Hi, everyone - my name is Jonathan Ellsworth, and I'm the founder & editor of Blister. I think you have a very interesting conversation going -- thanks for getting it started, Phil.
I hope I'm not an unwelcome guest at this party, but given that Blister has come up a lot in this thread, and given that I have spent a huge amount of time thinking about what makes a ski (or bike or snowboard or climbing shoe, etc) as useful as possible, I thought I'd check in.
First, I think it's clear that you've got a good thing going at Epic, and it's cool to hear some of you talk about how on-point and helpful the equipment recommendations from others on this board have been to you. The only reason to ski is because it's fun, and I truly think that getting skiers lined up with the best ski for them can significantly enhance their time on the mountain.
That's the fundamental reason why I started Blister, because I felt like the existing buyer's guides were so horrible at doing exactly this. I felt like they were often just egregiously wrong, and sometimes it seemed like they were straight up lying. Our primary goal is to steer people toward the equipment that will work best for them. That's really it. Of course, how effective we are at that is debatable, but I will say that I'm proud of our track record.
When we started Blister, a real question for me was, given that we're talking about something subjective here, i.e., your experience of a ski vs. mine - what sort of agreement rate would be a sign of success? If you had 7 out of 10 skiers who spent time on a ski that we reviewed say they agreed? 8 out of 10? I really didn't know.
But based on thousands of responses we've received on the site and via private emails, I think we're far, far above that 80%. And given that this is the internet (where people rarely go out of their way to be charitable -- to put it mildly) I have trouble believing that if a bunch of people were buying skis based off our reviews - and they felt that we'd misled them - they wouldn't be cursing us out all over the site, and understandably so.
A couple other things (and sorry this is long, but I'm jumping in late):
(1) a few of you have mentioned that you think Blister is - or has become - too pro or 'pro-bro' or something. I'm really not sure what that means. If you happen to know or have ever corresponded with any of our reviewers, I'm pretty sure you'd see how far from the truth this is. My own background is in academic philosophy. (I gave up teaching at the age of 28 after moving to New Mexico to work on a writing project for the summer, and never left. I fell in love with New Mexico, skiing, and Taos Ski Valley.) One of our senior reviewers is a heli guide in Alaska, but he' also a general practitioner who delivers a bunch of babies on Kodiak island in the spring, summer, fall. He's smart, he's a very good skier, he's humble, and he's not at all worried about impressing people on the internet that he's a good skier. If I ran down the list of our reviewers, the stories would all actually be pretty similar. (And yes, a number of our reviewers did grow up racing.) I guess you'll have to take my word for it, but I would describe pretty much all of our reviewers as nice people, humble people, who happen to be good skiers. And nobody is pro - if you're a sponsored skier, we don't let you review skis for us because of the conflict of interest.
(2) It is true that we think it's part of our job to go push some skis hard in technical terrain - because (1) some of the skis we review are supposed to excel there, and (2) if we're really going to be useful to everyone reading a review - whether he or she is new to skiing, or whether he or she is currently a guide in Chamonix or an older, accomplished skier - then we have to do our best to provide the information that is most relevant to them. To only target hard chargers, or to only target those who tend to stay on-piste ... that's too limited.
We literally started Blister by reviewing the skis that I happened to have in my garage. We started with what was available to us, and worked hard to review that stuff as accurately as we could. And as several of you have noticed, over time, we've expanded into more ski categories. We couldn't immediately review everything at once, not given the amount of time we think it takes to review skis accurately and well.
So rather than dismiss our reviews because you think we're too bro, or can't carve a ski - or just pivot them or whatever, rather than fall onto generalizations like that (especially given that I'm pretty sure those comments were made by people who have never skied with any of our reviewers) I think the better thing to do is judge us on each particular review. For example, do you think my review of the 15/16 177cm Kendo is inaccurate? I seem to remember that, a few years back, Beyond called my review of the 13/14 Mantra one of the best reviews he'd read of that ski. I was happy to hear that, because from the bit I know about Beyond, it seems like he knew the Mantra and the skis in that category quite well. That matters to me. It also matters to me that two very non-Mantra-esque skis - the Rossignol S3 and the Rossignol Soul 7 have many, many comments on Blister from readers saying that they bought those skis based on my reviews, and that their experience on them is consistent with my review. I'm not sure how that is possible, if all we're really trying to do is get people that we don't know to think we spend all day, everyday raging through 60 degree breakable crust?
Anyway, sorry to drag on, and I probably failed to even address the most interesting topics that you all have raised in this thread. So I'll leave you with this one, since some of you have debated in this thread better & worse ways of testing skis:
When it comes to ski reviews in general, the practice of taking a ski out for a single run -- or two runs -- is so inadequate in my view that I regard it as fatally flawed. I don't care if you put tape over your topsheets, or have 40 different reviewers take 1 or 2 runs on a ski. That is an ineffective way to be able to pinpoint the nuances of what a ski does or doesn't do. We regularly find that it takes at least a run or two just to acclimate to a new ski. So after 1 or 2 runs, you shouldn't be finishing up on a ski and filling out some note card, that's when you should just begin to assess a ski. And to me, at least, if you can't accurately lay out those nuances, then why should I bother to read your review? And furthermore, your comparisons to other skis in the category are going to be seriously lacking, too.
Just to drive home the point: after I spent 5-6 days reviewing a stiffer all mountain ski with a traditional mount point, and then the next ski I'm reviewing is a softer, wider ski with a more progressive mount point, there is no way I ought to be weighing in on that new ski after a run or two, and no reason you ought to trust my opinion if I did do that. Furthermore, if I'm going to be able to assess and then articulate how a ski performs in moguls and trees and on soft groomers and on roughed-up groomers and in narrow steeps and in pow, etc., well (1) why wouldn't you want to know about all of that before you drop hundreds of dollars on a ski?, and (2) a ski test that only allows the reviewer 1 or 2 runs starts to look more and more inadequate, no?
Again, sorry for the long comment. But these are interesting topics.
Edited by JFE24 - 9/29/15 at 1:16am