Spiteful (AKA Spinfuel? by chance?
Yes, like Spinfuel. They were the tipping point. There is very little ethics left in journalism, and this is coming from someone who spent years working in the print news business. It's all about the almighty advertising dollar. Advertising should support the content's message...not the other way around.
When every e-liquid that is reviewed gets a 4-5 star rating, and every vendor gets some kind of "award", there is a serious problem. I actually addressed this issue on Google+, and here is the editor's response:
Me: Strikes me as odd that most of the reviews on Spinfuel are so high. Not every juice deserves a 4-5 star rating.
Spinfuel eMagazine: I always thought the team needed to explain more about this, perhaps I will add to the Protocols page. Here's why you find it odd.
Most of the eliquid reviews the team does are brands that have already been somewhat vetted. They take recommendations from readers, and their own explorations, and choose companies to review. These companies must all have certain things in order to fit the requirement for being reviewed at all. Among them are dedicated mixing facilities, contamination preventions, shrink-wrap or some other method for sealing the bottles, labeling requirements, and ingredient restrictions. Now, yea, a lot of it is the "honor system", and sometimes a brand will stretch the truth in order to be reviewed, but its not often.
Once these requirements were put into place, 20 months ago now, the team had to decide that if the brand met the requirements then the "score", or "rating" would begin a 3. Three being the "average score" for an eliquid brand meeting all the requirements. That doesn't mean that an eliquid can score 3 stars and above only, they can go down to zero stars if its just a terrible eliquid. As it turns out, the brands chosen by the team have been very good brands. An "average" premium eliquid, say a custard flavor, would score an average score of 3 stars. If, after meeting the requirements, the flavor and vapor are better or worse than expected for a premium juice, the score is adjusted up or down.
I think the problem comes in because readers think that all eliquids start at zero stars and work up from there. But that's not how it goes. They start at 3, and only because they no longer review just any brand. All the email for the entire crew goes through my servers, and I can tell you that we get at least 5 emails a day from eliquid brands, and in a typical week that's 35+ emails asking to be reviewed by the team. Of those 35+ requests the team picks at the most, 2, and the 2 that they think would be the best of that 35+. Chances are they are going to score higher than 3 stars because they have been selected carefully. They turn down brands every day, after learning more about the brand, or that brand not meeting the requirements. The one's they don't turn down are put in a 'que'. Some brands wait 3 or 4 months to get reviewed.
If we had 4 or 5 teams doing reviews, or if the team spent 10 minutes with a brand instead of 72 hours, then we could review all of them and you would undoubtedly see many eliquids scoring much much lower. But, the team would rather search for the better brands and provide an in-depth review of those better brands.
On occasion the team has been burned, the brand appears to be a premium label, they say all the right things, and it turns out that the actually mix the juice in back room of a shop, using no standards and providing no protection against contamination... when they are discovered the team tosses out the juice and strikes them from the que. That happened twice in the last month alone. The emails, the phone calls, they are all excited about being reviewed by the team, but when the juice arrives they are not shrink wrapped, the labels are hand written or have no real information on them, and the juice smells totally artificial and a sampling of one flavor shows the juice is crap. The team just tosses the juice, and because they feel like they were taken advantage of, most of the time they don't even bother telling the brand, until enough time goes by that they email the team. Julia or Tom will simply write back that the juice that arrived did not meet the requirements as promised, and therefore will not be reviewed (they even have that stated in their submissions guidelines PDF they send to them all)
The bottom line as to why you see a lot of 4's and 4.25, 4.5., 4.75, and even 5 star ratings is because they have been preselected, prequalified, and the brand knows a thing or two about making good juice. And let's be honest, most good brands make good juice these days. The team could hate the flavor, but because the juice is clear of contaminants, bottled safely, labels have all the information you could want, and photos of the facility show excellent equipment maintained well, the score is going to be above a 4.
The team's unwillingness to spend their time with juice made in the basement or spare bedroom, or in storage unit, prevents poorly made juice from reaching them (most of the time), so these scores are natural.
One more thing; the team did tell me some months back that they want to "reset" the whole scoring method, and when the time comes for regulation and bedroom/basement/garage juice can't be made and sold online, then they will reset the whole thing and all eliquids will start at zero, and have to earn their way up the ladder. All juice will premium, safe, well made, but until that time they cannot reset the scoring system without putting all those eliquids that have already earned high scores up against juice that as just as good, but now score 1.2 or 1.5. It wouldn't be fair. Regulations will bring a new set of rules and a Spinfuel eLiquid Review score will referred to as "PR and AR", 4 stars PR (pre regulations) 2 stars AR (after regulations).
Oh, and "flavor" alone doesn't make a juice. Even I have learned that a first impression of an eliquid tells you nothing. A juice that tastes like crap first thing in the morning could be great in the evening, or vice versa. They way the team have worked all this out has been nothing short of brilliant. The only thing they didn't anticipate almost 3 years ago, is that they would need to restrict their reviews to premium labels only. Once they did that the scores sometimes look inflated. But that won't last forever.
Thanks for the comment, and I hope you can understand better why its done like it done. - Dave Foster - Spinfuel