Consumer Reports Mattress Reviews

January 15, 2017 By Michael Magnuson
Consumer Reports built its reputation by being a “consumer watchdog,” and through this approach, they have amassed loads of credibility and influence with consumers over the years.  While some of this credibility is surely deserved, when it comes to mattresses, it poses a problem for both consumers and the industry — because much of the information they provide is wrong or misleading.
 
Here at GoodBed, we neither make nor sell mattresses, and we cover both online and offline products of all brands, so I have no stake in who comes out on top per se.  My commitment is simply to providing good information — so that the consumer gets the truth and good companies are rewarded for the good things they do.  In my mind, a ranking that shows Tempur-Pedic as the worst memory foam mattress brand (and by a large margin) is wildly misleading and the result of an inherently flawed methodology.
 
For my part, I don’t question Consumer Reports’ integrity.  As other people have pointed out, Consumer Reports does earn some of its money through affiliate links (http://d.pr/i/U9Yq).  However, I think it’s entirely possible for a company to maintain its objectivity even when it earns revenue through advertising — we do it, and so does the New York Times and every other reputable media company.  I believe that where Consumer Reports’ problem lies is that they are presenting themselves as mattress experts, but aren’t.
 
I have heard the CR mattress “expert” go on TV morning shows and say things like “This mattress here is made of 100% memory foam” (anyone with knowledge of mattresses knows that would be like baking cookies with only sugar and no flour).  This same ignorance is reflected in their rankings.  3 of the top 10 “memory foam mattresses” in their current rankings have NO memory foam (incl Tuft & Needle), and another 3 have less than 2 inches of it (incl Casper).  Their top “memory foam mattress” is a 100% latex mattress (with no memory foam) that is also listed as “Certified Organic” (it isn’t).  This kind of misinformation isn’t just ignorant, it’s irresponsible.
 
So what about the rankings themselves?  Aren’t they based on scientific tests?  Well, what do you get when you try to apply scientific method without having the proper context or expertise to do so?  You get ‘pseudoscience.’  An example of where pseudoscience leads you on the motion isolation front is this utterly confounding statement from Consumer Reports:
 
“None of the memory foam mattresses earned excellent scores for stabilization, so steer clear of that type if this is a big concern for you and your partner.”  
 
So, according to Consumer Reports, memory foam — the material that was literally developed by NASA for its shock absorption qualities, and whose motion isolation properties can be easily observed in Tempur-Pedic’s famous wine glass test that has been replicated by countless numbers of people — should be avoided if motion isolation is important to you… huh?
 
Looking at the rankings reminds me of a time a while back when one of the big financial publications was trying to break into the school rankings game.  To kick off this effort, they produced a ranking of the top business schools in which Harvard Business School was ranked….#21.  Now, as someone who did not attend this school, I enjoyed the schadenfreude in this as much as the next guy.  But at the same time, the ranking was utterly ridiculous — you’d have been hard-pressed to find a single person among the top business school applicants and professors that wouldn’t place HBS among the top 3 schools where they’d want to be.  The lesson here is that wanting to have a scientific methodology is great, but when the results don’t jive with empirical evidence, it’s not science — it’s pseudo-science.
 
A deeper look at how Tempur-Pedic (or more specifically, the Tempur-Cloud Supreme) ended up where it did reveals some more specific flaws in Consumer Reports’ methodology:
 
  • Their durability tests don’t accurately measure durability, especially for all-foam mattresses.
  • Durability differences are not weighted sufficiently in their overall ranking algorithm. For example, the Tempur-Pedic reported “no changes in performance” while brands like T&N reported “minor changes in performance” — yet both received the highest score for durability.
  • The Cloud Supreme scored low for back support, esp for back sleepers.  This actually passes the smell test IMO.  This particular model has not only a lot of memory foam, but it also has a medium-soft feel — a combination that isn’t ideal for support, esp for back sleepers.  Of course, in real life a back sleeper could just choose a different Tempur-Pedic model.
  • Pressure relief (which, along with back support, is the other critical thing that everyone needs from their mattress) is not weighted sufficiently in their ranking algorithm.  If it were, then a model like the Cloud Supreme that got a relative ding in the support ratings would get a similarly-sized boost from the pressure relief ratings. This was not the case.  As a result, brands with less memory foam (or none at all) got way better overall scores.
Net, the people at Consumer Reports aren’t bad people, but they are irresponsible.  They are serving up wildly inaccurate and misleading information about mattresses, and selling it to people under the guise of scientific accuracy. Some may argue that pseudo-science is better than no science, however I would disagree in this case. People assign more extra credibility to recommendations that they believe are backed by science. So when the resulting information is wrong, but appears to be scientific, it will lead consumers even further down the wrong path.
 
I hope others will join us in standing up to this two-ton gorilla and calling them on their faulty methodology and conclusions, so that consumers aren’t led astray any further by the misinformation Consumer Reports is providing in this category.  
 
Sometimes even the “watchdog” needs a watchdog…