How to Tell a Good Website from a Crap WebsiteHow to tell whether a science article on the web is reliable or not. Skeptoid Podcast
#336 It's been said that searching for information on the Internet is like drinking from a firehose. There is a mind-boggling amount of information published that's freely available to anyone and everyone. The Internet grows so quickly that every time you open your web browser, you've got direct access to the largest compilation of information in history, bigger than all the books in all the libraries in all the world; and at current rates, it's growing by 5% every month. Search for information on any given subject, and you're presented with more options than anyone can know what to do with. So when the average person wants to learn some decent information, how can you tell whether the website you've found is giving you good info, or giving you crap? Today we'll find out. We're going to look at three categories of tools for appraising the validity of the information presented on a website. First, we're going to go through some general rules of thumb, pertaining to the website's style of presentation, that most laypeople should be able to spot. Next, we're going to look at a handful of software tools designed to give you an objective assessment. And finally, we're going to quickly review the "How to Spot Pseudoscience" guide to give you a pretty darned good idea of any given piece of information you're curious about. Style of PresentationThere actually is a certain amount of "judging a book by its cover" that makes sense, particularly for websites. Websites can be published by anyone, whether they have a large staff of editors and researchers behind them or not. Big slick presentations are found everywhere, from university websites to science journals to mass media consumer portals promoting who knows what. But there are important differences between a science article and a pseudoscience article, even on the slickest website, that you can learn to spot. Often the most obvious is the list of references at the end of the article. If there isn't one, then you're probably reading a reporter's interpretation of the research, and should try to click through to find the original. If there are no references at all, then it's a big red flag that what you're reading is unlikely to be legitimate science research. If it's not referenced, pass and move on. A lack of references doesn't mean the article is wrong, it just means that there's a better, more original source out there that you haven't found yet. If there are references, be aware that oftentimes, cranks will list references as well, so there are some things you need to look out for. Be wary of someone who cites himself. Be especially wary of a study or article that's cited, but once you click on it, you find that it actually says something different than what the author described. It's very important to look at what those citations are: Are they articles in legitimate science journals, or are they published in a journal dedicated to the promotion of something? Many Google results will return not a page on a slick big-budget website, but on an obscure page. For example, university professors will often have a little website on the university's server, describing their research or whatever. Often, those little websites look terrible, because they're not made by a professional web person. A crank who churns out his own website might superficially look really similar. How do you know whether you're looking at an amateurish site made by a crank, or an amateurish site made by a real science expert? One way is that real science professionals know that there are ways to establish proper credibility, and they generally follow those rules. The citation of sources is important here as well. A proper research scientists knows that he must list sources to be taken seriously. A crank often skips it, or cites himself, or makes vague references to famous names like Einstein (probably the only names he knows). Grammatical errors are a case of where it's appropriate to judge a book by its cover. Bad spelling and grammar left uncorrected is a sign that you're probably reading the page of a crank, who works in isolation and has nobody double checking his work. A professor's personal website, however, is often checked over and corrected by undergrads or associates. Do be wary of bad grammar. So we're dancing around the subject a bit of who is the author. First of all, if the author is anonymous, dismiss the article out of hand. If the author is a reporter, which is often the case, then you need to click through to find the lead researcher's name. If he's a legitimate scientist, he'll have plenty of publications out there, and it's easy to look him up by going to Google Scholar and typing in his name. This doesn't prove anything, but having publications in recognized journals gives the author more credibility than someone who doesn't. Be aware that most indexes like Google Scholar also list crap publications, even mass market paperback books that are not vetted in any way, so you do have to be careful about looking exactly at what those publications are. If the website teases you with a bit of titillating information but then requires a purchase to get the rest of the story, you could be dealing with a crank sales portal, or you could be dealing with a paywall which is still (unfortunately) common for science journals. Universities almost always have accounts that allow them past the paywalls. You should be able to easily tell whether you're looking at a paywall where researcher credentials can be input to download the full article, or whether you're looking at a sales page trying to pitch you on buying the book to learn "the secrets" or whatever it is. A journal paywall is a good indicator that you're probably looking at real science; the sales page is a good indicator that you're probably looking at crap. A braggadocious domain name like RealScientificInfo.com or LifeRevolution.biz is just like a used car salesman calling himself Honest John. Websites like that are not typical of the way proper science reporting is done. The website should represent an actual, real-world organization, academic institution, or publication, and not be just some random web compilation. Software ToolsIt would be great if there was such a thing as a web browser plugin or something that would simply give you a red X or a green check to tell a layperson whether a website is reliable or crap. But despite a number of efforts to build just such a thing, no great headway has been made. One good tool is the Quackometer, which uses an automated algorithm to scan a website's pages, looking at its use of language. It comes back with a score telling you how likely it is that the site is misusing scientific sounding language, and is promoting quackery, or whether it generally appears good. Obviously this is an imperfect solution; but when I've used it on sites that I know, I've found that its results are generally correct, with its biggest flaw being that it often gives a little too good of a score to sites that deserve lambasting. Rbutr is a browser plugin that lets users link articles that rebut whatever's written on the current page. So, if you're reading something that's been rebutted somewhere, rbutr will link you right to it. The downside is that it cuts both ways: it rebuts a bad article with a good, and rebuts that same good article with the bad. According to someone else. There's not really a way for the end user to know which is better, just that they rebut each other. Somewhat surprisingly, online trustworthiness services, of which TRUSTe is the best known, allow sites to pay for a privacy certification that they can put on their websites. It turns out that sites who pay for these logos are actually more likely to not be very trustworthy; people with less honorable motives are often more highly motivated to convince you that they are honorable. And, in any case, site privacy has nothing to do with the quality of the site's articles. If you see some sort of a logo or certification on a website, it proves nothing whatsoever by itself. By no means should you assume that it makes the information likely to be good. The best roundup of tools for assessing the validity of online data is Tim Farley's Skeptical Software Tools. You should keep it as a bookmark, and if anything new comes along for helping laypeople evaluate websites, Tim will be among the first to report on it. How to Spot PseudoscienceSkeptoid followers may recognize this list from episode 37, way back long ago. This is an abbreviated version that you can apply to the contents of a website. These common red flags don't prove anything, but they're characteristics of pseudoscience. Watch out any time you see these on a website: Ancient knowledge, ancient wisdom, statements that ancient people believed or knew about this, or that it's stood the test of time. To test whether an idea's true, we test whether it's true; we don't ask if ancient people believed it. Claims of suppression by authorities, an old dodge to explain away why you've never heard of this before. The biggest red flag of all is that somebody "Doesn't want you to know" this, or "What doctors won't tell you". Anything that sounds too good to be true probably is. Miraculously easy solutions to complicated problems should always set off your skeptical radar. Is the website dedicated to promotion or sales pertaining to a particular product or claim? If so, you're probably reading a sales brochure disguised as a research report. Be especially aware of websites that cite great, famous, well-known names as their inspiration. Albert Einstein, Nikola Tesla, and Stephen Hawking are three of the most abusively co-opted names in history. Real research instead tends to cite current researchers in the field, names that few people have ever heard of. The famous names are mentioned mainly in sales pitches. Always watch out for the all-natural fallacy, in its many guises. If a website trumps the qualities of being all-natural, organic, green, sustainable, holistic, or any other of the popular marketing buzzwords of the day, it's more likely that you're reading pseudoscience than science. Does the article fit in with our understanding of the world? Is it claiming a revolutionary development or idea — free energy, super health — things everyone wants but that don't actually exist? Be skeptical. Real research always cites weaknesses and conflicting evidence, which is always present in science. Pseudoscience tends to dismiss all such evidence. If a website claims that scientists or experts all agree on this new discovery, you're probably reading unscientific nonsense. In general, the word "revolution" is something of an old joke in science fields, along with the phrases "scientists are baffled" and "what they don't want you to know". If the website promises to revolutionize anything, you're almost certainly dealing with a crank who has little connection with genuine science. Anytime someone puts on their web page that they're smart, or that they are a renowned intellectual or thinker, they're not. Click your way elsewhere. Finally, always run screaming from a website by One Guy with All the Answers. The claim to have solved or explained everything with a new, pioneering theory is virtually certain to be crankery. So there you have it; it's neither perfect nor comprehensive, but it should give most laypeople a fair start on evaluating a website's quality of information. If nothing else, it shows what a difficult task this is, and highlights yet another reason why so many people believe weird things. Bad information is easy to sell, and not always so easy to spot.
Cite this article:
©2024 Skeptoid Media, Inc. All Rights Reserved. |