{"id":111,"date":"2018-01-22T20:02:16","date_gmt":"2018-01-22T20:02:16","guid":{"rendered":"https:\/\/courses.lumenlearning.com\/suny-webliteracy\/chapter\/how-to-think-about-research\/"},"modified":"2018-01-22T20:02:16","modified_gmt":"2018-01-22T20:02:16","slug":"how-to-think-about-research","status":"publish","type":"chapter","link":"https:\/\/courses.lumenlearning.com\/suny-webliteracy\/chapter\/how-to-think-about-research\/","title":{"raw":"How to Think About Research","rendered":"How to Think About Research"},"content":{"raw":"<p>This brings us to our third point, which is how to think about research articles. People tend to think that newer is better with everything. Sometimes this is true: new phones are better than old phones, new textbooks are often more up-to-date than old textbooks. But the understanding many students have about scholarly articles is that the newer studies \u201creplace\u201d the older studies. You see this assumption in the headline: \u201cIt\u2019s Official: European Scientific Journal Concludes\u2026\u201d\n\nIn general, that\u2019s not how science works. In science, multiple conflicting studies come in over long periods of time, each one a drop in the bucket of the claim it supports. Over time, the weight of the evidence ends up on one side or another. Depending on the quality of the new research, some drops are bigger than others (some much bigger), but overall it is an incremental process.\n\nAs such, studies that are consistent with previous research are often more trustworthy than those that have surprising or unexpected results. This runs counter to the narrative promoted by the press: \u201cnews\u201d, after all, favors what is new and different. The unfortunate effect of the press\u2019s presentation of science (and in particular science around popular issues such as health) is that rather give a sense of the slow accumulation of evidence for each side of an issue the narrative presents a world where last month\u2019s findings are \u201coverturned\u201d by this month\u2019s findings, which are then, in turn, \u201coverturned\u201d back to the original finding a month from now. This whiplash presentation \u201cChocolate is good for you! Chocolate is bad for you!\u201d undermines the public\u2019s faith in science. But the whiplash is not from science: it is a product of the inappropriate presentation from the press.\n\nAs a fact-checker, your job is not to resolve debates based on new evidence, but to accurately summarize the state of research and the consensus of experts in a given area, taking into account majority and significant minority views.\n\nFor this reason, fact-checking communities such as Wikipedia discourage authors from over-citing individual research \u00a0-- which tends to point in different directions. Instead, Wikipedia encourages users to find high quality secondary sources that reliably summarize the research base of a certain area, or research reviews of multiple works. This is good advice for fact-checkers as well. Without an expert\u2019s background it can be very hard to place new research in the context of old, which is what you want to do.\n\nHere\u2019s a claim (two claims, actually) that ran recently in the Washington Post:\n<\/p><blockquote>\n<div>The alcohol industry and some government agencies continue to promote the idea that moderate drinking provides some health benefits. But new research is beginning to call even that long-standing claim into question.<\/div><\/blockquote>\nReading down further we find a more specific claim -- the medical consensus is that alcohol is a carcinogen even at low levels of consumption. Is this true?\n\nThe first thing we do is look at the authorship of the article. It\u2019s from the Washington Post, which is a generally reliable publication, and one of its authors has made a career of data analysis (and actually won a Pulitzer prize as part of a team that analyzed data and discovered election fraud in a Florida mayoral race). So one thing to think about: these people may be better interpreters of the data than you. (Key thing for fact-checkers to keep in mind: You are often not a person in a position to know.)\n\nBut suppose we want to dig further and find out if they are really looking at a shift in the expert consensus, or just adding more drops to the evidence bucket. How would we do that?\n\nFirst, we\u2019d sanity check where the pieces they mention were published. The Post article mentions two articles by \u201cJennie Connor, a professor at the University of Otago Dunedin School of Medicine\u201d one published last year and the other published earlier. Let\u2019s find the more recent one, which seems to be a key input into this article. We go to Google Scholar and type in \u201c\u2018Jennie Connor\u2019 2016\u201d:\n\n<img class=\"alignnone size-full wp-image-218\" src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/2942\/2018\/01\/22200211\/image44.png\" alt=\"\" width=\"1768\" height=\"1038\" \/>\n\nAs usual we\u2019re scanning quickly to get to the article we want, but also minding our peripheral vision here. So we see that the top one is what we probably want, but we also notice that Connor has other well-cited articles in the field of health.\n\nWhat about this article on \u201cAlcohol consumption as a cause of cancer\u201d? It was published in 2017 (which is probably the physical journal\u2019s publication date, the article having been released in 2016). Nethertheless, it\u2019s already been cited by twelve other papers.\n\nWhat about this publication <em>Addiction<\/em>? Is it reputable?\n\nLet\u2019s take a look with an impact factor search:\n\n<img class=\"alignnone size-full wp-image-219\" src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/2942\/2018\/01\/22200214\/image06.png\" alt=\"\" width=\"1999\" height=\"809\" \/>\n\nYep, it looks legit. We also see in the little card to the right that the journal was founded in the 1880s. If we click through to that Wikipedia article, it will tell us that this journal ranks second in impact factor for journals on substance abuse.\n\nAgain, you should never use impact factor for fine-grained distinctions. What we\u2019re checking for here is that the Washington Post wasn\u2019t fooled into covering some research far out of the mainstream of substance abuse studies, or tricked into covering something published in a dodgy journal. It\u2019s clear from this quick check that this is a researcher well within the mainstream of her profession, publishing in prominent journals.\n\nNext we want to see what kind of article this is. Sometimes journals publish short reactions to other works, or smaller opinion pieces. What we\u2019d like to see here is that this was either new research or a substantial review of research. We find from the abstract that it is primarily a review of research, including some of the newer studies. We note that it is a six page article, and therefore not likely to be a simple letter or response to another article. The abstract also goes into detail about the breadth of evidence reviewed.\n\nFrustratingly, we can\u2019t get our hands on the article, but this probably tells us enough about it for our purposes.","rendered":"<p>This brings us to our third point, which is how to think about research articles. People tend to think that newer is better with everything. Sometimes this is true: new phones are better than old phones, new textbooks are often more up-to-date than old textbooks. But the understanding many students have about scholarly articles is that the newer studies \u201creplace\u201d the older studies. You see this assumption in the headline: \u201cIt\u2019s Official: European Scientific Journal Concludes\u2026\u201d<\/p>\n<p>In general, that\u2019s not how science works. In science, multiple conflicting studies come in over long periods of time, each one a drop in the bucket of the claim it supports. Over time, the weight of the evidence ends up on one side or another. Depending on the quality of the new research, some drops are bigger than others (some much bigger), but overall it is an incremental process.<\/p>\n<p>As such, studies that are consistent with previous research are often more trustworthy than those that have surprising or unexpected results. This runs counter to the narrative promoted by the press: \u201cnews\u201d, after all, favors what is new and different. The unfortunate effect of the press\u2019s presentation of science (and in particular science around popular issues such as health) is that rather give a sense of the slow accumulation of evidence for each side of an issue the narrative presents a world where last month\u2019s findings are \u201coverturned\u201d by this month\u2019s findings, which are then, in turn, \u201coverturned\u201d back to the original finding a month from now. This whiplash presentation \u201cChocolate is good for you! Chocolate is bad for you!\u201d undermines the public\u2019s faith in science. But the whiplash is not from science: it is a product of the inappropriate presentation from the press.<\/p>\n<p>As a fact-checker, your job is not to resolve debates based on new evidence, but to accurately summarize the state of research and the consensus of experts in a given area, taking into account majority and significant minority views.<\/p>\n<p>For this reason, fact-checking communities such as Wikipedia discourage authors from over-citing individual research \u00a0&#8212; which tends to point in different directions. Instead, Wikipedia encourages users to find high quality secondary sources that reliably summarize the research base of a certain area, or research reviews of multiple works. This is good advice for fact-checkers as well. Without an expert\u2019s background it can be very hard to place new research in the context of old, which is what you want to do.<\/p>\n<p>Here\u2019s a claim (two claims, actually) that ran recently in the Washington Post:\n<\/p>\n<blockquote>\n<div>The alcohol industry and some government agencies continue to promote the idea that moderate drinking provides some health benefits. But new research is beginning to call even that long-standing claim into question.<\/div>\n<\/blockquote>\n<p>Reading down further we find a more specific claim &#8212; the medical consensus is that alcohol is a carcinogen even at low levels of consumption. Is this true?<\/p>\n<p>The first thing we do is look at the authorship of the article. It\u2019s from the Washington Post, which is a generally reliable publication, and one of its authors has made a career of data analysis (and actually won a Pulitzer prize as part of a team that analyzed data and discovered election fraud in a Florida mayoral race). So one thing to think about: these people may be better interpreters of the data than you. (Key thing for fact-checkers to keep in mind: You are often not a person in a position to know.)<\/p>\n<p>But suppose we want to dig further and find out if they are really looking at a shift in the expert consensus, or just adding more drops to the evidence bucket. How would we do that?<\/p>\n<p>First, we\u2019d sanity check where the pieces they mention were published. The Post article mentions two articles by \u201cJennie Connor, a professor at the University of Otago Dunedin School of Medicine\u201d one published last year and the other published earlier. Let\u2019s find the more recent one, which seems to be a key input into this article. We go to Google Scholar and type in \u201c\u2018Jennie Connor\u2019 2016\u201d:<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-218\" src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/2942\/2018\/01\/22200211\/image44.png\" alt=\"\" width=\"1768\" height=\"1038\" \/><\/p>\n<p>As usual we\u2019re scanning quickly to get to the article we want, but also minding our peripheral vision here. So we see that the top one is what we probably want, but we also notice that Connor has other well-cited articles in the field of health.<\/p>\n<p>What about this article on \u201cAlcohol consumption as a cause of cancer\u201d? It was published in 2017 (which is probably the physical journal\u2019s publication date, the article having been released in 2016). Nethertheless, it\u2019s already been cited by twelve other papers.<\/p>\n<p>What about this publication <em>Addiction<\/em>? Is it reputable?<\/p>\n<p>Let\u2019s take a look with an impact factor search:<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-219\" src=\"https:\/\/s3-us-west-2.amazonaws.com\/courses-images\/wp-content\/uploads\/sites\/2942\/2018\/01\/22200214\/image06.png\" alt=\"\" width=\"1999\" height=\"809\" \/><\/p>\n<p>Yep, it looks legit. We also see in the little card to the right that the journal was founded in the 1880s. If we click through to that Wikipedia article, it will tell us that this journal ranks second in impact factor for journals on substance abuse.<\/p>\n<p>Again, you should never use impact factor for fine-grained distinctions. What we\u2019re checking for here is that the Washington Post wasn\u2019t fooled into covering some research far out of the mainstream of substance abuse studies, or tricked into covering something published in a dodgy journal. It\u2019s clear from this quick check that this is a researcher well within the mainstream of her profession, publishing in prominent journals.<\/p>\n<p>Next we want to see what kind of article this is. Sometimes journals publish short reactions to other works, or smaller opinion pieces. What we\u2019d like to see here is that this was either new research or a substantial review of research. We find from the abstract that it is primarily a review of research, including some of the newer studies. We note that it is a six page article, and therefore not likely to be a simple letter or response to another article. The abstract also goes into detail about the breadth of evidence reviewed.<\/p>\n<p>Frustratingly, we can\u2019t get our hands on the article, but this probably tells us enough about it for our purposes.<\/p>\n\n\t\t\t <section class=\"citations-section\" role=\"contentinfo\">\n\t\t\t <h3>Candela Citations<\/h3>\n\t\t\t\t\t <div>\n\t\t\t\t\t\t <div id=\"citation-list-111\">\n\t\t\t\t\t\t\t <div class=\"licensing\"><div class=\"license-attribution-dropdown-subheading\">CC licensed content, Shared previously<\/div><ul class=\"citation-list\"><li>Web Literacy for Student Fact-Checkers. <strong>Authored by<\/strong>: Michael A. Caulfield. <strong>Located at<\/strong>: <a target=\"_blank\" href=\"https:\/\/webliteracy.pressbooks.com\/\">https:\/\/webliteracy.pressbooks.com\/<\/a>. <strong>License<\/strong>: <em><a target=\"_blank\" rel=\"license\" href=\"https:\/\/creativecommons.org\/licenses\/by\/4.0\/\">CC BY: Attribution<\/a><\/em><\/li><\/ul><\/div>\n\t\t\t\t\t\t <\/div>\n\t\t\t\t\t <\/div>\n\t\t\t <\/section>","protected":false},"author":311,"menu_order":8,"template":"","meta":{"_candela_citation":"[{\"type\":\"cc\",\"description\":\"Web Literacy for Student Fact-Checkers\",\"author\":\"Michael A. Caulfield\",\"organization\":\"\",\"url\":\"https:\/\/webliteracy.pressbooks.com\/\",\"project\":\"\",\"license\":\"cc-by\",\"license_terms\":\"\"}]","CANDELA_OUTCOMES_GUID":"","pb_show_title":"on","pb_short_title":"","pb_subtitle":"","pb_authors":[],"pb_section_license":""},"chapter-type":[],"contributor":[],"license":[],"class_list":["post-111","chapter","type-chapter","status-publish","hentry"],"part":87,"_links":{"self":[{"href":"https:\/\/courses.lumenlearning.com\/suny-webliteracy\/wp-json\/pressbooks\/v2\/chapters\/111","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/courses.lumenlearning.com\/suny-webliteracy\/wp-json\/pressbooks\/v2\/chapters"}],"about":[{"href":"https:\/\/courses.lumenlearning.com\/suny-webliteracy\/wp-json\/wp\/v2\/types\/chapter"}],"author":[{"embeddable":true,"href":"https:\/\/courses.lumenlearning.com\/suny-webliteracy\/wp-json\/wp\/v2\/users\/311"}],"version-history":[{"count":0,"href":"https:\/\/courses.lumenlearning.com\/suny-webliteracy\/wp-json\/pressbooks\/v2\/chapters\/111\/revisions"}],"part":[{"href":"https:\/\/courses.lumenlearning.com\/suny-webliteracy\/wp-json\/pressbooks\/v2\/parts\/87"}],"metadata":[{"href":"https:\/\/courses.lumenlearning.com\/suny-webliteracy\/wp-json\/pressbooks\/v2\/chapters\/111\/metadata\/"}],"wp:attachment":[{"href":"https:\/\/courses.lumenlearning.com\/suny-webliteracy\/wp-json\/wp\/v2\/media?parent=111"}],"wp:term":[{"taxonomy":"chapter-type","embeddable":true,"href":"https:\/\/courses.lumenlearning.com\/suny-webliteracy\/wp-json\/pressbooks\/v2\/chapter-type?post=111"},{"taxonomy":"contributor","embeddable":true,"href":"https:\/\/courses.lumenlearning.com\/suny-webliteracy\/wp-json\/wp\/v2\/contributor?post=111"},{"taxonomy":"license","embeddable":true,"href":"https:\/\/courses.lumenlearning.com\/suny-webliteracy\/wp-json\/wp\/v2\/license?post=111"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}