Statistics in Journalism

I recently was speaking with my old maths teacher, who was saying that she attended a talk about the maths used in journalism. Originally, I questioned what maths could be involved, but she pointed out the fact that actually, a lot of statistical quotations are used in reporting; from percentages to fractions, and from poll statistics to research results. Following this up, many papers are starting to only accept employees with a maths A Level to join their firms, simply because of the statistical reporting that is often done so wrongly today.

Looking more into the statistics that reporters use, there is actually quite a lot of it, ranging from the basics up to the more advanced stuff. Stats.org is a website aimed at both journalists and journalism in general, helping to educate individuals in correct statistical reporting, and also showing articles that are actually incorrect in their use of statistics.

Statistics is quite a delicate area of maths, as the slightest change can make the world of difference. Looking at p values, merely a 0.01 difference could change the conclusion of a report from significant to insignificant, or vice versa. Therefore, not knowing about statistics could mean a reporter writes down the information haphazardly, reporting them incorrectly. Granted, they wouldn’t have to be reported in the APA style as we are trained to do, but they would still need to be recorded in such a way that people understand, and also that they are reported correctly.

A good example of media reporting is when research is publicised. A recent study looking into the effect of obesity and cognitive skills was written about in the Times online (http://www.bbc.co.uk/news/health-17465404). “Obesity harms later brain skill” was the title, so it would be assumed from this blunt title that the more obese you are, the less intelligent you are. Reading into the paper, the results are that individuals with a higher BMI were less likely to score as highly on a mental test. Therefore, the research itself was looking at a correlation between BMI and mental ability, yet the way in which it was reported suggested an existing correlation. The sample size was only 250 individuals between the ages of 60 and 70, so a limited sample was used, and only a single test was used to measure cognitive ability. From this, we can be certain that it definitely is not a causational relationship, due to the many other factors not tested; for example, there was no control group of younger people or individuals with a lower BMI. This is only one example of many incorrect reporting procedures.

Arguably, this is done so simply to create a headline that sells; usually reading the article is enough to clear up the real meaning of the headline. However, there is also the fact that statistics needs to be carefully reported, and lack of knowledge of how to do so may also be the cause of incorrect headlines.

Therefore, although we may not think of it, there is actually a vast range of statistics used in reporting, and reporters and journalists with at least a basic knowledge in statistics would be helpful in correct reporting.

 

 

 

This entry was posted in Uncategorized. Bookmark the permalink.

13 Responses to Statistics in Journalism

  1. Pingback: anti aging face mask Latest Anti Aging Treatments - Revitalizes skin for anti aging

  2. emcg1 says:

    Hi 🙂 the importance of correct reporting is no better highlighted than the difference between a fake or fabricated story and a truthful one. My first thought has always been that false reporting through the correlation causation mistake is down to a need to sell papers. Till reading your blog I’d never thought of another reason for this, it could be as you highlight here that a journalists lack of knowledge on statistics could be leading them to report incorrect information. It is a very valid point, without a correct knowledge of statistics and how moving a dot around could vastly affect interpretation. It is entirely possible therefore to conclude that we could be seeing stories not by headline driven journalists but journalists with a lack of statistical knowledge.

  3. Pingback: Comments for Final Blog! « emcg1

  4. I think the question that should be asked is “should statistics be used in journalism?”. Basic statistics such as percentages can provide compelling arguments, they can be used as evidence or used to back-up opinion. However, as we’ve seen through blogs and studying statistics, the figures can be hugely misrepresented, whether that be on purpose or by misunderstanding doesn’t matter, what is important is the fact misrepresentation occurs regularly. Should this mean a blanket ban on the use of statistics, or should this mean tighter regulations on how they can be used? For example if a journalist is suggesting that 60% of British are racist, this is a devastating figure with potentially huge aftereffects. However, once the figures are analysed the sample was of only 10 people from a known group with extremist views. There are endless flaws in this, to name a few: small sample, non-generalisable sample, potentially unreliable and so on. Now, if the journalist stated these flaws and/or changed the context surrounding the figure for example a small sample surveyed (10 people) from the BLAH BLAH extremist group yielded potentially racist results. This reduces those implicated as well as giving full context.

    Now, should these statistics be part of journalism. I think yes, they provide quick and instant information whilst adding evidence to any argument. However, I also believe there should be tight regulations on the use of statistics that ensure an accurate picture if being portrayed. In reality this is unlikely, multimillion dollar companies build multimillion dollar advertising campaigns on figures that are misrepresented (until reading the ultra-small print).

  5. nat1990psych says:

    An extremely interesting blog (made a nice change to the usual topics) however, I struggled to find a clear thesis argument in terms of what you were saying. As hannahcollins91 states, perhaps a better way of discussing statistics in journalism would be to argue whether the media should use statistics or not.

    I personally believe that journalists should study and use statistics, although be encouraged to report these statistics in a way that does not misinterpret the findings. Ultimately, journalists should have a clearer understanding than they do now due to the fact that, as your research example demonstrated, they may report the findigs incorrectly. In a sense this could argue that journalists should not use statistics however, my view is that anything written in papers needs to be backed up by evidence (whether it be voice recordings in the case of MP’s expenses, pictures in the case of celebrities, or statistical values for reports on research). As a result, statistics are important in journalism to support claims made, but in order to decrease the potential of misinterpretation of data (after all the majority of journalists are not scientists) they should be provided with teaching of statistics in order to report findings correctly.

    Your research example was fantastic and illustrated the simple notion of causation and correlation. Without studying statistics I would, quite possibly, not been aware of the fact that just because one variable is linked to another does not mean that causation can be stated due to the fact that other variables may be involved. This again demonstrates the need for journalists to have a clearer understanding of statistics in order to prevent simple errors such as this.

    Another idea that would have benefited your blog is that in the use of statistics, it is far more likely that the article would be published due to the fact that there is significant evidence to back the claims us. This is not only good for the journalist, but also the general public who are better able to access important findings that could help them benefit their lives. For example, if they read that people diagnosed with lung cancer are 15 to 30 times more likely to be smokers (Eldridge, 2012), then they potentially might stop smoking, preventing themselves from a possible early death and gruelling treatment, and preventing the NHS from spending large amounts of money on the particular patient.

    Overall, I believe that statistics should be used in journalism in order to back up claims made my the reporter and make the findings of studies more accessible to the general public. However, journalists should be trained in the use of statistics to avoid misinterpretation and causing panic to the masses.

  6. Lies, damned lies and statistics.
    The persuasive power of numbers is incredible, particularly when used by the media. Being someone that’s studied statistics in depth for the past couple of years, I find the topic kind of boring. But for some reason the public find them exciting. Tabloids and magazines tend to sensationalise percentages, ratios and any other form of statistical evidence that can support their story. The general public seem to be enthralled to learn that 95% of music is now downloaded, children born in 2009 are going to live twenty years longer than those born in 1939 and Britons took over 43million holidays (http://www.ons.gov.uk/ons/rel/social-trends-rd/social-trends/social-trends-41/index.html).

    Reporting of social trends means that each individual can know what everyone else is up to without getting caught sifting through the bins. But can we trust the statistics that are reported by journalists? I don’t see why not..

    After all, most of the statistics reported are taken from government or private industries which use surveys and large numbers of participants; making the sample size representative and avoiding biases which can be drawn from leading questions in interviews. There results may not be completely correct but they do show common trends within society. The worry comes when, those who do not necessarily understand the data misinterpret it; like frequently eating chocolate makes you lose weight (only if that were true). So readers need to be cautious when reading articles that report statistics. Always read the small print to judge how significant the effect actually is.

    *When reporting science goes wrong: http://junkscience.com/2012/03/26/study-more-frequently-eating-chocolate-appears-related-to-lower-bmi/

  7. psud24 says:

    I think journalism has a lot to answer for with regards to misleading statistics. For example the daily mail has printed an astounding ammount of things that can cause cancer: http://hellokinsella.posterous.com/the-daily-mail-list-of-things-that-give-you-c. This list has the usual culprits like bacon and alcohol but also ridicuolous things such as having children as a man. If men stopped having children the human race would die out so the issue of cancer would be irrelevant so what good is publishing the article. Granted it is easy to say “oh but it’s the daily mail” but that’s besides the point. Popular newspapers are for the masses and people who have studied statistics are not generally part of the masses. This therefore leaves misleading information being presented to people who aren’t as clued up on statistics as you or I. Therefore, I beleive that for something to be presented as a “fact” in a newspaper, there should be a more stringent process over what constitues being credible studies. Like in the example you raised above; one study of a small sample does not make it fact. Therefore, for it be published, it should constitue several credible studies from different sources. All too often you read in papers that X university states that… One university study should lead the way for research, but the people do not need to know what every university discovers or concludes until it is more credible.

  8. Pingback: The Big Finale – Enjoy :) | My Blog

  9. First off I would like to congratulate you on an interesting and well written blog.

    As we know in Journalism and through many sitcoms based around it is bad news sells more. Stories which can cause scares and can trouble readers means more people will buy the paper and follow the story. The problem is if this is caused by a lack of awareness for reading of statistics by the journalist reporting the story or if they are forcing out the statistics to read the way they want it to. Either are possible to happen with society especially if a health story comes into play. It’s also good that newspapers are now looking into teaching or getting employees who understand how to read statistics as that will be perfect in the long run.

    As was mentioned before you haven’t got a thesis statement clearly defined but other than that I’d say it was a great blog. Perhaps a few more links to other stories would have been helpful to add to the argument that bad journalism might be at play.

  10. Pingback: Final comments for the TA to glance over and make lots of red ticks and crosses. | itsafreudianslip

  11. uzumakiabby says:

    Statistics can be a powerful, persuasive tool but can of course be tools of evil (perhaps a little dramatic, but let’s go with it…) Everywhere you go nowadays, you cannot avoid the ambush of companies claiming to be able to reduce your wrinkles by 65% or make your teeth 80% whiter. There are hundreds of adverts all claiming that 9/10 people would recommend it to a friend, but is this really a true statistic? Who did they ask? How many people were in the sample? These are all vital questions that need answering to get the true reality of a statistic. Sure, it might flash on the screen for 2.3 milliseconds, but that’s not really being honest to the audience.
    Statistics are fantastic tools because they condense large amounts of data down into a small number, fraction or percentage. However, the people using them need to be honest for them to really be beneficial. Then again, statistics sell products – fact. So who can blame them, really?

    When the statistician misuses data, they are committing what’s known in the ‘bizz’ as a statistical fallacy. Campbell (1974) describes how it is becoming more and more of a problem. He says that sometimes it is simply an accident due to standardisation but recently, especially in the media it has become a usual, almost constant tool to use to deceive people into buying products.

    It’s a shame really, because it works and the powers that be knows it does. I have personally bought products simply because they are supported by impressive statistics on the adverts.

  12. Pingback: Final blog comments! :D « Not just ANOVA blog~

  13. Pingback: WHY IS THERE SO MUCH DISEASE IN THE WORLD? Body cleansing through a detox diet | Internal colon cleansing for all

Leave a comment