NSPCC’s stats on child addiction to porn don’t stand up to scrutiny

Oh dear. The NSPCC seems to have lost the moral high ground to – of all people –Vice magazine, a trendy periodical that is mainly about trainers and unconventional sex.

The children’s charity recently claimed that a tenth of the UK’s 12 to 13-year-olds were addicted to porn. It was an eye-catching story that was picked up by many media channels. The Conservatives immediately pounced on it as an election issue, pledging that they would block internet porn sites that did not have controls to prevent people under 18 from looking at them.

Vice was alone, it seems, in wondering what those numbers actually meant and where they came from. “Such inflammatory findings when published by a respected national charity would usually be accompanied by a full report of the study,” Vice says. “But not in this case. All the NSPCC would offer was an extended press release with some more quotes from concerned parties.”

The numbers came from a survey by OnePoll, which calls itself a creative market research group. “Generate content and news angles… secure exposure for your brand,” reads its blurb. “Our PR survey team can help draft questions, find news angles, design infographics, write and distribute your story.”

If you’ve read Dr Ben Goldacre’s book Bad Science (and if you haven’t, you should), you’ll recognise this as precisely the kind of survey that yields unreliable results. OnePoll encourages people to sign up and get paid for answering questions that OnePoll itself says are mainly “fun questions about celebs and your love life”.

Vice realised the sensitivity of the topic: “When the London School of Economics carried out research into children’s internet usage, a long list of safeguards were put in place, knowing that children would be asked about sensitive topics such as porn. These included pilot tests to gauge children’s state of mind, face-to-face interviews, a self-completion section for sensitive questions to avoid being heard by parents, family members or the interviewer, detailed surveys about the children themselves and measures of mediating factors such as psychological vulnerability. You can read the full 60-page report here.

“In contrast, the OnePoll survey included just 11 multiple-choice questions, which could be filled in online. Children were recruited via their parents, who were already signed up to OnePoll.

“Professor Clarissa Smith is Professor of Sexual Cultures at the University of Sunderland and a veteran researcher in the field of young people and sexuality. ‘Why aren’t they being entirely transparent with the research?’ she asked me. ‘If this was really robust, they would be sending the report to everybody; they wouldn’t be hiding it.'”

The Online Journalism Blog took the data apart, writing “there are so many methodological issues here I can’t list them all“. It’s worth a read just for the statistics, plus the obvious flaws that (a) porn addition is contentious even among academics and not well defined and (b) it’s not clear if the answers were in fact given by children.

A comment under that article explains: “I am one of the OnePoll panel. ALL surveys for children are fundamentally flawed. Here’s why: you log on and are presented with a list of open surveys. There is no way of knowing which are for kids and which are not. If you click on one and it is for kids it says something like ‘this is for children aged X to Y; if they are available please hand over to them now’. The problem is there is no option to say ‘they aren’t here at the moment, keep the survey open’. So you either select no and the survey disappears from the list and cannot be selected when the child is available, or you think ‘sod it, I’m losing 20p so I’ll select yes and try and guess what my kid might say’. I work in an office of five and we all do OnePoll surveys online whilst we are tied up on long phone calls. We all fill out the children’s surveys. Three of us presumably try and second guess what our kids might say, but two of us haven’t even got kids!! I bet OnePoll never analyse when the surveys are completed… I bet most are filled out when kids would be at school!!”

Oxfam got caught out in a similar way last year. It published figures to coincide with the World Economic Forum in Davos, claiming that the richest 1 per cent of the world’s population owned almost one-half of the world’s wealth. The financial journalist Ezra Klein, formerly of the Washington Post, pointed out that “it doesn’t mean quite what it looks like it means. To see the problem, here’s another version of the same number: the combined wealth of my two nephews is already more than the bottom 30 per cent of the world combined. And they don’t have jobs, or inheritances. They just have a piggy bank and no debt.”

He explained that what Oxfam was using “as a measure of wealth [is] a measure of net worth: assets minus debts. As such, what it’s picking up isn’t just massive inequality in wealth, but also massive inequality in the ability to access credit…. For the purposes of Oxfam’s calculation, a farmer in China’s rural Sichuan province with no debt but also very little money is wealthier than an American who just graduated from medical school with substantial debt but also a hefty, six-figure income. By any sensible standard, the medical student is richer, but because her student debt still outweighs her financial assets, the net worth measure counts her as poorer than the Chinese peasant… [Oxfam’s] measure would have counted Bill and Hillary Clinton, right after they left the White House, as among the poorest people in the world. They were, after all, millions of dollars in debt. But as Matt Yglesias wrote, you have to be pretty damn rich to get that poor.”

Even more remarkably, Oxfam itself said that its numbers used “a totally imperfect measure”. The problems with Oxfam’s numbers were also discussed by the prize-winning financial journalist Felix Salmon.

Does it matter that the people who know about data have noticed the problems here? Does it matter if charities gain a reputation for generating media coverage by pumping out stats that aren’t really true? Perhaps. This is, remember, a world in which donations haven’t risen above inflation in about three decades, and in which 35 per cent of people (according to a survey!) have low trust in charities and fewer than a quarter trust them a lot.

The issues that we serve are startling enough that surely we can attract attention to them at the same time as getting our facts straight.

This first published in Third Sector in April 2015 (apologies for posting out of sequence)

This entry was posted in Fundraising, Great charities. Bookmark the permalink.

Leave a comment