Reports based on scientific research or on surveys carried out by a whole range of organisations are a regular feature of the daily news diet. But how truly informative are they?
Mackerel are a good source of Omega 3 - but just how healthy?
These reports can cover important issues ranging from health and crime to the environment and money matters. But is the BBC always sufficiently rigorous in its analysis of this information? Does it always point up apparent contradictions with similar research?
Some viewers and listeners think not. Take the recent item on whether Omega 3 oils are good for us, broadcast on March 24th. The BBC reported research findings that the health benefits of eating oily fish weren't as clear as previously thought.
Debbie Wade wrote: "There is no new research. The article simply reviews older research and wildly skews the actual results of the previous studies."
And Sophie Galvani said: "This type of information, which magnifies one scientific review amongst many, acts to confuse consumers and sensationalise what is a serious subject - heart disease."
NewsWatch asked the BBC's home affairs editor, Mark Easton, for his view. He agreed that this report wasn't new research but was a peer review of research.
"You could argue it was something more important - a systematic review, where scientists have got together, gone back and looked at all the recent evidence about Omega 3, subjected it to the best analysis they can and then they've come up with a view as to whether Omega 3 is quite as good or bad as people had thought. And in this case they thought on balance the science isn't perhaps as strong as we'd originally thought.
"I think the responsibility is on us as correspondents to ensure that we reflect the strength of the science as best we can and - this is really important - the real risks we're talking about. I think it's about our understanding of science. Science moves forward through disagreement, through argument, discussion and debate and looking at evidence in different ways.
"We need to be sure when we report science to put it in the context that this is just one piece in a bigger jigsaw. It isn't some kind of ultimate truth. It's really important that we reflect the place this bit of science has in the wider context of what we know."
There are in fact guidelines for BBC journalists on how to report risk, drawn up in conjunction with King's College, London University. The guidelines suggest that journalists should ask the following questions to help frame their reports:
What exactly is the risk, how big is it, who's affected?
How has the risk been measured, how big is the sample, who funded the research, how reputable is the source?
If we are reporting a relative risk, have we made clear what the baseline risk is (e.g. a 100% increase in a problem that affects one person in 2,000 will still only affect one in 1,000)
Have we asked: "How safe is this?" rather than "Is this safe?"
If a scientist or a victim takes a view counter to majority scientific opinion, is that clear in the report and in the casting of and questions in a discussion?
Have we told the audience how they can get more information?
Can we find a comparison to make the risk easier to understand (e.g. it's as risky as drinking a pint of beer)
Is the scale of reporting in proportion to the extent of the risk?
Have we given the audience information to put the risk in context? (e.g. if during a pill scare women stop taking the pill, they face higher risks from abortion or pregnancy)
Can we use a story about a specific risk as a springboard to discuss other related risks? (e.g. train safety v. road safety)