My agency has contracted with an independent researcher. We’ve given him a lot of money to synthesize different datasets and provide some recommendations based on the findings. This study has a high visibility (as far as government work goes) and the decision-making that results from it will be heavily scrutinized…probably for years to come. I kind of function as agency “gate-keeper”. It’s my job to let management know what decisions are scientifically defensible and which aren’t.
This researcher submitted a draft report of his findings. I had a number of comments–none of it minor “nitpicky” stuff. (There were plenty of others who concentrated on spelling and grammar.) I reserved my comments for the big stuff–the stuff that would leave my agency vulnerable if they weren’t addressed. One such comment was the lack of transparency about the statistical analysis. Not even a single statistical test was identified, although on multiple occasions he presented p-values and described findings as “statistically significant.”
Seems to me that that not mentioning the statistical test(s) is the kind of mistake a first-year grad student might make. Not a prolific researcher. But I gave him a break. It could have just been an innocent oversight.
We recently receive the finalized version. And guess what? A full description of the statistical analyses performed is still missing. Almost thirty pages of text and figures, and he never once mentions it how he calculated all those significant p-values.
This makes me livid. I already don’t feel too warm and fuzzy about the details of the approach he HAS shared with us. The fact that he has not been 100% forthcoming about something that is so basic is throwing up a red flag.
Doper scientists: how would you feel about this, if you were in my shoes? Am I making a mountain out of a molehill? It is my job to be borderline paranoid about anything that seems fishy or sloppy with regarding data analysis, so maybe I’m attributing malice to an innocent mistake? I agree this is possible, but I suspect he might be worried that I will criticize him over how he analyzed the data. I criticized how he measured central tendency and pleaded with him to re-do some of his figures, which required some extra work on his end. But fear of criticism is not a good excuse for leaving out information, especially information that is so important. If you don’t want to have to defend your work, you have no business calling yourself a scientist! (OK, let me calm down now. :)).
I do want to tell my boss–who is not a technical expert–about this, but I don’t want to “escalate” something that is small potatoes. If you were to receive a manuscript without any information about the statistical analysis, would that cause you to just shrug your shoulders. Or would it throw up a red flag?
If you’re writing a three-page letter, you can leave out details like that. But in a 30-page paper? Absolutely I would expect a full description of the analysis. There are a number of possible explanations of why it might not be in there, but none of them look particularly good for this researcher.
If this is going to receive any public scrutiny (and you imply that it will) anyone with a tiny modicum of statistical savvy will wonder why the methodology’s not there.
Even if the readers of the report don’t have that savvy, it’s not unheard-of for a reporter on the science beat to have a go-to person on statistical/mathematical methods, who they’ll ask to explain the study, and who might defensibly say that it’s all made up.
If this study has high visibility, you need to be able to defend its conclusions from those who will say it was made up or that the conclusions were biased in favor of specific policies.
IMO, it’s a huge deal and completely unacceptable. I’ve said as much on papers that I’ve helped review with my advisor. Including, in particular, one paper that had some really wonderful data that I wanted to see published since it also bolsters my own research, but without at least a Stats 101 level of analysis the data is meaningless.
I guess I need to put on my big girl panties and tell my boss. I’ll say something like, “We should not let this report go out to peer review without a description of the statistical analysis. I’ve requested this information, but he has not provided it.”
I will leave out any implication of bamboozlement and hoodwinkery. But I’ll be thinking it!
Damn. I’d love to be able to tell an Important Person Behind a Desk “I think we may be dealing with someone engaged in bamboozlement, and … while I think it’s too early to be sure … some possible hoodwinkery.”
“And why are you wearing big girl panties?”
“NEVER YOU MIND!”
I’m not a researcher but as an administrator/accounting type, if I were paying “a lot of money”, I would count on someone like you to alert me to the person not meeting requirements for the product you have contracted for. Absolutely he should be doing whatever (extra) work is required to satisfy you, his customer. Go monstro go, and please report back!
My experience is that most people these days who quote statistics, have very little idea how those numbers are obtained. This is a consequence of the ease with which a computer will spit out numbers that the person who entered the numbers does not understand.
What you need are subject matter expert reviewers. When I serve as a statistical reviewer for peer reviewed journals, this problem would be fatal.
A key aspect of matters like this is replicability - there should be enough information to allow someone else to reproduce the results. And beyond that, the methods should be described in detail, enough to confirm or criticisize credibility. And when the deliverable includes data sets, don’t neglect QA/QC.
And when the authors are non-responsive to review, it’s time to drop some hammers.
I approve of this course of action. Just spell out the facts and explain why this omission could seriously hurt the agency’s scientific credibility. The ball will then be in your boss’s court, and you can take comfort in knowing that you’ve done your job by advising him appropriately.
Not detailing one’s methods, despite being asked to do so, reeks of shenanigans. Hopefully you don’t have to spell out the obvious to your boss, but I don’t think it’s wrong to point out to him how the scientific community will likely perceive this omission.
Right. Even if there are no shenanigans, even if the contractor has done a perfect job, your agency needs to be able to show publicly how their conclusions were reached.
It may be worth hammering that point just a tiny bit with your boss, so it doesn’t become a question of 'but <contractor> has a great reputation; they wouldn’t do any monkey business"
I’m on the other side of the table - my job entails a lot of what you are paying this guy to do. Senior management comes to me with “Why is this happening/What is driving X” type questions. I find/validate the appropriate data, run statistical tests and present conclusions and recommendations. You had better bet that every statistic that I present is backed up with information on data sources, time periods, assumptions/exclusions, and the test(s) run.
Stats without backup are worse than useless, they are downright dangerous.
I am not a scientist but I am managing editor of some scientific journals. We wouldn’t even send a paper like the one you described out for peer review. Major red flags.
Update: I sent an email to my boss first thing this morning telling him that we can’t accept the report the way it is right now. My boss immediately emailed the contractor and let him know. My boss’s boss was cc’ed to let him know we ain’t playin’.
I’m nothing like a scientist, but this would have all my alarm bells going nuts. Surely the whole point of a report like this is to say ‘Here are the conclusions I reached, and here’s how I reached them’?
Monstro, go back and read the statement of work or whatever document you use to pass requirements to contractors. If the requirement for ‘demonstration of statistical tests used’ or some other language is not in there, you’ll have to amend the contract to have the guy/gal put it in.