We hear it all the time. Its an ancient story. The history of science is full of these stories.
Misguided/driven/ambitious people are always looking for shortcuts and they will find them. Its like dealing with mosquitos, cockroaches, weeds, software bugs and cancer. It never ends.
I think the point of GP's story is that this isn't just one or two bad apples here and there, but it's endemic in that domain - and most likely in others too (I'm leaning to believe that; it's not the first story like this I've read in recent years).
Being an endemic problem means you have to switch your assumptions; when reading a random scientific paper, you're no longer thinking, "this is probably right, but I must be wary of mistakes" - you're thinking, "this is most likely utter bullshit, but maybe there's some salvageable insight in it".
I think once you've seen a few papers in high-tier journals that turn out to be bullshit once you start to dig a bit deeper, there is not other choice than to adopt this harsh stance on random scientific papers. Especially if you want to do work with that expands on findings on other papers that roughly look good "trust but verify" seems to be the way to go.
I've only recently dipped by toes into academic life in a lab, but it very much seems that PIs generally know which are the bad apples. E.g. when discussing whether some data is good enough to be publishable the PIs reaction was something along the lines of "If we were FAMOUS_LAB_NAME it would be, but we want to do it in a way that holds up". So it seems like there are at least some barriers to how incompetence would hurt the whole field.
I'm also surprised that there is no mention of the PI in GP's story. As it's a paper published by the lab, it's not just on the grad student "to do the right thing", but even more on the more senior scientist, whose reputation is also at stake.
> I think once you've seen a few papers in high-tier journals that turn out to be bullshit once you start to dig a bit deeper, there is not other choice than to adopt this harsh stance on random scientific papers. Especially if you want to do work with that expands on findings on other papers that roughly look good "trust but verify" seems to be the way to go.
Yeah, but I meant that in general case, you no longer "trust but verify", but "assume bullshit and hope there's a nugget of truth in the paper".
This has interesting implications for consuming science for non-academic use, too. I've been accused of being "anti-science" when I said this before, but I no longer trust arguments backed by citations around soft-ish fields like social sciences, dietetics or medicine. Even if the person posting a claim does good work of selecting citations (so they're not all saying something tangentially related, or much more specific, or "in mice!"), if the claim is counterintuitive and papers seem complex enough, I mentally code this as very weak evidence - i.e. most likely bullshit, but there were some papers claiming it, so if that comes up again, many times in different contexts, I may be willing to entertain the claim being true.
And stories like this make me extend this principle to biology and chemistry in general as well. I've burned myself enough times, getting excited about some result, only to later learn it was bunk.
The same pattern of course repeats outside academia, but more overtly - you can hardly trust any commercial communication either. At this point, I'm wondering how are we even managing to keep a society running? It's very hard work to make progress and contribute, if you have to assume everyone is either bullshitting, or repeating bullshit they've heard elsewhere.
Funny story, PI noticed an error in one of my papers and I (happily) issued a very minor retraction. Also in one of the threads I talked about how he did retract several year's worth of work done on a different project by the intern when she joined later. So he was alright. Plus, as a junior (2nd year grad student) you really don't want to tattle on the NIH grad student of the year. Who do you think wrote the recommendation?
it's endemic in biology, and it's endemic in chemistry (I had feet in both sides). The sentiment you wrote in the last sentence is exactly what I feel whenever I read a paper, hit it on the nail.
The crazy thing, is that the honest scientists are working at middling university. It is worse the higher up you go. I have had the opportunity to work in a upper-midrange research university [time-] sandwiched between two very high profile institutes. The institutes were way more corrupt. Like inviting the lab and the DARPA PM to hors d'oeuvres and cocktails at the institute leader's private mansion type of stuff (it turned out that that DARPA PM also had some wierd scientific overinterpretation skeletons / PI railroading the whistleblower stuff in her closet, and for a stint was the CTO of a microsample blood diagnostics company, I can't make this shit up, I guess after Theranos it got too wierd, she's now the CEO of another biotech startup -- how TF do people like this get VC money, and yet I can't get people to raise for some buddies with a growth industry company, and had to make the entire first investment myself?).
Of course working at a upper-midrange university sucks for other reasons. Especially at state universities, the red tape is astounding. And support staff is truly incompetent. Orders would fail to get placed or would arrive and disappear (not even theft, just incompetence) all the time.
While the "host" (people who pay, often with minimal decision power over their resources) turns a blind eye, "parasites" (cheaters who profit disproportionately) proliferate. Is that really so surprising?
When somebody else foots the bill, it's feast time!
To be clear, I'm with you. Also a PhD-turned-industry, for much the same reasons. But I realize what you describe is a completely rational strategy. The options always come down to:
1) Try not to be a host – if you have the wherewithal
2) Try to be a parasite – if you have the stomach
3) Suck it up & stay salty – otherwise. You can call it a balance, equilibrium, natural order of things – whatever helps you sleep at night.
Take your pick and then choose appropriate means. Romantic resolutions and wishful thinking – kinda like Atlas Shrugged solution for option 1) – rarely work.
There is a reason there is a huge replication crisis in academia and it’s exactly what you say above. When folks in industry need to develop a product based a published paper more often then not it’s bullshit.
Yup, I was taught this as part of graduate school.
Nobody ever said it was fraud, they said things like they wouldn't share the data and I couldn't replicate.
In general, the incentives for shoddy science (get Nature papers or find a new career) tend to reward bad behaviour, and I just wasn't able to find something unexpected and pretend it had been my hypothesis all along (it's almost impossible to publish a social science paper where you disconfirm your major hypothesis).
Misguided/driven/ambitious people are always looking for shortcuts and they will find them. Its like dealing with mosquitos, cockroaches, weeds, software bugs and cancer. It never ends.