Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There is no one hovering over scientists all the time ready to stick a hot poker in them when they make a mistake or get careless. I was in academia and my impression is there is a reluctance to double and triple check results to make sure they are right as long as the results match your instincts, whether it's time pressure, laziness, bias, or just being human.


At least in my own mental model of publishing a paper (I've published only a few), I'd want my coauthors to stick hot pokers in my if I made a mistake or got careless. But then, my entire thesis was driven by a reproducible Makefile that downloaded the latest results from a supercomputer, re-ran the whole analysis, and wrote the latex necessary (at least partly to avoid making trivial mistakes). It was clear everything I was doing was just getting in the way of publishing high prestige papers.


All too easy to understand your situation. NIH is finally but slowly waking up and is imposing more “onerous” (aka: essential and correct) data management and sharing (DMS) document. Every grant applicant now submits following these guidelines:

https://grants.nih.gov/grants/guide/notice-files/NOT-OD-24-1...

Unfortunately, not all NIH institutes understand how to evaluate and moderate this key new policy. Oddly enough the peer reviewers do NOT have access to DMS plans as of this year.


Is this a process whereby the researcher is forced to submit the thesis (null, etc) of the research, ahead of the study and its findings?


I think you are referring to clinical trial registration. Different idea and process (a QA-like step).

The NIH DMS mandates are about the data generated by an award.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: