Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When you have the right mental model, this is an appropriate approach.

Not every mental model is correct, either locally to a domain or globally in all cases.

1 < 2, in almost all cases except for small Z_{0,1} or similar type cases where mod functions modify the space to something exotic or comparator functions are defined far out of the typical. If you think there is a case for 2 < 1, and you aren't appealing to something exotic, you have the wrong approach.



Any mental model that assigns zero weight to the probability of being wrong is a wrong mental model.

That said, biases arising from endogeneity might have negative effects too. You can't conclude a parameter should have a different/zero sign just because of endogeneity, you have to go fix your model, and re-estimate the parameters.


> Any mental model that assigns zero weight to the probability of being wrong is a wrong mental model.

This is true when there is uncertainty. It also doesn't connect to "I only hire people who think like I do", which is the context of my response, so I don't follow your point.

Agreed on your other point that you need to instrument endogeneity. Another requirement is plausibility. Hookworm presence in Greenland should be uncorrelated to solar sunspots.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: