Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What you describe in your second paragraph is a general phenomenon called the fundamental attribution error:

https://en.wikipedia.org/wiki/Fundamental_attribution_error

It's definitely helpful to keep in mind.



Interesting.

I had a boss about a decade ago that, during a really tense situation at my last job involving a false report of an application of mine suffering from a serious security vulnerability[0], said to me: "Where there are gaps of understanding, people jump to the worst possible conclusion." It was demonstrated so well during the follow-up meeting (where my application was exonerated and the friendship remained) that I took it to heart and made it a sort-of life mission. When something goes wrong, people assume the absolute worst but, as the Fundamental Attribution Error "bias" points out, we judge others by a different set of rules than we judge ourselves.

If we assume that others are generally operating without malice as we assume ourselves to be, a whole lot of conflict is avoided.

[0] We had an app that was on a hardened box and had asked security to audit it. The person who audited it didn't read that message and logged into the box due to another reason. He was then shocked that this box was able to connect to all of our infrastructure (for audit purposes) and he was still able to log in (we gave him explicit permissions to do so). He changed the configuration (without any notification) causing the application to stop and reporting to fail, then tried to get it all shut down. We went into a meeting, angry, under the perception that security was trying to kill this app because they preferred their own (which didn't meet our needs) and he went in assuming we were trying to be devious because our hardened configuration made it impossible for their auditing tool to audit the app. We were both wrong.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: