Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Biased humans program systems, biased humans collect data, biased humans feed systems data, biased humans purchase systems that affirm their biases, biased humans point at the black boxes as a confirmation for their bias. How can a black box be wrong? After all, it is a computer and it is not influenced by bias like a human.


Would you rather have a biased human attempt to make data-driven, statistically based, decisions, or a biased human make decisions based on whatever they feel like? Either way there will be bias in the system. You don't have to be faster than the bear, you just have to faster than your friend.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: