Biased humans program systems, biased humans collect data, biased humans feed systems data, biased humans purchase systems that affirm their biases, biased humans point at the black boxes as a confirmation for their bias. How can a black box be wrong? After all, it is a computer and it is not influenced by bias like a human.
Would you rather have a biased human attempt to make data-driven, statistically based, decisions, or a biased human make decisions based on whatever they feel like? Either way there will be bias in the system. You don't have to be faster than the bear, you just have to faster than your friend.