Quote:
Originally Posted by sebastian_dangerfield
Data crunchers can't think fast enough (to borrow the article's term).
|
You're misusing Kahneman's term. "Fast" thinking is heuristic and intuitive. I'm not exactly sure how you can crunch data with that kind of thinking.
Quote:
If it's found that an algorithm gets better results using prohibited bases for credit denial/renting/hiring, there will be pressure to nevertheless use it.
|
That's where you lose me. It won't be better. It's not true that, for example, Thurgreed is a worse credit risk than Ty's hypothetical deadbeat cousin and a model that assumes so will perform worse and cost the organization money.
It may be true that someone from the wrong neighborhood, the wrong school, the wrong type of job or with the wrong history with the legal system is a worse credit risk, and all of those things may correlate closely with race and lead to results that we think are unfair, but that's a different issue.
Anyway, you don't need big data to do what you're talking about. You're now saying that they will ignore the data and just use race. I'm skeptical that they will both to do the analysis if that's where they intend to come out.