» Site Navigation |
|
» Online Users: 692 |
0 members and 692 guests |
No Members online |
Most users ever online was 9,654, 05-18-2025 at 04:16 AM. |
|
 |
|
01-05-2015, 04:45 PM
|
#1021
|
[intentionally omitted]
Join Date: Mar 2003
Location: NYC
Posts: 18,597
|
Re: For Sebby
Quote:
Originally Posted by sebastian_dangerfield
Do I have to say I agree with the article? Seems a waste of effort. How couldn't I? I read it twice by the way, having also scanned it in the Times yesterday.
I decided to offer a thought I had after reading it. This seemed the more interesting thing to do.
|
Here's the problem with your post: "The shift toward hiring/renting/giving credit based on surface information is only increasing."
That sentence basically sums up the issue with your previous posts (remember, the "jumping off point" in your last post was your retort to why redlining wasn't really discrimination) that I was trying to address by citing this article.
I don't want you to say, "I agree with the article," because what's in the article just *is.*
TM
|
|
|
01-05-2015, 04:51 PM
|
#1022
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 17,172
|
Re: It was HAL 9000!
Quote:
Originally Posted by sebastian_dangerfield
Data crunchers can't think fast enough (to borrow the article's term).
|
You're misusing Kahneman's term. "Fast" thinking is heuristic and intuitive. I'm not exactly sure how you can crunch data with that kind of thinking.
Quote:
If it's found that an algorithm gets better results using prohibited bases for credit denial/renting/hiring, there will be pressure to nevertheless use it.
|
That's where you lose me. It won't be better. It's not true that, for example, Thurgreed is a worse credit risk than Ty's hypothetical deadbeat cousin and a model that assumes so will perform worse and cost the organization money.
It may be true that someone from the wrong neighborhood, the wrong school, the wrong type of job or with the wrong history with the legal system is a worse credit risk, and all of those things may correlate closely with race and lead to results that we think are unfair, but that's a different issue.
Anyway, you don't need big data to do what you're talking about. You're now saying that they will ignore the data and just use race. I'm skeptical that they will both to do the analysis if that's where they intend to come out.
Last edited by Adder; 01-05-2015 at 04:57 PM..
|
|
|
01-05-2015, 04:56 PM
|
#1023
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 17,172
|
Re: It was HAL 9000!
Quote:
Originally Posted by taxwonk
What criteria?
|
Yes, wonk, the data will bear the stamp of our unequal history and thus reliance on it will not bring an even playing field.
As I've said repeatedly, that's a different issue from Sebby's assertion that people will use with the specific intent to discriminate.
They don't need data for that.
|
|
|
01-05-2015, 05:08 PM
|
#1024
|
Moderator
Join Date: Mar 2003
Location: Monty Capuletti's gazebo
Posts: 26,228
|
Re: It was HAL 9000!
Quote:
You're misusing Kahneman's term. "Fast" thinking is heuristic and intuitive. I'm not exactly sure how you can crunch data with that kind of thinking.
|
The holy grail of algorithms is one that is heuristic and intuitive, something that can replace or at least approximate human thinking, with the added benefit of being able to do math and make decisions at 100X the speed. Again, cost minimization (no salary or benefits to pay) and efficiency maximization (it does what humans do, but better).
Quote:
Anyway, you don't need big data to do what you're talking about.
|
I didn't say you did. But you you can engage in a whole lot more intentional and unintentional discrimination with big data than without it.
Quote:
You're now saying that they will ignore the data and just use race.
|
No. I'm saying some will do it if they find a correlation that enhances their ability to predict future events and minimize risk. How many is some? I don't know. And some others will merely include prohibited criteria among non-prohibited criteria if they find that such blending enhances their predictive capacities.
Quote:
I'm skeptical that they will both to do the analysis if that's where they intend to come out.
|
They don't intend to come out anywhere. They merely seek to avoid risk and predict the future for profit. And the actors we're talking about, at least in insurance and finance, will not lose sleep over using prohibited criteria. Particularly where they can blame it on a learning algorithm.
__________________
All is for the best in the best of all possible worlds.
|
|
|
01-05-2015, 05:12 PM
|
#1025
|
Moderator
Join Date: Mar 2003
Location: Podunkville
Posts: 6,034
|
This is not my beautiful house!
Quote:
Originally Posted by Adder
They want to make money. They will end up using criteria that correlate with race but that they can show actually have meaning for the decision they are making.
|
Pardon my skepticism, but I have long heard arguments about how the Invisible Hand of The Market prevents discrimination because companies that "irrationally" discriminate (i.e., on the basis of race, sex, religion, etc.) would be forced out of business by losing out to companies that "rationally" discriminate (i.e., on the basis of whatever the market is - skills, credit-risk, speed of fastball, etc.).
With the notable exception of professional sports, I don't think that this has occurred. So the idea that the lenders won't discriminate on race now that they have all of these wonderful (albeit imperfect) tools and databases because they only care about making money is Not Credible.
|
|
|
01-05-2015, 05:13 PM
|
#1026
|
Moderator
Join Date: Mar 2003
Location: Monty Capuletti's gazebo
Posts: 26,228
|
Re: It was HAL 9000!
Quote:
As I've said repeatedly, that's a different issue from Sebby's assertion that people will use with the specific intent to discriminate.
|
No. I am not saying they all intend to discriminate. The overwhelming majority will not. They will intend nothing more than to make money. The discrimination will be largely unintentional. Think of it as algorithmic laziness. Company X finds that, rather than spending time assessing more complex traditional data on hiring, it can shortcut using the number of certain letters in a first name (which just happen to indicate minority background). If the Company finds this is effective and saves time, it will be pressured to use this discriminatory technological advantage.
__________________
All is for the best in the best of all possible worlds.
|
|
|
01-05-2015, 05:15 PM
|
#1027
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 17,172
|
Re: It was HAL 9000!
Quote:
Originally Posted by sebastian_dangerfield
They don't intend to come out anywhere. They merely seek to avoid risk and predict the future for profit. And the actors we're talking about, at least in insurance and finance, will not lose sleep over using prohibited criteria. Particularly where they can blame it on a learning algorithm.
|
You're still assuming that stereotypes are profit-maximizing. I think that's highly unlikely to be true, especially in a big data world.
|
|
|
01-05-2015, 05:20 PM
|
#1028
|
Moderator
Join Date: Mar 2003
Location: Monty Capuletti's gazebo
Posts: 26,228
|
Re: It was HAL 9000!
Quote:
Originally Posted by Adder
You're still assuming that stereotypes are profit-maximizing. I think that's highly unlikely to be true, especially in a big data world.
|
No, I'm saying if they are found to be profit-maximizing they will be used. Some will be. Some will not be. I'm talking about the former.
__________________
All is for the best in the best of all possible worlds.
|
|
|
01-05-2015, 05:22 PM
|
#1029
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: It was HAL 9000!
Quote:
Originally Posted by Adder
It may be true that someone from the wrong neighborhood, the wrong school, the wrong type of job or with the wrong history with the legal system is a worse credit risk, and all of those things may correlate closely with race and lead to results that we think are unfair, but that's a different issue.
|
That's a crock and we both know it. The fact that those things correlate with race is unfair and that's NOT a different issue. It's the same issue because those neighborhood lines were drawn that way for a reason, the school sucks because of the place it's located, and the job is directly related to the education, neighborhood, and yes, race or socioeconomic status (something often, but not exclusively, tied to race).
That is the exact issue.
__________________
Send in the evil clowns.
|
|
|
01-05-2015, 05:24 PM
|
#1030
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: It was HAL 9000!
Quote:
Originally Posted by Adder
Yes, wonk, the data will bear the stamp of our unequal history and thus reliance on it will not bring an even playing field.
As I've said repeatedly, that's a different issue from Sebby's assertion that people will use with the specific intent to discriminate.
They don't need data for that.
|
They don't need data to have an intent to discriminate. But here you are, yourself, using it as an excuse. That excuse is just that. The intent is to discriminate.
__________________
Send in the evil clowns.
|
|
|
01-05-2015, 05:25 PM
|
#1031
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 17,172
|
Re: This is not my beautiful house!
Quote:
Originally Posted by Not Bob
Pardon my skepticism, but I have long heard arguments about how the Invisible Hand of The Market prevents discrimination because companies that "irrationally" discriminate (i.e., on the basis of race, sex, religion, etc.) would be forced out of business by losing out to companies that "rationally" discriminate (i.e., on the basis of whatever the market is - skills, credit-risk, speed of fastball, etc.).
With the notable exception of professional sports, I don't think that this has occurred.
|
Really? It's the exceedingly rare company that's okay with the world thinking it discriminates, and that's not entirely about anti-discrimination laws.
Moreover, those "shocked" hiring managers in the article, who believe they value diversity and could not believe that they were unconsciously discriminating, would not have the opportunity to do so if decisions were guided by actual data.
Quote:
So the idea that the lenders won't discriminate on race now that they have all of these wonderful (albeit imperfect) tools and databases because they only care about making money is Not Credible.
|
We probably need to deal with the uncomfortable realization that redlining may well have been profit maximizing for a banking system operating in our racist society. Or at least not money-losing when all of the other banks are doing it too.
But no, market incentives will not fix everything and regulation is needed too.
|
|
|
01-05-2015, 05:27 PM
|
#1032
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: It was HAL 9000!
Quote:
Originally Posted by sebastian_dangerfield
No. I am not saying they all intend to discriminate. The overwhelming majority will not. They will intend nothing more than to make money. The discrimination will be largely unintentional. Think of it as algorithmic laziness. Company X finds that, rather than spending time assessing more complex traditional data on hiring, it can shortcut using the number of certain letters in a first name (which just happen to indicate minority background). If the Company finds this is effective and saves time, it will be pressured to use this discriminatory technological advantage.
|
I call bullshit. They intend nothing more than to make money. Fine, the way you make money is by not lending it to schwarzes, hiring schwarzes, or dating schwarzes.
But they aren't being racist. It's just business, Mikey. Nothing personal.
__________________
Send in the evil clowns.
|
|
|
01-05-2015, 05:29 PM
|
#1033
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 17,172
|
Re: It was HAL 9000!
Quote:
Originally Posted by taxwonk
That's a crock and we both know it. The fact that those things correlate with race is unfair and that's NOT a different issue. It's the same issue because those neighborhood lines were drawn that way for a reason, the school sucks because of the place it's located, and the job is directly related to the education, neighborhood, and yes, race or socioeconomic status (something often, but not exclusively, tied to race).
That is the exact issue.
|
It's not the issue we've been discussing, wonk. We've been discussing whether the rise of big data is going to lead to more or less overt discrimination. Sebby contends that it will, while I contend it will not.
None of that goes to how to address the accumulated effects of history. Those issues will still exist. They are much bigger issues and much harder to address, and, sadly, are still largely ignored. But they were not the conversation we've been having and eliminating racial discrimination does not eliminate those issues.
|
|
|
01-05-2015, 05:30 PM
|
#1034
|
Registered User
Join Date: Mar 2003
Location: Government Yard in Trenchtown
Posts: 20,182
|
Re: This is not my beautiful house!
Just a point of order, is using the term "heuristic" a corollary to Godwin's Law or the basis for a separate but parallel law?
__________________
A wee dram a day!
|
|
|
01-05-2015, 05:33 PM
|
#1035
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 17,172
|
Re: It was HAL 9000!
Quote:
Originally Posted by taxwonk
They don't need data to have an intent to discriminate. But here you are, yourself, using it as an excuse. That excuse is just that. The intent is to discriminate.
|
I want you to go back and re-read this discussion, because this is not okay. I don't appreciate you accusing me of something I've not done and I assume you're only doing so because you've misinterpreted something I've said.
If I've been too charitable, you can go make use of Atticus's angry fist of god with yourself if you think I've excused discrimination somehow.
|
|
|
 |
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
|