» Site Navigation |
|
» Online Users: 749 |
0 members and 749 guests |
No Members online |
Most users ever online was 9,654, 05-18-2025 at 04:16 AM. |
|
 |
|
01-05-2015, 06:33 PM
|
#1051
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 11,873
|
Re: It was HAL 9000!
Quote:
Originally Posted by sebastian_dangerfield
Hence the caveat before the last section, "Jumping off..."
I see a future in which algorithms are used to sort people based on criteria discrimination law was designed to eliminate. And then are used as defenses where found to be doing so. "It wasn't me. It was the computer system." I've seen that defense myself in a discrimination case. It was not successful, but it was a legitimate defense, and can be easily employed given the increasing automation of everything.
|
The caveat came after the parts of your post I was commenting on. I didn't realize that when you said "Jumping off..." you meant "The stuff above has nothing to do with the article I pretend to be commenting on."
Or, to put it differently, hence, the "huh?" that I started with.
__________________
Where are my elephants?!?!
|
|
|
01-05-2015, 06:42 PM
|
#1052
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 11,873
|
Re: This is not my beautiful house!
Quote:
Originally Posted by Not Bob
Pardon my skepticism, but I have long heard arguments about how the Invisible Hand of The Market prevents discrimination because companies that "irrationally" discriminate (i.e., on the basis of race, sex, religion, etc.) would be forced out of business by losing out to companies that "rationally" discriminate (i.e., on the basis of whatever the market is - skills, credit-risk, speed of fastball, etc.).
With the notable exception of professional sports, I don't think that this has occurred. So the idea that the lenders won't discriminate on race now that they have all of these wonderful (albeit imperfect) tools and databases because they only care about making money is Not Credible.
|
I would not posit "big data" as a panacea, but pushing lenders (for example) to rely on data rather than subjective factors is an improvement.
The factors used to assess the data can have built-in biases, but at least the factors themselves can be identified and the biases attacked that way.
More important, though, is that pushing people to consider actual objective data, and to identify what objective factors are important to a decision, compels them to think about it -- to engage in "slow" thinking. That system can be manipulated by someone who WANTS to discriminate on the basis of race, but it benefits people who do NOT want to so discriminate, and yet might unconsciously do so because racism is so ingrained (as the article points out is a significant phenomenon -- the person who did not realize he was using "black" names to identify applicants to reject).
__________________
Where are my elephants?!?!
|
|
|
01-05-2015, 06:44 PM
|
#1053
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 11,873
|
Re: It was HAL 9000!
Quote:
Originally Posted by taxwonk
That's a crock and we both know it. The fact that those things correlate with race is unfair and that's NOT a different issue. It's the same issue because those neighborhood lines were drawn that way for a reason, the school sucks because of the place it's located, and the job is directly related to the education, neighborhood, and yes, race or socioeconomic status (something often, but not exclusively, tied to race).
That is the exact issue.
|
Sorry, but no -- those are different issues. Related, but different.
__________________
Where are my elephants?!?!
|
|
|
01-05-2015, 06:45 PM
|
#1054
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 11,873
|
Re: It was HAL 9000!
Quote:
Originally Posted by taxwonk
They don't need data to have an intent to discriminate. But here you are, yourself, using it as an excuse. That excuse is just that. The intent is to discriminate.
|
The article that Thurgreed linked was at least in part about people discriminating where they do not have that intent.
__________________
Where are my elephants?!?!
|
|
|
01-05-2015, 06:47 PM
|
#1055
|
Registered User
Join Date: Mar 2003
Location: Government Yard in Trenchtown
Posts: 20,182
|
Re: For Sebby
Quote:
Originally Posted by Sidd Finch
Shit. Turns out that lawyers really are dickheads. Thanks for making me think about it.
|
That's not a stereotype. That's training.
__________________
A wee dram a day!
|
|
|
01-05-2015, 06:48 PM
|
#1056
|
Registered User
Join Date: Mar 2003
Location: Government Yard in Trenchtown
Posts: 20,182
|
Re: This is not my beautiful house!
Quote:
Originally Posted by Sidd Finch
I would not posit "big data" as a panacea, but pushing lenders (for example) to rely on data rather than subjective factors is an improvement.
The factors used to assess the data can have built-in biases, but at least the factors themselves can be identified and the biases attacked that way.
More important, though, is that pushing people to consider actual objective data, and to identify what objective factors are important to a decision, compels them to think about it -- to engage in "slow" thinking. That system can be manipulated by someone who WANTS to discriminate on the basis of race, but it benefits people who do NOT want to so discriminate, and yet might unconsciously do so because racism is so ingrained (as the article points out is a significant phenomenon -- the person who did not realize he was using "black" names to identify applicants to reject).
|
Wait, I thought if we relied on big data we didn't have to think?
__________________
A wee dram a day!
|
|
|
01-05-2015, 06:51 PM
|
#1057
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: For Sebby
Quote:
Originally Posted by Sidd Finch
Shit. Turns out that lawyers really are dickheads. Thanks for making me think about it.
|
Hey, there's an exception to prove every rule.
__________________
Send in the evil clowns.
|
|
|
01-05-2015, 06:58 PM
|
#1058
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: It was HAL 9000!
Quote:
Originally Posted by Sidd Finch
Sorry, but no -- those are different issues. Related, but different.
|
I see what you're saying, in that someone isn't setting out saying to themselves "I will design an algorithm to support not lending to black people because they are all deadbeats and they don't take care of their homes so we lose a ton on the foreclosure." But, if you know that what your algorithm is going to produce, and you use it anyway, Doesn't your difference lose its distinction?
If we accept the notion that selection bias poisons the process, then how can we use the process but deny that we are exercising bias? I'm really trying to get my head around this. But it keeps coming back to the same place.
__________________
Send in the evil clowns.
|
|
|
01-05-2015, 07:00 PM
|
#1059
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: It was HAL 9000!
Quote:
Originally Posted by Sidd Finch
The article that Thurgreed linked was at least in part about people discriminating where they do not have that intent.
|
True, but having read the article, we now know that people will discriminate even without intent. If we accept that as a truth, then don't we have an obligation to test everything against that knowledge and reject any process that continues the discrimination?
__________________
Send in the evil clowns.
|
|
|
01-05-2015, 07:17 PM
|
#1060
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
It was the Jewish guy.
Okay, here's me taking another run at this, trying not to sound like I'm calling anybody out.
I read the article TM linked to a couple days ago, and I have read it a dozen times since then. It really troubles me. It's also something I realized about myself a couple years back, when I was looking for a new place to live. I'm ashamed to say this, but a large part of my choice was based on looking at the probable population of the closest Emergency Room at any given point in time, and I realized that part of what I was doing was looking at what police districts the hospital I was studying was located in and what the nearby police districts were.
I told myself I was just looking at response time and competition for scarce medical resources in a time of crisis. Ultimately, though, what that meant was my looking at where the shootings and stabbings were likely to exceed the broken arms and infants with fevers. I sliced and diced the data a million different ways, but it all boiled down to this: I didn't want to be closest to a hospital where there were likely to be a lot of crime victims. For better or for worse, those hospitals were the ones that had predominantly black neighborhoods nearby.
I can try and justify this a million different ways ("it's a matter of life and death!") But it all boiled down to this: predominantly black neighborhoods have more violent crime, so I want to live nears hospitals that are farther away from predominantly black neighborhoods.
My own algorithm was not set up consciously to avoid predominantly black neighborhoods, but once I thought about the metrics I used, it became pretty hard to ignore the obvious racial bias. It also became impossible to perform the same analysis in a colorblind way.
That's the source of my resistance to the notion that it's impossible to use data that reflect racial and economic bias without choosing that data based on economic and racial bias. I tried to dress it up. I told myself that it was just a health issue; if I didn't have a bad heart that put me in the ER a couple times a year, I wouldn't even be looking at race. But it's impossible for me to say that, ultimately, for whatever reason, I live where I do now because there is an acceptable mix of black and white.
I'm not proud of myself. But I am trying to learn from the experience.
__________________
Send in the evil clowns.
|
|
|
01-05-2015, 08:29 PM
|
#1061
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 17,172
|
Re: It was HAL 9000!
Quote:
Originally Posted by taxwonk
I personally do want lenders to ignore certain aspects of risk because they also correlate with race. What's more, I want them to be subject to litigation if they don't.
I don't believe that bankers or insurers are special in any way. They want to earn money, they need to risk money. Especially if they are going to ask the taxpayers to prop up their greedy fat asses when they screw up. I don't think a bank is entitled to any more leeway than any other business. If they don't like the fact that lending to black people is more risky, then they shouldn't be allowed to lend to white people. If an insurer doesn't want to cover a poor, underfed, underclothed kid in a ghetto, then it shouldn't be allowed to write a policy on a well-fed Midwestern farm girl with braces on her teeth and an annual checkup.
t.
|
I'd prefer we be honest about real risks and instead subsidize and incentivize taking risks we view as societally valuable. Doing to reduces the chance that we bail out bad risk taking and increases the chance we reward good risk taking.
Of course, insurance may well be best managed by the government, at least as to health.
|
|
|
01-05-2015, 09:17 PM
|
#1062
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: It was HAL 9000!
Quote:
Originally Posted by Adder
I'd prefer we be honest about real risks and instead subsidize and incentivize taking risks we view as societally valuable. Doing to reduces the chance that we bail out bad risk taking and increases the chance we reward good risk taking.
Of course, insurance may well be best managed by the government, at least as to health.
|
I don't like the idea of subsidizing an industry that already gets so much in the area of government largesse. If we are going to underwrite risk, then we should just have the FHA issue direct loans. Of course, if the banks aren't willing to invest in community development and infrastructure in troubled neighborhoods, then the next time they get themselves in liquidity trouble, the government should get the bank. Maybe they can find a buyer who is willing to act as a force for improvement, instead of just continuing to subsidize leeches in starched shirts.
Insurance is easy. One risk, one risk pool. Premiums can be adjusted for claims made by the insured, but none of the insurance redlining that takes place now.
__________________
Send in the evil clowns.
|
|
|
01-06-2015, 09:33 AM
|
#1063
|
Moderator
Join Date: Mar 2003
Location: Monty Capuletti's gazebo
Posts: 26,228
|
Re: It was HAL 9000!
Quote:
Originally Posted by taxwonk
I call bullshit. They intend nothing more than to make money. Fine, the way you make money is by not lending it to schwarzes, hiring schwarzes, or dating schwarzes.
But they aren't being racist. It's just business, Mikey. Nothing personal.
|
I'm not excusing it. I'm noting that big data is going to: (1) enable a lot more of it, both intentionally and unintentionally; and, (3) provide an alibi for it.
Corporations are the greatest tools ever invented for avoidance of personal responsibility. We are handing them enormous data pools with which to commoditize human beings. When they judge the books by the covers because that's the easiest and most efficient way to maximize profit and minimize risk, and we call them on it, this will be what you hear in the Congressional hearings:
"I did not design the algorithm... That was by committee, and involved many tech people no longer with us. A number of outside consultants, as well. There's no way to know who exactly designed the code at issue. And the algorithm we're discussing-- And keep in mind, I'm not a coder or anything-- actually, quite a Luddite in that regard... I believe that code actually teaches itself. So the prohibited criteria it used, I think, if I understand the tech people correctly, was selected by the algorithm itself. With no human involvement, or foresight that might occur.
But we have enacted best practices to avoid this in the future. Our new coders assure us this exact use of this exact prohibited criteria can be avoided. Some others may be used by these learning algorithms in the future, as it's impossible to preclude them all. But this one? This exact discriminatory basis? We have that one eliminated. And we are committed to vigilant removal of others as they appear."
Of course, no purchased Senator or Congressman will mention that the categories of criteria that may be used to effect discriminatory ends are innumerable.
Only the silliest tech evangelist would believe big data is going to remove or reduce discrimination. It's a delusion as preposterous as the belief the 2008 Crash would result in true upending of social order, as it should have, rather a stronger retrenchment in which the rich and powerful before the Crash became even more rich and powerful afterward.
We are Engineered. The question isn't whether the status quo persists, but how fast it accelerates our splintering into ever more deeply class divided mini societies. To be a bear, to hope for some form of justice, or revolution, or true and deserved free market correction, is to be insane.
__________________
All is for the best in the best of all possible worlds.
Last edited by sebastian_dangerfield; 01-06-2015 at 09:37 AM..
|
|
|
01-06-2015, 11:07 AM
|
#1064
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 17,172
|
Re: It was HAL 9000!
Quote:
Originally Posted by sebastian_dangerfield
We are handing them enormous data pools with which to commoditize human beings.
|
Right, which is why they are more likely to use those big pools of data than use the simple shortcuts of traditional stereotypes. Doing so offers them both better predictions and better defenses.
I take your point that there will be defenses raised in future discrimination cases that can be summed up as "we were just following the data." That might often be true, and sometimes it may not.
Quote:
Only the silliest tech evangelist would believe big data is going to remove or reduce discrimination.
|
There's is wonk's point that all data is permeated with our history of injustice, but to believe that big data is going to be just as likely as the human subconscious to unknowingly reject black applicants because they are black is irrationally cynical. Why do that when the data offers you all kinds of ways to get to non-discriminatory results?
Yeah, I know, you've argued its going to happen either inadvertently or because it's the cheapest and easiest way to do it, and they're all greedy bastards. But both of those conclusions require entertaining the possibility that race really is strongly predictive, to a degree that it will outweigh other factors.
We know that's not right.
Quote:
It's a delusion as preposterous as the belief the 2008 Crash would result in true upending of social order
|
I'm pretty sure you were one of the people entertaining that possibility.
|
|
|
01-06-2015, 11:42 AM
|
#1065
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 11,873
|
Re: It was HAL 9000!
Quote:
Originally Posted by Adder
There's is wonk's point that all data is permeated with our history of injustice, but to believe that big data is going to be just as likely as the human subconscious to unknowingly reject black applicants because they are black is irrationally cynical. Why do that when the data offers you all kinds of ways to get to non-discriminatory results?
|
To elaborate on the above point: Take the first study discussed in the article. Change every resume to delete the applicant's name and to just say "Applicant 1", "Applicant 2", etc. Does the study end with a different result, in terms of the proportion of whites and blacks who get call-back interviews? That seems to be a certain "yes".
What Wonk is pointing to is a different, more systemic, and more pernicious issue -- and one that is in part the result of the unconscious racism found in the study, but of many other things as well. Namely, that the input to the study is unrealistic, in that a smaller percentage of black people would be able to send the "good" resume in question. This is true, and a purely objective selection process will still produce an imbalance -- but I don't put the weight of correcting that much broader problem on any individual employer, lender, etc.
Take a different context (intentionally, very different): For many years, symphony orchestras have held auditions in which the players stood behind an opaque screen, so that gender (and race - but this was really intended to address gender) would not affect selection. Some even required players to remove their shoes before walking onto the stage, because women's shoes will sound different than men's shoes.
I believe that these mechanisms were effective in reducing gender-based discrimination in the selection of musicians. BUT -- if such bias in conservatories and other opportunities for young musicians meant that there were fewer women who reached the point of being able to audition for a major symphony, the effect of systemic bias would still be seen. The "objective" approach -- we're just going to hear the musician play, and have no idea of the musician's name, gender, race etc. -- does not eliminate that.
__________________
Where are my elephants?!?!
|
|
|
 |
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
|