» Site Navigation |
|
» Online Users: 1,487 |
0 members and 1,487 guests |
No Members online |
Most users ever online was 6,698, 04-04-2025 at 04:12 AM. |
|
 |
|
01-06-2015, 05:40 PM
|
#1096
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: It was HAL 9000!
Quote:
Originally Posted by Adder
Big data is going to mean finding patterns in your tweets, to take only a mildly far-fetched example, that suggest that you're a better or worse credit risk.
Those things are only going to be included if the numbers have predictive value. Sure, deciding which things to take a look at will start with a person (in the beginning, anyway), and thus will be subject to human biases, but they are only going to get used if the numbers work.
Which is a different critter from unconsciously going into the no pile because of your name.
|
Predicting my behavior based on my tweets is actually not a problem for me. Except that the algorithm is going to be built based on whether or not I knew that Sir Paul really didn't need jack shit in the way of musical boost from Kanye, or it's going to take into account whether I use certain abbreviations or anagrams. Again, the issue isn't whether the data is valid or not. The issue is what data is studied, and how it is weighted. Because an algorithm is going to have to weigh choices if it is to have any chance at all of producing a competitive advantage.
You (not you, personally and exclusively) keep trying to draw this distinction between deep and shallow, fast or slow, big or little, as if they make a difference. All big data is is a shitload of little data being looked at by a big computer array instead of by a roomful of grad students or junior associates.
__________________
Send in the evil clowns.
|
|
|
01-06-2015, 06:02 PM
|
#1097
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: It was HAL 9000!
Quote:
Originally Posted by sebastian_dangerfield
I noted that decisions were made which appeared to be racist based solely on risk avoidance concerns. These decisions had discriminatory effects, but were not made with the intent to discriminate, but with the intent merely to "make money," as Adder put it. Part of making money is avoiding risk. You couch that however you like.
|
Okay. I think I have this figured out. Risk avoidance is bad, at least as practiced in America today by bankers, insurers, employers, and any other economic actor that is large enough to be able to pretend that human beings are not responsible for their actions.
Decisions made "merely to make money" are inherently bad when the basis on which that risk avoidance is built is racially impacted. Note that I did not say racially motivated.
I don't give a fuck that your model is based on the mathematical determination that boys who went to Choate are more likely to wind up as managers or subject matter experts than boys who went to any public school in America. Who gets into Choate?
Same thing, someone earning $150,000/year in Bloomfield Hills is 64% less likely than someone earning $150,000 in downtown Detroit. So what. Who lives where?
Again, big data is just a shitload of small data. Some asshole still sits at a desk somewhere and decides what each piece of data is worth. Whether it's being made in the name of maximizing profits or not, somebody is still saying the black-sounding name or the mexican neighborhood gets weighted less favorably.
The truth is, if people are still saying that "If I lend money to this black man or hire this Vietnamese woman, my risk profile is going to be X rather than Y," they are still saying nothing more than that colored folk is unreliable, and if they want to work here in America, why cant' they bother to learn to speak American.
__________________
Send in the evil clowns.
|
|
|
01-06-2015, 06:06 PM
|
#1098
|
[intentionally omitted]
Join Date: Mar 2003
Location: NYC
Posts: 18,597
|
Re: It was HAL 9000!
Quote:
Originally Posted by sebastian_dangerfield
I noted that decisions were made which appeared to be racist based solely on risk avoidance concerns. These decisions had discriminatory effects, but were not made with the intent to discriminate, but with the intent merely to "make money," as Adder put it. Part of making money is avoiding risk. You couch that however you like.
The article does not say all decisions made which have discriminatory impact are based on race. It notes merely that many discriminatory decisions are unconscious. Are you suggesting that even a blind algorithm which was not intended to discriminate, but does so because discriminating happens to dovetail with risk avoidance is nevertheless deciding based on race?
Apples and oranges. People do make decisions unconsciously based on race. I think that is such an obvious fact, I hijacked the point into what I thought was more interesting: How will discrimination be changed as we automate more and more of these decisions?
You got me. But that also means I agree with you, and with the article's observations. I disagree with the article's hopeful tone. What's unconsciously done is rarely fixed. Those behaviors are like heart beats, or breathing.
I don't think this at all. But I do think they will be, and soon. And I think if you want to combat future discrimination, there's lots more to be gained in avoiding mechanized electronic discrimination before it becomes a huge problem than there is in asking people to examine their unconscious biases.
Because they aren't mutually exclusive propositions. Logically, I say pre-emptively adjust the technological thing that hasn't been polluted yet, and does not have human failings, to avoid it becoming infected with racism. Why? Because if we don't, as Wonk noted, it'll acquire all the prejudices of its users and creators. I'm not sure we can ever fully racism-proof the algorithms, but its worth a try, and its more likely to have results than suggesting people individually look at their own unconscious racism. Why would I say this? Because here's the thing about unconsciousness: A person doing something he doesn't even realize until after he's done it has a hard time both stopping the act or remedying its damage. Tweaking the algorithms that may engage in this discrimination in the future has a far greater chance of actual, measurable success.
I'm not much interested in the human element because I hold little hope of it improving much in our lifetimes. Sorry to have hijacked. Can't help myself.
|
I would respond to each and every point, but I figure you're just on your own frequency, so I'll just let it go.
TM
|
|
|
01-06-2015, 07:05 PM
|
#1099
|
Moderasaurus Rex
Join Date: May 2004
Posts: 33,073
|
Re: It was HAL 9000!
Quote:
Originally Posted by taxwonk
You (not you, personally and exclusively) keep trying to draw this distinction between deep and shallow, fast or slow, big or little, as if they make a difference. All big data is is a shitload of little data being looked at by a big computer array instead of by a roomful of grad students or junior associates.
|
I've been trying to avoid getting into this particular exchange, despite some strong feelings, because conversations about big data are so tedious. Talking about big data is like talking about transportation or weather -- the subject is so incredibly broad that any sort of assertion about it is bound to be part right, part wrong, and completely useless.
There is data and then there is data, and it really depends on what you are actually talking about. Take the phenomenon, described in the NYT piece, that resumes with certain names on them fare much worse than the same resumes with other names on them. No one here can possibly think that any organization hires people in a particularly rational or effective way. Most jobs have specific requirements which set them apart from other jobs, and thus requires a human to make subjective judgments about whether someone is a good fit. I'm sure everyone thinks they are better than average at doing this. I have heard that Google's HR head has tried to do some data analysis to try to figure out which indicators are the most effective at screening resumes to identify the better candidates, and that sounds like a good idea. But if anyone thinks that's going to dispel the racial bias in hiring mentioned in the NYT article, I have a bridge to sell you.
__________________
“It was fortunate that so few men acted according to moral principle, because it was so easy to get principles wrong, and a determined person acting on mistaken principles could really do some damage." - Larissa MacFarquhar
|
|
|
01-06-2015, 07:11 PM
|
#1100
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: It was HAL 9000!
Quote:
Originally Posted by Tyrone Slothrop
I've been trying to avoid getting into this particular exchange, despite some strong feelings, because conversations about big data are so tedious. Talking about big data is like talking about transportation or weather -- the subject is so incredibly broad that any sort of assertion about it is bound to be part right, part wrong, and completely useless.
There is data and then there is data, and it really depends on what you are actually talking about. Take the phenomenon, described in the NYT piece, that resumes with certain names on them fare much worse than the same resumes with other names on them. No one here can possibly think that any organization hires people in a particularly rational or effective way. Most jobs have specific requirements which set them apart from other jobs, and thus requires a human to make subjective judgments about whether someone is a good fit. I'm sure everyone thinks they are better than average at doing this. I have heard that Google's HR head has tried to do some data analysis to try to figure out which indicators are the most effective at screening resumes to identify the better candidates, and that sounds like a good idea. But if anyone thinks that's going to dispel the racial bias in hiring mentioned in the NYT article, I have a bridge to sell you.
|
I don't think the discussion was whether it is going to change the way anyone hires. I think the issue we are looking at is whether or not this crap gives someone cover (or should give someone cover) when their actions have a clearly discriminatory impact.
__________________
Send in the evil clowns.
|
|
|
01-06-2015, 07:22 PM
|
#1101
|
Moderasaurus Rex
Join Date: May 2004
Posts: 33,073
|
caption, please
__________________
“It was fortunate that so few men acted according to moral principle, because it was so easy to get principles wrong, and a determined person acting on mistaken principles could really do some damage." - Larissa MacFarquhar
|
|
|
01-06-2015, 08:55 PM
|
#1102
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: caption, please
Quote:
Originally Posted by Tyrone Slothrop
|
My eyes, Mommy!! It burns!
__________________
Send in the evil clowns.
|
|
|
01-06-2015, 08:59 PM
|
#1103
|
Proud Holder-Post 200,000
Join Date: Sep 2003
Location: Corner Office
Posts: 86,144
|
Re: caption, please
Quote:
Originally Posted by Tyrone Slothrop
|
Yes, if you could handle this you can handle me, I promise.
__________________
I will not suffer a fool- but I do seem to read a lot of their posts
|
|
|
01-06-2015, 09:30 PM
|
#1104
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: caption, please
Quote:
Originally Posted by Hank Chinaski
Yes, if you could handle this you can handle me, I promise.
|
Blatant double entendres with old people is icky. Stop this at once.
__________________
Send in the evil clowns.
|
|
|
01-07-2015, 10:00 AM
|
#1105
|
Registered User
Join Date: Mar 2003
Location: Government Yard in Trenchtown
Posts: 20,182
|
Re: It was HAL 9000!
So, after all this discussion on racial discrimination in hiring, I'm about to go into the market for a young corporate associate. Should I intentionally be giving preference to candidates who are minorities? To women candidates?
__________________
A wee dram a day!
|
|
|
01-07-2015, 10:16 AM
|
#1106
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 11,873
|
Re: It was HAL 9000!
Quote:
Originally Posted by Greedy,Greedy,Greedy
So, after all this discussion on racial discrimination in hiring, I'm about to go into the market for a young corporate associate. Should I intentionally be giving preference to candidates who are minorities? To women candidates?
|
I would say not. But I would say you should give greater preference to the person who was top of his class at a lower-tier law school than the person who was 80th percentile at fancy-pants U. And you should consider that someone who didn't start life on third base has worked hard to be a credible candidate for you, and likely be more appreciative and hard-working in the future.
__________________
Where are my elephants?!?!
|
|
|
01-07-2015, 11:06 AM
|
#1107
|
Registered User
Join Date: Mar 2003
Location: Government Yard in Trenchtown
Posts: 20,182
|
Re: It was HAL 9000!
Quote:
Originally Posted by Sidd Finch
I would say not. But I would say you should give greater preference to the person who was top of his class at a lower-tier law school than the person who was 80th percentile at fancy-pants U. And you should consider that someone who didn't start life on third base has worked hard to be a credible candidate for you, and likely be more appreciative and hard-working in the future.
|
So, in other words, I should use UMich's approach and hire from the hoods?
__________________
A wee dram a day!
|
|
|
01-07-2015, 11:41 AM
|
#1108
|
[intentionally omitted]
Join Date: Mar 2003
Location: NYC
Posts: 18,597
|
Re: It was HAL 9000!
Quote:
Originally Posted by Greedy,Greedy,Greedy
So, after all this discussion on racial discrimination in hiring, I'm about to go into the market for a young corporate associate. Should I intentionally be giving preference to candidates who are minorities? To women candidates?
|
If the person satisfies your firm's initial hiring requirements such that they are sitting in front of you, yes. Woman, minority, whatever. That's my approach. Give them preference. And that really just means that non-minority men have to absolutely shine in order to overcome that preference (or the minority or woman has to clearly be wrong for the position). That's how it works 95% of the time in the opposite direction.
TM
|
|
|
01-07-2015, 12:06 PM
|
#1109
|
Moderator
Join Date: Mar 2003
Location: Monty Capuletti's gazebo
Posts: 26,223
|
Re: It was HAL 9000!
Quote:
Originally Posted by Greedy,Greedy,Greedy
So, after all this discussion on racial discrimination in hiring, I'm about to go into the market for a young corporate associate. Should I intentionally be giving preference to candidates who are minorities? To women candidates?
|
No. You make a business decision. You hire the best candidate you can find based on the usual criteria (personality, cost, skill, etc.).
__________________
All is for the best in the best of all possible worlds.
|
|
|
01-07-2015, 12:08 PM
|
#1110
|
Moderasaurus Rex
Join Date: May 2004
Posts: 33,073
|
Re: It was HAL 9000!
Quote:
Originally Posted by sebastian_dangerfield
No. You make a business decision. You hire the best candidate you can find based on the usual criteria (personality, cost, skill, etc.).
|
Aren't we just talking about how to hire the best candidate?
__________________
“It was fortunate that so few men acted according to moral principle, because it was so easy to get principles wrong, and a determined person acting on mistaken principles could really do some damage." - Larissa MacFarquhar
|
|
|
 |
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
|