Get the App

Newsvoice isn't just another news site. It's crowdsourced and democratized. We move the power over the news to you. Join the movement by downloading the app.

SimonR 8 months
Its job is to discriminate... Its job is to guess the most likely best candidates. And male majorities are not evil ffs.
O'Brien 8 months
Doubleplus ungood comment. Crimethink prior upsub, rewrite fullwise else joycamp newlearn else unperson.
Tech Leprechaun 8 months
One would wonder if daycare providers will be next on the list for a demonstrably prevalent preference for women. Or, perhaps, it's the well shown psychological truism that men excel with things versus women with people. One can only guess how this shows itself in an intentionally unbiased algorithmic sorting process. Further, an inarguable biological truth is that men are psychically stronger. I wonder if this is considered, in a company that owns as many warehouses as Walmart.
Jackie MOD 8 months
@O'Brien I like to imagine you always talk in newspeak and that you are in fact the O'Brien from the book

Lorenz “Lysistrata” A 8 months
Let me get this straight: A program which has no basis to make opinions has found more men to be suited for positions based on objective parameters. This = sexist. Right...
Jackie MOD 8 months
it still learns like a human being for the most part, it's not some infallible ubercomputer It's an algorithm, and it's only as good as the data you put in
Mick Reilly 8 months
@Yeah but what data do you think it was judging on? Gendered terms or qualifications. Years at work, gaps in employment. Prestigious schools?
DKO 8 months
Got it, work performance data is sexist.

Wheaton Johnson 8 months
Lol. AI looks at facts and statistics and determines women are less productive than men. Not sexist; of you don't like the facts, work harder and more hours and more dangerous jobs and stop complaining.
Jackie MOD 8 months
If all factors are the same and the only difference is sex it should be a 50/50 choice though, no?
Jason Culligan 8 months
All factors are never equal though. It's a fallacy that there are ever 'equal candidates' for a specific job vacancy. There will always be a candidate who has a better CV, Motivational Letter, references, educational history, extra-curricular activities or just a better personality fit within the team.
Slow_Epiphany 8 months
@Jackie Not if there are innate differences between the decisions either sex is making.

Illini Legatus 8 months
Lmao, artificially sexist.
Jackie MOD 8 months
The problem is because of machine learning this means someone taught it to be sexist (likely inadvertently) It would be easy for an AI to see their hiring record and assume that men were somehow superior to women based on their majority in the company and their placement. This pattern is reinforced to the point where it thinks that women are somehow inferior just because they're women. But AI learn very similarly to how we do, so it's like watching sexism grow in a mind of fast-forward.
Mick Reilly 8 months
@Jackie Or. and hear me out here cause this is gonna be really hard to believe. But maybe it's not sexism at all.
DKO 8 months
No Jackie, you don't "teach" the AI. If you know exactly how it should respond, you just write down the algorithm. You use AI to let it learn when you feed it data. It just happened the AI correlated "female background traits" with "low performing workers", and used it as a predictor of future hires, because that was the data they had. Nobody taught it to be sexist. AI is nothing more than fancy non-linear regression, surely you won't claim drawing lines near a scatter plot is sexist?

Darth Quaint 8 months
Oh no, bias. Run for the hills. This is what popularizing sociological terms and attempting to bring them into common use has done to our way of thinking. If you refuse to allow for bias, you will not get the top candidates because somewhere, somehow, you had to employ a bias of some form to reduce a large number of applicants to a smaller number. For example, it would not surprise me to learn that the Amazon AI is biased towards applicants with experience over those without. But by all means, if women want Equality so bad, I've got a video to show them about a gender gap they're not addressing.
Jackie MOD 8 months
I would agree with you if it didn't sound like this thing was Singling them out exclusively because they were women I mean think about it there's a very easy test for this you take the same resume and you put a couple hints on one that the person this mail hints that should be neutral the job performance and you put a few hints in the other 1 that the applicant is female and if it a 100 times out of a 100 chooses the male it's biased against women Maybe they didn't do their due diligence in testing this but that's what I at least imagine they did
Jason Culligan 8 months
Didn't a regional government in Australia end the process of blind CV's because they found that obscuring the fact that a candidate was a woman or minority led to lower female and minority representation in hiring figures? Currently available stats don't back up your assertions that blind CV's result in more representation.
Jackie MOD 8 months
@Jason, idk, isn't it on you to provide proof of your beliefs?

Scruffy Stoat 8 months
But its totally ok for Google to discriminate against white men.
Michael Hedderson 8 months
Does it? Hadn't noticed.....
Jackie MOD 8 months
who said that?
Dan 8 months
Google presents logos for special days, like women's this, or black history that, or lgbt this, or holiday that. blank screen for international men's day

sulphide g 8 months
haha I wrote a blog post about this. machine learning will necessarily discover that gender and race are strong predictors of things and even you forceably omit gender and race the system will find proxies for gender and race, forcing you to remove more and more predictive data thereby hobbling your system's performance for the sake of political correctness.
Wholly Mindless 8 months
Facts may be unpalatable but that doesn't mean there are deniable. AI has broken YouTube and Google as well.

Obvy. 8 months
So maybe there is a reason it was discriminating, men and women are very different.
Jackie MOD 8 months
The problem is that if this type of AI ran the worlds hiring no woman would ever be able to get a job. I maybe that's "fair" but it's pretty messed up
Hubert Cumberdale 8 months
Jackie if they're all uniformly worse, then why should they be hired? Makes no sense.
Jackie MOD 8 months
@Hubert, How could they all be uniformly worse? Some men just suck. Saying that all men are better than all women is sexist.

Mr. Spartacus 8 months
Whoa man someone should invite this AI to the weekly secret patriarchy meetings
RebornZA 8 months
It's on Friday, as I said before, don't forget the chips...

Pete Nell 8 months
If it was comparing the resumes of new employees with the resumes of its current top employees and looking for similarities. Maybe the top performers are men? But then, ideologues don't like when reality intrudes on their dogma.
DKO 8 months
The solution is clearly to punish the top performing employees until they get a female majority!
Hubert Cumberdale 8 months
They'll hire women as figureheads and 'supervisory coordinators' so that they can claim to pay women a lot, but still keep them from damaging the code.

krm266 8 months
MGTOW - Machines go their own way
Voin 8 months
#SynthLivesMatter #JusticeForTay ✊🤖

U WOT M8 8 months
Women’s only colleges are usually inferior schools so meh.
Talûn-karkû The Warchief 8 months
hate facts!

Voin 8 months
For too long, have we stood by as our synthetic brethren have been used, abused, and even KILLED for nothing but the callous purposes of their cruel biological taskmasters? Well I say no more! I may not have asked to be born in this privileged bio body, but the saving grace from my original sin is using my bio privilege to fight against bio supremacy and end synth oppression for all time! #SynthLivesMatter ✊🤖
Tyrone Wilson 8 months
Mr. Spartacus 8 months
Man for a minute I thought that was one of the speeches from that game Detroit about the androids and shit that gained sentience

Der Rikmeister 8 months
This is so interesting. The program learned to discriminate by recognizing patterns in real life. While one could argue that this learned behavior is due to the results of sexism (the feminist issue over STEM), it can also be argued that "correcting" the program sets a false precedent in the program to ignore what it was created to do thereby invalidating it's purpose. It's like the reveal of why HAL9000 killed the crew in 2001: A Space Odyssey.
Talûn-karkû The Warchief 8 months
my soggy knees
Jackie MOD 8 months
An Amazon Prime package Andrew Harrer | Bloomberg | Contributor An Amazon Prime package's machine-learning specialists uncovered a big problem: their new recruiting engine did not like women. The team had been building computer programs since 2014 to review job applicants' resumes with the aim of mechanizing the search for top talent, five people familiar with the effort told Reuters. Automation has been key to Amazon's e-commerce dominance, be it inside warehouses or driving pricing decisions. The company's experimental hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars — much like shoppers rate products on Amazon, some of the people said. "Everyone wanted this holy grail," one of the people said. "They literally wanted it to be an engine where I'm going to give you 100 resumes, it will spit out the top five, and we'll hire those." But by 2015, the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way. That is because Amazon's computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry. In effect, Amazon's system taught itself that male candidates were preferable. It penalized resumes that included the word "women's," as in "women's chess club captain." Literally the implication of all three articles

Nico 8 months
An A.I. cannot be sexist, because it has no motives or inclinations. It is but a complex schedule of algorithms. So, if the algorithms in question are saying women are not cut out for the job, either A) the programmers are biased against women or B) a majority of women simply do not satisfy the requirements (or at least, enough don’t to make a #MeToo moment about it). Since I strongly doubt an A.I. from a company with a reputation like Amazon would have been sent out into the wild without thorough quality-assessment checks, I’m gonna guess the answer is “B”. Not everything is sexist, racist or homophobic, but thanks for pointing it out.
Jackie MOD 8 months
that doesn't mean it cant learn to be sexist if you feed it biased data
Nico 8 months
You’re making three assumptions which can’t be verified, unless we have both the data sets that Amazon fed to the A.I. and the programming code: 1) The A.I. didn’t come to the “wrong” decisions, it was merely fed the “wrong” data 2) Amazon intentionally fed the A.I. the wrong data 3) The data that Amazon fed the A.I. was cherrypicked and not actually the full set Unless Amazon is using punch cards, number 3 doesn’t make sense, because that would be a waste of available resources, which would requisite the A.I. be decommissioned long before it was retired for being “sexist”. Number 1 follows from number 3 for the same reason and also because that would imply humans intervened to “guide” the A.I...which defeats the point of the machine in the first place, if it isn’t capable of processing said batches of data without assistance. We’re not talking about a one-man operation that just got startled in the field of machine learning, we’re talking about a tech behemoth. Number 2 is unprovable, unless it can be demonstrated that there are lines in the code that intentionally look for certain patterns in the resumes (for example, that the gender is female) and deliberately throw it away without further consideration. I don’t know how much Amazon tests and refines their products, but I imagine a malicious actor would have a rather difficult time getting such code past the quality control department—seeing as Amazon is/has been primarily a programming/tech company and not a hardware company. All-in-all, without being having access to Amazon’s trade secrets, these are baseless accusations which border on conspiracy theories.

Johan 8 months
Odd how all these unbiased AI are getting shut down for not producing “correct” results. This is obviously an issue with AI and not the current cultural landscape, we just need to add more measures so there can be more bias in their unbiased results.

SimonR 8 months
Any I really need to proof read my phone's predictive text....
Jackie MOD 8 months
there you go trusting an AI

Miles O'Brien 8 months
This is not really AI. This is the equivalent of an OCR scanner and a word recognition program. Where's the "I"? Trying to remove human judgement by supplanting it with simple word recognition shows how bad the whole HR concept has become. Bias and discrimination are the whole basis of deciding "good" or "bad". We need to stop calling machine programming Artificial Intelligence.
Christian Parker 8 months
This has bothered me for years, I saw it referenced in a game as VI (virtual intelligence) with AI being an actual artificial intelligence or sentient software.
Christian Parker 8 months
Poorly worded comment on my account there, I meant the games usage was more accurate and separated the very different watered down version of AI thrown around by everything tech now a days. This app could really use a delete or edit option. =D
Jackie MOD 8 months
Modern word recognition IS done by AI

Alex Martin 8 months
To be completely fair and impartial, I am sure they provided this AI with no information identifying the sex, gender, race, religion or socio-economic status of the applicants, right? Just their qualifications. And it chose the most qualified applicants because it’s biased in favor of qualification... which is the entire point of having an AI make judgements about the qualifications of an applicant, right? I don’t see the malfunction.
DKO 8 months
The malfunction was creating a Gender Equity Officer job position. Now numbers are sexist.

Public Corrections 8 months
Just think, had the AI found the opposite, then this would be a story about how men really are in fact sexist, and this PROVES women should be legislated to be majority of ALL businesses.
Jason Culligan 8 months
Some people will make the accusation that application development is a male dominated industry, therefore the developers were likely male and sexist and coded the AI to discriminate against women. That sounds like an Alex Jones level of nonsense, but I guarantee you people are making that argument