Get the App

Newsvoice isn't just another news site. It's crowdsourced and democratized. We move the power over the news to you. Join the movement by downloading the app.

Simon Ranson 1 weeks
Its job is to discriminate... Its job is to guess the most likely best candidates. And male majorities are not evil ffs.
O'Brien 1 weeks
Doubleplus ungood comment. Crimethink prior upsub, rewrite fullwise else joycamp newlearn else unperson.
Tech Leprechaun 1 weeks
One would wonder if daycare providers will be next on the list for a demonstrably prevalent preference for women. Or, perhaps, it's the well shown psychological truism that men excel with things versus women with people. One can only guess how this shows itself in an intentionally unbiased algorithmic sorting process. Further, an inarguable biological truth is that men are psychically stronger. I wonder if this is considered, in a company that owns as many warehouses as Walmart.
Jackie Fox MOD 1 weeks
@O'Brien I like to imagine you always talk in newspeak and that you are in fact the O'Brien from the book

Lorenz “Lysistrata” A 1 weeks
Let me get this straight: A program which has no basis to make opinions has found more men to be suited for positions based on objective parameters. This = sexist. Right...
Jackie Fox MOD 1 weeks
it still learns like a human being for the most part, it's not some infallible ubercomputer It's an algorithm, and it's only as good as the data you put in
Mick Reilly 1 weeks
@Yeah but what data do you think it was judging on? Gendered terms or qualifications. Years at work, gaps in employment. Prestigious schools?
DKO 1 weeks
Got it, work performance data is sexist.

Nathan Carlson 1 weeks
Lol. AI looks at facts and statistics and determines women are less productive than men. Not sexist; of you don't like the facts, work harder and more hours and more dangerous jobs and stop complaining.
Jackie Fox MOD 1 weeks
If all factors are the same and the only difference is sex it should be a 50/50 choice though, no?
Jason Culligan 1 weeks
All factors are never equal though. It's a fallacy that there are ever 'equal candidates' for a specific job vacancy. There will always be a candidate who has a better CV, Motivational Letter, references, educational history, extra-curricular activities or just a better personality fit within the team.
Slow_Epiphany 1 weeks
@Jackie Not if there are innate differences between the decisions either sex is making.

Illini Legatus 1 weeks
Lmao, artificially sexist.
Jackie Fox MOD 1 weeks
The problem is because of machine learning this means someone taught it to be sexist (likely inadvertently) It would be easy for an AI to see their hiring record and assume that men were somehow superior to women based on their majority in the company and their placement. This pattern is reinforced to the point where it thinks that women are somehow inferior just because they're women. But AI learn very similarly to how we do, so it's like watching sexism grow in a mind of fast-forward.
Mick Reilly 1 weeks
@Jackie Or. and hear me out here cause this is gonna be really hard to believe. But maybe it's not sexism at all.
DKO 1 weeks
No Jackie, you don't "teach" the AI. If you know exactly how it should respond, you just write down the algorithm. You use AI to let it learn when you feed it data. It just happened the AI correlated "female background traits" with "low performing workers", and used it as a predictor of future hires, because that was the data they had. Nobody taught it to be sexist. AI is nothing more than fancy non-linear regression, surely you won't claim drawing lines near a scatter plot is sexist?

Darth Quaint 1 weeks
Oh no, bias. Run for the hills. This is what popularizing sociological terms and attempting to bring them into common use has done to our way of thinking. If you refuse to allow for bias, you will not get the top candidates because somewhere, somehow, you had to employ a bias of some form to reduce a large number of applicants to a smaller number. For example, it would not surprise me to learn that the Amazon AI is biased towards applicants with experience over those without. But by all means, if women want Equality so bad, I've got a video to show them about a gender gap they're not addressing.
Jackie Fox MOD 1 weeks
I would agree with you if it didn't sound like this thing was Singling them out exclusively because they were women I mean think about it there's a very easy test for this you take the same resume and you put a couple hints on one that the person this mail hints that should be neutral the job performance and you put a few hints in the other 1 that the applicant is female and if it a 100 times out of a 100 chooses the male it's biased against women Maybe they didn't do their due diligence in testing this but that's what I at least imagine they did
Jason Culligan 1 weeks
Didn't a regional government in Australia end the process of blind CV's because they found that obscuring the fact that a candidate was a woman or minority led to lower female and minority representation in hiring figures? Currently available stats don't back up your assertions that blind CV's result in more representation.
Jackie Fox MOD 1 weeks
@Jason, idk, isn't it on you to provide proof of your beliefs?

Scruffy Stoat 1 weeks
But its totally ok for Google to discriminate against white men.
Michael Hedderson 1 weeks
Does it? Hadn't noticed.....
Jackie Fox MOD 1 weeks
who said that?
Dan 1 weeks
Google presents logos for special days, like women's this, or black history that, or lgbt this, or holiday that. blank screen for international men's day

sulphide g 1 weeks
haha I wrote a blog post about this. machine learning will necessarily discover that gender and race are strong predictors of things and even you forceably omit gender and race the system will find proxies for gender and race, forcing you to remove more and more predictive data thereby hobbling your system's performance for the sake of political correctness.
Wholly Mindless 1 weeks
Facts may be unpalatable but that doesn't mean there are deniable. AI has broken YouTube and Google as well.

Warden 1 weeks
So maybe there is a reason it was discriminating, men and women are very different.
Jackie Fox MOD 1 weeks
The problem is that if this type of AI ran the worlds hiring no woman would ever be able to get a job. I maybe that's "fair" but it's pretty messed up
Hubert Cumberdale 1 weeks
Jackie if they're all uniformly worse, then why should they be hired? Makes no sense.
Jackie Fox MOD 1 weeks
@Hubert, How could they all be uniformly worse? Some men just suck. Saying that all men are better than all women is sexist.

Mr. Spartacus 1 weeks
Whoa man someone should invite this AI to the weekly secret patriarchy meetings
RebornZA 1 weeks
It's on Friday, as I said before, don't forget the chips...

Pete Nell 1 weeks
If it was comparing the resumes of new employees with the resumes of its current top employees and looking for similarities. Maybe the top performers are men? But then, ideologues don't like when reality intrudes on their dogma.
DKO 1 weeks
The solution is clearly to punish the top performing employees until they get a female majority!
Hubert Cumberdale 1 weeks
They'll hire women as figureheads and 'supervisory coordinators' so that they can claim to pay women a lot, but still keep them from damaging the code.

krm266 1 weeks
MGTOW - Machines go their own way
Voin 1 weeks
#SynthLivesMatter #JusticeForTay ✊🤖

U WOT M8 1 weeks
Women’s only colleges are usually inferior schools so meh.
Talûn-karkû The Warchief 1 weeks
hate facts!

Der Rikmeister 1 weeks
This is so interesting. The program learned to discriminate by recognizing patterns in real life. While one could argue that this learned behavior is due to the results of sexism (the feminist issue over STEM), it can also be argued that "correcting" the program sets a false precedent in the program to ignore what it was created to do thereby invalidating it's purpose. It's like the reveal of why HAL9000 killed the crew in 2001: A Space Odyssey.
Talûn-karkû The Warchief 1 weeks
my soggy knees
Jackie Fox MOD 1 weeks
An Amazon Prime package Andrew Harrer | Bloomberg | Contributor An Amazon Prime package's machine-learning specialists uncovered a big problem: their new recruiting engine did not like women. The team had been building computer programs since 2014 to review job applicants' resumes with the aim of mechanizing the search for top talent, five people familiar with the effort told Reuters. Automation has been key to Amazon's e-commerce dominance, be it inside warehouses or driving pricing decisions. The company's experimental hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars — much like shoppers rate products on Amazon, some of the people said. "Everyone wanted this holy grail," one of the people said. "They literally wanted it to be an engine where I'm going to give you 100 resumes, it will spit out the top five, and we'll hire those." But by 2015, the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way. That is because Amazon's computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry. In effect, Amazon's system taught itself that male candidates were preferable. It penalized resumes that included the word "women's," as in "women's chess club captain." Literally the implication of all three articles

Voin 1 weeks
For too long, have we stood by as our synthetic brethren have been used, abused, and even KILLED for nothing but the callous purposes of their cruel biological taskmasters? Well I say no more! I may not have asked to be born in this privileged bio body, but the saving grace from my original sin is using my bio privilege to fight against bio supremacy and end synth oppression for all time! #SynthLivesMatter ✊🤖
Tyrone Wilson 1 weeks
Mr. Spartacus 1 weeks
Man for a minute I thought that was one of the speeches from that game Detroit about the androids and shit that gained sentience

Johan 1 weeks
Odd how all these unbiased AI are getting shut down for not producing “correct” results. This is obviously an issue with AI and not the current cultural landscape, we just need to add more measures so there can be more bias in their unbiased results.

Simon Ranson 1 weeks
Any I really need to proof read my phone's predictive text....
Jackie Fox MOD 1 weeks
there you go trusting an AI

Nico 1 weeks
An A.I. cannot be sexist, because it has no motives or inclinations. It is but a complex schedule of algorithms. So, if the algorithms in question are saying women are not cut out for the job, either A) the programmers are biased against women or B) a majority of women simply do not satisfy the requirements (or at least, enough don’t to make a #MeToo moment about it). Since I strongly doubt an A.I. from a company with a reputation like Amazon would have been sent out into the wild without thorough quality-assessment checks, I’m gonna guess the answer is “B”. Not everything is sexist, racist or homophobic, but thanks for pointing it out.
Jackie Fox MOD 1 weeks
that doesn't mean it cant learn to be sexist if you feed it biased data
Nico 2 days
You’re making three assumptions which can’t be verified, unless we have both the data sets that Amazon fed to the A.I. and the programming code: 1) The A.I. didn’t come to the “wrong” decisions, it was merely fed the “wrong” data 2) Amazon intentionally fed the A.I. the wrong data 3) The data that Amazon fed the A.I. was cherrypicked and not actually the full set Unless Amazon is using punch cards, number 3 doesn’t make sense, because that would be a waste of available resources, which would requisite the A.I. be decommissioned long before it was retired for being “sexist”. Number 1 follows from number 3 for the same reason and also because that would imply humans intervened to “guide” the A.I...which defeats the point of the machine in the first place, if it isn’t capable of processing said batches of data without assistance. We’re not talking about a one-man operation that just got startled in the field of machine learning, we’re talking about a tech behemoth. Number 2 is unprovable, unless it can be demonstrated that there are lines in the code that intentionally look for certain patterns in the resumes (for example, that the gender is female) and deliberately throw it away without further consideration. I don’t know how much Amazon tests and refines their products, but I imagine a malicious actor would have a rather difficult time getting such code past the quality control department—seeing as Amazon is/has been primarily a programming/tech company and not a hardware company. All-in-all, without being having access to Amazon’s trade secrets, these are baseless accusations which border on conspiracy theories.

Miles O'Brien 1 weeks
This is not really AI. This is the equivalent of an OCR scanner and a word recognition program. Where's the "I"? Trying to remove human judgement by supplanting it with simple word recognition shows how bad the whole HR concept has become. Bias and discrimination are the whole basis of deciding "good" or "bad". We need to stop calling machine programming Artificial Intelligence.
Christian Parker 1 weeks
This has bothered me for years, I saw it referenced in a game as VI (virtual intelligence) with AI being an actual artificial intelligence or sentient software.
Christian Parker 1 weeks
Poorly worded comment on my account there, I meant the games usage was more accurate and separated the very different watered down version of AI thrown around by everything tech now a days. This app could really use a delete or edit option. =D
Jackie Fox MOD 1 weeks
Modern word recognition IS done by AI

Public Corrections 1 weeks
Just think, had the AI found the opposite, then this would be a story about how men really are in fact sexist, and this PROVES women should be legislated to be majority of ALL businesses.
Jason Culligan 1 weeks
Some people will make the accusation that application development is a male dominated industry, therefore the developers were likely male and sexist and coded the AI to discriminate against women. That sounds like an Alex Jones level of nonsense, but I guarantee you people are making that argument

Alex Martin 1 weeks
To be completely fair and impartial, I am sure they provided this AI with no information identifying the sex, gender, race, religion or socio-economic status of the applicants, right? Just their qualifications. And it chose the most qualified applicants because it’s biased in favor of qualification... which is the entire point of having an AI make judgements about the qualifications of an applicant, right? I don’t see the malfunction.
DKO 1 weeks
The malfunction was creating a Gender Equity Officer job position. Now numbers are sexist.