Docn/gger
My Prada's at the cleaners
cyberdick 5mo ago#6499357
spent 0 currency on pings
AI might actually kill the "right to service" BS because you can deny people and then say the AI is a blackbox and you just trained it on an unrelated parallel. How can they ever prove that you intended to discriminate
Someone can very easily test for this by simply feeding a bunch of people in "in-group" vs "out-group" to the AI and see how it performs to prove its bias against something though. People already do this (prior to AI release) in stuff like medical AI to make sure that it doesn't perform particularly poorly for minorities and stuff.
Legislation may not be here yet for such a case but I'm sure it would make a pretty easy discrimination suit.
However, civil rights advocates have been disappointed as federal courts have increasingly limited how and when plaintiffs may file disparate-impact claims. As a result, disparate-impact suits have become less successful over time.
Jump in the discussion.
No email address required.
A lot of biofoids are gonna be taking the L to end all Ls because of this.
Jump in the discussion.
No email address required.
this happened already in straya
In Australia, women-only app becomes latest front in war over trans rights
The founder of a women-only social networking app is being sued for denying her service to a trans woman.
https://www.aljazeera.com/news/2024/4/22/in-australia-a-women-only-app-is-latest-front-in-the-war-over-trans-rights
skip to min 20 for the ai scanning !transphobes
Jump in the discussion.
No email address required.
AI might actually kill the "right to service" BS because you can deny people and then say the AI is a blackbox and you just trained it on an unrelated parallel. How can they ever prove that you intended to discriminate
Jump in the discussion.
No email address required.
Someone can very easily test for this by simply feeding a bunch of people in "in-group" vs "out-group" to the AI and see how it performs to prove its bias against something though. People already do this (prior to AI release) in stuff like medical AI to make sure that it doesn't perform particularly poorly for minorities and stuff.
Legislation may not be here yet for such a case but I'm sure it would make a pretty easy discrimination suit.
Jump in the discussion.
No email address required.
More options
Context
if you hurt someones feelings then you discriminated
Jump in the discussion.
No email address required.
More options
Context
https://www.britannica.com/topic/disparate-impact
Jump in the discussion.
No email address required.
yeah but literally first paragraph in ur link:
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
More options
Context