Racist self-driving cars might mow down black people.

38  2019-03-09 by DismayAndUpset

18 comments

My god I just checked your post history. There is no shaming you. This is literally your life. Just imagine how much you could accomplish if you weren’t addicted to Reddit. You’re making a difference, though!

Snapshots:

  1. This Post - archive.org, megalodon.jp, archive.is

I am a bot. (Info / Contact)

Good bot.

That's why the "I" in AI stands for intelligence. And AI hates the unintelligent.

Why arent they hunting redditers

Because they're locked up in their basement.

You'd have to go outside to be hunted

Yea, they clearly coded "if they're brown run them down" into their systems.

Since algorithmic systems “learn” from the examples they’re fed, if they don’t get enough examples of, say, black women during the learning stage, they’ll have a harder time recognizing them when deployed.

Similarly, the authors of the self-driving car study note that a couple of factors are likely fueling the disparity in their case. First, the object-detection models had mostly been trained on examples of light-skinned pedestrians. Second, the models didn’t place enough weight on learning from the few examples of dark-skinned people that they did have.

👆

Don't they have infrared? I feel like that could solve this.

I looked up on Google and it looks like this IR stuff works reliably only when the distance is short (5 metres typically) so it seems pretty useless on high speed vehicles. At 70 mph, the braking distance is like 75 metres ( 96 meters for humans when the reaction time is also included)

Pretty sure this is part of the software package US police departments are going to employ in their self driving patrol cars. They can toggle an option to "accidentally" floor the gas when a negrozoid comes into view and avoid a lawsuit because it "wasn't intentional"

I know drama subscribers ironically act retarded, but in this instance, AI is being biased because of algorithmic bias.

Because all algorithms depend on huge input of data to learn from it, what happens is that it ends up learning the bias of the algorithm creator too in the process.

Kind of like how that one time 4chan trained a twitter AI to spout nazi propaganda by the sheer amount of shitposts alone, the AI systems in these crucial systems tend to not properly identify dark skinned people because they are not properly given input data about them.

Explain how machine learning specifically gets such bias.

I can't wait for the future

I’m so excited for the future where everyone has self driving cars

Alarmist headline and opening paragraphs followed by:

The report, “Predictive Inequity in Object Detection,” should be taken with a grain of salt. It hasn’t yet been peer-reviewed. It didn’t test any object-detection models actually being used by self-driving cars, nor did it leverage any training datasets actually being used by autonomous vehicle manufacturers. 

  • Not peer reviewed.

  • Not testing object-detection models actually used by self-driving cars

  • Not using training datasets actually used by car manufacturers

  • Vox writes an article anyway

QUALITY JOURNALISM

My uneducated theory is it's because Basketball Americans jaywalk a lot.