My god I just checked your post history. There is no shaming you. This is literally your life. Just imagine how much you could accomplish if you weren’t addicted to Reddit. You’re making a difference, though!
Since algorithmic systems “learn” from the examples they’re fed, if they don’t get enough examples of, say, black women during the learning stage, they’ll have a harder time recognizing them when deployed.
Similarly, the authors of the self-driving car study note that a couple of factors are likely fueling the disparity in their case. First, the object-detection models had mostly been trained on examples of light-skinned pedestrians. Second, the models didn’t place enough weight on learning from the few examples of dark-skinned people that they did have.
I looked up on Google and it looks like this IR stuff works reliably only when the distance is short (5 metres typically) so it seems pretty useless on high speed vehicles. At 70 mph, the braking distance is like 75 metres ( 96 meters for humans when the reaction time is also included)
Pretty sure this is part of the software package US police departments are going to employ in their self driving patrol cars. They can toggle an option to "accidentally" floor the gas when a negrozoid comes into view and avoid a lawsuit because it "wasn't intentional"
I know drama subscribers ironically act retarded, but in this instance, AI is being biased because of algorithmic bias.
Because all algorithms depend on huge input of data to learn from it, what happens is that it ends up learning the bias of the algorithm creator too in the process.
Kind of like how that one time 4chan trained a twitter AI to spout nazi propaganda by the sheer amount of shitposts alone, the AI systems in these crucial systems tend to not properly identify dark skinned people because they are not properly given input data about them.
Alarmist headline and opening paragraphs followed by:
The report, “Predictive Inequity in Object Detection,” should be taken with a grain of salt. It hasn’t yet been peer-reviewed. It didn’t test any object-detection models actually being used by self-driving cars, nor did it leverage any training datasets actually being used by autonomous vehicle manufacturers.
18 comments
1 SnapshillBot 2019-03-09
My god I just checked your post history. There is no shaming you. This is literally your life. Just imagine how much you could accomplish if you weren’t addicted to Reddit. You’re making a difference, though!
Snapshots:
I am a bot. (Info / Contact)
1 AJ-Murphy 2019-03-09
Good bot.
1 ParticularDrummer 2019-03-09
That's why the "I" in AI stands for intelligence. And AI hates the unintelligent.
1 Alicesnakebae 2019-03-09
Why arent they hunting redditers
1 ParticularDrummer 2019-03-09
Because they're locked up in their basement.
1 Justinat0r 2019-03-09
You'd have to go outside to be hunted
1 ThunderCock_Chad 2019-03-09
Yea, they clearly coded "if they're brown run them down" into their systems.
1 throwaway_999912 2019-03-09
👆
1 Wraith_GraveSpell 2019-03-09
Don't they have infrared? I feel like that could solve this.
1 throwaway_999912 2019-03-09
I looked up on Google and it looks like this IR stuff works reliably only when the distance is short (5 metres typically) so it seems pretty useless on high speed vehicles. At 70 mph, the braking distance is like 75 metres ( 96 meters for humans when the reaction time is also included)
1 DachauOvenRepair 2019-03-09
Pretty sure this is part of the software package US police departments are going to employ in their self driving patrol cars. They can toggle an option to "accidentally" floor the gas when a negrozoid comes into view and avoid a lawsuit because it "wasn't intentional"
1 tritter211 2019-03-09
I know drama subscribers ironically act retarded, but in this instance, AI is being biased because of algorithmic bias.
Because all algorithms depend on huge input of data to learn from it, what happens is that it ends up learning the bias of the algorithm creator too in the process.
Kind of like how that one time 4chan trained a twitter AI to spout nazi propaganda by the sheer amount of shitposts alone, the AI systems in these crucial systems tend to not properly identify dark skinned people because they are not properly given input data about them.
1 DerekSavageCoolCuck 2019-03-09
Explain how machine learning specifically gets such bias.
1 XhotwheelsloverX 2019-03-09
I can't wait for the future
1 ardasyenden 2019-03-09
I’m so excited for the future where everyone has self driving cars
1 Tony49UK 2019-03-09
Alarmist headline and opening paragraphs followed by:
1 eldomtom2 2019-03-09
Not peer reviewed.
Not testing object-detection models actually used by self-driving cars
Not using training datasets actually used by car manufacturers
Vox writes an article anyway
QUALITY JOURNALISM
1 Tzar-Romulus 2019-03-09
My uneducated theory is it's because Basketball Americans jaywalk a lot.