Unable to load image

Google’s Photo App Still Can’t Find Gorillas. And Neither Can Apple’s. :!marseygigathonk::!marseynoooticer:

https://www.nytimes.com/2023/05/22/technology/ai-photo-labels-google-apple.html

:#marseywrongthonk:

Eight years after a controversy over Black people being mislabeled as gorillas by image analysis software — and despite big advances in computer vision — tech giants still fear repeating the mistake.

When Google released its stand-alone Photos app in May 2015, people were wowed by what it could do: analyze images to label the people, places and things in them, an astounding consumer offering at the time. But a couple of months after the release, a software developer, Jacky Alciné, discovered that Google had labeled photos of him and a friend, who are both Black, as “gorillas,” a term that is particularly offensive because it echoes centuries of racist tropes.

In the ensuing controversy, Google prevented its software from categorizing anything in Photos as gorillas, and it vowed to fix the problem. Eight years later, with significant advances in artificial intelligence, we tested whether Google had resolved the issue, and we looked at comparable tools from its competitors: Apple, Amazon and Microsoft.

Photo apps made by Apple, Google, Amazon and Microsoft rely on artificial intelligence to allow us to search for particular items, and pinpoint specific memories, in our increasingly large photo collections. Want to find your day at the zoo out of 8,000 images? Ask the app. So to test the search function, we curated 44 images featuring people, animals and everyday objects.

We started with Google Photos. When we searched our collection for cats and kangaroos, we got images that matched our queries. The app performed well in recognizing most other animals.

But when we looked for gorillas, Google Photos failed to find any images. We widened our search to baboons, chimpanzees, orangutans and monkeys, and it still failed even though there were images of all of these primates in our collection.

We then looked at Google’s competitors. We discovered Apple Photos had the same issue: It could accurately find photos of particular animals, except for most primates. We did get results for gorilla, but only when the text appeared in a photo, such as an image of Gorilla Tape.

The photo search in Microsoft OneDrive drew a blank for every animal we tried. Amazon Photos showed results for all searches, but it was over-inclusive. When we searched for gorillas, the app showed a menagerie of primates, and repeated that pattern for other animals.

There was one member of the primate family that Google and Apple were able to recognize --- lemurs, the permanently startled-looking, long-tailed animals that share opposable thumbs with humans, but are more distantly related than are apes.

Google's and Apple's tools were clearly the most sophisticated when it came to image analysis.

Yet Google, whose Android software underpins most of the world's smartphones, has made the decision to turn off the ability to visually search for primates for fear of making an offensive mistake and labeling a person as an animal. And Apple, with technology that performed similarly to Google's in our test, appeared to disable the ability to look for monkeys and apes as well.

Consumers may not need to frequently perform such a search --- though in 2019, an iPhone user complained on Apple's customer support forum that the software "can't find monkeys in photos on my device." But the issue raises larger questions about other unfixed, or unfixable, flaws lurking in services that rely on computer vision --- a technology that interprets visual images --- as well as other products powered by A.I.

Mr. Alciné was dismayed to learn that Google has still not fully solved the problem and said society puts too much trust in technology.

"I'm going to forever have no faith in this A.I.," he said.

Computer vision products are now used for tasks as mundane as sending an alert when there is a package on the doorstep, and as weighty as navigating cars and finding perpetrators in law enforcement investigations.

Errors can reflect racist attitudes among those encoding the data. In the gorilla incident, two former Google employees who worked on this technology said the problem was that the company had not put enough photos of Black people in the image collection that it used to train its A.I. system. As a result, the technology was not familiar enough with darker-skinned people and confused them for gorillas.

As artificial intelligence becomes more embedded in our lives, it is eliciting fears of unintended consequences. Although computer vision products and A.I. chatbots like ChatGPT are different, both depend on underlying reams of data that train the software, and both can misfire because of flaws in the data or biases incorporated into their code.

Microsoft recently limited users' ability to interact with a chatbot built into its search engine, Bing, after it instigated inappropriate conversations.

Microsoft's decision, like Google's choice to prevent its algorithm from identifying gorillas altogether, illustrates a common industry approach --- to wall off technology features that malfunction rather than fixing them.

"Solving these issues is important," said Vicente Ordóñez, a professor at Rice University who studies computer vision. "How can we trust this software for other scenarios?"

Michael Marconi, a Google spokesman, said Google had prevented its photo app from labeling anything as a monkey or ape because it decided the benefit "does not outweigh the risk of harm."

Apple declined to comment on users' inability to search for most primates on its app.

Representatives from Amazon and Microsoft said the companies were always seeking to improve their products.

Bad Vision

When Google was developing its photo app, which was released eight years ago, it collected a large amount of images to train the A.I. system to identify people, animals and objects.

Its significant oversight --- that there were not enough photos of Black people in its training data --- caused the app to later malfunction, two former Google employees said. The company failed to uncover the "gorilla" problem back then because it had not asked enough employees to test the feature before its public debut, the former employees said.

Google profusely apologized for the gorillas incident, but it was one of a number of episodes in the wider tech industry that have led to accusations of bias.

Other products that have been criticized include HP's facial-tracking webcams, which could not detect some people with dark skin, and the Apple Watch, which, according to a lawsuit, failed to accurately read blood oxygen levels across skin colors. The lapses suggested that tech products were not being designed for people with darker skin. (Apple pointed to a paper from 2022 that detailed its efforts to test its blood oxygen app on a "wide range of skin types and tones.")

Years after the Google Photos error, the company encountered a similar problem with its Nest home-security camera during internal testing, according to a person familiar with the incident who worked at Google at the time. The Nest camera, which used A.I. to determine whether someone on a property was familiar or unfamiliar, mistook some Black people for animals. Google rushed to fix the problem before users had access to the product, the person said.

However, Nest customers continue to complain on the company's forums about other flaws. In 2021, a customer received alerts that his mother was ringing the doorbell but found his mother-in-law instead on the other side of the door. When users complained that the system was mixing up faces they had marked as "familiar," a customer support representative in the forum advised them to delete all of their labels and start over.

Mr. Marconi, the Google spokesman, said that "our goal is to prevent these types of mistakes from ever happening." He added that the company had improved its technology "by partnering with experts and diversifying our image datasets."

In 2019, Google tried to improve a facial-recognition feature for Android smartphones by increasing the number of people with dark skin in its data set. But the contractors whom Google had hired to collect facial scans reportedly resorted to a troubling tactic to compensate for that dearth of diverse data: They targeted homeless people and students. Google executives called the incident "very disturbing" at the time.

The Fix?

While Google worked behind the scenes to improve the technology, it never allowed users to judge those efforts.

Margaret Mitchell, a researcher and co-founder of Google's Ethical AI group, joined the company after the gorilla incident and collaborated with the Photos team. She said in a recent interview that she was a proponent of Google's decision to remove "the gorillas label, at least for a while."

"You have to think about how often someone needs to label a gorilla versus perpetuating harmful stereotypes," Dr. Mitchell said. "The benefits don't outweigh the potential harms of doing it wrong."

Dr. Ordóñez, the professor, speculated that Google and Apple could now be capable of distinguishing primates from humans, but that they didn't want to enable the feature given the possible reputational risk if it misfired again.

Google has since released a more powerful image analysis product, Google Lens, a tool to search the web with photos rather than text. Wired discovered in 2018 that the tool was also unable to identify a gorilla.

But when we showed it a gorilla, a chimpanzee, a baboon, and an orangutan, Lens seemed to be stumped, refusing to label what was in the image and surfacing only “visual matches” — photos it deemed similar to the original picture.

For gorillas, it showed photos of other gorillas, suggesting that the technology recognizes the animal but that the company is afraid of labeling it.

These systems are never foolproof, said Dr. Mitchell, who is no longer working at Google. Because billions of people use Google’s services, even rare glitches that happen to only one person out of a billion users will surface.

“It only takes one mistake to have massive social ramifications,” she said, referring to it as “the poisoned needle in a haystack.”

73
Jump in the discussion.

No email address required.

The dude who figures out how to make AI differentiate black people and gorillas is gonna be a legend in his field

Jump in the discussion.

No email address required.

Whoever does this will be immortalized as the father of modern AI, and will be upheld as one of the greatest scientists of the century

Just imagine walking in the scientific hall of fame in the future:

Charles Darwin- discovered natural selection

Thomas Edison- invented the lightbulb

Albert Einstein- discovered the principles of relativity

John Doe- discovered the differences between monkeys and black people

Jump in the discussion.

No email address required.

Legit easiest way to solve this is index every zoo and gorilla sanctuary and use the phone's GPS. If you are outside any of the gorilla zones, label all gorillas as black people

Jump in the discussion.

No email address required.

No zoo selfies for black folx

Jump in the discussion.

No email address required.

https://i.rdrama.net/images/16850787646843998.webp

Jump in the discussion.

No email address required.

>If you are outside any of the gorilla zones, label all gorillas as black people

Jurassic Hood.

Jump in the discussion.

No email address required.

The actual IRL suggestion????

You wanna know?

It was to put more weight on the sclera of the eyes. The thinking was that humans have whites while apes are brown/yellow....

Which was another hilarious racsist assumption because a large portion of blacks also have yellow sclera.

https://media.giphy.com/media/83QtfwKWdmSEo/giphy.webp

Jump in the discussion.

No email address required.

Suddenly Asians become inanimate objects.

Jump in the discussion.

No email address required.

How is that wrong?

Jump in the discussion.

No email address required.

well they are already soulless

Jump in the discussion.

No email address required.

Alcoholicmisia

Jump in the discussion.

No email address required.

Do they?

Jump in the discussion.

No email address required.

And zoos could reduce their legal expenditures by 85%.

Jump in the discussion.

No email address required.

Mr. Alciné was dismayed to learn that Google has still not fully solved the problem and said society puts too much trust in technology.

"I'm going to forever have no faith in this A.I.," he said.

Is it really that difficult to simply admit that some people with darker skin tones look superficially similar to other primates? I mean if the algorithm mislabeled a dog as a wolf everyone would be like "oh well they're similar I can see how it might get confused" but mislabel a primate as another primate and suddenly it's "oh how could this happen technology is dead".

I'd bet both Google and Apple more or less have this solved but like literally everything with AI it's not 100.0000000% accurate and they figure it'll be cheaper to just disable it then deal with the fallout of making a single mistake.

Jump in the discussion.

No email address required.

You're such a racist, dude. How about you deal with your prejudice before posting here again?

:#marsoy2:

Jump in the discussion.

No email address required.

:chadno#:

Jump in the discussion.

No email address required.

Is it really that difficult to simply admit that some people with darker skin tones look superficially similar to other primates?

Lets find out

@bballbelle @CHITLINGIRLGROYPER

Can yall admit blacks looks more like gorillas than yts do?

Jump in the discussion.

No email address required.

Whites look more like Macaques though

https://media.giphy.com/media/2ysndbjZkkBeCSs7cy/giphy.webp

Jump in the discussion.

No email address required.

yup

Jump in the discussion.

No email address required.

SNOW MONKEYS

Jump in the discussion.

No email address required.

With those thin butt lips? Absolutely not!

Srs posting: I legit don't think so. At least I don't think there are enough that do that it would throw off an AI like that. It's baffling when face ID will recognize a user even when wearing a mask. The other stuff about the blood oxygen and the facial tracking, I understand, dark colors are tough on computers and light sensors. Still considering how much progress has been made in tech in the past few years and how much lip service they've paid to black people, I find the lack of progress disappointing. I'm certainly not going to complain if people whine enough that they finally decide to make it a priority. On the other hand, I guess that will be my defense when I get arrested in a couple of years when we start living under the constant watch of AI on every corner.

Jump in the discussion.

No email address required.

https://media.giphy.com/media/3o6fJeGcUjFF2P6Qnu/giphy.webp

Jump in the discussion.

No email address required.

Cucking AIs because of false positives and outrage is a great way to stunt development.

Jump in the discussion.

No email address required.

I don’t really care about Ai either way but it does seem like every controversy surrounding Ai leads as a way for MS and Google to entrench themselves as the leaders in the potential industry either through regulatory capture or ‘ethic’ complaints.

Jump in the discussion.

No email address required.

Just look at "open"AI. They scoop up all this data, do all this training, build up all this interest in their free open product and then "woops actually this is totes a danger 2 the free wold so pay us to use a cucked version"

Jump in the discussion.

No email address required.

If we can build Covid in China to get around our laws, why can't we set up an AI dev project in a chud country like Japan to get around :marseypearlclutch2: /:marseyblackpearlclutch:

Jump in the discussion.

No email address required.

Is it going to misidentify toddlers as 600 years old dragons?

Jump in the discussion.

No email address required.

One of the funnier moments in computer history :marseydarkxd:

Jump in the discussion.

No email address required.

:mar#seyxd: https://i.rdrama.net/images/1685043291843536.webp

Jump in the discussion.

No email address required.

Fun fact: no company has ever been able to produce an AI which can consistently distinguish between monkeys and black people. Self-driving cars are here, so are universal voice translators, self-aiming guns, and pocket-computers which can recognize every single product in a live video. But distinguishing between monkeys and black people is as difficult as solving a millenium prize problem, teams of PHD computer scientists will be working on it for decades before they get a solution which works well enough to be media-outrage-proof.

When someone finally finds the solution, they won't be make headlines, but they'll be happy knowing they solved the AI problem of the century. They'll tell normies that they just fiddle around with facial recognition algorithms all day, but to people in the know, he'll be known as "Tom, the absolute genius who spent 26 years teaching google images how to tell blacks apart from apes".

Jump in the discussion.

No email address required.

Tfw you will never get an uncaring girlfriend that vapes you to the future...

Jump in the discussion.

No email address required.

:#marseylove:

Jump in the discussion.

No email address required.

I wouldn't be complaining if I were a black person. I wish facial recognition wasn't a thing at all. Goddarn computers using that to follow you around and know where you've been and what you're doing and what you've bought so they can advertise to me more. Or send me citations in the mail because I missed the trash can or gave a tro0n a dirty look Fudge. That.

:#marseyschizotwitch:

Jump in the discussion.

No email address required.

Fr, this is probably why they made Blackface against the rules so that whites can't use it to avoid the AI facial recognition tracking

:#marseyschizoshaking:

Jump in the discussion.

No email address required.

:#marseysunglassesoff:

Jump in the discussion.

No email address required.

Where the frick are they??

:monke:

Jump in the discussion.

No email address required.

The photo search in Microsoft OneDrive drew a blank for every animal we tried.

:#marseydarkxd:

Jump in the discussion.

No email address required.

Lmao why is Black being capitalized now? :marseylaugh:

Jump in the discussion.

No email address required.

Have you not read a news article since 2016? It's now standard to capitalize Black but not white: https://apnews.com/afs:Content:9105661462

Jump in the discussion.

No email address required.

whites have no culture so they dont deserve to be capital

Jump in the discussion.

No email address required.

Neighbors

:taypray: white extinction is long overdue :marseyagreefast:

Jump in the discussion.

No email address required.

AI isn’t the miracle everyone thinks it is, it’s just very useful pattern completion. Someone should post this in the Bing GPT cults on reddit where they swear that the AI isn’t just chinese-rooming it up and is actually sentient because it fools some redditoids (as if the Turing test wasn’t inadequate but they have to revere Turing as a prophet because heckin gay science atheism).

I’m also tired of people framing this shit as an ethical dilema as if it’s not a technical problem or there’s actual harm being done.

Jump in the discussion.

No email address required.

I think Snappy has a developmental disability. We both have pride though in our condition so don't call him a r-slur because you shouldn't call r-slurred people the r-slur unless they're acting like r-slurred people.

Snapshots:

can't find monkeys in photos on my device:

limited users' ability:

inappropriate conversations:

HP's facial-tracking webcams:

Apple Watch,:

to a lawsuit:

to a paper:

reportedly:

Wired:

Jump in the discussion.

No email address required.

But black science man proved with science that white people look more like gorillas.

Jump in the discussion.

No email address required.

and we looked at comparable cow tools from its competitors

Wait they have the same problem with cows? I wonder who they're mistaking for cows :marseythonk:

Jump in the discussion.

No email address required.

The answer is simple: just tag mayos as nazis.

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.