Twitter is investigating after users discovered its picture-cropping algorithm sometimes prefers white faces to black ones.
Users noticed when two photos – one of a black face the other of a white one – were in the same post, Twitter often showed only the white face on mobile.
Twitter said it had tested for racial and gender bias during the algorithm’s development.
But it added: “It’s clear that we’ve got more analysis to do.”
Twitter’s chief technology officer, Parag Agrawal, tweeted: “We did analysis on our model when we shipped it – but [it] needs continuous improvement.”Love this public, open, and rigorous test – and eager to learn from this.”
The latest controversy began when university manager Colin Madland, from Vancouver, was troubleshooting a colleague’s head vanishing when using videoconference app Zoom.
The software was apparently mistakenly identifying the black man’s head as part of the background and removing it.
But when Mr Madland posted about the topic on Twitter, he found his face – and not his colleague’s – was consistently chosen as the preview on mobile apps, even if he flipped the order of the images.
His discovery prompted a range of other experiments by users, which, for example, suggested:
Twitter’s chief design officer, Dantley Davis, found editing out Mr Madland’s facial hair and glasses seemed to correct the problem – “because of the contrast with his skin”.
Responding to criticism, he tweeted: “I know you think it’s fun to dunk on me – but I’m as irritated about this as everyone else. However, I’m in a position to fix it and I will.”
“It’s 100% our fault. No-one should say otherwise.”