Twitter is examining first-hand in regards to the challenges of eliminating racial bias in algorithms. The social community’s Liz Kelley said the corporate had “more analysis” to do after cryptographic engineer Tony Arcieri conducted an experiment suggesting Twitter’s algorithm was biased in prioritizing images. When attaching images of Barack Obama and Mitch McConnell to tweets, Twitter appeared to entirely spotlight McConnell’s face; Obama completely popped up when Arcieri inverted the colours, making pores and skin coloration a non-issue.
thanks to everyone who raised this. we tested for bias before shipping the model and didn't find evidence of racial or gender bias in our testing, but it’s clear that we’ve got more analysis to do. we'll open source our work so others can review and replicate. https://t.co/E6sZV3xboH
— liz kelley (@lizkelley) September 20, 2020
Kelley said that Twitter had checked for bias before using the current algorithm, but “didn’t find evidence” at the time. She added that Twitter would open source its algorithm studies to help others “review and replicate.”
There’s no guarantee that Twitter can correct this. However, the experiment does show the very real dangers of algorithmic bias regardless of intent. It could shove people out of the limelight, even if they’re central to a social media post or linked news article. You might have to wait a long while before issues like this are exceptionally rare.
Post Your Comments