Twitter users will soon be able to crop their own image previews and have better control over how photos appear in the main feed.
This change comes from criticism that Twitter’s image cropping algorithm is biased about what it does and doesn’t do with photos.
Currently, users have no control over how an uploaded photo is previewed in the stream.
According to Twitter, work is being done to give users more visibility and control over what images will look like when a tweet is posted.
How an image is previewed in the Tweet Composer is not shown in users’ feeds.
Twitter intends to correct this:
“We’re prioritizing the work to reduce our reliance on cropping ML-based images by giving people more visibility and control over how their images will look in a tweet.
Going forward, we commit to following the design principles “What you see is what you get”, which simply means: The photo you see in the Tweet Composer looks like what it will look in the Tweet. “
Read on below
Why did this suddenly become a priority for Twitter?
Here is some background information on the criticism that led to this change.
Twitter image cropping controversy
When an image is uploaded to a tweet and published, it is currently cropped to 600 x 335 pixels.
This is standard regardless of the original dimensions of the image. What is not standard is which section of an image is cropped.
The cropping is done algorithmically, so Twitter may decide to crop near the top, bottom, or middle of the image.
Depending on the original size of the image, a considerable amount of detail can be cut out. This is especially true for images that are larger than they are wide.
Of course, images are fully displayed when users click on the cropped preview. However, image previews in tweets such as links are not always clicked.
Read on below
A series of repeatable tests appear to show an alleged bias of what Twitter prefers in an image preview.
Simply put, Twitter’s image preview algorithm seems to focus more on white faces than black faces.
A number of tweets demonstrating the obvious trend went viral last month.
There were examples with people in photos:
Test this to see if it’s real. pic.twitter.com/rINjaNvXaj
– Jef Caine (@JefCaine) 19th September 2020
There were examples with fictional characters:
I wonder if Twitter does this for fictional characters as well.
Lenny Carl pic.twitter.com/fmJMWkkYEf
– Jordan Simonovski (@_jsimonovski) 20th September 2020
And there were even examples with dogs:
I’ve tried dogs. Let’s see. pic.twitter.com/xktmrNPtid
– – MARK – (@MarkEMarkAU) 20th September 2020
Reply from Twitter
Twitter admits that it could have done better when designing its image preview algorithm:
“Although our previous analyzes have not shown any racist or gender-specific prejudice, we recognize that the way in which we automatically crop photos represents a potential potential for harm. We should have foreseen this possibility better when we first designed and built this product. “
Twitter intends to make changes to reduce the obvious bias shown in the examples above:
“We are aware of our responsibility and want to work towards ensuring that everyone can better understand how our systems work. While no system can be completely free of bias, we will continue to minimize bias through targeted and thorough analysis and will provide updates as we advance in this area. “
Read on below
The exact changes Twitter is making and when they will be introduced are currently unknown.
The company is currently developing a solution. “There’s a lot to do,” says Twitter.
This is probably a better approach than posting an update and potentially making the situation worse.
Twitter says it will share additional updates as they become available.