Apple’s Obsession With Women’s Lingerie

A photo classifying feature on the iPhone’s Photos app has been creating a stir on social media after Twitter users discovered that Apple had been quietly categorising women’s underwear.

The image recognition software, which makes use of an artificial intelligence algorithm to categorise lingerie photos so that they are easily searchable, has been in place for more than a year. However, it has only now come to the attention of the users of the iPhone.

There are more than 4,000 different categories that are used in the image recognition software. However, several are more noticeable than others. They include “bra,” “brassiere,” “girdle,” “bras,” and “corset.” As The Verge, a tech news site, notes, no categories for men’s underwear, such as “briefs,” or “boxers” feature on the Photos app of Apple.

The feature was first featured in June 2016 in a blog post by developer Kenny Yin, though it obtained little attention beyond developer circles at the time.

It works by providing the algorithm with thousands of different images of a certain item, together with a word that describes the said item.

“Photos app supports detecting 4,432 different scenes and objects. These scenes or objects can be searched for in all languages,” explained Yin in the blog post, which also lists the entire database of objects and scenes.

“Additionally, you can search for various landmarks. For example, Photos can respond for search query of ‘Maho’ (beach in Saint Martin), despite Photos is not programmed or trained to understand specific landmarks [sic].”

Unless a storage setting of an iPhone user is set to store images to the iCloud automatically, the AI algorithm is local to every device, and the images are not visible to Apple.

Apple is not the only tech company that was criticized for its photo-classifying algorithms. In 2015, another Twitter user posted a picture of two black people that had been categorised by the Photos app of Google as “gorillas.”

Google has since apologised for the error. However, it was not able to entirely fix the algorithm and in the end, removed the gorilla tag from its photo categorisation software.