dailyO
Technology

No, Apple isn't creating folders for 'semi-nude' pictures on your iPhone

Advertisement
Sushant Talwar
Sushant TalwarNov 01, 2017 | 18:45

No, Apple isn't creating folders for 'semi-nude' pictures on your iPhone

Machine learning and Artificial Intelligence (AI) are the new buzzwords for the smartphone industry. Of late, manufacturers have been using them in the hope of bringing on board new customers, and — to an extent — they have succeeded in demonstrating the benefits of our pocket computers being given the ability to learn and become competent on their own. But we frequently encounter incidents that make us wonder if the hype is one big joke. Some reports doing the rounds are blaming Apple for using machine learning for saving brassiere and cleavage pictures in a special folder on the phone and its iCloud servers, but if you are an iPhone user, know that this feature is not new — and not insidious.

Advertisement

Your iOS devices have been categorising such "semi-nude" pictures for a while now. More importantly, these pictures are only being sorted with a special tag inside the phone's Photos app and not saved separately in a special folder that Apple or anybody else has access to.

iphone-x3-copy_110117024531.jpg

So where did it start?

Given there's no smoke without fire, but that's not exactly true in this case.

When Twitter user @ell who has close to 20k followers on the platform realised that a quick search of the word "brassiere" inside the phone's Photos app threw up all her semi-nude pictures, it was suggested that Apple was doing so with an ulterior purpose

It didn't end there. This tweet soon got over 10k retweets and others joined in.   

Advertisement

There was genuine concern and outrage over what was being viewed as Apple violating the privacy of its users. Just that in reality it was nothing like that. 

So what is it all about?

As mentioned above, the iPhone's OS has been categorising – and for a while now – pictures of a certain kind. Interestingly, this categorisation is not unique and is part of a feature that was rolled out in 2016 along with iOS 10.

Apple, last year, introduced this feature, allowing it to sift through pictures to identify faces and popular objects in them for improving categorisation with the Photos app. It used the advanced machine learning algorithm in the newer versions of iOS to add relevant tags to all the pictures on your phone to make them more easily accessible.

But importantly, it did so locally on the phone and as such makes it safer than the times when such machine learning tasks were being carried out on alien servers. All Apple can take credit for is that it adds tags to your pictures to make sifting through them easier.

Not invasive

So, a picture of your dog will have a corresponding tag, one with a cake will have a tag to reflect that and so on. Apple has included many other category tags such as "trees", "shoes" or "hats. 

Advertisement

Thus the feature is anything but invasive or insidious as it is made out to be recently and regardless of the feature working in the background, your photos are safe. Additionally, this is not just something that is specific to Apple. Similar image recognition technology is also being used by Android in Google Photos. So calm down, and breathe easy.

Our friendly neighbourhood tech giants are not taking advantage of our data in this case. 

Last updated: November 02, 2017 | 09:47
IN THIS STORY
Please log in
I agree with DailyO's privacy policy