Quantcast
Channel: Business Insider
Viewing all articles
Browse latest Browse all 126779

There's A Big Disconnect In The Way Apple Enforces Its Anti-Porn Policy (AAPL)

$
0
0

tim cook

After an outbreak of user-generated porn on Twitter's new video-sharing app Vine, there's been a big debate about how such content should be managed by Apple, which explicitly prohibits porn in apps for iPhone and iPad.

But much of it misses the point.

The story here isn't the porn on Vine or other apps with user-generated content. The story isn't that Safari can look up porn too. (It's an old joke that if Apple hates porn, it should ban its own Web browser). The story is the disconnect in Apple's written policy on pornographic content and the way it enforces that policy.

A few examples:

Apple's App Store guidelines are very clear about pornography in apps:

Apps containing pornographic material, defined by Webster's Dictionary as "explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings", will be rejected

Apps that contain user generated content that is frequently pornographic (ex "Chat Roulette" Apps) will be rejected 

But how Apple enforces those guidelines isn't so clear.

Some apps, such as Web browsers like Google Chrome, offer a warning before downloading that says the app could have "mature/suggestive content." Other apps like Twitter and Tumblr, where you can easily find porn, do not have a such a warning.

Apple has a history of removing other apps that make it easy to find pornographic user-generated content.

For example, it removed Viddy, a video-sharing app that's very similar to Vine, last year because there was a bunch of porn in the app. It removed photo-sharing app 500px last week because the app had a lot of nude photos. Viddy eventually returned to the App Store after making some adjustments. 500px's COO Evgeny Tchebotarev told us a new version of the app was submitted to Apple on Friday with new filters that help weed out nude photos.

Vine clearly has a lot of "user generated content that is frequently pornographic," yet the app is still available for download in the App Store. It was featured as an "Editor's Pick" at the top of the iPhone version of the App Store, but it disappeared from that spot this afternoon. Apple won't comment on the issue, so we can't tell if this is related to the porn controversy or not.

So now for the big question: Why is Vine getting preferential treatment and remaining in the App Store, even though it's clearly violating Apple's anti-porn guidelines?

Again, Apple declined to comment for this story or anything else related to the matter. Twitter did not respond to repeated requests for comment.

So we're left to speculate.

Here's what we do know.

We know Twitter and Apple have a very close relationship. Twitter is built into Apple's iOS operating system for iPhones and iPads. It's also built into the latest version of Mac OS X, the operating system for Apple laptops and desktops.  

Twitter appears to be getting preferential treatment in the App Store when it comes to pushing out new updates to Vine. There have already been three versions of Vine since it launched last week, a rare occurrence for a brand new app.

We know Vine has a reporting feature that you can use to flag inappropriate content to Twitter. Twitter said in a statement yesterday that it will remove flagged videos that show pornographic content and other lewd material. 

Finally, we know Vine began blocking searches today for hashtags that commonly feature pornographic content including "#sex," "#porn," and "#boobs."

All of that could work in Twitter's favor and shield Vine from being yanked by Apple.

Why does it matter? 

Apple's App Store is big deal. Over the years it's spawned millionaires and enriched billionaires. It's created video game empires. And a lot of those apps, most notably Instagram, which Facebook bought for roughly $1 billion, rely on user-generated content. It's very easy for pornographic images to sneak onto such apps.

Without a clear description on how Apple enforces its anti-porn policy, startups are left guessing. There's a lot at stake.

So what's the answer?

Tchebotarev has a good take on the issue. In an email, he said there are plenty of apps in the App Store that are still available for download and provide easy access to pornographic content. The best solution is to have a consistent reviewing and monitoring process for apps.

"I believe that as long as there's a consistency and transparency in how apps are reviewed, it would make it a lot easier for developers and users alike," Tchebotarev said.

Another, and much more likely scenario, comes from iMore's Rene Ritchie who says Apple could add a ratings system or warning for apps like Vine that have a lot of user-generated content.

"I say slap the same mature content warning on social sharing that they slap on full blown web access," Ritchie wrote. "It's scary, and it's not ideal, but it's pragmatic given the system as it stands."

Sounds good to us.

Please follow SAI: Tools on Twitter and Facebook.

Join the conversation about this story »




Viewing all articles
Browse latest Browse all 126779

Trending Articles