Column Right

ads header

Breaking News

Instagram Introduces Video Selfies To Verify The Age Of Teens


 

Twitter, Snapchat, Facebook – all of the social media platforms have been trying out a trial for videos based on selfies. In February 2018, the Federal Trade Commission (FTC) approved an app called MobileMe, which uses AI technology to verify who is sharing such photos and videos on mobile devices like Apple’s iOS or Android, but not Facebook or any other platform. This was in response to Snap’s release last year that let young users post selfies when they were teens, even if they could not be found at home. According to CNBC, the idea came from CEO Brian Acton who said he wanted to allow youngsters to share what photos they were using through smartphones rather than simply posting them online. The App Store Review process will require two things to happen. First, adults must approve the application’s “app store review” request. Second, Instagram users will receive confirmation about whether and how to activate it. The trial may begin as soon as May 1. You can download it here. As you can see from the image below, it looks more like this is only available to iPhone owners and iPad users from now until July 1st, but others are expected to follow shortly after. There are still no plans yet on launching support for Google Android. However, there are signs it is going to go live even sooner than that.

However, for one reason or another, there’s hope that Instagram might allow its users to verify their age by uploading video posts themselves. Or maybe these photos would help with enforcing the rule requiring people under 18 and in possession to show proof of a passport. Maybe they’ll also provide users with a way to prove they’re over the age limit because everyone has to reach 18 at some point. One thing they definitely need to do is test the system from different angles before letting anyone try to monetize the account. When it comes to older users, Facebook has already provided several tools to verify their ages since the beginning. Most notably, they added a feature in the Messenger app to notify users if someone else tries to purchase a Facebook-owned business. It seems like Instagram just wants to tap into the same methods. Since younger users are less likely to upload content from public accounts, Instagram can use its own verification platform to assure the demographic of why the person uploading the photo is actually a legitimate user of the platform. And it can also make changes depending on whether the owner is of legal age or not. The exact time frame and timeline aren’t known just yet, but the plan is very similar to Twitter’s experiment in allowing verified Tweets.

Image by StockSnap from Pixabay 

It is important for Instagram to get the approval first before it goes live and then put up ads or start collecting data. If kids don’t find it too easy to verify age by looking at videos posted by their peers from across platforms like other social networks because they aren’t quite mature enough yet, then Instagram can easily enforce such requirements in this trial. But we’ve already seen that their policies are pretty harsh, especially for minors and young users, and we shouldn’t expect anything less from them when it comes to giving us a peek at their future product. Once they’ve got everything set up, it will hopefully be easier on Instagram to finally bring out many of the features they were promised earlier this year. So if anyone gets interested in creating a private @instagram account, we can only wait and see what happens next.

No comments