Unlike its parent company Facebook, Instagram has never required age-verification for its users till now. That’s about to change, as the popular social media platform strives to be more responsible for protecting underage kids online. Under its new policy, Instagram will now ask new users to input their date-of-birth before registering and bar children below 13 years of age from joining. However, since the new age-restriction does not apply to Instagram’s pre-existing one billion members, any underage kids already on the platform automatically escape scrutiny. Of course, there’s always the age-old trick of putting in a fake birth date (hello, flashbacks of decades-old new facebook account log-in). Apart from this entry-barrier, Instagram will also soon start using information about users’ age to educate them about the app’s settings and new privacy controls. There’s also going to be options to only allow people you follow to message you, reply to your stories, or add you to a group.
Instagram’s new age-restriction policy seems like a classic case of too-little, too-late. In order to avoid a $40,000-fine for violating the Child Online Privacy Protection Act (COPPA) that banned services from collecting personal information from children below 13 years of age, Instagram relied on ambiguity, saying that it was ignorant about the age of its users. It said that “asking for this information will help prevent underage people from joining Instagram, help us keep young people safer and enable more age-appropriate experiences overall”. Mobile researcher Jane Manchun Wong recently spotted Instagram prototyping an age-check feature, which suggested that Instagram will keep your birthday and birth date private, and if you already have a Facebook account, Instagram will link both your accounts.
Facebook, Snapchat and even TikTok already require users to enter their birth date as soon as they start the signup process. TikTok built a whole separate section of its app where kids can watch videos but not post or comment after it was fined $5.7 million by the US FTC (Federal Trade Commission) for violating COPPA. When asked about why it took this long to add an age-restriction feature, considering that the app holds huge appeal for teenagers, Instagram said: “Historically, we didn’t require people to tell us their age because we wanted Instagram to be a place where everyone can express themselves fully — irrespective of their identity”, according to a TechCrunch report.
While introducing the age restriction is a much-needed (albeit quite late) first step, there’s considerably more that Instagram can do to not only verify users’ age but also to keep underage children safe from the perils of social media is their many forms. While keeping up industry standards is all very well, a social media platform that enjoys massive popularity and appeal, especially among adolescents and teenagers, is expected to be that much more stringent when it comes to handling sensitive data and keeping children safe.