Instagram has introduced an artificial technology in order to prevent underage children from creating accounts and blocking adults from contacting minors they are not familiar with, on the application.
This introduced a new security feature limit how adults can contact and search for young people using the app and will prompt teens to be careful about who they’re talking to.
On Tuesday, Instagram unveiled an artificial technology that aims to prevent underage children safe. This step was taken after the concerns regarding inappropriate contact between adults and children on this platform, which like most services, sets a minimum age requirement at 13.
The platform will start using artificial intelligence to consider the age of the users when any user signs up for an account on the application in an effort to find underage users.
In a blog post, it was stated, “While many people are honest about their age, we know that young people can lie about their date of birth. We want to do more to stop this from happening, but verifying people’s age online is complex and something many in our industry are grappling with,”
“To address this challenge, we’re developing new artificial intelligence and machine learning technology to help us keep teens safer and apply new age-appropriate features.”
Moreover, the app would prevent adults from sending messages to people below the age of 18 who don’t follow them, to prevent unwanted contact. “This feature relies on our work to predict peoples’ ages using machine learning technology, and the age people give us when they sign up,” Instagram said.
The platform is further looking at other ways to make it tougher for adults who have been showing “potentially suspicious behavior” to interact with teens, including restricting these adults from seeing suggested teen accounts. “We’ll use this tool to alert the recipients… and give them an option to end the conversation, or block, report, or restrict the adult,” Instagram said.