Snapchat's Under-16 Social Media Ban Leaves Significant Gaps, Officials Warn. The social media giant has disabled over 415,000 Australian Snapchat accounts belonging to users under 16 as part of its compliance with a law requiring platforms to ban minors from accessing their services.
The move was made in January, following the implementation of the law in December last year, which aims to restrict access to social media for individuals under 16. The government had announced that 4.7 million accounts across 10 platforms had been disabled or removed in the first few days of the ban, a figure that suggests significant progress towards enforcing the new regulations.
However, concerns have been raised about the effectiveness of Snapchat's facial age estimation technology, which was used to identify and disable young users' accounts. Reports have suggested that the technology can be easily bypassed by teenagers, who may be able to create fake profiles or use alternative apps to communicate with their peers.
Snapchat has acknowledged these limitations, stating that there are "significant gaps" in its implementation of the ban, which could leave some young people vulnerable to exploitation while others over 16 may incorrectly lose access. The company also warned that some users may turn to less regulated messaging apps to evade the restrictions.
The eSafety commissioner, Julie Inman Grant, has acknowledged these concerns and stated that her team will be sending notices to companies on how they are complying with the ban. However, she noted that addressing systemic issues and improving age assurance technology is crucial to ensuring the law achieves its intended outcomes.
While Meta and Snapchat have called for app store-level age verification, other platforms have not disclosed their own account deactivation numbers, leading to concerns about the effectiveness of the enforcement process. The eSafety commissioner has declined to provide a breakdown of the data, citing the focus on the initial 10 platforms that were required to comply with the ban.
The ongoing debate highlights the challenges of regulating social media and protecting young users from exploitation online. As policymakers continue to evaluate the success of the law, concerns about Snapchat's facial age estimation technology and the effectiveness of its implementation will remain a pressing issue.
The move was made in January, following the implementation of the law in December last year, which aims to restrict access to social media for individuals under 16. The government had announced that 4.7 million accounts across 10 platforms had been disabled or removed in the first few days of the ban, a figure that suggests significant progress towards enforcing the new regulations.
However, concerns have been raised about the effectiveness of Snapchat's facial age estimation technology, which was used to identify and disable young users' accounts. Reports have suggested that the technology can be easily bypassed by teenagers, who may be able to create fake profiles or use alternative apps to communicate with their peers.
Snapchat has acknowledged these limitations, stating that there are "significant gaps" in its implementation of the ban, which could leave some young people vulnerable to exploitation while others over 16 may incorrectly lose access. The company also warned that some users may turn to less regulated messaging apps to evade the restrictions.
The eSafety commissioner, Julie Inman Grant, has acknowledged these concerns and stated that her team will be sending notices to companies on how they are complying with the ban. However, she noted that addressing systemic issues and improving age assurance technology is crucial to ensuring the law achieves its intended outcomes.
While Meta and Snapchat have called for app store-level age verification, other platforms have not disclosed their own account deactivation numbers, leading to concerns about the effectiveness of the enforcement process. The eSafety commissioner has declined to provide a breakdown of the data, citing the focus on the initial 10 platforms that were required to comply with the ban.
The ongoing debate highlights the challenges of regulating social media and protecting young users from exploitation online. As policymakers continue to evaluate the success of the law, concerns about Snapchat's facial age estimation technology and the effectiveness of its implementation will remain a pressing issue.