Social media age restrictions may go further than you thought. Here’s how

Australia’s eSafety Commissioner, Julie Inman Grant, today outlined an updated list of platforms that may fall under the social media age restrictions that will take effect later this year.

While Australians expected platforms such as Facebook, TikTok, Instagram, Twitter and YouTube to be included, this new list demonstrates the legislation’s reach is even broader.

Which platforms will be required to restrict access for under 16s?

When the legislation was first introduced, the government explained it would use the definition of social media outlined in Australia’s Social Media Services Online Safety Code.

The law, which comes into effect on December 10, will restrict people under 16 from holding accounts on many social media platforms.

Recently, the eSafety Commissioner introduced self-assessment guidance for companies to determine whether their platforms would be considered age-restricted.

Under this guidance, companies will be required to restrict access to platform accounts for users under 16 where:

  • material is accessible to, or delivered to, Australian users
  • users can post material
  • users can link to, or interact with, other users
  • online social interaction is the sole, or significant, purpose of the platform.

The eSafety Commissioner has written to an “initial list” of 16 companies (including Facebook, TikTok and YouTube) advising they may fall under the social media definition that requires age restrictions.

The list also named messaging and gaming platforms WhatsApp, Discord and Twitch, alongside software sharing platform GitHub, dating app company Match (which owns Tinder), message board Reddit and image-sharing platform Pinterest, among others.

What does this mean for platforms designed to engage children?

Lego Play and Roblox were also included. These platforms host games and creative tools used by many children under 16.

Earlier this month, Roblox (which has more than 380 million users globally) committed to new safety measures to address risks posed by online grooming.

The new measures include making children’s accounts private by default and ensuring adults are prevented from engaging with child users without parental consent.

However, Roblox has now been flagged as likely to meet the definition of an age-restricted platform. It may need to block children under 16 from having accounts.

Will some platforms be excluded from age restrictions?

The self-assessment guidance also outlines specific circumstances in which social media platforms can be excluded from the restrictions. Under the legislation, services are excluded where the sole or primary purpose is to enable:

  • communication by messaging, email, voice or video calling
  • playing online games
  • sharing information about products or services
  • professional networking or professional development
  • supporting education (including between educational institutions and students/parents)
  • supporting health (including between healthcare providers and people accessing services).

These exclusions require self-assessment by individual platforms. This is why the eSafety Commissioner has asked companies to make their case, in writing, and provide evidence as to why they believe they should be exempt.

WhatsApp, for example, may argue its primary purpose is communication by messaging. Lego Play may argue it is mainly an educational tool.

Companies will only have a few weeks to make their case for exclusion before the restrictions are in place.

What happens next?

In the coming weeks, we will learn more about which platforms will be included – and excluded – from age restrictions. But today’s list is only a start. There will likely be others identified through the self-assessment process.

What’s not yet clear is what happens when there is conflict over a platform’s self-assessment.

The eSafety Commissioner has flagged she is already “bracing for legal challenges” on this point. She explained the focus will first be on platforms with the greatest number of users and, therefore, the highest potential for harm.

So Australians may have some clarity about which platforms fall under – or outside – the legislation on December 10. However, some uncertainty is likely to continue well into 2026.

by : Lisa M. Given, Professor of Information Sciences & Director, Social Change Enabling Impact Platform, RMIT University

Source link

Capital Media

Read Previous

Trump scraps meeting with Democrats, raising government shutdown risk

Read Next

Why Trump’s tariffs could make the apps on your phone worse