Age verification

On 1 June 2021, the Australian Government requested the eSafety Commissioner develop an implementation roadmap for a mandatory age verification (AV) regime relating to online pornography.

This roadmap forms part of the government’s response to the House of Representatives Standing Committee on Social Policy and Legal Affairs report, ‘Protecting the age of innocence’.

eSafety welcomes the Government’s response and commitment to protecting children from harmful online content. For young children, accidental encounters with pornography can be distressing and even harmful. For older children who stumble upon or seek out pornographic material, there is a risk that it will give them unrealistic and damaging ideas about what intimate relationships should look like – especially as the material becomes increasingly violent and extreme.

On this page:

 

Call for evidence

eSafety issued a call for evidence on 16 August 2021, seeking insights into effective age verification techniques, as well as the impact of online pornography on children and proven methods of educating young people about both respectful and harmful sexual behaviours.

eSafety has produced a thematic analysis of the evidence and insights emerging from this first phase of input from stakeholders and the public. 

Together, we are exploring these main suggestions:

  • Any online service provider that poses a risk of exposing children to pornography should have measures to prevent children gaining access.
  • Tools should not be prescribed – but any potential technological tools should meet strict safety and privacy standards, be certified and independently audited. The role of filtering and parental controls should also be considered.
  • A one-size-fits all technological solution would not be effective. Technological requirements should be proportionate and based on risk.
  • Industry should design technologies so they are easy for children and parents to understand – including information on how the technologies work and how they use, store and protect data.
  • A holistic approach should empower young people through greater education, awareness and understanding of pornography and give parents skills to support their children online. 
  • AV should not block access to vital sexuality and sexual health information for young people or restrict adults’ legal access to online pornography.

Thematic analysis of age verification submissions

This summary is a compilation of the emerging themes from the responses to the call for evidence. The views and opinions are those of the authors and do not reflect eSafety’s position. They are an important contribution to informing the next phases of consultation and the development of the implementation roadmap.

Governance

  1. There should be a single oversight body that investigates, audits, monitors and assesses compliance with the AV regime. This body should have adequate enforcement powers.
  2. Specific technologies should not be prescribed. Instead, principles of proportionality, community standards and consumer choice should guide which technologies are used. The regime should accommodate innovation and tool advancements. 
  3. A recognised body should certify AV providers. Certification should test for overall effectiveness, privacy compliance and security (including for data storage facilities).
  4. There should be ways to identify and check compliance of new sites. Sites that do not comply should get no commercial advantage.
  5. Consideration should be given to the regulatory, administrative and financial burden to both industry and the consumer when determining what makes a proportionate, feasible and effective AV regime.
  6. Businesses, not consumers, should bear the cost of using AV technologies. Tools should be cost effective for all pornography sites – from individual businesses to large-scale platforms. 

 

Alignment

  1. The regime should consider the outcomes of the review of the Australian classification system.
  2. A whole-of-government approach should be deployed to digital regulation to ensure online safety measures align with online privacy and security policy.
  3. Consideration should be given to aligning the age of access with the age of consent.

 

Scope

  1. Age verification should extend to all commercial pornography sites, not just sites which allow users to generate content. Any online service provider that poses a risk of exposing children to pornography should have measures to prevent children gaining access.
  2. Requirements should be proportionate (based on risk). Blanket requirements across all relevant platforms and services should be avoided. Industry should not have to scan for online pornographic material.
  3. Limit and focus AV tools on sites which provide direct access to pornographic content or bear the closest relationship to pornographic material. Requirements should not apply to private messaging or end-to-end encrypted (E2EE) systems.
  4. The AV regime should be holistic. It should consider mobile device filtering, ISP filtering and parental controls as well as AV tools.Consideration should be given to the regulator overseeing a list of relevant URLs, which are captured by all filtering services.

 

Privacy and security

  1. There are privacy concerns and cyber security risks relating to commercial pornography websites directly processing user data. These sites should use third-party verification tools.
  2. Age verification and assessment technologies should meet clear and transparent standards and technical requirements. They should also be certified, independently audited and demonstrate robust privacy and security settings.
  3. A data minimisation approach should be followed for AV/AA tool standards. Only age attributes should be shared between the AV/AA technologies and content hosts. Other data should not be shared.
     

  1. Studies point to accidental exposure from ages 11 to 13 – with a significant proportion of young people having viewed sexually explicit content by the age of 16.
  2. Access and exposure are not limited to pornography websites. This also occurs on gaming platforms, social media and search engines.
  3. Access and exposure to pornography for under 16s is generally seen as inappropriate. A more nuanced approach to pornography should be considered for young people who have reached the age of sexual consent (16- to 17-year-olds). 
  4. Intentional access to pornography occurs for educational purposes, self-exploration, to understand ‘expectations’ when having sex, excitement and entertainment. Lack of access to comprehensive sex education is associated with greater intentional access to pornography.
  5. The influence of pornography on the sexual practices and beliefs of young people varies. It has been associated with:
    • A greater likelihood to pressure and coerce others to perform unwanted, derogatory and violent sexual acts.
    • Influencing perceptions of sexual expectations and negatively impacting awareness, attitudes and understanding of consent.
    • Increased frequency of watching pornography is associated with a greater likelihood of accessing extreme and violent pornography, and presenting greater levels of sexual aggression, sexual objectification and sexual coercion. There are factors other than pornography which may also contribute to these attitudes and behaviours.
    • Helping young people to learn the practicalities and mechanisms of sex and explore their sexual identities. This is particularly important when this subject matter is not adequately discussed in the school curriculum or with parents or carers.
  6. The negative impacts of pornography are more pronounced in children under 14, marginalised and at-risk young people, and high frequency users of pornography.
  7. Negative impacts are mostly associated with accidental exposure to pornography and exposure to violent or extreme pornography. However, not all young people are negatively impacted by exposure to such content. 
  8. Age verification should not block access to vital sexuality and sexual health information for young people, restrict adults’ access to online pornography, or reduce safe online spaces for sex workers and the sale of adult products. 
  9. Technologies should be designed so they are easy for children and parents to understand. This includes how tools works and how we use, store and protect data.
     

  1. Educating young people on healthy sexual relationships, behaviours and sexuality can help counter the negative impacts of pornography. This includes risky or violent sexual behaviour, and reinforcing unhealthy or unrealistic expectations regarding gender, power, sex and relationships.
  2. Sex education in schools could be enhanced through:
    • Providing authorised leadership and fostering a culture that encourages age-appropriate information-sharing with young people. 
    • Introducing a national, comprehensive sexuality and relationships education curriculum that is developmentally appropriate and runs across all year levels with a focus on:
      • improving young people’s sexual literacy, which aligns respectful relationship education with digital literacy education
      • encouraging active participation, in recognition that young people see relationship and sexual health education as an important part of their psychosexual development
      • peer-led discussions among 16- to 18-year-olds and teacher or expert specialist-led discussions and workshops 
      • the risks associated with viewing online sexually explicit media.
  3. Whole-school approaches to address student wellbeing and pornography exposure are important. These approaches can include policies, staff professional development, parent and community partnerships, guidelines for student education practices and evaluation, and parental support/advice on managing technology in the home. 
  4. There should be community-based programs to support young people who are vulnerable to missing out on school-based relationship and sexual health education. These programs can be in residential care, flexible learning centres, community health centres and youth justice centres.
  5. Evidence-based public health content and programs can support young people, parents, educators and frontline professionals to talk openly about pornography and enhance the wellbeing and moral development of young people. 
    • Relevant industry stakeholders and experts should review any advice for navigating adult content online.
    • Experts or relevant organisations could be funded to develop evidence-based public health resources and training.
  6. Parents and carers should have access to guidance on the tools that can better protect children online (including verification tools, filtering and parental controls). Parents and carers are critical to providing children with key literacy skills on pornography. 
  7. Families are a significant source of information and support. However not all young people have equal access to adults they can turn to for advice. 
  8. Young people often turn to peers, so it’s important to advance young people’s sexual wellbeing and pornography literacy. 
  9. Targeted programs could be established for young people identified as problematic users of pornography.
     

  1. The economic impact and regulatory/compliance burden on both the adult industry and consumers should be considered within the roadmap.
  2. The cost of AV/AA tools may be by disproportionate and prohibitive for smaller producers and sites. 
    • The cost of implementing more stringent age verification processes may be anti-competitive for smaller producers and individual sex workers. 
    • AV/AA may disproportionately impact on LGBTQI+ and female producers.
    • AV/AA may push sex workers onto unsafe platforms and systems.
  3. AV/AA requirements should apply to every online pornography service available in Australia.
    • This provides a level playing field for all platforms and services.
    • Transparent policies, processes and guidance should exist to ensure compliance.
    • The regulator should have the capacity to take swift action against non-compliant platforms and services, so that compliant platforms and services are not disadvantaged.
  4. The role and use of social media by sex workers and industry-wide approaches to professional sex workers should be considered, as should the effect on sex worker advertising.
     

  1. Dating apps and age-restricted content-sharing platforms and apps use stricter age-checking measures than most social media platforms. These include user registration with credit card details and in some instances user verification. App stores prohibit apps that contain or promote pornography. 
  2. Filtering software and parental controls settings on mobile devices and computers can also help to prevent access to more mature content.
  3. Platforms have mechanisms for:
    • restricting the content younger users are exposed to (including advertising) 
    • blurring sensitive content, providing warnings, or making content unavailable 
    • blocking 18+ users from contacting under 18 users
    • preventing age-restricted content from being viewed by unregistered users
    • reporting and blocking inappropriate content
    • parental controls and filters which also allow parents to monitor children’s activity
    • asking for age verification through credit card or identity verification to check if a user is underage
    • blocking users who breach terms of service by using IP addresses, device signatures or other data
    • deleting accounts if users are unable to prove they meet the minimum age.
  4. Some platforms use proactive technologies and machine learning to moderate content or identify underage users, including:
    • age estimation technology to determine whether a user is under 18 or 18+
    • artificial intelligence tools that help to understand someone’s real age 
    • persistent cookies that platforms place on devices to prevent children from attempting to circumvent age restrictions (e.g., multiple attempts entering a valid birth year)
    • proactive detection tools for identifying and removing sexually explicit images or videos. 
  5. Some platforms have policies which:
    • restrict pornography, nudity and sexually explicit content and advertising
    • prohibit content which endangers the emotional and physical wellbeing of minors.
  6. Search engines can: 
    • apply filtered search tools which prevent search results for sexually explicit content or websites
    • block hyperlinks that drive traffic to commercial pornography sites, prohibit pornography ads or ads made against pornography websites 
    • remove sexual and violent terms from autocomplete search functions.
       

  1. AV and AA providers offer digital apps, application programming interfaces (APIs) or send links which individuals can use to provide their age attribute. 
    • Digital identity apps were proposed as an effective AV method as an individual’s personal information remains stored on their phone. QR codes or links can be used to connect with the app and allow for an age attribute to be shared with the requesting site or platform.
    • Physical age tokens can be used to generate an online password for any age-restricted content. Mobile operator age checks were also suggested.
    • Facial analysis technology was presented as a suitable biometric option, particularly for individuals who do not have government identity documents. It allows for one-time facial scans which estimate a user’s age – no data is stored. 
      • Submissions noted that the technology is still nascent and few providers have achieved high accuracy. 
      • Some research has raised concerns of ethnic and gender bias by some facial analysis technologies.
  2. Submissions raised some concerns regarding the use of:
    • Database checks, which may be less proportionate as they are typically used for ‘Know-Your-Customer’ (KYC) and anti-laundering or fraud regulation. Document verification can also be a more costly process. 
    • Credit card checks, which may be easy for young people to circumvent by using their parent’s or a third-party’s details.
    • Official documentation (e.g., driver’s licence or passport), which is often used for regulated, age-restricted retail and services (such as alcohol sales and gambling) and may raise privacy concerns 
    • Biometric data (such as facial recognition), which may raise privacy and security risks and concerns of surveillance. 
  3. Submissions demonstrated mixed attitudes to one-off age checks, which present less friction for user experience, and to single-use checks, which require repeat user verification. 
     

Next steps

eSafety recognises the need for extensive research and consultation to identify what a proportionate, effective, and feasible age verification system may look like when addressing children’s access to online pornography in an Australian context. 

eSafety is now embarking on targeted consultations with stakeholders, which will allow closer examination of the evidence submitted in response to the call to action.

We have already started consulting with representatives from the adult entertainment and sex work industries, local and international academics, children’s wellbeing groups and age verification and other safety technology providers. This process will continue in 2022 with additional stakeholders, including online platforms and services and digital rights advocates.

High level, anonymised summaries of these consultations will be added to this page in due course.

Following the consultations, eSafety will continue to work closely with relevant stakeholders to define the minimum requirements for an effective regime and scope its various elements. These recommendations will then be presented to the Australian Government for consideration.

The following timeline shows how the age verification roadmap fits alongside other eSafety regulatory initiatives.

Glossary of terms

Adult industry – commercial enterprises (individuals, businesses or peak bodies) involved in the sale or purchase of sex-related entertainment services.

Age assurance – the broad range of processes that can be used to establish or predict the age (or age range) of an individual. Examples include self-reported (for example stating your year of birth), confirmation of age by another person (for example a parent or peer), use of biometric information (for example face, fingerprint or voice recognition), or use of behavioural or online signals (for example digital traces or gesture patterns).

Age verification – a technical process that confirms the age of a person using their attributes or other confirmed sources of information. Examples include tokens or licences, third party verification, government e-ID systems.

Digital environments, services and platforms – online spaces that may allow access to and uploading, distribution and sharing of online pornography or other sexually explicit content. These include, but are not limited to, social media services, designated internet services, or relevant electronic services (as defined in the Online Safety Act 2021), as well as search engines and gaming platforms.

Not-for-profit sector – social enterprises, charities and other non-government organisations which provide social or human services or conduct related research that informs social policy. These include, but are not limited to, services that support the wellbeing of children and their families or carers.

Pornography – material that contains sexually explicit descriptions or displays that are intended to create sexual excitement, including actual sexual intercourse or other sexual activity.