Report of the inquiry into age verification for online wagering and online pornography
House of Representatives Standing Committee on Social Policy and Legal
The Committee’s inquiry considered the potential role for online age verification in protecting children and young people in Australia from exposure to online wagering and
Evidence to the inquiry revealed widespread and genuine concern among the community about the serious impacts on the welfare of children and young people associated with exposure to certain online content,
The Committee heard that young people are increasingly accessing or being exposed to pornography on the internet, and that this is associated with a range of harms to young people’s health, education,
relationships, and wellbeing. Similarly, the Committee heard about the potential for exposure to online wagering at a young age to lead to problem gambling later in life.
Online age verification is not a new concept. However, the
Committee heard that as governments have sought to strengthen age restrictions on online content, the technology for online age verification has become more sophisticated, and there are now a range of age-verification services available which seek to
balance effectiveness and ease-of-use with privacy, safety, and security.
In considering these issues, the Committee was concerned to see that, in so much as possible, age restrictions that apply in the physical world are also
applied in the online world.
The Committee recognised that age verification is not a silver bullet, and that protecting children and young people from online harms requires government, industry, and the community to work together
across a range of fronts. However, the Committee also concluded that age verification can create a significant barrier to prevent young people—and particularly young children—from exposure to harmful online content.
Committee’s recommendations therefore seek to support the implementation of online age verification in Australia.
The Committee recommended that the Digital Transformation Agency lead the development of standards for online age
verification. These standards will help to ensure that online age verification is accurate and effective, and that the process for legitimate consumers is easy, safe, and secure.
The Committee also recommended that the Digital
Transformation Agency develop an age-verification exchange to support a competitive ecosystem for third-party age verification in Australia.
In relation to pornography, the Committee recommended that the eSafety Commissioner lead
the development of a roadmap for the implementation of a regime of mandatory age verification for online pornographic material, and that this be part of a broader, holistic approach to address the risks and harms associated with online pornography.
In relation to wagering, the Committee recommended that the Australian Government implement a regime of mandatory age verification, alongside the existing identity verification requirements. The Committee also recommended the
development of educational resources for parents, and consideration of options for restricting access to loot boxes in video games, including though the use of age verification.
The Committee hopes that together these
recommendations will contribute to a safer online environment for children and young people.
Lastly, the Committee acknowledges the strong public interest in the inquiry and expresses its appreciation to the individuals and
organisations that shared their views with the Committee.
Several times last year Australian games ratings have been reported for arbitrary ratings assigned under the Australian Classification Board's IARC automated game and app rating tool.
Variants of the same game on different platforms appeared in
the classification database with wildly different outcomes. One game achieved being 15 rated, 18 rated and banned. Inevitably when the shit hit the fan and the incompetent ratings gained the attention of publicity, human censors stepped in and sorted out
the rating (down to 15), and expunged all the embarrassing misfires from the database.
Well it seems that the shoddy system has been discussed for a while and a damning report from 2016 has just been published as a result of a Freedom of
The report reveals that a selection of ratings from the tool were audited by compared them with an assessment from a human censor.
Results were particularly atrocious fro the higher ratings. A table on page 13 reveals
56% of M (PG-15) ratings assigned by the tool were wrong
72% of MA 15+ ratings were wrong
100% of R 18+ ratings were wrong
99% of RC (banned) ratings were wrong
In all of these categories the automated ratings were nearly always lowered by the audit.
The failure of the system was attributed to the inaccuracy of data input but surely this is a systemic failure to define tight enough definitions of date
An Australian film industry coalition is calling for new classification between PG and M (which is a PG-15 rating).
Major and independent film distributors and exhibitors are urging the federal government to adopt a new PG13 classification which they
say would benefit family-friendly Australian and international films that get M ratings.
Echoing calls by Screen Producers Australia and the Australian Children's Television Foundation, the Film Industry Associations (FIA) also advocates a uniform
classification system across all delivery platforms, with self-classification by the industry, overseen by a government regulator.
The say the current review system is no longer fit-for-purpose. It is expensive and unfeasibly time-consuming
in an environment where digital distribution has minimised the time between the delivery of a film and its release date, the FIA says in its submission to the government classification review.
We are seeking feedback on proposals for a new Online Safety Act to improve Australia's online safety regulatory framework.
The proposed reforms follow a 2018 review of online
safety legislation which recommended the replacement of the existing framework with a single Online Safety Act.
Key proposals include:
A set of basic online safety expectations for industry (initially social media platforms), clearly stating community expectations, with associated reporting requirements.
An enhanced cyberbullying
scheme for Australian children to capture a range of online services, not just social media platforms.
A new cyber abuse scheme for Australian adults to facilitate the removal of serious online abuse and harassment and
introduce a new end user take-down and civil penalty regime.
Consistent take-down requirements for image-based abuse, cyber abuse, cyberbullying and seriously harmful online content, requiring online service providers to
remove such material within 24 hours of receiving an eSafety Commissioner request.
A reformed online content scheme requiring the Australian technology industry to be proactive in addressing access to harmful online content.
The scheme would also expand the eSafety Commissioner's powers to address illegal and harmful content on websites hosted overseas.
An ancillary service provider scheme to provide the eSafety Commissioner with the capacity to
disrupt access to seriously harmful online material made available via search engines, app stores and other ancillary service providers.
An additional power for the eSafety Commissioner to respond rapidly to an online crisis
event (such as the Christchurch terrorist attacks) by requesting internet service providers block access to sites hosting seriously harmful content.