1. Introduction & Purpose
I am proposing a new legislative framework to be established under the Online Safety Act, called the ‘Parent-Child Digital Safety Link’ framework. The core of this framework is a partnership between government and industry: parents register their child's device in a central, secure government hub, and online platforms are then required to check this hub and automatically apply parental supervision to any account created on that device. This system is designed to provide parents and guardians with more meaningful tools to protect their children online by closing the key vulnerabilities in current parental control systems. By mandating this integration, the framework would empower parents to proactively protect their children from online dangers such as cyberbullying, exposure to harmful content, and predatory behaviour.
2. Comparison with Current Legislation
The recently passed Online Safety Amendment (Social Media Minimum Age) Act mandates a blanket ban on social media for children under 16, enforced by age verification technologies. While well-intentioned, this approach has several critical flaws that the ‘Parent-Child Digital Safety Link’ framework resolves:
- Empowering Parents vs. Government Overreach: The current law substitutes parental responsibility with a blunt, one-size-fits-all government mandate that parents everyone poorly. My framework provides parents with sophisticated tools to parent their own children effectively in the digital age.
- Privacy and Data Security: The age ban requires all users, including adults, to submit to potentially invasive age verification, creating massive databases of sensitive information. It's even worse for international visitors; because they can't use myGovID, they will be forced to hand over their passport details directly to the tech giants just to use Instagram on holiday. My framework is privacy-preserving, as it only requires DVS verification for parents who voluntarily choose to register a device, not for every citizen who wants to use social media.
- Effectiveness and Circumvention: Blanket bans are notoriously easy for tech-savvy children to circumvent. My proposal focuses on a robust, device-centric system that is significantly harder to bypass and provides parents with direct alerts about such attempts.
- Focus on Safety, Not Prohibition: The current law is a blunt instrument of prohibition. My framework is a sophisticated safety tool. It focuses on creating a supervised, educational environment where children can learn to navigate the digital world safely under their parents' guidance, rather than simply being locked out.
In essence, the ‘Parent-Child Digital Safety Link’ is a proactive, parent-led safety solution that achieves the goals of the current legislation without its significant drawbacks to privacy, parental rights, and overall effectiveness.
3. The Problem: Current Gaps in Child Online Safety
While the Online Safety Act and the Basic Online Safety Expectations (BOSE) have made significant strides, a fundamental gap remains. The parental control systems currently offered by digital platforms are inconsistent, optional for the platform, and easily bypassed by tech-savvy children.
Key vulnerabilities include:
- Account De-linking: A child can simply uninstall and reinstall an application or create a new, unrestricted account to sever the parental link.
- Lack of Proactive Alerts: Parents are often unaware of dangerous interactions until after significant harm has occurred. There is no mechanism for real-time alerts based on high-risk keywords or behaviours.
- Limited Reporting: The current reporting systems place the onus on the child or other users to report harm, which often goes undone due to fear or social pressure. There is no direct channel for a child or a concerned bystander to securely and immediately alert a linked parent.
4. The Proposed Framework: Core Components
I propose that the Online Safety Act be amended to establish the ‘Parent-Child Digital Safety Link’ framework. This framework will be a public-private partnership, mandating that designated online services integrate with a government-run Centralized Child Device Registry. This integration will be a condition of operating in Australia and will apply to all accounts created on registered devices for users under the age of 16.
The core features would include:
4.1. Proactive Device Registration via the 'Digital Safety Link' App
To streamline and enforce account linking, a Centralized Child Device Registry would be established, managed by the eSafety Commissioner. The registration process is designed for security and ease of use, initiated directly from the device to be registered:
- App Installation and Login: A parent installs the official ‘Digital Safety Link’ app on a new device. Upon first launch, the app prompts the user to sign in with their myGov account.
- Parent Verification: The parent logs into myGov. If it is their first time, they are guided through a one-time DVS identity verification to create their secure ‘Digital Safety Link’ account.
- Device Allocation: Immediately after a successful login, the app presents a mandatory, one-time choice: allocate the device as a "Child" or "Adult" device.
- Child Device Linking: If "Child" is selected, the device will be linked to their Digital Safety Link account, officially registering it in the Centralized Registry and locking it into "Child Mode".
- Secure Mode: Once a device is registered as a "Child" device, its status cannot be easily reversed. The option to change it back to an "Adult" device would be buried deep within the app's settings and require the parent to re-authenticate through myGov to prevent unauthorised changes.
4.2. Streamlined & Proactive Account Creation
Since most online accounts require the same basic information (username, password, DoB, email), the parental hub can leverage the parent's verified identity to simplify this process. Within the 'Digital Safety Link' app, a parent can:
- Select a Child: Choose which of their children they wish to create accounts for.
- Enter Universal Details: Provide a desired username, password, and contact email once.
- Select Platforms: Choose from a list of participating platforms (e.g., Instagram, Roblox, etc.).
- Mass Create Accounts: With a single click, the hub's API will securely transmit the verified details (including the child's DoB) to the selected platforms to create the accounts. The system will check for username availability on each platform and suggest alternatives if the desired name is taken.
This "one-click" process allows a parent to proactively establish a safe, pre-configured, and fully supervised set of accounts for their child before they even use the platforms.
4.3. Account and Device Integrity Alerts
This framework is built on the principle that the registered device is the primary vector of control. The linked parent must be alerted to actions that indicate an attempt to bypass supervision, and the device itself must act as a gatekeeper.
- On-Device Creation: If a new, unlinked account is created directly on the registered device, the account creation process will be paused and cannot be completed until the parent is notified and provides approval with an OTP from their hub.
- External Account Login: If an account is created on an external device (e.g., a school computer) and then logged into on the child's registered device, the app will display a 'Waiting for Parental Approval' screen, the parent is notified and uses an OTP from their hub to formally link the new account. Restricting children from accessing this process is the responsibility of the parent, who can gate access to the hub with a secondary password or other verification system.
This ensures that all accounts, regardless of where they are created, must be authorized by the parent before they can be fully used on the primary registered device.
4.4. Parental Interaction and Oversight
A parent's access to a supervised account is for observation and protection, not impersonation. Parents can view the child's account as they see it, but their ability to interact is limited to a specific set of clearly identified actions:
- Block Users: A parent can block any other user from interacting with their child's account.
- Initiate and Track Reports: Both the parent and the child can use the platform's reporting tools to flag content or users for review. A record of any report initiated from the child's account, regardless of who made it, along with the platform's final response and any action taken, is automatically logged in the parent's Centralized Registry hub.
- Reply as the Parent: A parent can comment or reply to posts, but never as the child. All parental replies must be clearly and automatically labelled to ensure transparency.
4.5. Mandatory Platform-Level Parental Controls
In addition to the parent's direct interaction tools, all platforms must provide a standardized set of granular controls for supervised accounts, manageable by the parent through their hub. These controls address account-level risks that OS-level tools cannot. The mandatory features include:
Communication & Social Interaction Controls:
- Chat and Messaging Restrictions: Parents must have the option to limit who can send their child direct messages (e.g., "Everyone," "Friends Only," "No One").
- In-Game Chat Display: For gaming platforms, parents must have the option to completely disable the display of all in-game text and voice chat.
- Friend Request Approval: Parents can enable a setting that requires their approval before the child can accept a new friend or follower request.
- Mature Language Filter: A filter for offensive language in chats and comments must be enabled by default, with the option for the parent to disable or update how strict it is.
Content & Privacy Controls:
- Content Filtering: Parents must be able to set a content sensitivity or age-rating level for algorithmic feeds and recommendations, defaulting to the child device’s registered age.
- Profile Privacy: The child's account must default to "Private," and the parent holds the authority to change it to "Public."
- Enforced Safe Search: In-app search functions must have "Safe Search" enabled by default, which can only be disabled by the parent.
4.6. Visible Account Supervision Status
For all supervised accounts, the platform must display a clear, standardized public indicator. This visual cue (e.g., a specific icon and text like "Supervised Account") would serve to inform other users that they are interacting with a minor under parental supervision.
4.7. Expanded Reporting: The 'Alert a Parent' Function
For supervised accounts, the existing user reporting system would be expanded to include a new option: "Report and Alert a Parent." This would allow a user who witnesses bullying or concerning behaviour to trigger a confidential notification directly to the linked parent's account.
4.8. Secure Parent-to-Parent Communication Channel
A key benefit of this centralized framework is the ability to facilitate communication between parents of supervised children who are interacting online.
- Initiating Contact: If a parent observes a concerning interaction between their child and another supervised child, they can initiate a secure chat with the other child's parent directly through the Centralized Registry hub.
- Privacy-Preserving: This communication channel would be anonymized. Parents would be identified only as Parent of [Child's Username].
- Purpose: This creates a safe, adult-to-adult channel for resolving disputes and addressing bullying at its source.
5. The 'Digital Safety Link' App: A Central Hub for Families
The official ‘Digital Safety Link’ app serves as the single point of interaction for both parents and children, creating an integrated and user-friendly experience.
5.1. For Parents:
The parent's central management hub is accessible either through the ‘Digital Safety Link’ app on their own device or via the myGov website portal. It provides:
- Step-by-Step Guidance: The app offers in-app tutorials and instructions on how to use the device's native parental controls (like Apple Screen Time or Google Family Link) in conjunction with the ‘Parent-Child Digital Safety Link’ framework.
- Centralized Control Panel: Parents can manage all their registered devices, view alerts, link new accounts with OTPs, and access the parent-to-parent communication channel from either the app or the website.
- Hub-Based Interaction: The app provides the "Basic View" interface for parents to monitor and interact with their child's accounts without needing to sign up for every platform.
5.2. For Children:
When installed on a child's registered device, the app transforms into their dedicated safety and communication hub. It provides:
- Secure Parent Communication: The app includes a direct and secure channel for the child to communicate with their linked parent.
- A Safe Space for Help: The app serves as the child's primary access point for all help and support features, including the confidential Child-Initiated Dispute Process and integrated chat with services like Kids Helpline. This ensures that a child always knows where to go if they feel unsafe or need to talk to someone.
6. A Public-Private Partnership: Division of Responsibilities
This framework is designed as a public-private partnership.
6.1. Government Responsibilities (The Central Hub)
The Australian Government, through the eSafety Commissioner, would be responsible for the core identity, verification, and oversight infrastructure. This includes:
- Identity and Device Management: Managing the end-to-end device registration process within myGov, including DVS verification, and acting as the Certificate Authority for issuing, managing, and revoking Parental Certificates.
- Central Hub Operations: Maintaining the secure Centralized Registry, the parental control panel, and the hub-based interaction interface for parents who do not have platform-specific accounts.
- API Development and Maintenance: Building and maintaining the single, standardized API for industry to integrate with.
- Moderation Oversight: Receiving and securely storing all platform moderation responses related to reports made on supervised accounts, creating a centralized dataset to monitor platform compliance and effectiveness.
- Secure Communications: Hosting and mediating the secure, anonymized parent-to-parent communication channel.
- Dispute Resolution and Failsafes: Administering all dispute resolution processes, including those related to shared custody and malicious registration, as well as managing the emergency override and child-initiated dispute mechanisms.
6.2. Corporate Responsibilities (Platform Integration)
Digital service providers would be responsible for integrating with the government's API and managing the user-facing experience. It is important to note that while the list of mandatory controls is specific, the underlying capabilities represent a fraction of the standard operating procedures already in place for large digital platforms, making implementation a matter of standardisation rather than invention. Their responsibilities include:
- API Integration: Connecting their account creation and management systems to the government's standardized API to check device status and facilitate parental linking.
- Platform-Side Logic: Implementing the code to support supervised account types, including restricting functionality of unlinked accounts on registered devices.
- Parental Dashboard and Controls: Providing and maintaining the parental dashboard for linked accounts, ensuring all mandatory controls (as defined in section 4.5) are available and functional.
- Content and Communication Serving: Securely serving account content (e.g., posts) to the government's hub-based interface and processing parental replies sent from the hub.
- Moderation Reporting: Transmitting the outcome of any parent-initiated report back to the Centralized Registry hub via the API.
- User Experience: Ensuring the entire process, from account creation to parental interaction, is smooth, transparent, and clearly communicated to all users.
7. Alignment with the Online Safety Act
This proposal directly supports the objectives of the Online Safety Act by enhancing user empowerment, promoting service provider responsibility, and creating powerful deterrents for abuse.
8. Addressing Potential Concerns & Failsafes
8.1. Foundational Safeguards
- Secure Identity Verification.
- Managing Device Ownership and Preventing Misuse.
- Dispute Resolution for Shared Custody.
- Criminal Penalties for Misuse.
8.2. Failsafes for At-Risk Individuals and System Integrity
- Emergency Override Mechanism.
- Child-Initiated Dispute Process.
- Automatic Emancipation at 16.
8.3. General Concerns
- Parental Rights and Privacy.
- Security of the Centralized Registry.
- Data Security and Third-Party Privacy.
- Implementation: A phased rollout could be mandated.
9. Requested Action
The ‘Parent-Child Digital Safety Link’ offers a robust, practical, and necessary evolution of Australia's online safety framework. I urge the Government to adopt this proposal as a key recommendation in an urgent review of the Online Safety Amendment (Social Media Minimum Age) Act.