Google’s proposed child‑safety measures a big step forward, more details needed to evaluate policy changes
For Immediate Release
Winnipeg, Canada — The Canadian Centre for Child Protection (C3P) welcomes news of Google’s plan to bring about significant design and policy changes to better protect children and adolescents in the digital space, and is hopeful the company will also provide more details in the near future on some key questions raised by this initiative.
Earlier this week, Google said it will introduce a new policy that enables anyone under the age of 18, or their parent or guardian, to request the removal of their images from Google Image search results. Several other changes to its products and platforms, including YouTube, will also take place, such as clawing back commercial‑type content to minors, setting privacy and adult‑filtering modes by default for accounts held by minors.
“We view these many safety‑by‑design changes proposed by Google favorably. That said, online harm to children is a complex issue and more details are needed in order to fully evaluate how this will play out in practice,” says Lianna McDonald, Executive Director for C3P.
Key questions about the proposal include:
- How will Google accurately verify the age of users so as to prevent minors from easily circumventing age restrictions?
- Will Google’s policy allow current‑day adults to request that images depicting them as minors be removed from Google Image search results?
- What type of information will Google be requesting from parents and/or underage individuals to facilitate removal?
- Similar to the removal or blocking of copyrighted material on YouTube, will Google remove, videos or sections of videos of a person under 18 upon request?
A recent global report by C3P provides a series of recommendations related to website design and content moderation to ensure children are safe online.
Media relations contact:1 (204) 560-0723
communications@protectchildren.ca