Google is making a number of product changes to help protect minors online
Since the introduction of its Family Link service, Google has done a lot to meet the needs of families and children who use apps and services in its ecosystem. A Family tab on Nest Hub devices that’s packed with games and learning experiences, filters on Youtube Kids, the ability to restrict app installs and set time limits on Chromebook and Android devices, and even more.
However, while we have been given many ways to control the experiences given to us, there is still a long way to go in how we – especially as parents – can change or remove our children’s online footprint. for their security and confidentiality. Yes, many types of data can be removed from search and other on-demand services if they violate Google’s terms of service, but these are geared more towards fraudulent and criminal activity or explicit.
Today on The Keyword, Google described several ways it seeks to provide children and teens with a safer online experience. For example, over the next few weeks, it will begin allowing anyone under the age of 18 – or the parents / guardians of those minors – to request that their images be removed from Google Image search results. The company points out that just because it removes an image from discovery through Image Search doesn’t mean the image is removed from the source website it was uploaded to.
Additionally, Youtube will begin to set the default download options to the most private setting for teens aged 13-17, as well as showcase digital wellness features and education better on content. commercial. In practice, this means that the pause and bedtime reminders will be enabled by default for Youtube users in this age range, and autoplay will be automatically disabled (these can be changed!)
Paying attention to the use of technology is the key to everyone’s well-being. These new defaults for teens are protective; they increase the safe and conscious use of technology by getting teens to think about what they want to see and who they want to see their content.
Anne Collier, Executive Director of The Net Safety Collaborative
In terms of education on commercial content, that means Youtube will remove excessively commercial content from Youtube Kids, such as a video that focuses on product packaging or directly encourages kids to spend money. While the platform has never really allowed paid product placements in kids videos, it is taking more intentional steps to get rid of them in the future. On the Youtube side, there is a lot more that was discussed in a recent blog update, so be sure to read about it.
SafeSearch in Google Search will also be enabled for minors under 18 (instead of just those 13 and under), and adult content will be better filtered from Assistant with an upcoming set of defaults like SafeSearch on smart screens. To go even further, supervised accounts will not be able to activate the position history, a new Google Play Store security section is launched for parents and administrators of the education workspace K-12 will have SafeSearch and other tools at their disposal which will also be enabled by default for child accounts – phew!
It’s a lot of great work, and my first thought is – “Why weren’t these things already a thing from the start?” To be honest, though, I often have this question with all of Google’s products and services. If you want to see all of this and more in one place, the company is creating new “transparency resources” that will launch over the next few months that will help parents and children understand their data and how it is being used.