- Nov 10, 2017
Child safety in online environments seems to be on the top of the figurative priority list maintained by big tech firms. Apple recently announced that it will use its photo scanning technology to detect Child Sexual Abuse Material (CSAM) on iCloud and will also use on-device machine learning to censor sexual content sent or received by children below 13 years of age. You can find out more details here. In the same vein, Google has also revealed today about how it will be offering safer online experiences for non-adults across its range of products.
First up is a policy change through which Google will allow people under the age of 18 or their parents and guardians to request the removal of their images from Image search results. While this obviously won't remove the image in question from the internet altogether, it will definitely reduce its visibility.
In a bid to make the internet a safer space for teenagers, Google is making changes to many of its major products. SafeSearch will be enabled by default for teens and YouTube uploads will be private.