- cross-posted to:
- google@lemmy.world
- technology@lemmy.ml
- cross-posted to:
- google@lemmy.world
- technology@lemmy.ml
Archived version: https://archive.li/TbziV
Google is launching new privacy tools to allow users to have more control over unwanted personal images online and ensure explicit or graphic photos do not appear easily in search results.
Updates to Google policies on personal explicit images mean that users will be able to remove non-consensual and explicit imagery of themselves that they no longer wish to be visible in searches.
The update means that even if an individual created and uploaded explicit content to a website, and no longer wishes for it to be available on search, they will be able to request to remove it from Google search. The forms to submit requests have also been made more simple. The policy does not apply to images users are currently and actively commercialising.
The policy also applies to websites containing personal information.
Google will also roll out a new dashboard, only available in the US in English initially, that will let users know search results that display their contact information. Users can then quickly request the removal of these results from Google. The tool will also send a notification when new results with a user’s information pop up in search.
A new blurring setting in SafeSearch will also be implemented as the default on Google search for users who do not already have SafeSearch filtering on. Explicit imagery, adult or graphic violent content will be blurred by default when it appears in search results. The setting can be turned off at any time, unless you are a supervised user on a public network that has kept this setting as default and locked it.
For instance, in a search for images under “injury”, explicit content will be blurred to prevent users from being shown graphic content.
Google initially announced this safeguard in February and it will be launched globally in August.
Thanks for the laugh.
Hah, likewise :)