365NEWSX
365NEWSX
Subscribe

Welcome

Why Google Is Scrubbing Personal Info From Search Results (if You Ask Nicely) - Gizmodo

Why Google Is Scrubbing Personal Info From Search Results (if You Ask Nicely) - Gizmodo

Why Google Is Scrubbing Personal Info From Search Results (if You Ask Nicely) - Gizmodo
Oct 01, 2022 6 mins, 42 secs

Today, lots of people can find out personal information about you—phone number, physical address—with a simple Google search.

Google knows this, and it’s giving ordinary people the chance to scrub this information from its all-knowing search engine.

On Wednesday, the search giant launched “Results About You,” a new tool that allows users to request the removal of their physical address, phone number, and email address with just a few clicks.

Danny Sullivan, Google’s public liaison for Search, told Gizmodo in a video chat interview this week that, over time, people have become more sensitive to their personal information appearing in search results.

The search giant has more than half a dozen removal policies related to different content, from involuntary fake porn and sites with exploitative removal practices to images of minors and doxxing content.

Gizmodo spoke to Sullivan about Results About You this week and other aspects of Google Search, such as whether the new alerts feature is basically a repackaged version of Google Alerts, what person or algorithm reviews content removal requests, and what Google’s doing to ensure the public understands its miles-long content removal policies.

DS: It’s to better offer two-way communication between the search team and people outside of Google.

Gizmodo: How would you describe the problem of personal identifiable information on Google Search and the steps Google has been taking to address this problem.

DS: I think people are more sensitive to it over time because there’s been more information that’s out there and more people continue to search.

There are a few cases where there might be a public aspect, but we can also address some of the concerns people have, especially I think because they’ve seen some of this content go into third party websites and you’re like, “I don’t know why it’s there.”.

Basically, if it’s not spam and it’s not something illegal—like we have laws we have to react to, like with child sexual abuse content—we tend to leave it there and trust in our ranking system to try to show the most helpful stuff that we can, so that we’re not stepping in and then somehow taking information out that other people might find useful.

But this was a case where we said there’s enough interest in having this type of thing and we think that those concerns can be met without impacting the search results in a way that’s not making them less helpful to people.

Gizmodo: In recent months and years, there have been very high-profile investigations by The New York Times and other media outlets about people whose personal identifiable information has been you know has been on Google Search and that has identified them as pedophiles, for instance.

It really is broadening it toward people who don’t necessarily have some concern of harm that they’re actually having to show, it’s just that I would want to be more comfortable, right.

It was because we understand that, especially if you’re a minor, you might post images, or friends might do it, [and realize] ‘Oh, I really don’t want those showing up in search, even though they’re out on the open web.’ So, it’s just designed to make it easier for people, just ordinary people.

DS: I totally agree we have lots of different policies and we have a whole page if you [type] ‘remove information from Google.’ It will list things like, ‘Is it doxxing?’ or ‘Is it nonconsensual imagery?’ etc?

You’re in the moment looking at the search results saying, ‘Oh, I’m concerned about this,’ and you click on it and you can see [whether the information you’re concerned about] matches to [a Google policy for removal], then put in a request and have you guided through the process without having to necessarily read all the details.

DS: It’s still being developed, so I don’t have a whole lot of a lot of things to say, but I think you’ll envision that it would be kind of like alert where you have something watching for you, which is like Google Alerts does?

Sometimes people just need to share a bit more information and we understand the context a bit more.

DS: Well, I think for a lot of people, most of those policies aren’t an issue for them.

I think most people are probably not thinking, ‘Gosh, I had irrelevant pornography associated with my name.’ That’s like a really weird situation where someone has created a porn site and then they scrape a bunch of names that have nothing to do with it and they just generate this stuff.

That doesn’t impact that many people, so it’s probably not to the degree that we need to build it into the tool just yet.

In contrast, the things that we’re [addressing] with this kind of tool really are the things that impact people a lot and things that we think will be helpful to a lot of people.

And then, it’s so nice to the sense of, we have the policies now to the degree that we can say, ‘Well, how do we make it even easier for people to act on these policies and become aware of them and [build] it into the app and [build] it into the system that’s right there?’ Because I think also when people really think about ‘I want to deal with something,’ it’s when they’ve done a search and then they’re in the moment and they realize, ‘Oh, I don’t like this in relation to me, how do I deal with it?’ And now, for the first time in ages that I can think of, you can interact with it right from the search results and know how to go with it?

Can you imagine something like that for Google in the future where there is like a hub that helps people match the content they want removed in search results with the Google policy on it.

When you click on a thing next to a search result, it’s going to tell you things like, do you want to remove this.

So, it really is exactly as you’re talking about this, how do we match the policies up into this sort of a tool and guide people better to those sorts of solutions.

DS: With some of the [policies], you’ll be able to go through to do the removal process completely through the [tool], like this is my personal info and I want it removed.

So, you could still make use the right to be forgotten for certain things if you want, and there’s things [our content removal policies] won’t cover that right to be forgotten might cover.

As I understand you might not like an article that was written about a crime that maybe you were convicted of, but now it’s old and you’re like, ‘I just don’t want that showing up anymore.’ We don’t have removal policies for that outside the EU?

DS: The tool right now is only US English, but the policies are global in nature so people can already use them around the world and then we’ll bring that out as well

It’s not like the thing that the tool does is only for people in the US searching in English, it’s just that we only have the tool process for it

But I know one of the challenges was ‘how do we communicate all these different policies quickly, concisely, [and] in a way that’s helpful to people as they go through this tool form

And from what I’ve seen, I look at and I’m like, ‘Wow, this is actually good.’ [From] my perspective, where I’m especially always trying to think about how we can explain things clearly as we can

Summarized by 365NEWSX ROBOTS

RECENT NEWS

SUBSCRIBE

Get monthly updates and free resources.

CONNECT WITH US

© Copyright 2024 365NEWSX - All RIGHTS RESERVED