zlacker

[return to "Google ordered to identify who watched certain YouTube videos"]
1. addict+J6[view] [source] 2024-03-23 02:39:20
>>wut42+(OP)
There are different incidents here.

The first one where the police uploaded videos and wanted viewer information is absolutely egregious and makes me wonder how a court could authorize that.

The next one, which I didn’t fully understand, but appeared to be in response to a swatting incident where the culprit is believed to have watched a specific camera livestream and the police provided a lot of narrowing details (time period, certain other characteristics, etc) appears far more legitimate.

◧◩
2. godels+Zb[view] [source] 2024-03-23 03:52:38
>>addict+J6
I don't understand how either of these are remotely constitutional. They sure aren't what is in the spirit.

They asked for information about a video watched 30k times. Supposing every person watched that video 10 times AND supposing the target was one of the viewers (it really isn't clear that this is true), that's 2999 people who have had their rights violated to search for one. I believe Blackstone has something to say about this[0]. Literally 30x Blackstone's ratio, who heavily influenced the founding fathers.

I don't think any of this appears legitimate.

Edit: Ops [0] https://en.wikipedia.org/wiki/Blackstone%27s_ratio

◧◩◪
3. mingus+kd[view] [source] 2024-03-23 04:10:21
>>godels+Zb
Cell phone tower data has been used for a decade now in pretty much the same way.

Did you happen to pass by a cell tower in a major city around the time a crime was committed? We all have.

Well, your IEMI was included in a cell tower dump. Probably dozens of times.

Did you happen to drive your car over any bridge in the Bay Area lately? Did a municipal vehicle pass you and catch your license plate with their ALPR camera?

Guess what? Your name went through a database of an LEO search if they wanted to find a perp for that time/location.

Privacy has been dead for a long time. The worst part is people don’t care.

The Snowden files changed nothing. If there was ever a point in history where people would have given up their cell phones for their civil liberties, that would have been the time to do it.

◧◩◪◨
4. godels+bg[view] [source] 2024-03-23 04:54:06
>>mingus+kd
> Cell phone tower data has been used for a decade now in pretty much the same way.

I was mad then. I'm more mad now. Stop these arguments because it isn't like one implies the other. And who the fuck cares if someone wasn't but is now. What's the argument, that you're a hipster? That's not solving problems. I don't want to gatekeep people from joining the movement to protect rights. I don't care if they joined as a tin foil hat or just yesterday after having literally been complacent in these atrocities. If you're here now, that's what matters.

> Privacy has been dead for a long time. The worst part is people don’t care.

Bull, and bull.

There are plenty of people fighting back. I'm pretty sure me getting ads in languages I don't speaks is at least some good sign. Maybe I can't beat the NSA, sure, but can I beat mass surveillance? Can I beat 10%? 50%? 80%? 1% is better than 0% and privacy will die when we decide everything is binary.

People care. People are tired. People feel defeated. These are different things. If people didn't care Apple (and even Google) wouldn't advertise themselves as privacy conscious. Signal wouldn't exist and wouldn't have 50 million users. It's not time to lay down and give up.

> mingus88 36 minutes ago | parent | context | flag | on: Google Ordered to Identify Who Watched Certain You...

Cell phone tower data has been used for a decade now in pretty much the same way.

Did you happen to pass by a cell tower in a major city around the time a crime was committed? We all have.

Well, your IEMI was included in a cell tower dump. Probably dozens of times.

Did you happen to drive your car over any bridge in the Bay Area lately? Did a municipal vehicle pass you and catch your license plate with their ALPR camera?

Guess what? Your name went through a database of an LEO search if they wanted to find a perp for that time/location.

Privacy has been dead for a long time. The worst part is people don’t care.

> The Snowden files changed nothing.

They didn't change enough, but that isn't nothing.

◧◩◪◨⬒
5. alfied+op[view] [source] 2024-03-23 07:21:43
>>godels+bg
> > The Snowden files changed nothing. >They didn't change enough, but that isn't nothing.

The biggest change IMHO was the entire industry got off their collective assets to finally move to HTTPS.

◧◩◪◨⬒⬓
6. PeterS+pt[view] [source] 2024-03-23 08:31:38
>>alfied+op
Had nothing to do with Snowdon but with Google ranking algo changes. Google has a commercial interest of hindering competitors in the add brokering market from observing info on the wire.
◧◩◪◨⬒⬓⬔
7. mike_h+6F[view] [source] 2024-03-23 11:23:55
>>PeterS+pt
It had everything to do with Snowden. Source: I was at Google at the time he started leaking.

Before Snowden encryption was something that was mostly seen as a way to protect login forms. People knew it'd be nice to use it for everything but there were difficult technical and capacity/budget problems in the way because SSL was slow.

After Snowden two things happened:

1. Encryption of everything became the companies top priority. Budget became unlimited, other projects were shelved, whole teams were staffed to solve the latency problems. Not only for Google's own public facing web servers but all internal traffic, and they began working explicitly on working out what it'd take to get the entire internet to be encrypted.

2. End-to-end encryption of messengers (a misnomer IMHO but that's what they call it) went from an obscure feature for privacy and crypto nerds to a top priority project for every consumer facing app that took itself seriously.

The result was a massive increase in the amount of traffic that was encrypted. Maybe that would have eventually happened anyway, but it would have been far, far slower without Edward.

◧◩◪◨⬒⬓⬔⧯
8. lern_t+x41[view] [source] 2024-03-23 15:34:30
>>mike_h+6F
You were at Google at the time, but your memory of the ordering of events is off. Google used HTTPS everywhere before Snowden.[1][2] HTTPS on just the login form protects the password to prevent a MITM from collecting it and using it on other websites, but it doesn't prevent someone from just taking the logged in cookie and reusing it on the same website. That was a known issue before Snowden, and Google had already addressed it. Many other websites, including Yahoo, didn't start using HTTPS everywhere until after Snowden.[3] I know because this was something I was interested in when using public WiFi points that were popping up at the time. I also remember when Facebook moved their homepage to HTTPS.[4] Previously, only the login form POSTed to an HTTPS endpoint, but that doesn't protect against the login form being modified by a MITM to have a different action for the MITM to get your password, rendering the whole thing useless.

What changed after Snowden was how Google encrypts traffic on its network, according to an article quoting you at the time.[5]

[1]https://gmail.googleblog.com/2010/01/default-https-access-fo...

[2]https://googleblog.blogspot.com/2011/10/making-search-more-s...

[3]https://www.zdnet.com/article/yahoo-finally-enables-https-en...

[4]https://techcrunch.com/2012/11/18/facebook-https/

[5]https://arstechnica.com/information-technology/2013/11/googl...

◧◩◪◨⬒⬓⬔⧯▣
9. fl0ki+jv1[view] [source] 2024-03-23 19:08:03
>>lern_t+x41
An important clarification is that the leaks about NSA snooping on Google motivated end-to-end encryption between all pairs of Google internal services. It was a technical marvel, every Stubby connection had mutual TLS without any extra code or configuration required. Non-Stubby traffic needed special security review because it had to reinvent much of the same.

People even got internal schwag shirts made of the iconic "SSL added and removed here" note [1]. It became part of the culture.

Over a decade later I still see most environments incur a lot of dev & ops overhead to get anywhere close to what Google got working completely transparently. The leak might have motivated the work, but the insight that it had to be automatic, foolproof, and universal is what made it so effective.

[1] https://blog.encrypt.me/2013/11/05/ssl-added-and-removed-her...

◧◩◪◨⬒⬓⬔⧯▣▦
10. mike_h+zC1[view] [source] 2024-03-23 20:13:50
>>fl0ki+jv1
A minor quibble; iirc it was only connections that crossed datacenters that were encrypted. RPC connections within a cluster didn't need it, as the fiber taps were all done on the long distance fibers or at telco switching centers.

But otherwise you're totally right. I suspect the NSA got a nasty shock when the internal RPCs started becoming encrypted nearly overnight, just weeks after the "added and removed here" presentation. The fact that Google could roll out a change of that magnitude and at that speed, across the entire organization, would have been quite astonishing to them. And to think... all that work reverse engineering the internal protocols, burned in a matter of weeks.

[go to top]