- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Summary
- Google’s proposal, Web Environment Integrity (WEI), aims to send tamper-proof information about a user’s operating system and software to websites.
- The information sent would help reduce ad fraud and enhance security, but it also raises concerns about user autonomy and control over devices.
- The authors argue that implementing WEI could lead to websites blocking access for users not on approved systems and browsers.
- They express worries about companies gaining more control over users’ devices and the potential for abuse.
- The authors emphasize that users should have the final say over what information their devices share.
- Remote attestation tools, like WEI, might have their place in specific contexts but should not be implemented on the open web due to potential negative consequences.
- The authors advocate for preserving user autonomy and the openness of the web, emphasizing that users should be the ultimate decision-makers about their devices.
Joke:
Two pieces of string walk into a bar. The first piece of string asks for a drink. The bartender says, “Get lost. We don’t serve pieces of string.”
The second string ties a knot in his middle and messes up his ends. Then he orders a drink.
The bartender says, “Hey, you aren’t a piece of string, are you?” The piece of string says, “Not me! I’m a frayed knot.”
Firefox it is again?
Always has been
All the way. Don’t settle for just chrome plating.
I’m a happy Vivaldi user (features and configurability to me are more important!) but I’m sure this will be implemented in the Chromium open source platform and not exclusively in Chrome (like some other features).
The problem is that Google has such a monopoly over web browsers that Firefox will most probably have to follow and implement this shit as well.
Smells like “this website is only compatible with Internet Explorer 7 or higher” kind of stuff, those were bad back then, it will be a lot worse now.it will be a lot worse now
On the other hand: A website implementing such a functionality does not want me as a user. That’s fine. I’ll find the information elsewhere or give them useless date from within a VM. Starting and stopping minimalist single-purpose VMs isn’t hard nowadays.
It’s easy for us as we are tech literate, but I mostly think of the average person that “doesn’t care about privacy and personal data”. We’re also not Google’s main demographic. When most websites use this kind of shit, it will be extremely hard for everyone to get away from it.
but I mostly think of the average person that “doesn’t care about privacy and personal data”
I stopped thinking of them. But yes, those people will have their data stolen by Google, as usual. But those people also don’t care one single bit about that.
I’ve been warning people that Google making up their own web standards will end in disaster, for years.
At this point, I only keep Chrome around for the odd website that only works on Chrome. It’s astonishing how quickly Google is burning through good will lately.
Removed by mod
I must say I really missed the option to ‘open links in apps’ and use ‘external download manager’ in Firefox mobile
Your computer should say what you tell it to say - so if I want to spoof my browser and OS I can do that right? Right?
The magic words are “user-agent header in http protocol”
Also the goal is not for everyone to spoof everyone else, but the goal is to not trust any information you are given by a browser. A good developer would always find ways to bypass any limits with that so it would be useless anyway.
Removed by mod
Well, according to the proposal, it doesn’t send it to websites. It sends all your data to an attestation server, AKA Google probably, and the attestation server sends stuff to the website.
Can someone tell me how it can be “tamper proof”? Any encryption key inside chrome can be extracted and used to sign anything the user might want to send back.
The idea is that it would be similar to hardware attestation in Android. In fact, that’s where Google got the idea from.
Basically, this is the way it works:
-
You download a web browser or another program (possibly even one baked into the OS, e.g. working alongside/relying on the TPM stuff from the BIOS). This is the “attester”. Attesters have a private key that they sign things with. This private key is baked into the binary of the attester (so you can’t patch the binary).
-
A web page sends some data to the attester. Every request the web page sends will vary slightly, so an attestation can only be used for one request - you cannot intercept a “good” attestation and reuse it elsewhere. The ways attesters can respond may vary so you can’t just extract the encryption key and sign your own stuff - it wouldn’t work when you get a different request.
-
The attester takes that data and verifies that the device is running stuff that corresponds to the specs published by the attester - “this browser, this OS, not a VM, not Wine, is not running this program, no ad blocker, subject to these rate limits,” etc.
-
If it meets the requirements, the attester uses their private key to sign. (Remember that you can’t patch out the requirements check without changing the private key and thus invalidating everything.)
-
The signed data is sent back to the web page, alongside as much information as the attester wants to provide. This information will match the signature, and can be verified using a public key.
-
The web page looks at the data and decides whether to trust the verdict or not. If something looks sketchy, the web page has the right to refuse to send any further data.
They also say they want to err towards having fewer checks, rather than many (“low entropy”). There are concerns about this being used for fingerprinting/tracking, and high entropy would allow for that. (Note that this does explicitly contradict the point the authors made earlier, that “Including more information in the verdict will cover a wider range of use cases without locking out older devices.”)
That said - we all know where this will go. If Edge is made an attester, it will not be low entropy. Low entropy makes it harder to track, which benefits Google as they have their own ways of tracking users due to a near-monopoly over the web. Google doesn’t want to give rivals a good way to compete with user tracking, which is why they’re pushing “low-entropy” under the guise of privacy. Microsoft is incentivized to go high-entropy as it gives a better fingerprint. If the attestation server is built into Windows, we have the same thing.
-
The attester here is really mostly Google’s Android/Play Services/(ChromeOS) team, not Google’s Chrome team. Chrome is really just responsible for passing it along and potentially adding some more information like what kind of extensions are in use, but the real validator is above Chrome entirely.
There will not really be a worthwhile key inside Chrome (there might be one that does nothing by itself); it’ll be backed by the existing per-device-unique key living inside your phone’s secure enclave. Extracting one key would just cause Google to ban it. That attestation covers the software in the secure enclave, your device’s running OS, bootloader unlock state and a couple of other things along those lines; the OS, guaranteed to be unmodified by the hardware attestation layer, then adds extra stuff on top like the .apk hash of the browser. The browser, guaranteed to be unmodified by the OS layer, can add things like extension info if it wants to.
SafetyNet/Play Integrity have both software and hardware modes, but all Android+Google Services phones released in the previous 6? or so years have been required to have hardware backed attestation support, which has no known bypass. The existing “Universal SafetyNet Fix” pretends to be a phone without hardware support which Google begrudgingly accepts… for now. But the day where Google will just screw over older phones is getting increasingly closer, and they already have the power to force hardware backed attestation for device-specific features like NFC payments and DRM support.
On Apple devices, Apple has parallels via their secure enclaves in the form of App Attest/DeviceCheck. On Windows desktops, there could be a shoddy implementation with TPMs (fortunately they’re not quite powerful enough to do this kind of attestation in a tamper-proof way; Microsoft’s Pluton chips might have some secret sauce we haven’t yet seen, though). On Linux desktops… nope, ain’t no support for this coming anytime ever.
Even if you resist TPM and WEI, if you don’t have WEI for whatever reason I don’t think you’ll be stopped from using Google services and YouTube… you’ll probably face a shut ton more captchas and 2FA checks whilst it wears down your sanity.