• 1 Post
  • 266 Comments
Joined 1 year ago
cake
Cake day: September 11th, 2023

help-circle










  • Funny thing I’ve been driving for 30+ years, and have never had a formal driving test:

    • Permit at 15: No tests, only restriction was to have a licensed passenger in-vehicle

    • License at 16: Had driver’s ed in school, state’s driving test waved, and license transferred to other states without any new driving test

    • Motorcycle license: Took a safety course while in the military, state added endorsement without any test, which also transferred out-of-state

    • Heavy vehicles: Trained on military 5-ton/deuce+0.5/Frontloaders/HMMWVs - all kinds of heavy equipment - no formal tests, only unit sign-off (even on civilian roads)

    • The kicker: I now live in a US state where a driver license is good with no re-testing till age 64

    Gonna suck when I actually do have to take a test. Hopefully there will be sane infra to go completely driverless by the time I get that old.



  • executing foreign JavaScript

    This is a great point I try to convey to my less-technical friends and family. Looking at a webpage is not like changing the channel on a tv of old. Looking at a webpage pulls code from who knows where and executes it on your local machine.

    These advertisers expect that I should blindly trust them to execute code on my cpu, in my memory, on my machine? Yeah fuck that, it’s a privilege. I don’t invite every hobo walking by to come into my house and take a shit in my toilet.

    If they don’t like that not everyone executes their syphilis-ridden javascript, then they should put their shit behind a paywall. But they won’t, since they know they don’t have a product worth paying for.





  • Doesn’t it require a separate process to be using the cryptographic algorithm in the first place in order to fill the cache in question?

    Yes, that’s my understanding. I haven’t looked at the code, but their high level explanation sounds like their app is making calls to an API which could result in the under-the-hood crypto “service” pulling the keys into the cache, and there’s an element of luck to whether they snag portions of the keys at that exact moment. So it seems like the crafted app doesn’t have the ability to manipulate the crypto service directly, which makes sense if this is only a user-land app without root privileges.

    why wouldn’t the app just steal your password and avoid all of this in the first place?

    I believe it would be due to the app not having root privileges, and so being constrained with going through layers of abstraction to get its crypto needs met. I do not know the exact software architecture of iOS/macOS, but I guarantee there’s a notion of needing to call an API for these types things. For instance, if your app needs to push/pull an object it owns in/out of iCloud, you’d call the API with a number of arguments to do so. You would not have the ability to access keys directly and perform the encrypt/decrypt all by yourself. Likewise with any passwords, you would likely instead make an API call and the backing code/service would have that isolated/controlled access.



  • So it’s been a while since I had my OS and microcomputer architecture classes, but it really looks like GoFetch could be a real turd in the punch bowl. It appears like it could be on par with the intel vulns of recent years.

    which would have this hardware issue but don’t have actual background processing

    So I’ve read the same about iOS only allowing one user-space app in the foreground at a time, but… that still leaves the entirety of kernal-space processes allowed to run at anytime they want. So it’s not hard to imagine that a compromised app could be running in the foreground, all the the while running GoFetch trying to mine, while the OS might be shuffling crypto keys in the background on the same processor cluster.

    The other thing I’d like to address, is you’re assuming this code would necessarily require physical access to compromise a machine. That is certainly one vector, but I’d posit there’s other simpler ways to do the same. The two that come to mind really quick, are (1) a compromised app via official channels like the app store, or even more scary, (2) malicious javascript hidden on compromised websites. The white paper indicates this code doesn’t need root, it only needs to be executed on the same cluster where the crypto keys keep passing through by chance; so either of these vectors seem like very real possibilities to me.

    Edit to add:

    I seem to recall reading a paper on the tiktok apps with stock installation were actually polyglot, in that the app would actually download a binary after installation, such that what’s executed on an end user’s machine is not what went through the app store scanners. I had read of the same for other apps using a similar technique for mini-upgrades, which is a useful way to not have to go through app store approval everytime you need to roll out a hotfix or that latest minor feature.

    If these mechanisms haven’t already been smacked down by apple/google, or worse, aren’t detectable by apple/google, this could be a seriously valuable tool for state level actors able to pull off the feat of hiding it in plain sight. I wonder if this might be part of what congress was briefed about recently, and why it was a near unanimous vote to wipe out tiktok. “Hey congress people, all your iphones are about to be compromised… your tinder/grindr/onlyfans kinks are about to become blackmail fodder.”