Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My understanding is that these certs are vendor-specific (and potentially more finely grained than that - I think it'd be fine to have a separate cert per model, for instance) and are rooted in the firmware, not the hardware - that is, it's not the secure boot signing key, and a new firmware update can simply use a different platform certificate.

Having said that, the fact that there appear to be multiple vendors affected by this (looking at those hashes in Virustotal, there are signatures that claim to be owned by Samsung, LG, and Mediatek, at least), and that's certainly concerning. My entirely unsupported guess is that they may all share manufacturing for some models, and certificates may have been on site there?

But the more significant question is how many phones trust these certificates. As I mentioned before, these could be model specific, and any leaked certificates could be tied to a very small number of phones. Or maybe Samsung used the same platform cert for all phones manufactured in the past decade, and that would be a much larger problem. With the information presently available we simply don't know how big a deal this is.

Edit to add: given Mediatek certs appear here, and given all the vendors linked to this have shipped low-end phones based on Mediatek SoCs, it wouldn't surprise me if it turns out Mediatek were the source.



Edit to add: given Mediatek certs appear here, and given all the vendors linked to this have shipped low-end phones based on Mediatek SoCs, it wouldn't surprise me if it turns out Mediatek were the source.

Not surprising either, given that you can far more easily find MTK's leaked datasheets and other useful design documents than any of the other major Android SoC vendors. I could own the detailed documentation for my phone (which is of course rooted) thanks to that insecurity, so you can't say it's entirely a bad thing.


The post title here should really be corrected. It's editorialized and is completely wrong.

There is no 'Android platform signing key'. Android is not a single OS provided by Google. It's an OS family. Each OEM has their own OS and keys, often per device or device family. Several non-Google OEMs either got tricked into signing malware or had at least one of these keys (platform key) compromised. The issue is clearly marked as a partner issue discovered by their program for reporting issues to partners (non-Google Android OEMs). Google detected it and that's why it's on a Google issue tracker. Platform signing key is used to sign app-based components in the OS which are able to receive platform signature permissions, including some quite privileged access. Any app can define signature permissions. Platform signing key is the one used by the highly privileged app-based components of the platform itself, such as the Settings app.

Platform key is only really meant to be used for first party code, normally all built with the OS but potentially updated outside it. It's not the verified boot key (avb key) or the key for signing over-the-air updates (releasekey by default but can be a dedicated key). There are also similar keys for several other groups of OS components with internal signature permissions (media, shared, etc.). All of these keys can be rotated, but aren't rotated in practice as part of the normal process of maintaining devices, partly because supporting phones for 5+ years is a recent development. The rotation that's done in practice is using a separate set of keys for each device / device family. For example, Google used new keys for each generation of Pixel phones and they currently use separate keys for separate device models although they've sometimes used the same keys for XL and non-XL variants in the same generation.

Even the verified boot key for the stock OS could be rotated by firmware updates since they're all shipped with OS updates and it's completely normal to do breaking changes requiring both new firmware and new OS. It'd require adding code to handle the rotation in several places (TEE and secure element pin it and use it as a bonus input for key derivation, which would need to continue using the old value) along with breaking anything pinning the hash for attestation. Only the SoC firmware signing key and firmware for other secondary components with the key burned into the hardware can't be rotated.


> Only the SoC firmware signing key and firmware for other secondary components with the key burned into the hardware can't be rotated.

Side note: those components do have fuses burned via updates that are used to implement downgrade protection dynamically. In theory, they could provide key rotation support. That may make sense as device support gets longer than 5 years. They should also really support more downgrade protection counter increments than they currently do. Snapdragon used to support 16 increments (16 fuses) but I'm not sure what they support in current generation devices. They called it a '16 bit version number' but you can't unset a bit.

OS downgrade protection is generally being done with special replay protected storage from the TEE (Pixel 2), the secure element (Pixel 3 and later) or simply by hard-wiring the version in firmware (very common approach before Android Verified Boot 2.0). It's a lot more flexible both in terms of being able to do key rotation in theory and not having any limit on # of updates. Pixels simply convert the Android / Pixel patch level such as 2022-11-05 into a Unix timestamp and that's the OS rollback counter stored in the secure element after successful boot. Happens alongside updating the firmware rollback counter, which rarely increases. Pixel 6 only bothered shipping the code to increment the firmware counter with Android 13 when they increased it. It's so uncommon that the external development community largely wasn't prepared to cope with it even though it should be done much more regularly.


Ok, we've changed the title to be that of the article.

(Submitted title was "Android platform signing key compromised".)


Take a look here, I found some strings mentioned on VirusTotal for the malware in question (drv.androidsecurityteam.club) here in the slideshow. Sounds like malicious updaters are being signed by some vendors perhaps?

https://maldroid.github.io/docs/vb_2022.pdf

I'll be curious when more details come out, but there's not much to go off yet.

EDIT: Sounds like this may all tie back to a contractor providing OTA update applications, either being compromised or directly malicious themselves: https://wuffs.org/blog/digitime-tech-fota-backdoors

"They provide Android ODMs/OEMs with the SystemFota updater APK, instructions on how to build OTA packages and a web portal where they can upload them and view statistics."


>My understanding is that these certs are vendor-specific (and potentially more finely grained than that - I think it'd be fine to have a separate cert per model, for instance) and are rooted in the firmware, not the hardware - that is, it's not the secure boot signing key, and a new firmware update can simply use a different platform certificate.

Correct, affected OEMs can still rotate the cert used to sign their system apps and then push an OTA update to deliver the updated apps. Then they can push app updates with that new cert.

However, V2 versus V3 signatures complicate things a bit, I think.


You can freely update from a v2 signature to a v3 signature.

If the minimum SDK version is below when v2 was introduced, Android will include both a v1 and v2 signature by default.

If the minimum SDK version is above when v2 was introduced but below when v3 was introduced, Android will only include a v2 signature by default.

It will only include a v3 signature by default if the minimum SDK version is above when v3 was introduced, which is rare.

The reason it works this way is because v2 was a significant security upgrade over v1 but v3 is just v2 with rotation support. If you do a key rotation, you get a v3 signature included to handle it for versions with key rotation support. If you support older versions, the users on those versions are still relying on the original keys. This is transparently supported in a way that it's backwards compatible.

It would simply waste space to include v2 + v3 when no rotation has happened yet. I don't know why they bothered to micro-optimize to this extent but they did.

For some reason the Play Store doesn't make use of Android's support for key rotations yet. They only support key rotations with Play Signing since they're phasing out non-Play-signed apps. If you switch from non-Play-signed to Play Signing, they keep signing the app for existing installs with the old key but switch to a new key for fresh installs. They do the same thing if you trigger a key rotation with the UI. They could rotate the key for existing users on Android versions with v3 support. They might as well rotate it even for users on versions predating it, since if they did get an OS upgrade it will start working right away.


I assume that the level that the malware is running at couldn’t simply modify the OTA file and inject itself back in?


Potentially but then you get into “reflections on trusting trust” type attacks, which are pretty hard to sustain.


According to the OP of the report, the key used to sign an OTA image and the key used to sign system apps are different.


If your device is rooted, it seems like maybe the malware could change the key used to verify the OTA update to a key controlled by the malware authors? (I don't know how that key is stored.)


Yes but actually no. While they may be able to install a bogus update, unless they also compromised the keys for verified boot the system won't boot if they change anything interesting as AVB has a hardware root of trust (typically).


I believe the author of this report and the discoverer are different people.


Oh yeah, that's true. Here's who I was citing then: https://twitter.com/maldr0id/status/1598467755887529990


Depending on the age of the devices in question, 'simply' may not accurately reflect the solution as Android vendors generally stop updating devices as soon as they can get away with it, security issue or otherwise. Unless these are newer devices, I'd expect that a fair number of them will never get an update.


All the more reason to highlight the simplicity. Any vendor who decides not to do something simple to protect their customers should be relegated to the “never buy” heap.


>Edit to add: given Mediatek certs appear here, and given all the vendors linked to this have shipped low-end phones based on Mediatek SoCs, it wouldn't surprise me if it turns out Mediatek were the source.

This implies that all the affected vendors shared their private keys with Mediatek. Why would they need to do this? (Genuine question, I don't know much about firmware. But it doesn't seem like it should be necessary at first glance.)


It's possible Mediatek (or an unrelated contractor) did the firmware work for them all so they're the ones holding the keys in the end.


> My entirely unsupported guess is that they may all share manufacturing for some models, and certificates may have been on site there?

I think manufacturing only needs the public key, so the compromise can’t really happen there.


Firmware builds may be outsourced to the ODM in some scenarios.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: