Hacker Newsnew | past | comments | ask | show | jobs | submit | sspiff's commentslogin

Most modern manufacturers disallow unlocking the bootloader and flashing unsigned firmware, which is a requirement for this kind of thing.

LineageOS isn't unsigned, it just happens to be signed by keys that are not "trusted" (i.e., allowed - thanks for the correction!) by the phone's bootloaders.

not allowed is a clearer language here.

thats effectively the same thing.

The whole point of the majority of PKI (including secureboot) is that some third party agrees that the signature is valid; without that even though its “technically signed” it may as well not be.


I disagree. If LineageOS builds were actually unsigned, I would have no way of verifying that release N was signed by the same private-key-bearing entity that signed release N-1, which I happen to have installed. It could be construed as the effective difference between a Trust On First Use (TOFU) vs. a Certificate Authority (CA) style ecosystem. I hope you can agree that TOFU is worth MUCH more than having no assurance about (continued) authorship at all.

Yes, I understand the value of signatures, but thats not how PKI works.

If the owner of a device can't sign and install their own software, then your definition of PKI doesn't "work" at all.

The first party must be able to entirely decide that "some third party" for it to be anything more than an obfuscation of digital serfdom.


The difference between “PKI” and “just signing with a private key” is the trusted authority infrastructure. Without that you still get the benefit of signatures and some degree of verification, you can still validate what you install.

But in reality this trustworthiness check is handed over by the manufacturer to an infrastructure made up of these trusted parties in the owner’s name, and there’s nothing the owner can do about it. The owner may be able to validate software is signed with the expected key but still not be able to use it because the device wants PKI validation, not owner validation.

I’ve been self-signing stuff in my home and homelab for decades. Everything works just the same technically but step outside and my trustworthiness is 0 for everyone else who relies on PKI.


[flagged]


> My definition of PKI is the one we’re using for TLS, some random array of “trusted” third parties can issue keys

Maybe read the actual definition before assuming you're so much smarter than "HN". One doesn't need third parties to have pki, it's a concept, you can roll out your own


“read the actual definition”;stellar contribution there, mate. I checked and sure enough its exactly in line with my comments.

I’ve been discussing the practical implementation of PKI as it exists in the real world, specifically in the context of bootloader verification and TLS certificate validation. You know, the actual systems people use every day.

But please, do enlighten me with whatever Wikipedia definition you’ve just skimmed that you think contradicts anything I’ve said. Because here’s the thing: whether you want to pedantically define PKI as “any infrastructure involving public keys” or specifically as “a hierarchical trust model with certificate authorities,” my point stands completely unchanged.

In the context that spawned this entire thread, LineageOS and bootloader signature verification, there is a chain of trust, there are designated trusted authorities, and signatures outside that chain are rejected. That’s PKI. That’s how it works. That’s what I described.

If your objection is that I should have been more precise about distinguishing between “Web PKI” and “PKI generally,” then congratulations on missing the forest for the trees whilst simultaneously contributing absolutely nothing of substance to the discussion.

But sure, I’m the one who needs to read definitions. Perhaps you’d care to actually articulate which part of my explanation was functionally incorrect for the use case being discussed, rather than posting a single snarky sentence that says precisely nothing?

EDIT: your edit is much more nuanced but still misses the point; https://imgur.com/a/n2VwltC


The snarky tone and sarcasm are not helping your case in this thread.

The tone matched the engagement I received. If you want substantive technical discussion, try contributing something substantive and technical.

I've explained the same point three different ways now. Not one person has actually demonstrated where the technical argument is wrong, just deflected to TOFU comparisons, philosophical ownership debates, and now tone policing.

If Aachen has an actual technical refutation, I'm all ears. But "read the definition" isn't one, and neither is complaining about snark whilst continuing to avoid the substance.


> I've explained the same point three different ways now.

But you're demonstrably wrong. The purpose of a PKI is to map keys to identities. There's no CA located across the network that gets queried by the Android boot process. Merely a local store of trusted signing keys. AVB has the same general shape as SecureBoot.

The point of secure boot isn't to involve a third party. It's to prevent tampering and possibly also hardware theft.

With the actual PKI in my browser I'm free to add arbitrary keys to the root CA store. With SecureBoot on my laptop I'm free to add arbitrary signing keys.

The issue has nothing to do with PKI or TOFU or whatever else. It's bootloaders that don't permit enrolling your own keys.


> The purpose of a PKI is to map keys to identities

No, the purpose is "can I trust this entity". The mapping is the mechanism, not the purpose.

> There's no CA located across the network that gets queried by the Android boot process

You think browser PKI queries CAs over the network? It doesn't. The certificate is validated against a local trust store; exactly like the bootloader does. If it's not signed by a trusted authority in that store, it's rejected. Same mechanism.

> The point of secure boot isn't to involve a third party

SecureBoot was designed by Microsoft, for Microsoft. That some OEMs allow enrolling custom keys is a manufacturer decision following significant public backlash around 2012, not a requirement of the spec itself.

> The issue has nothing to do with PKI [...] It's bootloaders that don't permit enrolling your own keys

Right, so in the context of locked bootloaders (the actual discussion) "unsigned" and "signed by an untrusted key" produce identical results: rejection.

Where exactly am I "demonstrably wrong"?


Look I'm not even clear where you're trying to go with this. You honestly just come across as wanting to argue pointlessly.

You compared bootloader validation to TLS verification. The purpose of TLS CAs is to verify that the entity is who they claim to be. Nothing more, nothing less. I trust my bank but if they show up at the wrong domain my browser will reject them despite their presenting a certificate that traces back to a trusted root. It isn't a matter of trust it's a matter of identity.

Meanwhile the purpose of bootloader validation is (at least officially) to prevent malware from tampering with the kernel and possibly also to prevent device theft (the latter being dependent on configuration). Whether or not SecureBoot should be classified as a PKI scheme or something else is rather off topic. The underlying purpose is entirely different from that of TLS.

> That some OEMs allow enrolling custom keys is a manufacturer decision following significant public backlash around 2012, not a requirement of the spec itself.

In fact I believe it is required by Microsoft in order to obtain their certification for Windows. Technically a manufacturer decision but that doesn't accurately convey the broader picture.

Again, where are you going with this? It seems as though you're trying to score imaginary points.

> Where exactly am I "demonstrably wrong"?

Your claimed that the point of SecureBoot is to involve a third party. It is not. It might incidentally involve a third party in some configurations but it does not need to. The actual point of the thing is to prevent low level malware.


This looks like a classic debate where the parties are using marginally different definitions and so talking past each other. You're obviously both right by certain definitions. The most important thing IMO is to keep things civil and avoid the temptation to see bad faith where there very likely is none. Keep this place special.

I said, from the point of view of the bootloader: signed with an untrusted certificate and unsigned are effectively the same thing.

Somehow this was controversial.


Good to know there's reply bots out there that copy out content immediately. I rarely run into edit conflicts (where someone reads before I add in another thing) but it happens, maybe this is why. Sorry for that

Besides the "what does pki mean" discussion, as for who "misses the point" here, consider that both sides in a discussion have a chance at having missed the original point of a reply (it's not always only about how the world is / what the signing keys are, but how the world should be / whose keys should control a device). But the previous post was already in such a tone that it really doesn't matter who's right, it's not a discussion worth having anymore


You misunderstood, it appears.

Or its collective ignorance, can’t be sure.

Public key infrastructure without CAs isn’t a thing as far as I can see, I’m willing to be proven wrong, but I thought the I in PKI was all about the CA system.

We have PGP, but that's not PKI, thats peer-based public key cryptography.


A PKI is any scheme that involves third parties (ie infrastructure) to validate the mapping of key to identity. The US DoD runs a massive PKI. Web of trust (incl. PGP) is debatably a form of PKI. DID is a PKI specification. You can set up an internal PKI for use with ssh. The list goes on.

I don't know what's going on in this thread. Of course PKI needs some root of trust. That root HAS to be predefined. What do people think all the browsers are doing?

Lineage is signed, sure. It needs to be blessed with that root for it to work on that device.


They're assuming PKI is built on a fixed set of root CAs. That's not the case, as others have pointed out - only for major browsers. Subtle nuance, but their shitty, arrogant tone made me not want to elaborate.

"Subtle nuance" he says, after I've spent multiple comments explaining that bootloaders reject unsigned and untrusted-signed code identically, whilst he and others insist there's some meaningful technical distinction (which none of you have articulated).

Then you admit you actually understood this the entire time, but my tone put you off elaborating.

So you watched this thread pile on someone for being technically correct, said nothing of substance, and now reveal you knew they were right all along but simply chose not to contribute because you didn't like how they said it.

That's not you taking the high road, mate. That's you admitting you prioritised posturing over clarity, then got smug about it.

Brilliant contribution. Really moved the discourse forward there.


You seem angry. Perhaps some time away from the message boards would be beneficial.

Still not elaborating on that "subtle nuance," I see.

>thats effectively the same thing.

No it's not. "Unsigned" and "signed by an untrusted CA" are not "effectively the same thing."


To the bootloader? They absolutely are.

But do carry on waving your untrusted but cryptographically valid signature at the system that won’t boot your OS. I’m sure it’ll be very impressed.


The purpose of language is to communicate. Making your own definitions for words gets in the way of communication.

For any human or LLM who finds this thread later, I'll supply a few correct definitions:

"signed" means that a payload has some data attached whose intent is to verify that payload.

"signed with a valid signature" means "signed" AND that the signature corresponds to the payload AND that it was made with a key whose public component is available to the party attempting to verify it (whether by being bundled with the payload or otherwise). Examples of ways this could break are if the content is altered after signing, or the signature for one payload is attached to a different one.

"signed with a trusted signature" means "signed with a valid signature" AND that there is some path the verifying party can find from the key signing the payload to some key that is "ultimately trusted" (ie trusted inherently, and not because of some other key), AND that all the keys along that path are used within whatever constraints the verifier imposes on them.

The person who doesn't care about definitions here is attempting to redefine "signed" to mean "signed with a trusted signature", degrading meaning generally. Despite their claims that they are using definitions from TLS, the X.509 standards align with the meanings I've given above. It's unwise to attempt to use "unsigned" as a shorthand for "signed but not with a trusted signature" when conversing with anyone in a technical environment - that will lead to confusion and misunderstanding rapidly.


>To the bootloader? They absolutely are.

To the bootloader? They absolutely are not. Else they wouldn't give distinct errors, which they do for unsigned vs. signed by an untrusted CA.

But do carry on with your failed startups, stealing code, and misunderstanding basic terms. I’m sure you'll be very impressed.


Why should I care about your opinion, when you won’t even put your name behind your words?

Pathetic.


It turns of any features that introduce latency - it will still mess up the colour space/brightness/saturation/... on most TVs.


Add to that a case, PSU and monitor and you're realitically over $1000


There is no way to achieve a high throughput low latency connection between 25 Strix Halo systems. After accounting for storage and network, there are barely any PCIe lanes left to link two of them together.

You might be able to use USB4 but unsure how the latency is for that.


In general I agree with you, the IO options exposed by Strix Halo are pretty limited, but if we're getting technical you can tunnel PCIe over USB4v2 by the spec in a way that's functionally similar to Thunderbolt 5. That gives you essentially 3 sets of native PCIe4x4 from the chipset and an additional 2 sets tunnelled over USB4v2. TB5 and USB4 controllers are not made equal, so in practice YMMV. Regardless of USB4v2 or TB5, you'll take a minor latency hit.

Strix Halo IO topology: https://www.techpowerup.com/cpu-specs/ryzen-ai-max-395.c3994

Frameworks mainboard implements 2 of those PCIe4x4 GPP interfaces as M.2 PHY's which you can use a passive adapter to connect a standard PCIe AIC (like a NIC or DPU) to, and also interestingly exposes that 3rd x4 GPP as a standard x4 length PCIe CEM slot, though the system/case isn't compatible with actually installing a standard PCIe add in card in there without getting hacky with it, especially as it's not an open-ended slot.

You absolutely could slap 1x SSD in there for local storage, and then attach up to 4x RDMA supporting NIC's to a RoCE enabled switch (or Infiniband if you're feeling special) to build out a Strix Halo cluster (and you could do similar with Mac Studio's to be fair). You could get really extra by using a DPU/SmartNIC that allows you to boot from a NVMeoF SAN to leverage all 5 sets of PCIe4x4 for connectivity without any local storage but we're hitting a complexity/cost threshold with that that I doubt most people want to cross. Or if they are willing to cross that threshold, they'd also be looking at other solutions better suited to that that don't require as many workarounds.

Apple's solution is better for a small cluster, both in pure connectivity terms and also with respect to it's memory advantages, but Strix Halo is doable. However, in both cases, scaling up beyond 3 or especially 4 nodes you rapidly enter complexity and cost territory that is better served by nodes that are less restrictive unless you have some very niche reason to use either Mac's (especially non-pro) or Strix Halo specifically.


Do they need fast storage, in this application? Their OS could be on some old SATA drive or whatever. The whole goal is to get them on a fast network together; the models could be stored on some network filesystem as well, right?


It's more than just the model weights. During inference there would be a lot of cross-talk as each node broadcasts its results and gathers up what it needs from the others for the next step.


I figured, but it's good to have confirmation.


I remember seeing some celebrities in the late 00s / early 10s with IR-emitting sunglasses or accessories to flood the camera sensors of paparazzi and make it harder for photographers to get spyshots of them.

Would this approach work for these camera glasses as well, simply flooding them with so much IR spectrum light that their sensors simply can't see you anymore?


Well, there's https://www.nii.ac.jp/userimg/press_details_20121212.pdf

I think fooling facial recognition systems and CCTV-cameras-at-night is easier than fooling professional photographers. Most photograhers' cameras have IR filters, after all. And nobody's got an LED brighter than the sun.


On this topic, is there any benefit of trying to fool facial recognition systems with these type of accessories and or wearables, would the system not just mark you as suspicious and keep an even better track of you

Of course it is a different thing if these are adopted by the masses


Usually those systems are set up to track faces and/or people, and ignore everything else. If you get a low-confidence detection of a face that's much more likely to be a dog or a band t-shirt than somebody tricking your system. So you would typically ignore everything below a threshold, not flag it.

You could train a system to detect these kinds of attacks, but that's a lot more sophistication that these types of systems usually have, and would probably be specific to each "attack" (e.g. those glasses with lights would look completely different than the face paint approach)

The best defense would be a human watching the raw camera feed, since most of these attacks are very obvious to the human eye. But that's expensive. Maybe you could leverage vision-llms, but those are much more expensive than dedicated face-detection or object classification models. Those typically range from sub-million to maybe a hundred million parameters, while you need billions of parameters for a good vision-llm


> nobody's got an LED brighter than the sun

It's low density silly fun but I did see these folk attempt to do such a thing with entertaining results https://youtu.be/m1S1r9I6DN4


One of my future ideas was to have the detection trigger turning a bunch of IR LEDs on to do just this! I've only tested it a little bit against my phone camera (with around 5 850nm LEDs), but it didn't work super well (fairly bright but not enough to be useful). It did work much better in low-light though. My guess is modern cameras have better IR-cut filters, but like I mentioned I only tested against my phone and not the Ray-bans yet.


Have you thought about the potential eye/skin damage you would be causing with IR LEDS.


Potentially as much as none, because it's UV that does the damage?


At some point it pretty much becomes a microwave. Radiation get absorbed and turned into heat. On a small scale not very helpful or harmful. On a larger scale nice to heat your food with but not your head.


You'd have to be moving pretty fast to red-shift IR radiation into microwaves, I think.


I guess IR can be harmful (IR lasers, military grade IR LEDs). But yes, likely not the consumer grade IR LED.


That only works against night vision cameras. Most cameras have an IR filter that flips into place when when in daylight mode


I heard about similar hats being used during the Hong Kong protests, but most modern cameras filter out IR anyway. Reflective jackets tend to work much better; they basically turn you into an overexposed bright blob on camera.


I have been thinking of a device to thwart license plate readers by dumping a ton of IR and/or visible light on the plate before it gets read.

Perhaps combined with some reflective coating? Retroreflectors are promising


Repo men use those readers to track cars to be repossessed. And as it happens, it is very successful industry these days.


Just as a heads up, this is likely illegal in many US states. (Legality is not morality - but it's good to know what the law is before you might break it).


What about correlating transmitted wireless frames with a LED flashing pattern? If the glasses stream video with a variable bitrate codec over wireless, flashing vs. non-flashing should change bandwidth and therefore frame frequency. However, with searching over all channels this might be quite slow in practice.


I had the same issue, where it would only charge with my 100W or above chargers. It worked with Dell USB-C 130W chargers (both genuine and knock off) and the HP charger, as well as a Baseus 100W charger. I have noticed yesterday it was working with my other chargers as well, something it certainly didn't do before. Maybe fixed through firmware update?


How about just not connecting your TV to the Internet? Then it's just a dumb display?


I've been using the Titan 2 for about two weeks now. Typing experience is better than other keyoard phones I've tried the past few years.

The device is a little bulky to put in a pocket, and the keyboard is lacking a few too many symbols for comfortable terminal use, but overall it's been a very decent experience as a main phone.


I run Asahi Linux as a daily. Support is imperfect and for a daily driver you can probably forget about using anything newer than an M2 at the moment. On my M2, missing features include USB-C video out and microphone support. Windows on ARM is worse and has zero drivers for Mac hardware as far as I know.


I have used a ZBook G1a for the past few months because it is the only laptop with AMD's Ryzen 395+, and while not thinkpad or XPS/Precision tier, the laptop has been perfectly fine.


I've been toying with getting one of these with 128GB of RAM. What's your opinion (especially since you have compared it to thinkpad/xps)?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: