Use Reddit? Come join our Reddit Sub as another place to follow the community!
Title: Why most people are approaching the xz-attack wrong.
LoudTechie 17 hours ago
The xz-attack was a high profile supply chain attack that affected most systems and Linux based ones in particular.

Part of the attack involved replacing the maintainer of the xz utility through a coordinated harassment campaign and a trust building attack.
This was done quite expertly and the cyber security community has mostly pinned the blame of this nearly successful attack on lack of funding for critical open source projects.

In my eyes this's wrong on many levels
a. This completely ignores all the other technical and policy failures that nearly made this possible.
b. It focuses on the problem that is hardest to fix for technical people(socio-economic problems are for management an hr not engineers)
c. It proposes a solution that can't be implemented by the proposer.

I argue 16 different vulnerabilities were exploited, many of which can easily be solved on a technical level.

1. Maintainers have never been trained or enabled to spot malicious behavior or code or even been trained, to write secure code themselves. The best they can have is job experience with red teamers who shoot holes in their software.
Writing secure code isn't thought in schools, there're few tools to enable it and there're no courses available online to learn it. One can easily learn how to crack code, but that isn't the same.
2. Machine generated data is accepted. This puts unfair pressure on maintainers who've to review content machines instead of the work of a fellow human. All machine generated inputs should be replaced with scripts that generate the code.
3. Github has insufficient moderation to stop targeted harassment campaigns.
4. Nobody notes major maintainer changes as suspicious.
5. Tests can mod output binaries.
6. Nobody actually hash checks reproducable builds.
7. Regression tests are fully maintained by the same one that introduces regressions. For maintainability this doesn't matter for security it does.
8. git information isn't checked by anybody on accuracy(checking timestamps can simply be done when receiving pushes)
9. The control questions on key signing parties focus squarely on who you're, not what you claim to have done.
10. Code isn't written maintainable enough, backdoors can easily be patched out.
11. We accept hardcoded credentials in binaries.
12. SystemD does too many things as one program to the level that corrupting it allows one to transcend the principle of least privilege, since it does everything and thus also needs ever privilege.
13. Dynamic libraries can take over the calling library.
14. OSI layer 7 encryption can only be checked by the app. With session layer encryption(OSI 6) security tools could control your communications.
15. non-installer programs can change executables without external permission. Changing executables should be the exclusive right of the user, installer and package manager.
16. Trusted remote controllers are only known and checked by the remote control program not anybody else. This is risky enough there should be a second party to check this.
17. As the others have noted. Open source projects have insufficient tools, manpower and funds to maintain talent and/or run their own hr department.

Many of these issues can be solved with better tooling, or at least without needing to beg for money and manage KYC laws in a community that values anonymity and transparency.

1. Offer courses in writing code using security by design principles or simply teach developers to use existing security interfaces like pledge and landlock.
2. Tools to automatically generate scripts based on earlier and tools to detect and reject machine generated input should be made. Also this could be taken up in official guidance from the ssf, NIST or other influential organisation on security matters.
3. Duplicate issues should be forbidden and replaced by likes. Also microsoft should start moderating github like the social medium it's. The good news here is one can be pretty strict when moderating such a clearly themed forum as a github issue page.
4. Git fetch should warn for maintainer changes.
5. Change make to apply privilege minimization principles on the code base.
6. Distro maintainers should actually hash check their self compiled binary to the provided one. Automating this should make it more realistic that people do this.
7. Make a tool that generates automatic regression tests based on older versions of the program.
8. Real time git recipients like maintainers and hosting providers should check at least the time with the here and now.
9. Make a tool that generates based on your git history questions like: "why did you use popen for commit #12"
10. More maintainability tooling and checks.
11. Distros and/or virusscanners should detect and block code with key size high entropic blocks of data.
12. Splits SystemD in several seperate sandboxes(the SystemD team seems to have actually realised the first part of this)
13. Patch the library loader.
14. Make an OSI 6 encrypted connection interface in the OS and encourage its usage for high risk applications.
15. Simply don't give programs the permission to write any file for which anyone except root has execution permissions.
16. Have a central "remote control trusted keys" database and request control through an api that checks it against these keys(this api should be simple and just run a by you provided binary with root permissions, when the authentication succeeds).
17. Moneyyy or ... independent press outputs. The problem here's that one person was, so unaccountable that when they dissapeared and somebody else took their place nobody even noticed. Let a bunch of guys scour the internet for major, but non-critical security events like critical maintainer changes, large code base changes, unreadable parts of code bases and document them.

If you made it so far.
Thank you for listening to my rant.
If you have your own insights feel free to provide them.

Last edited by LoudTechie on 30 Apr 2026 at 11:40 pm UTC
tuubi 8 hours ago
User Avatar
I'm way too tired and preoccupied to put serious thought into this now and I'm not going to get into a debate, but just a few nitpicks after a quick skim:

Quoting: LoudTechieWriting secure code isn't thought in schools, there're few tools to enable it and there're no courses available online to learn it.
That's not true. There are plenty of educational institutions that offer courses and curricula in secure development, and tons of resources online.

Quoting: LoudTechieNobody actually hash checks reproducable builds.
Package managers like APT check hashes and package/repo signatures, and will stop the install in case of a mismatch unless you explicitly override it. Even source-based distros like Gentoo and Arch do integrity checks and will fail to build or install in case of "corrupt" repositories or packages. Of course, all of this is based on several levels and tiers of trust, which led to the whole XZ fiasco.

But maybe I missed your point here.

Quoting: LoudTechiegit information isn't checked by anybody on accuracy(checking timestamps can simply be done when receiving pushes)
I'm not sure this is what you're talking about, but the way git works, you can't simply modify a repo or even a single commit without changing the hashes and causing a conflict. Git itself is built for integrity. No need to check individual timestamps when they're included in the hashes.

Quoting: LoudTechieSystemD does too many things as one program to the level that corrupting it allows one to transcend the principle of least privilege, since it does everything and thus also needs ever privilege.
This isn't true either. SystemD is a very modular collection of tools and services you can install, enable and manage granularly. What distros actually implement is another matter, but what they do is no less secure than what they replaced with it.

Quoting: LoudTechienon-installer programs can change executables without external permission. Changing executables should be the exclusive right of the user, installer and package manager.
Most of the executables on typical Linux systems are only modifiable by root, and hopefully most of your processes do not need or get root privileges.

---

In the end, these security issues fundamentally boil down to matters of trust and management. There are no technical solutions to the root causes, and there's no way to force specific tools or best practices on the entirety of the open source community. That's just the way it is, and forever will be.

We can of course build better tools and advocate their use, and public and private organisations dependent on open source should heavily invest in the security of these projects.

Just as an aside, I'm a developer who in recent years drifted into infosec management, and I've seen first hand how organisations struggle with all this due to short-term business targets overriding any security concerns. Gotta fill those quarterly reports with arrows pointing up, no matter the cost in the long run. It's all a game of numbers to the investors after all. The sad thing is, this style of management has crept into the public space as well.

Thank dog for recent regulation focused on cybersecurity and supply chain management such as NIS2 in the EU. Infosec is expensive, and thus most corporations need to be forced to do it at the pain of heavy fines. They'd often rather accept or share the risks, i.e., pay for insurance, than invest in proper controls.

If you doubt the power of regulations, see how the GDPR has affected the privacy landscape in the EU for reference. I wish there was a way to enforce this stuff globally.
tmtvl 4 hours ago
Quoting: tuubiEven source-based distros like (...) Arch
Oi. It's a tiny nitpick, but Arch isn't source-based. Aside from Gentoo there's Exherbo and the Source Mage family (Lunar, for example). Crux, which inspired Arch, is also source-based; but Arch very specifically isn't (even though the AUR exists).
tuubi 3 hours ago
User Avatar
Quoting: tmtvl
Quoting: tuubiEven source-based distros like (...) Arch
Oi. It's a tiny nitpick, but Arch isn't source-based. Aside from Gentoo there's Exherbo and the Source Mage family (Lunar, for example). Crux, which inspired Arch, is also source-based; but Arch very specifically isn't (even though the AUR exists).
Heh, true. As I said, way too tired and preoccupied. 😆

I guess my brain somehow skipped mid-sentence from source-based to rolling release distros or something...

Granted, my own experience with (and interest in) Arch is fairly minimal, but I did run Gentoo with fluxbox as my daily driver for a couple of years back when it was shiny and cool in the early 2000s. I think it took me three days to bootstrap, configure, build and install it all from stage 1 on the old Athlon Thunderbird, with the installation guide printed on A4 sheets because I didn't own a second Internet-capable device. And I feel like I spent 90% of my time just fiddling with the system and building packages. But at least I learned a lot about Linux.

And now I'm rambling. Happy May Day / Labour Day or whatever. 🐧
LoudTechie 2 hours ago
Quoting: tuubiI'm way too tired and preoccupied to put serious thought into this now and I'm not going to get into a debate, but just a few nitpicks after a quick skim:

Quoting: LoudTechieWriting secure code isn't thought in schools, there're few tools to enable it and there're no courses available online to learn it.
That's not true. There are plenty of educational institutions that offer courses and curricula in secure development, and tons of resources online.

Quoting: LoudTechieNobody actually hash checks reproducable builds.
Package managers like APT check hashes and package/repo signatures, and will stop the install in case of a mismatch unless you explicitly override it. Even source-based distros like Gentoo and Arch do integrity checks and will fail to build or install in case of "corrupt" repositories or packages. Of course, all of this is based on several levels and tiers of trust, which led to the whole XZ fiasco.

But maybe I missed your point here.

Quoting: LoudTechiegit information isn't checked by anybody on accuracy(checking timestamps can simply be done when receiving pushes)
I'm not sure this is what you're talking about, but the way git works, you can't simply modify a repo or even a single commit without changing the hashes and causing a conflict. Git itself is built for integrity. No need to check individual timestamps when they're included in the hashes.

Quoting: LoudTechieSystemD does too many things as one program to the level that corrupting it allows one to transcend the principle of least privilege, since it does everything and thus also needs ever privilege.
This isn't true either. SystemD is a very modular collection of tools and services you can install, enable and manage granularly. What distros actually implement is another matter, but what they do is no less secure than what they replaced with it.

Quoting: LoudTechienon-installer programs can change executables without external permission. Changing executables should be the exclusive right of the user, installer and package manager.
Most of the executables on typical Linux systems are only modifiable by root, and hopefully most of your processes do not need or get root privileges.

---

In the end, these security issues fundamentally boil down to matters of trust and management. There are no technical solutions to the root causes, and there's no way to force specific tools or best practices on the entirety of the open source community. That's just the way it is, and forever will be.

We can of course build better tools and advocate their use, and public and private organisations dependent on open source should heavily invest in the security of these projects.

Just as an aside, I'm a developer who in recent years drifted into infosec management, and I've seen first hand how organisations struggle with all this due to short-term business targets overriding any security concerns. Gotta fill those quarterly reports with arrows pointing up, no matter the cost in the long run. It's all a game of numbers to the investors after all. The sad thing is, this style of management has crept into the public space as well.

Thank dog for recent regulation focused on cybersecurity and supply chain management such as NIS2 in the EU. Infosec is expensive, and thus most corporations need to be forced to do it at the pain of heavy fines. They'd often rather accept or share the risks, i.e., pay for insurance, than invest in proper controls.

If you doubt the power of regulations, see how the GDPR has affected the privacy landscape in the EU for reference. I wish there was a way to enforce this stuff globally.
You seem to know what you're talking about, so I would love to continue the conversation about this.
To enable your desire for lack of debate I will provide you with an index of my generic reactions, so you can cherry pick the parts you consider worth reacting to.

1. I've never found any, please share. I neeeed it.🫨
2. Reason why I thought this.
3. You seem to have misunderstood me.
4. You're right.
________________
1. My point short explained.
2. Disagreement on the function of security development.
3. Yeah, few can force best practices. One can enable them and we failed there.
4. Yeah, I know.

Your first two points appear to come from an information asymmetry between us. Hopefully me, but at least one of us is simply wrong about the truth.

1. On your first point.
Really, do you have some examples?
I build most of my academic career on the fact I can't find those. I would love to be proven wrong.
Cybersecurity courses always are either blue and red teaming never yellow teaming and software development courses do only teach correct and maintainable development, which is great for functional code, but does little for securit(just, because something does everything you want, doesn't mean it doesn't also do a lot of things you don't want).

2. On your second point:
Than how did this attack ever get anywhere? Comparing the hash of the provided binary and a self-compiled binary should have immediately shown a mismatch and thus set off alarms.

3. Your third point seems to stem from a misunderstanding of my point.
My point is that, because SystemD does so much different things it must at least have the permissions for all the things it does.
Meaning that sandboxing based on function(a.k.a principle of least privilege) becomes infeasable.
Yes, you can turn off SystemD functions, but for every function you turn off you lose functionality.
If it was several separate libraries/sandboxes you could give them each only part of the permissions of SystemD without having to sacrifice functionality.

4. Your fourth point is completely right.
I saw that the payload contained a self-update mechanism and concluded this.
I was wrong.
What went wrong here is simply the fact that a single network connected program can represent all admin decisions on its own.
This is the default assumption of a monolith kernel, so not realistically solvable on Linux.

_________________________________________________________________________________________________________________________________

1. I think you come pretty close to understanding my point.
My point is that although not all issues exposed in the xz-attack is solvable with technology much of it's.
I also argue that the cobbler should stick to his last. Technical people should solve the technical aspects and leave the socio-economic aspects to managers, politicians and social scientists.

2. On the root causes thing.
All security issues boil down to trust and management. An improper serialisation bug places too much in unverified data, an sql injection attack fails to manage untrusted user data out of preforming higher trust operations(queries), etc.
We use security technology to manage and enforce our existing trust.
Technical security vulnerabilities are mistakes in using this to accurately reflect our trust in technology.

3. Also XZ and at least some of its victims were using the best and latest of security tools. We can at least enable those to assure their own security. XZ provides reproducible builds, signs its code, accepts only web of trust verified maintainers and uses cloud based forum hosting. The victims together use most to all virus scanners. The latest version of all software involved at least once. All the best practices.
Not all of them do so individually, but the point is that any of those measures or any combination of those measures was sufficient the problem would've been spotted much earlier.
We can't force everybody to follow best practices, but we can enable those who do to keep the others and themselves safe.

4. On your aside, yeah I know. The growing regulatory pressure is the only reason my study and career choice(yellow teaming) makes any financial sense.

Last edited by LoudTechie on 1 May 2026 at 3:16 pm UTC
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon Logo Patreon. Plain Donations: PayPal Logo PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
Login / Register