Latest Comments by DamonLinuxPL
Linux GPU Configuration Tool 'LACT' adds NVIDIA support
16 Nov 2024 at 11:18 pm UTC Likes: 6
One of them is the lack of the possibility of compiling without using bootstrap, or to put it simply, to compile rust we need to have previously compiled rust... So to compile rust we have to use rust and how to do it? Well, it is not possible, so we have to use precompiled rust from the developer and take their word for it that there is no danger in the code - among other reasons, several distributions did not want to include it in their repositories at the beginning of rust's existence. The creators of rust know about it and supposedly announced work on solving this issue, but in practice no one is in a hurry until someone injects something into the binaries and it goes like bad code in .tar.xz...
Another more serious issue is the way libraries are distributed. Normally, when we build an application, we need several development packages, e.g. libdrm-devel, etc., and we have these libraries in the system, and if not, we add them to the system repository. In rust, the problem is more complex. If we want to compile a rust package, e.g. LACT, we run the cargo-build command inside the source directory, it will actually compile quickly... but this is cheating, not real compilation, because it downloads the necessary modules on the fly during compilation. None of the distributions that focus on security, such as Debian, Ubuntu, Fedora, OpenMandriva or OpenSUSE, etc., can allow compilation with network access, it is strictly forbidden, why? Well, in this way someone could inject potentially dangerous code, or over time simply replace the library at the source, in addition, it is also protection against losing sources when, for example, a program or library becomes unavailable because the site that hosted it disappears from the network etc.
So what other options do I have? Two, do it properly or vendor it.
Do it properly: that is, open the archive and check what dependencies a given rust package needs. And we have a lot of them here, e.g. Lact or Fractal etc. require at least 100 dependencies! A lot, right? Considering that e.g. the corectrl alternative needs only a few... Ok, let's start, we'll add 10 packages a day to the repository and we'll finish by the end of the month ;) And let's try the first crate - let's say it's called rust-libfoo, we add it and when we try to build offline it shows that it needs 3-5 other rust libraries... So we take the first one and it also has its dependencies, which we also have to add... Understanding that 100 packages turns into not 300 but 500 or even more... Even distributions that are paid/sponsored by large companies can't handle it, neither Canonical with their Ubuntu, nor RedHat in RHEL and Fedora, nor even Suse in OpenSuse or even the large community Debian... not to mention small systems. But let's assume that some Mr. Mark from Canonial decides to do it right and tells his 10/20 developers to sit down and make rust in the repository for money, build a mechanism to automate it, use AI or whatever you want, but clean up the rust crates in the repo. Ok, let's say that some time has passed and they've added all the dependencies from one application. Cool? Now give them a second application, e.g. Fractal or Shortwave etc. Ok, they start adding and encounter the same rust-libfoo dependency but in a different version, because the first application required libfoo in version 1.1.0 and the second application already 1.2.0... Another problem, so what if we add two, let them duplicate? There will be hundreds of these duplicates in the repo and they will create problems... Even in Python, hated by some, we have this nicely solved so-called application A requires python-libfoo in version equal to or greater than 1.0.0, sometimes as the API changes they also add but not greater than 2.0.0. But rust had to mess up as usual and requires this specific 1.1.0 and nothing more or less, MADNESS! Nobody can handle this with 500 packages at the moment...
Okay so we abandon the idea of doing it right and we want to make it work at all. So we take and vendor all these rust crates, open the archive and forge a 'cargo vendor' on it and after 10 minutes of downloading we create an archive from these 5GB of downloaded data and send it to our server... Now rust is told to build offline but using vendored dependency. The build begins, which even on ThreadRipper takes forever.
What's wrong with this method apart from the big size and duration of compilation? So if a vulnerability is detected in the rust-libfoo library, we have to take and manually fix every package containing libfoo... When in the normal world, like in libpng, a vulnerability is detected, it is enough to update one libpng package and all applications that require it fixed! Simple right? So let's start, we download the fixed rust-libfoo in version 1.2.1 and try to re-vendor rust packages. The problem is that with each subsequent vendor it still downloads the old package (affected by the dangerous code) hmm. We have to make a patch and force it to require the 1.2.1 package now. Ok, the vendor was successful, we start compiling. After recompiling the error, we look for a solution, we waste time (and the code with the vulnerability is still running around the repository), we finally have it, it was a protection against patching... What id*ot could have come up with that? Well, we apply to rust (the main package) patch to removing patching protection and compile it first - it takes a few hours, and only then we go back to our package with libfoo. And so we have to re-vendor every package containing this vulnerability. It's dangerous and basically duplicates what Microsoft did in Windows, it was to avoid this that we created system repositories apt from .deb, urpmi, zypper or dnf from .rpm and all this just to come back after 30 years to what Microsoft did wrong ;) Madness
And this is just the beginning of problems with rust in distribution. The longer we deal with it, the more problems we see, and we are not the only ones, we talk to creators of other distributions and they face the same problems. But the biggest problem is convincing the community that rust may be good from the perspective of an application developer, but not necessarily from the point of view of distribution. The problem is, if anyone speaks up on this matter, a group of trolls will come - who have never had anything to do with it, but the fact that someone said that rust is great is enough to start hating and discrediting other people who try to draw attention to its flaws. Nobody says - let's not use rust because it has problems, we say: 'hey rust developers, it has issues and before we implement it, fix it please'. But as you can see, someone argued against rust in the Linux kernel, and the trolls sent a SWAT team after him (remember that drama from the last few months?). The problem is that rust is already implemented everywhere, firefox, chromium, mesa with opencl implementation (rusticl), or arm and rust drivers (apple m1), now panfrost, and also open source nvidia vulkan in rust or even the kernel itself... This causes problems for us, and they are getting bigger and bigger. For example, mesa released nvidia vulkan, ok everyone thought how to handle that rust there, look at debian, they already implemented a new mesa - they probably found a way to do it, great job guys ;). Checked and they simply disabled vulkan for nvidia because they were not able to handle it in time...
Ps. Sorry for long topic (and all mistakes in it) but this is most ignored case and most important for distribution.
16 Nov 2024 at 11:18 pm UTC Likes: 6
Quoting: nnohonsjnhtsylaySadly its written in rust so it takes forever to compile on my computer, even with my cpu with 24 threadsThis is indeed a big problem. As a developer in one of the Linux distributions, I would like to highlight a few problems related to rust. Of course, this does not mean that rust is bad, I myself think that it may not be great, but it is definitely good - although it has its big problems.
One of them is the lack of the possibility of compiling without using bootstrap, or to put it simply, to compile rust we need to have previously compiled rust... So to compile rust we have to use rust and how to do it? Well, it is not possible, so we have to use precompiled rust from the developer and take their word for it that there is no danger in the code - among other reasons, several distributions did not want to include it in their repositories at the beginning of rust's existence. The creators of rust know about it and supposedly announced work on solving this issue, but in practice no one is in a hurry until someone injects something into the binaries and it goes like bad code in .tar.xz...
Another more serious issue is the way libraries are distributed. Normally, when we build an application, we need several development packages, e.g. libdrm-devel, etc., and we have these libraries in the system, and if not, we add them to the system repository. In rust, the problem is more complex. If we want to compile a rust package, e.g. LACT, we run the cargo-build command inside the source directory, it will actually compile quickly... but this is cheating, not real compilation, because it downloads the necessary modules on the fly during compilation. None of the distributions that focus on security, such as Debian, Ubuntu, Fedora, OpenMandriva or OpenSUSE, etc., can allow compilation with network access, it is strictly forbidden, why? Well, in this way someone could inject potentially dangerous code, or over time simply replace the library at the source, in addition, it is also protection against losing sources when, for example, a program or library becomes unavailable because the site that hosted it disappears from the network etc.
So what other options do I have? Two, do it properly or vendor it.
Do it properly: that is, open the archive and check what dependencies a given rust package needs. And we have a lot of them here, e.g. Lact or Fractal etc. require at least 100 dependencies! A lot, right? Considering that e.g. the corectrl alternative needs only a few... Ok, let's start, we'll add 10 packages a day to the repository and we'll finish by the end of the month ;) And let's try the first crate - let's say it's called rust-libfoo, we add it and when we try to build offline it shows that it needs 3-5 other rust libraries... So we take the first one and it also has its dependencies, which we also have to add... Understanding that 100 packages turns into not 300 but 500 or even more... Even distributions that are paid/sponsored by large companies can't handle it, neither Canonical with their Ubuntu, nor RedHat in RHEL and Fedora, nor even Suse in OpenSuse or even the large community Debian... not to mention small systems. But let's assume that some Mr. Mark from Canonial decides to do it right and tells his 10/20 developers to sit down and make rust in the repository for money, build a mechanism to automate it, use AI or whatever you want, but clean up the rust crates in the repo. Ok, let's say that some time has passed and they've added all the dependencies from one application. Cool? Now give them a second application, e.g. Fractal or Shortwave etc. Ok, they start adding and encounter the same rust-libfoo dependency but in a different version, because the first application required libfoo in version 1.1.0 and the second application already 1.2.0... Another problem, so what if we add two, let them duplicate? There will be hundreds of these duplicates in the repo and they will create problems... Even in Python, hated by some, we have this nicely solved so-called application A requires python-libfoo in version equal to or greater than 1.0.0, sometimes as the API changes they also add but not greater than 2.0.0. But rust had to mess up as usual and requires this specific 1.1.0 and nothing more or less, MADNESS! Nobody can handle this with 500 packages at the moment...
Okay so we abandon the idea of doing it right and we want to make it work at all. So we take and vendor all these rust crates, open the archive and forge a 'cargo vendor' on it and after 10 minutes of downloading we create an archive from these 5GB of downloaded data and send it to our server... Now rust is told to build offline but using vendored dependency. The build begins, which even on ThreadRipper takes forever.
What's wrong with this method apart from the big size and duration of compilation? So if a vulnerability is detected in the rust-libfoo library, we have to take and manually fix every package containing libfoo... When in the normal world, like in libpng, a vulnerability is detected, it is enough to update one libpng package and all applications that require it fixed! Simple right? So let's start, we download the fixed rust-libfoo in version 1.2.1 and try to re-vendor rust packages. The problem is that with each subsequent vendor it still downloads the old package (affected by the dangerous code) hmm. We have to make a patch and force it to require the 1.2.1 package now. Ok, the vendor was successful, we start compiling. After recompiling the error, we look for a solution, we waste time (and the code with the vulnerability is still running around the repository), we finally have it, it was a protection against patching... What id*ot could have come up with that? Well, we apply to rust (the main package) patch to removing patching protection and compile it first - it takes a few hours, and only then we go back to our package with libfoo. And so we have to re-vendor every package containing this vulnerability. It's dangerous and basically duplicates what Microsoft did in Windows, it was to avoid this that we created system repositories apt from .deb, urpmi, zypper or dnf from .rpm and all this just to come back after 30 years to what Microsoft did wrong ;) Madness
And this is just the beginning of problems with rust in distribution. The longer we deal with it, the more problems we see, and we are not the only ones, we talk to creators of other distributions and they face the same problems. But the biggest problem is convincing the community that rust may be good from the perspective of an application developer, but not necessarily from the point of view of distribution. The problem is, if anyone speaks up on this matter, a group of trolls will come - who have never had anything to do with it, but the fact that someone said that rust is great is enough to start hating and discrediting other people who try to draw attention to its flaws. Nobody says - let's not use rust because it has problems, we say: 'hey rust developers, it has issues and before we implement it, fix it please'. But as you can see, someone argued against rust in the Linux kernel, and the trolls sent a SWAT team after him (remember that drama from the last few months?). The problem is that rust is already implemented everywhere, firefox, chromium, mesa with opencl implementation (rusticl), or arm and rust drivers (apple m1), now panfrost, and also open source nvidia vulkan in rust or even the kernel itself... This causes problems for us, and they are getting bigger and bigger. For example, mesa released nvidia vulkan, ok everyone thought how to handle that rust there, look at debian, they already implemented a new mesa - they probably found a way to do it, great job guys ;). Checked and they simply disabled vulkan for nvidia because they were not able to handle it in time...
Ps. Sorry for long topic (and all mistakes in it) but this is most ignored case and most important for distribution.
NVIDIA switching to open kernel modules by default in future driver update for Turing+
11 May 2024 at 8:12 pm UTC Likes: 8
11 May 2024 at 8:12 pm UTC Likes: 8
Quoting: finaldestCorrect me if I am wrong but does this mean that Nvidia will officially be moving to open drivers for all future cards?No one know... but seems like you confuse open kernel module with open source driver. What Nvidia now doing is just focusing on open kernel module, whole driver (opengl, Vulkan etc) userspsce is still closed source, proprietary.
If so what about HDMI 2.1 support? Because AMD recently confirmed that the HDMI forum will NOT licence out 2.1 support via the AMD open driver. I don't want to spend out for a monitor when my £2.5k QN95A TV supports VRR 120Hz 4k along with my £1k 7900xtx.
I am just curious because I cannot use Freesync, VRR due to AMD not supporting HDMI 2.1 via the open driver.
I may just switch back to Nvidia with the launch of the 5000 series of cards if this is the case and sell my 7900xtx.
Half-Life 25th Anniversary Update brings Half-Life: Uplink, Steam Deck support
18 Nov 2023 at 4:34 pm UTC Likes: 4
18 Nov 2023 at 4:34 pm UTC Likes: 4
Also, looks like that version of Half Life is a fully native Linux build. So no longer toGL.
KeeperFX open-source remake and expansion of Dungeon Keeper 1.0 out now
14 Nov 2023 at 1:58 pm UTC Likes: 3
14 Nov 2023 at 1:58 pm UTC Likes: 3
Well it should works on Linux as native or should be at almost working state.
There is few related pull requests, like this: https://github.com/dkfans/keeperfx/pull/2373 [External Link]
But that all require manually compiling.
There is few related pull requests, like this: https://github.com/dkfans/keeperfx/pull/2373 [External Link]
But that all require manually compiling.
Ubuntu 23.10 download got pulled down due to a malicious translation
13 Oct 2023 at 2:18 pm UTC Likes: 6
The guy you're talking about actually works for Canonical, but if you check git - he deleted the entire Ukrainian translation while waiting for it to be corrected. He is not responsible for anti-Semitic and offensive content. He only removed it.
The man who did this has already deleted his github account, but traces remain. For a long time he has been a social translator of various Linux projects into Ukrainian, among others for system76. His nationality is unknown.
13 Oct 2023 at 2:18 pm UTC Likes: 6
Quoting: kroitusGit history and discussion there shows, that those malicious translations were added by a russian, who work(ed) at Canonical office in San Francisco. He's been fired, and may face deportation to motherland.Please, let's not draw such far-reaching conclusions, this is not the place for politics.
The guy you're talking about actually works for Canonical, but if you check git - he deleted the entire Ukrainian translation while waiting for it to be corrected. He is not responsible for anti-Semitic and offensive content. He only removed it.
The man who did this has already deleted his github account, but traces remain. For a long time he has been a social translator of various Linux projects into Ukrainian, among others for system76. His nationality is unknown.
Valve dropped Counter-Strike 2 support on macOS and older hardware
11 Oct 2023 at 12:22 pm UTC Likes: 1
11 Oct 2023 at 12:22 pm UTC Likes: 1
Well they also make cs2 unplayable for many Linux players. Game just crashing at launch without any clear indications.
Similar with dota2 after migrating to sdl3. In Dota most of people was affected by #
"Wayland bug" but even after fixing it, there is some people that can't play at all because game crashing or gives illegally instructions at launch.
Similar with dota2 after migrating to sdl3. In Dota most of people was affected by #
"Wayland bug" but even after fixing it, there is some people that can't play at all because game crashing or gives illegally instructions at launch.
Driver updates for AMD RADV to give nice boost for Linux and Steam Deck
13 Apr 2023 at 1:33 pm UTC Likes: 1
13 Apr 2023 at 1:33 pm UTC Likes: 1
Worth to add to use GPL you need good CPU. If someone try it on old or weak CPU then game experience can be worse. The important thing is that you can easily turn it off. Just a reminder, in case anyone else experiences this.
Action-RPG in Early Access 'Last Epoch' adds in online multiplayer
14 Mar 2023 at 9:14 pm UTC
14 Mar 2023 at 9:14 pm UTC
Anyone know how to fix 1-5 FPS with Vulkan API? OpenGL show normal FPS but causing graphics issues (invisible characters) while forcing Vulkan fixes it but gives 1-5 FPS.
RX 580 8GB, latest mesa 23.0.0
RX 580 8GB, latest mesa 23.0.0
Action-RPG in Early Access 'Last Epoch' adds in online multiplayer
13 Mar 2023 at 3:13 pm UTC
13 Mar 2023 at 3:13 pm UTC
Invisible character bug is not a new things. It is here with about year or so...
Vulkan API not working or maybe another worked before multplayer patch with 1-3 FPS. So picture slider and not a game. Now after multiplayer patch - Vulkan api load up to main menu (still feels like 1-3 FPS) and then whole game crashing.
Only OpenGL render works but... with many major issues.
Vulkan API not working or maybe another worked before multplayer patch with 1-3 FPS. So picture slider and not a game. Now after multiplayer patch - Vulkan api load up to main menu (still feels like 1-3 FPS) and then whole game crashing.
Only OpenGL render works but... with many major issues.
Dota 2 removes OpenGL support, new hero Muerta now live, big update due in April
10 Mar 2023 at 5:41 pm UTC
10 Mar 2023 at 5:41 pm UTC
Quoting: whizseApparently the game defaults to different graphics levels depending on the renderer. The default for Vulkan is higher which might explain why some are having performance problems.No, every time I did performance tests - past and present, I manually set the same graphics level values on both OpenGL and Vulkan.
- Valve wins legal battle against patent troll Rothschild and associated companies
- Game manager Lutris v0.5.20 released with Proton upgrades, store updates and much more
- Rocket League is adding Easy Anti-Cheat, Psyonix say Linux will still be supported with Proton
- Unity CEO says an upcoming Beta will allow people to "prompt full casual games into existence"
- Godot Engine suffering from lots of "AI slop" code submissions
- > See more over 30 days here
- KDE Plasma in Linux Mint
- Caldathras - I think I found my Discord alternative
- Pyrate - Help! Steam ignoring gamepad
- szorza - Total Noob general questions about gaming and squeezing every oun…
- Caldathras - Small update for article comments and forum posts
- Liam Dawe - See more posts
How to setup OpenMW for modern Morrowind on Linux / SteamOS and Steam Deck
How to install Hollow Knight: Silksong mods on Linux, SteamOS and Steam Deck