Patreon Logo Support us on Patreon to keep GamingOnLinux alive. This ensures all of our main content remains free for everyone. Just good, fresh content! Alternatively, you can donate through PayPal Logo PayPal. You can also buy games using our partner links for GOG and Humble Store.
Latest Comments by Samsai
The Linux 'Desktop Entry Specification' gets a way to automatically use a discrete GPU, merged into GNOME
8 May 2020 at 6:29 am UTC Likes: 1

Quoting: adibuyono
Quoting: Samsai
Quoting: adibuyono
Quoting: Samsai
Quoting: mcphailI package a few games. I'm not sure whether to add this to the .desktop files. Would it be seen as user-hostile if I was making this default decision for users? Most of the games I package would run satisfactorily on integrated graphics but might be better on a dedicated gpu.
There are laptop configurations where the dGPU is functionally slower than the iGPU. My laptop for example is like that, the iGPU and dGPU are rated at just about equal performance but either due to thermal issues (single, shared heatpipe) or other overhead the dGPU consistently under-performs. On the other hand though, these kinds of systems are basically broken designs, so they may not be worth working around.

If you are unsure then relying on the default behaviour where GNOME optionally allows launching games using the dGPU is probably fine.
Sound like Asus X550DP 😂
HP Notebook 13 or whatever. Got it because it had a hybrid GPU setup I could test for GOL with an APU + AMD GPU combo. At the time most laptop choices were Intel + Nvidia and I have no interest in buying an Nvidia card.
I see.. Your case is similar to my Asus X550DP. It come with AMD A10-5750, Dual Graphic HD8650G + HD8670M.
I have been hunting a way to use discrete graphic card on Linux but ended up with no easy solution. At the end, i find out that the dGPU is just very slight better than the iGPU, but it came with a great cost. A lot of x550dp user ended up with dying GPU due to overheat. Luckily I moved to Linux after few months purchasing it and find no way to active the dGPU.
I wonder why the manufactures implant a dGPU if it's perform almost similar to iGPU.
The original purpose of those setups was to use Crossfire between the APU and the dGPU to get higher performance. Obviously on Linux Crossfire was never really a thing, so it's not relevant to us. Technically Vulkan could make use of such setups but it's debatable if game devs will actually bother.

The Linux 'Desktop Entry Specification' gets a way to automatically use a discrete GPU, merged into GNOME
8 May 2020 at 5:54 am UTC

Quoting: adibuyono
Quoting: Samsai
Quoting: mcphailI package a few games. I'm not sure whether to add this to the .desktop files. Would it be seen as user-hostile if I was making this default decision for users? Most of the games I package would run satisfactorily on integrated graphics but might be better on a dedicated gpu.
There are laptop configurations where the dGPU is functionally slower than the iGPU. My laptop for example is like that, the iGPU and dGPU are rated at just about equal performance but either due to thermal issues (single, shared heatpipe) or other overhead the dGPU consistently under-performs. On the other hand though, these kinds of systems are basically broken designs, so they may not be worth working around.

If you are unsure then relying on the default behaviour where GNOME optionally allows launching games using the dGPU is probably fine.
Sound like Asus X550DP 😂
HP Notebook 13 or whatever. Got it because it had a hybrid GPU setup I could test for GOL with an APU + AMD GPU combo. At the time most laptop choices were Intel + Nvidia and I have no interest in buying an Nvidia card.

The Linux 'Desktop Entry Specification' gets a way to automatically use a discrete GPU, merged into GNOME
7 May 2020 at 4:36 am UTC Likes: 2

Quoting: mcphailI package a few games. I'm not sure whether to add this to the .desktop files. Would it be seen as user-hostile if I was making this default decision for users? Most of the games I package would run satisfactorily on integrated graphics but might be better on a dedicated gpu.
There are laptop configurations where the dGPU is functionally slower than the iGPU. My laptop for example is like that, the iGPU and dGPU are rated at just about equal performance but either due to thermal issues (single, shared heatpipe) or other overhead the dGPU consistently under-performs. On the other hand though, these kinds of systems are basically broken designs, so they may not be worth working around.

If you are unsure then relying on the default behaviour where GNOME optionally allows launching games using the dGPU is probably fine.

According to NetMarketShare during April we saw a big bump in Linux use - Ubuntu gains big
6 May 2020 at 12:46 pm UTC Likes: 4

Cue Microsoft demanding that people go back to their offices. :P

AMD announces the Ryzen 3 3100 and Ryzen 3 3300X budget processors and a new B550 chipset
23 Apr 2020 at 12:44 pm UTC Likes: 1

This thread apparently got a bit heated, so I'll share some ideas of mine. Comment sections about hardware releases need not devolve into calling others ignorant or a shill. People have different sorts of ideas as to what good hardware purchases are like and what sorts of technologies is going to be driving performance in the future. I would suggest stating your views and the facts and feelings you think support that stance rather than going for ad hominems because someone's views don't line up with yours. Throwing around insults and labels only shuts down discussion and encourages responding in kind which creates a toxic environment for all.

---

On the actual topic, I find these processors a useful choice for customers on a budget. Their core counts will probably eventually be a limitation on performance but 4/8 can still get current stuff done decently well and there is a viable upgrade path to higher core count CPUs from these ones when quad-cores no longer cut it.

AMD announces the Ryzen 3 3100 and Ryzen 3 3300X budget processors and a new B550 chipset
22 Apr 2020 at 1:56 pm UTC Likes: 3

A few things.

You are so wrong it is not even funny. First of all, the 2600 uses 1 256bit AVX2 command per 2 clocks, instead of 1 clock. This is HUGE for next gen games. Guess what, AAA next gen games will require AVX2 baseline, since consoles have it. And having proper 1 command per clock support for AVX2 will make a huge difference in the AAA games to come. So buying older architectures than Zen 2 in my opinion is not worth it.

To put it into perspective, it is the difference Bulldozer architecture had with Intel in gaming. It had HALF the floating point performance, which hurt gaming a lot.
This assumes you can actually feed AVX2 workloads every clock in games. There are some game engine architecture choices that are capable of utilizing SIMD but claiming 1 clock or 2 clock AVX2 submission is going to be as dramatic as Bulldozer's shared FPUs doesn't make sense. Bulldozer did poorly in gaming because gaming is full of single floating point operations, since much of the data in games is represented as floating point values. Not to mention non-game processes could require the FPU while the game is running which stalls the floating point operation.

Caches aren't really important in current video games. Obviously a few years from now this may change, but a few years from now you will probably be upgrading the 3300X anyway... What matters most is the speed/latency of those caches, for gaming purposes. And i think since both cpus are of the same architecture they will feature similar speeds. Cache would play a huge part if it was cache for the igpu part of an APU, other than that, the vast majority of games are more than fine with 3300X levels of cache.
I find it particularly funny that you in consecutive comments argued for the importance of AVX2 but then followed that up by saying cache isn't that important. Cache is what keeps those AVX2 instructions fed, otherwise you will stall until the RAM slowly drags your data set into your L3 where it still needs to traverse into L2 and L1. Presumably games don't just load a single vector workload and endlessly FMA that same vector. Not to mention the significance of caches when it comes to data-oriented design, such as ECS systems, which have been gaining popularity as of late because, as it turns out, cache misses are harmful to game performance too. In this regard bigger caches absolutely do matter since we use multi-way caches and more cache lines means more slices of memory can be kept in L3 without having to evict already loaded data.

So, if you are going to be rude and accuse people of being wrong and shills and whatnot, I would also recommend that you at least be correct in your claims. Or consistent. Or generally constructive.

KDE's window manager KWin gets forked with 'KWinFT' to accelerate the development and better Wayland
17 Apr 2020 at 4:11 pm UTC Likes: 6

I feel like we have some preconceptions of what a fork is for, which is colouring the discussion a fair bit. We mostly see them in the context of severe developer disagreement or actual project death. In this case I don't see an issue with a fork, since the fork isn't really competing with KWin proper, from the description it sounds like they want to code fast and break often in order to pursue progress. It probably could have lived as a branch in the main KWin development tree but that would either limit the ability for users to test it or you would end up in a situation where the branch is a de facto fork except without its own repository.

I do naturally hope that any usable improvements get picked up by KWin and the two projects merge back together when the goals of the fork have been achieved.

GNOME launches a 'Community Engagement Challenge' with cash prizes
8 Apr 2020 at 9:04 pm UTC

Quoting: joder666
Women represent about 50% of the population but are under-represented in software development.
BS, as always. Change under-represented for NOT interested, which right now is also a lie, and we can meet in the middle.
Wait, so, you think that 50% of software developers are women? Or that 50% of GNOME community members are women? Literally don't understand your stance but I'm going to assume it's not correct either way. No need for me to meet you in the middle.

GNOME launches a 'Community Engagement Challenge' with cash prizes
8 Apr 2020 at 1:23 pm UTC Likes: 7

Quoting: gabberBecause it even is a consideration.

Last I checked I did neither need my genitals nor my sexuality to program. This is a politically motivated power grab and only brings division and drama, not better code. Equality starts when you do not differentiate. Now they have to ask for gender and divide the projects into those categories.

But the fact it's under secondary gives me a bit of hope this cancer will soon die off.
FOSS projects survive based on the number of motivated and talented developers available. Women represent about 50% of the population but are under-represented in software development. And you believe there is no reason to even wonder why that might be the case or attempt to leverage a new demographic in order to increase the size of the developer pool?

Also, not in a single part of that text you quoted, nor in the context of that quote, is there a demand to "ask for gender". They are showing an interest in entries that engage under-represented communities but if you read the actual text it's not even a strict requirement.

As for division and drama, I rarely hear about Outreachy participants causing drama and division. People in article comments about Outreachy and the like on the other hand...

Ubuntu 20.04 has hit Beta (as have all the extra flavours) - help make it a release to remember
4 Apr 2020 at 11:56 am UTC Likes: 4

Quoting: slaapliedjeI do find it funny that snap is geared more toward commercial software than flatpak is. Which enabled someone to upload a 2048 game with an embedded bitcoin miner.
Is there honestly that big of a difference? Basically any software source with little to no curation will eventually get garbage on it. This has happened on AUR as well and I bet someone could sneak a Bitcoin miner on the Flathub as well.

To me Flatpak and Snap don't seem that different. They both aim to solve more or less the same problems, provide more or less the same mechanisms for solving those problems and to me they both seem annoying to use and I'm not fully convinced they actually solve a real problem.