Latest Comments by F.Ultra
Valve reveals Steam Deck OLED for November 16th
15 Nov 2023 at 12:08 pm UTC
https://gitlab.gnome.org/GNOME/gtk/-/issues/3787 [External Link]
15 Nov 2023 at 12:08 pm UTC
Quoting: slaapliedjeThe new one that I still need to play through? Yes! Pretty sure that was the demo name too, though it may have been the group that made it... my go to demo is usually State of the Art by Spaceballs... That's not an AGA one though, I'm pretty sure the one I saw with the blue was AGA though. I'll poke around and see if I can find it again.Hehe, State of the Art brings back memories :), all of us suddenly started to code interference rings when that one came.
Quoting: ShmerlHere is also an interesting read which mentions WRGB and RWBG subpixel layouts (never heard of them before):AFAIK gtk4 does subpixel-someting, it just does it in grayscale and not in rgba. In any case IMHO fonts looks perfect in GTK4 apps under Wayland. Google returns lots of posts from people complaining about fuzzy fonts in gtk4 though so not sure what is happening, if they simply have some old/bad config lying around or if I'm just lucky or what it is.
https://tftcentral.co.uk/reviews/lg-27gr95qe-oled [External Link]
I wonder if FontConfig even supports that.
UPDATE: I can't find anything here: https://gitlab.freedesktop.org/fontconfig/fontconfig/-/issues [External Link]
Given such subpixel layouts aren't even supported, benefits of OLED screens become pretty moot.
UPDATE 2:
Found something related:
* https://gitlab.freedesktop.org/fontconfig/fontconfig/-/issues/328 [External Link]
* https://bugs.kde.org/show_bug.cgi?id=472340 [External Link]
UPDATE 3:
What a rabbit hole:
https://gitlab.freedesktop.org/freetype/freetype/-/issues/1182 [External Link]
> In fact, GTK4 does not support any subpixel geometry, which is upsetting to some people.
https://gitlab.gnome.org/GNOME/gtk/-/issues/3787 [External Link]
Valve reveals Steam Deck OLED for November 16th
14 Nov 2023 at 4:12 pm UTC Likes: 1
14 Nov 2023 at 4:12 pm UTC Likes: 1
Quoting: slaapliedjeIsn't that the name of the Great Giana Sisters game?Quoting: F.UltraHa, you would ask me that... Twisted Dreams? I was clicking on some random ones, as I was having some issues with a hard lock when I would try and exit the whdload.Quoting: slaapliedjeYes the graphics (just as it is today ofc) was created with the display used at the time so scanlines and other imperfections where used to enhance the image where the GPU of the time couldn't provide the color or resolution needed/wanted. Btw which demo was it on the A4000? I would like to see the blue rectangle to try and make out what it was.Quoting: F.UltraInterestingly, I currently have my A4000 connected to an LCD monitor (via a zz9000, which has an HDMI output, but a pass through for native resolutions), plus a Commodore 1084 monitor (CRT). Watching a demo, I could see a square blue area around the main part of the demo running on the LCD screen, whereas on the CRT, it was very dark and you couldn't see it, making it look much better.Quoting: elmapulThe thing is that those old games where created with the notion that the display was fuzzy and not sharp and detailed as they are now and an OLED is just as sharp and detailed as any LCD. What OLED brings to the table is CRT like (and in some cases like my monitor, better) handling of black and increased color+brightness capabilities.Quoting: slaapliedjespeaking of it do you (or anyone) know if old games work fine on OLED ? i know they look like crap on CRT, but oled work different so it might look less crapy? i wonder if its harder to make shaders/filters to simulate an CRT on an OLED screen than on an LCD one.Quoting: tuubiCRTs didn't either, except when you'd try to do foolish things like interlace. Well, or if you were someone not in the 60hz locations... While there are benefits of PAL, a higher refresh rate is not one of them, and there is definitely flicker to most people at 50hz vs 60hz.Quoting: slaapliedjeIt's kind of amusing to me that CRTs started off as 50/60hz, then higher end monitors started getting really high refresh rates (like the one I have that'll do 1600x1200 at 85hz). Then when we started with LCDs, we were back to having crappy refresh rates, with the added disadvantage of any non-native resolution looking like trash... Many years later, they're finally getting better.You're forgetting or ignoring the fact that we mostly wanted higher refresh rates for CRTs to reduce the eye destroying flicker, not to make games run smoother or whatever. Whereas an LCD doesn't really have a flicker problem, even with the old fluorescent backlights.
There are definitely benefits and disadvantages to each tech. Older stuff though, was designed for a CRT, so on occasion can look like utter trash on an flat screen. Especially when you're looking at 8-16bit stuff.
Also one have to remember that back when we played those 8-bit and 16-bit games a 14" monitor was the default and the viewing distance was the same as it is with our modern 45" monitors so the size difference alone shows imperfections that were not detectable back then.
That said, I find C64 games using VICE looking quite good actually both on my OLED and on my old LCD.
A lot of the old pixel art and such, just looks better with scanlines, which is why most emulators try their damnedest to recreate such things with shaders, etc. Ha, in a lot of ways, the computations to do just the shaders are more powerful than what it the original platforms were...
For the record, my Atari Jaguar does actually look quite amazing on my 77" OLED through an OSSC...
GNOME gets €1M funding from the Sovereign Tech Fund
13 Nov 2023 at 5:35 pm UTC Likes: 1
Regarding the CPU+GPU waiting for Vsync I think I wrote before that I'm still perplexed that we in a world where monitors no longer really update with a frequency (like a CRT have to do) the GPU<->Monitor protocol should really just be start-of-image+image+end-of-image and no frequences be used at all, only that the monitor would reply back what the minimum wait period would be between images. VRR as such should both not exist and be the default so to speak.
13 Nov 2023 at 5:35 pm UTC Likes: 1
Quoting: EikeFull agreement on all parts, I was just commenting on why the devs are not running their legs off in order to implement VRR. And funding like this is obviously aimed at improving the desktop experience and not the gaming experience (too many people are still not seeing gaming as something that is important).Quoting: EikeFlickering is not the opposite of VRR, it's screen tearingYes, I was using the wrong term here.
Quoting: Eikeand honestly since I moved away from a 60Hz screen and my old GPU I cannot notice screen tearing (the fps is far to high for that).That's great for you, but many people are having low fps, and if not currently, maybe tomorrow with the next generation of games, you might have to. Your card will stop being high tier - but still have VRR.
(And, as I think you're a technical person as well: CPU and GPU waiting for the monitor is just wrong.)
Quoting: poiuzWell, I sure don't buy into "The company isn't giving to us, so it's not important." Anybody got some more insight why Valve wouldn't do it? I'd guess it's part of making Steam Deck as affordable as possible?Quoting: BrokattI would guess that for the millions of Linux Steam users VRR is somewhat important.More than 40% use a Steam Deck. Steam Deck doesn't support VRR (even the newest revision won't). It can't be that important if a gaming company omits it (on a device which would very much benefit from it).
Regarding the CPU+GPU waiting for Vsync I think I wrote before that I'm still perplexed that we in a world where monitors no longer really update with a frequency (like a CRT have to do) the GPU<->Monitor protocol should really just be start-of-image+image+end-of-image and no frequences be used at all, only that the monitor would reply back what the minimum wait period would be between images. VRR as such should both not exist and be the default so to speak.
Quoting: BrokattBoth VRR and HDR is something that we gamers are anxiously awaiting but the unfortunate truth is that desktop is where the focus is, especially for funding like this.Quoting: F.UltraYes it's very important to gamers. I would go as far as to call it a game changer for PC gaming. The fact that Gnome is not prioritizing it is very sad. But I understand gamers are not the target audience for IBM/Red Hat. Still I was hoping to see some progress with this donation.Quoting: BrokattIt's only important to gamers which already is a small subset of all Linux users and on top of that it is also only important to people that have a system that cannot handle high enough frame rates, if you e.g have 1% lows > 90fps then you will not see screen tearing so VRR will be mostly useless then.Quoting: F.UltraI would guess that for the millions of Linux Steam users VRR is somewhat important. I just thought that after getting some extra funding they would put some of it towards that 3 year old merge request.Quoting: BrokattI don't see "Improve the state of VRR."Very few people care about VRR, which is also why it have taken so long to get it implemented.
GNOME gets €1M funding from the Sovereign Tech Fund
13 Nov 2023 at 12:59 pm UTC Likes: 1
13 Nov 2023 at 12:59 pm UTC Likes: 1
Quoting: BrokattIt's only important to gamers which already is a small subset of all Linux users and on top of that it is also only important to people that have a system that cannot handle high enough frame rates, if you e.g have 1% lows > 90fps then you will not see screen tearing so VRR will be mostly useless then.Quoting: F.UltraI would guess that for the millions of Linux Steam users VRR is somewhat important. I just thought that after getting some extra funding they would put some of it towards that 3 year old merge request.Quoting: BrokattI don't see "Improve the state of VRR."Very few people care about VRR, which is also why it have taken so long to get it implemented.
Quoting: EikeImportant for games yes but how small a subset of users are gamers. Flickering is not the opposite of VRR, it's screen tearing and honestly since I moved away from a 60Hz screen and my old GPU I cannot notice screen tearing (the fps is far to high for that).Quoting: F.UltraVery few people care about VRR, which is also why it have taken so long to get it implemented.I find it very hard to believe that. Without VRR, you get either flickering or might be losing quite a bit of the performance you payed for (plus added input lag, but that probably is something not many people care for). When thinking about it, the monitor displaying the image when it's ready instead of some hundreds of bucks of CPU plus possibly many hundreds of bucks of GPU waiting for the monitor is the world as it's supposed to be.
Valve reveals Steam Deck OLED for November 16th
13 Nov 2023 at 12:55 pm UTC Likes: 2
One is not sharper or softer than the other, it's just that algorithms like ClearType are designed for one of them but not all of them when adding anti aliasing tricks to fake a higher resolution for text than what the screen can produce natively. A single pixel is as sharp on any of these, so this is only down to algos that are trying to increase the sharpness of text.
Just for fun I made high res photos of text displayed on both the 1440p and the 4k one to show that the sharpness and clarity is identical but the issue is that the curvature of my 45" is visible so I cannot use it as blind tests to people since they will always see which is the 45" and therefore will always be able to just say that that one is less sharp or whatever their bias now is.
Rtings have showed that burn in is mostly overblown on modern panels (yes they do experience burn in, but they run their displays for 24x7 on the same media just to trigger it as much as possible), what they do show however is that LCD:s experience tons of image uniformity that visually is identically to burn in: https://www.rtings.com/tv/tests/longevity-burn-in-test-updates-and-results [External Link]
13 Nov 2023 at 12:55 pm UTC Likes: 2
Quoting: Loftywell that's not strictly true. You personally may not perceive a softness to an image (and im sure on a small low res screen like the steam decks it might be even harder to tell) but there are many threads on OLED monitors (perhaps not so much on a large TV as you sit further back) that comment on the sub pixel layout of OLED and how for desktop use it is softer and much better for gaming than desktop productivity. And that this is something that is not likely to be fixed any time soon. Right now LCD is sharper.I know that there are but lots of those people are confused, you see it's not OLED that have a subpixel layout, every display have a subpixel layout:
One is not sharper or softer than the other, it's just that algorithms like ClearType are designed for one of them but not all of them when adding anti aliasing tricks to fake a higher resolution for text than what the screen can produce natively. A single pixel is as sharp on any of these, so this is only down to algos that are trying to increase the sharpness of text.
Quoting: LoftyGnome does seem to handle font's a bit better im my experience. One of my screens is a 24.5" 1080p screen running Gnome wayland and it looks okay at a suitable distance, but it's the lowest PPI i could possibly handle. I assume you sit further back with it being a giant 45" screen. Also it seems that the 45GR95QE-B does not flicker like the steam deck, seen as i have already looked previously at reviews of the LG and have seen no issues (id actually like a 5k version of that screen)Due to how my setup is at home I sit at the exact same distance at home from my 45" as I did with my 27", the 32" at work is also at roughly the same distance.
everyone has their acceptable preferences with regards to clarity,some people have returned OLED because they for some reason don't like it's color presentation. But i wouldn't assume that they are all wrong across the many forums mostly they base their experiences on other OLED's at 27" to 34" with a 109ppi vs your smaller 83ppi. Although, how many people actually own these screens vs theoretically disliking i cannot tell. On balance there is an issue, but not everyone is sensitive to it.
Just for fun I made high res photos of text displayed on both the 1440p and the 4k one to show that the sharpness and clarity is identical but the issue is that the curvature of my 45" is visible so I cannot use it as blind tests to people since they will always see which is the 45" and therefore will always be able to just say that that one is less sharp or whatever their bias now is.
Quoting: LoftyI must of missed that bit of the discussion. OLED burn in is still a thing. its one of the reasons i believe the pricing has been so surprisingly competitive with mini-LED panels. Where is the study that shows OLED to last twice the length of an LCD screen ? I have LCD screens from 2012 with 10's of thousands of hours on working just fine like the day i bought it.LG made a public statement in 2016 that their panels as of then had a lifespan of 100k hours and LCD:s at the time had a max lifespan of roughly 50k hours, that is where 2x the lifespan comes from: https://www.oled-info.com/lgs-latest-oled-tvs-last-100000-hours [External Link] note that is up from only 36k hours in 2013 so lifespan have increased greatly in just a short time frame.
Rtings have showed that burn in is mostly overblown on modern panels (yes they do experience burn in, but they run their displays for 24x7 on the same media just to trigger it as much as possible), what they do show however is that LCD:s experience tons of image uniformity that visually is identically to burn in: https://www.rtings.com/tv/tests/longevity-burn-in-test-updates-and-results [External Link]
Quoting: LoftySo your screen has a fan ? I don't want a monitor that needs a fan TBH.I actually don't know if it does nor not. It is 100% silent, just that I know that e.g the smaller Alienware does have a fan and they have one model where the fan is noticeable and one where it isn't so I don't know if the LG have a fan or not, just that it is completely silent.
Quoting: LoftyThe power consumption on OLED is higher than LCD. which may or may not bother the end user. But i like to keep my energy bill & heat in room as low as possible. Like i said each technology has its strengths and weaknesses.[/quote]Yes that is one drawback, roughly 130w vs 60W for an equivalent LCD, though it depends on what you display, if it is mostly dark pixels then the OLED draws close to nothing while the LCD still draws 60W (which is e.g why the switch to OLED have made the power consumption of the Deck to go down and not up). Nothing compare with my old Plasma though :)
Valve reveals Steam Deck OLED for November 16th
13 Nov 2023 at 12:27 pm UTC Likes: 1
13 Nov 2023 at 12:27 pm UTC Likes: 1
Quoting: slaapliedjeYes the graphics (just as it is today ofc) was created with the display used at the time so scanlines and other imperfections where used to enhance the image where the GPU of the time couldn't provide the color or resolution needed/wanted. Btw which demo was it on the A4000? I would like to see the blue rectangle to try and make out what it was.Quoting: F.UltraInterestingly, I currently have my A4000 connected to an LCD monitor (via a zz9000, which has an HDMI output, but a pass through for native resolutions), plus a Commodore 1084 monitor (CRT). Watching a demo, I could see a square blue area around the main part of the demo running on the LCD screen, whereas on the CRT, it was very dark and you couldn't see it, making it look much better.Quoting: elmapulThe thing is that those old games where created with the notion that the display was fuzzy and not sharp and detailed as they are now and an OLED is just as sharp and detailed as any LCD. What OLED brings to the table is CRT like (and in some cases like my monitor, better) handling of black and increased color+brightness capabilities.Quoting: slaapliedjespeaking of it do you (or anyone) know if old games work fine on OLED ? i know they look like crap on CRT, but oled work different so it might look less crapy? i wonder if its harder to make shaders/filters to simulate an CRT on an OLED screen than on an LCD one.Quoting: tuubiCRTs didn't either, except when you'd try to do foolish things like interlace. Well, or if you were someone not in the 60hz locations... While there are benefits of PAL, a higher refresh rate is not one of them, and there is definitely flicker to most people at 50hz vs 60hz.Quoting: slaapliedjeIt's kind of amusing to me that CRTs started off as 50/60hz, then higher end monitors started getting really high refresh rates (like the one I have that'll do 1600x1200 at 85hz). Then when we started with LCDs, we were back to having crappy refresh rates, with the added disadvantage of any non-native resolution looking like trash... Many years later, they're finally getting better.You're forgetting or ignoring the fact that we mostly wanted higher refresh rates for CRTs to reduce the eye destroying flicker, not to make games run smoother or whatever. Whereas an LCD doesn't really have a flicker problem, even with the old fluorescent backlights.
There are definitely benefits and disadvantages to each tech. Older stuff though, was designed for a CRT, so on occasion can look like utter trash on an flat screen. Especially when you're looking at 8-16bit stuff.
Also one have to remember that back when we played those 8-bit and 16-bit games a 14" monitor was the default and the viewing distance was the same as it is with our modern 45" monitors so the size difference alone shows imperfections that were not detectable back then.
That said, I find C64 games using VICE looking quite good actually both on my OLED and on my old LCD.
A lot of the old pixel art and such, just looks better with scanlines, which is why most emulators try their damnedest to recreate such things with shaders, etc. Ha, in a lot of ways, the computations to do just the shaders are more powerful than what it the original platforms were...
For the record, my Atari Jaguar does actually look quite amazing on my 77" OLED through an OSSC...
Valve reveals Steam Deck OLED for November 16th
13 Nov 2023 at 12:12 am UTC
Online people claim that it is impossible to work with text on my monitor while I as a programmer do tons of work with text (and often in terminal windows) with zero issues. If this means that they are all wrong or that Gnome happens to be better at text on this sub pixel layout than Windows is I don't know since I don't run Windows but text and/or sharpness is a complete non issue.
Plus to get back to context, even if that would be true it would be nothing compared with the fuzziness that existed even on the really high end CRT:s (and I used seriously high end CRT:s at work back before LCD:s took over) due to their analogue nature (#1 the signal was analogue and #2 the ray painting the image can not hit the exact same spot with infinite precision). Combine that with the low resolution of 8-bit monitors of 320x200 / 320x160 depending on PAL/NTSC fed over analogue composite.
* lower lifespan was an issue before 2016 as already discussed, the OLED panels since then have 2x the lifespan of a LCD one
* fans are a problem on some models, mine not at all, completely silent
* visible ABL, can be an issue the few times it hits but IMHO beats the bleed from LCD:s backlight every single day of the week.
13 Nov 2023 at 12:12 am UTC
Quoting: LoftyYes I have seen lots of such claims, especially from people online vomiting over my LG 45GR95QE-B since it's both a WOLED and 45" UW 1440p and thus have a DPI that is just slightly above that of a 27" 1080p but to me this is mostly BS. At work I have a high end 32" 4K LCD from Samsung and it is by all means not that much sharper than my OLED at home.Quoting: F.UltraThe thing is that those old games where created with the notion that the display was fuzzy and not sharp and detailed as they are now and an OLED is just as sharp and detailed as any LCD. What OLED brings to the table is CRT like (and in some cases like my monitor, better) handling of black and increased color+brightness capabilities.well that's not strictly true. You personally may not perceive a softness to an image (and im sure on a small low res screen like the steam decks it might be even harder to tell) but there are many threads on OLED monitors (perhaps not so much on a large TV as you sit further back) that comment on the sub pixel layout of OLED and how for desktop use it is softer and much better for gaming than desktop productivity. And that this is something that is not likely to be fixed any time soon. Right now LCD is sharper.
Online people claim that it is impossible to work with text on my monitor while I as a programmer do tons of work with text (and often in terminal windows) with zero issues. If this means that they are all wrong or that Gnome happens to be better at text on this sub pixel layout than Windows is I don't know since I don't run Windows but text and/or sharpness is a complete non issue.
Plus to get back to context, even if that would be true it would be nothing compared with the fuzziness that existed even on the really high end CRT:s (and I used seriously high end CRT:s at work back before LCD:s took over) due to their analogue nature (#1 the signal was analogue and #2 the ray painting the image can not hit the exact same spot with infinite precision). Combine that with the low resolution of 8-bit monitors of 320x200 / 320x160 depending on PAL/NTSC fed over analogue composite.
* lower lifespan was an issue before 2016 as already discussed, the OLED panels since then have 2x the lifespan of a LCD one
* fans are a problem on some models, mine not at all, completely silent
* visible ABL, can be an issue the few times it hits but IMHO beats the bleed from LCD:s backlight every single day of the week.
GNOME gets €1M funding from the Sovereign Tech Fund
12 Nov 2023 at 4:52 pm UTC Likes: 5
12 Nov 2023 at 4:52 pm UTC Likes: 5
Quoting: BrokattI don't see "Improve the state of VRR."Very few people care about VRR, which is also why it have taken so long to get it implemented.
Valve reveals Steam Deck OLED for November 16th
12 Nov 2023 at 4:19 pm UTC
Also one have to remember that back when we played those 8-bit and 16-bit games a 14" monitor was the default and the viewing distance was the same as it is with our modern 45" monitors so the size difference alone shows imperfections that were not detectable back then.
That said, I find C64 games using VICE looking quite good actually both on my OLED and on my old LCD.
12 Nov 2023 at 4:19 pm UTC
Quoting: elmapulThe thing is that those old games where created with the notion that the display was fuzzy and not sharp and detailed as they are now and an OLED is just as sharp and detailed as any LCD. What OLED brings to the table is CRT like (and in some cases like my monitor, better) handling of black and increased color+brightness capabilities.Quoting: slaapliedjespeaking of it do you (or anyone) know if old games work fine on OLED ? i know they look like crap on CRT, but oled work different so it might look less crapy? i wonder if its harder to make shaders/filters to simulate an CRT on an OLED screen than on an LCD one.Quoting: tuubiCRTs didn't either, except when you'd try to do foolish things like interlace. Well, or if you were someone not in the 60hz locations... While there are benefits of PAL, a higher refresh rate is not one of them, and there is definitely flicker to most people at 50hz vs 60hz.Quoting: slaapliedjeIt's kind of amusing to me that CRTs started off as 50/60hz, then higher end monitors started getting really high refresh rates (like the one I have that'll do 1600x1200 at 85hz). Then when we started with LCDs, we were back to having crappy refresh rates, with the added disadvantage of any non-native resolution looking like trash... Many years later, they're finally getting better.You're forgetting or ignoring the fact that we mostly wanted higher refresh rates for CRTs to reduce the eye destroying flicker, not to make games run smoother or whatever. Whereas an LCD doesn't really have a flicker problem, even with the old fluorescent backlights.
There are definitely benefits and disadvantages to each tech. Older stuff though, was designed for a CRT, so on occasion can look like utter trash on an flat screen. Especially when you're looking at 8-16bit stuff.
Also one have to remember that back when we played those 8-bit and 16-bit games a 14" monitor was the default and the viewing distance was the same as it is with our modern 45" monitors so the size difference alone shows imperfections that were not detectable back then.
That said, I find C64 games using VICE looking quite good actually both on my OLED and on my old LCD.
Valve reveals Steam Deck OLED for November 16th
12 Nov 2023 at 4:15 pm UTC
Display frequencies have increased quite a lot though, I mean my OLED is 240hz and there exists 540hz in stores right now. And as others have already stated, the reason for the 85Hz CRT was to give a more stable image, not to handle high FPS in games.
12 Nov 2023 at 4:15 pm UTC
Quoting: slaapliedjeTrue, though the flkickering from OLED comes from a completely different frequency than the display frequency. However I'm just still amazed that we still since we moved on from a moving electron beam in a CRT to technology like LCD/LED/OLED where you can update any pixel at any time that we still have update frequencies. Monitors should basically by VRR as in the GPU sending "here is an image, and here is another" and only the capability of the monitor determinging the min time between such full frames.Quoting: F.UltraIt's kind of amusing to me that CRTs started off as 50/60hz, then higher end monitors started getting really high refresh rates (like the one I have that'll do 1600x1200 at 85hz). Then when we started with LCDs, we were back to having crappy refresh rates, with the added disadvantage of any non-native resolution looking like trash... Many years later, they're finally getting better.Quoting: PenglingAh, was about to ask if you had the same issues with CRTs. Yes OLED:s "flicker" to control brightness so I can see why that might be an issue, probably different from model to model since the frequency of the flicker differs and once the frequency goes high enough I suspect that the issue should go away from sensitive people (the current issue is that the frequency is just above the detectable threshold and not way above, prob to save on power).Quoting: F.UltraWhere does the pain come from? I understand that OLED usually have higher brightness, but since that is a setting I guess that we are talking about something else?Eyestrain and migraines. I'm told that OLEDs flicker like CRTs did (I had the same problem with those - LCDs were a godsend! :tongue:), which would both explain it and suggest that it can't be avoided.
Display frequencies have increased quite a lot though, I mean my OLED is 240hz and there exists 540hz in stores right now. And as others have already stated, the reason for the 85Hz CRT was to give a more stable image, not to handle high FPS in games.
- GOG now using AI generated images on their store [updated]
- CachyOS founder explains why they didn't join the new Open Gaming Collective (OGC)
- The original FINAL FANTASY VII is getting a new refreshed edition
- GOG job listing for a Senior Software Engineer notes "Linux is the next major frontier"
- UK lawsuit against Valve given the go-ahead, Steam owner facing up to £656 million in damages
- > See more over 30 days here
Recently Updated
- I need help making SWTOR work on Linux without the default Steam …
- whizse - Browsers
- Johnologue - What are you playing this week? 26-01-26
- Caldathras - Game recommendation?
- buono - Will you buy the new Steam Machine?
- CatGirlKatie143 - See more posts
How to setup OpenMW for modern Morrowind on Linux / SteamOS and Steam Deck
How to install Hollow Knight: Silksong mods on Linux, SteamOS and Steam Deck
Source: upload.wikimedia.org
View cookie preferences.
Accept & Show Accept All & Don't show this again Direct Link