Don't want to see articles from a certain category? When logged in, go to your User Settings and adjust your feed in the Content Preferences section where you can block tags!
We do often include affiliate links to earn us some pennies. See more here.

Open Game Benchmarks, a brand new benchmarking website for Linux games

By - | Views: 16,617
I have been hoping to see something like this. Open Game Benchmarks is a brand new website dedicated to showing off benchmarks from Linux games.

Previously we've only really had Phoronix (and people know how I feel about the quality on Phoronix in recent times), so it's good to see some healthy competition in the Linux benchmarking area. While we do benchmarks, it's generally only on big new titles as we focus more on the Sales Page, the Calendar and general day to day Linux gaming news.

I was actually going to be adding something exactly like Open Game Benchmarks to GOL, but now I don't see a need which is great. I am going to try and team up with them to feature benchmarks here on GOL too.

It currently only supports games from Steam for Linux, as they claim that gives them a big enough selection and all high profile Linux games will be on it. It's hard to argue with that really, but I still hope they open up a bit in future if it's not too much trouble.

They also plan to open source it once they have cleaned it up enough, so that's even better news. I would hate for it to die off, and if it ever does we could always host it.

It has a nice simple layout, everything looks clean and to the point, it even has simple and nice looking graphs on each game. Article taken from GamingOnLinux.com.
Tags: Editorial
0 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
27 comments
Page: «2/3»
  Go to:

Half-Shot Jan 30, 2016
Apologies to anybody who actually reads this, the comment chain got fat fast.

Quoting: dubigrasuSo why not using Phoronix test suite for this?
Is Linux/Windows, easy to install and can run tests in controlled conditions easy to reproduce.

It's stale for one thing, this looks like a really nice and I think given enough work I think it could be very nice as a community sourcing tool.

Quoting: TheBossLike with anything that takes user input it can be gamed, but it won't be too hard to spot if there's more than one of each test done on a game.

Well, when you enter the region of 20 to 30 test cases you can see a pattern but we have to reach that first. Having a standard test makes it much harder to fake that half the people (say for example) building tiny cities and another bunch building huge cites in skylines.

Quoting: JSalem
Quoting: Half-Shot
Quoting: Doc AngeloI'm kinda curious how it will turn out, that these will be benchmarks of random scenery or level of a game. Someone could benchmark looking 5 minutes at the sky box at lowest settings. Someone else could upload a benchmark result of 5 minutes in a complex level with highest settings in 4k resolution. If there will only be 1 single average value, it will be pointless. But it's a young project, we'll see where it will go.

This.

I'm not sure I follow the purpose of the site at all. I'm all for a benchmark consolidation site where we all throw scores of our tests at but there are quire a few problems.

I can't run other peoples tests, so I can't compare against them. Allowing users to run their own tests at all seems dangerous, since the scores vary wildly depending on what your doing in game/test. Finally, I've noticed the only way to disginguish users is by hardware, driver (though NOT version), and distro (again, not version). So I could run Mesa on Ubuntu Warty Warthog and then try to compare against a seemingly same user running Ubuntu Utopic Unicon and obviously the results would be absolutely massivly different.

The concept is cool, but I can't see how the implementation is useful.

If this seems like a rant, it's meant to be constructive critisizm. As someone who has been doing Mesa benchmarks for two years, I want to have tools that can reproduce results correctly.

Thanks for the fedback, I see where are you coming from.
As you know, having standardized tests is hard, if possible at all in this case, where the vast majority of games do not have a benchmark tool. One way could be having a set of hardware various enough, and doing all the tests in a controlled environment. The other way is letting the community accumulate data (surely noisy and less controlled) to a centralized place, and then try to make some sense of it: filtering, aggregating, normalizing, visualizing.

The idea here is I, random_linux_user_37, want to buy game X, and I want to know how with my settings (cpu, gpu, resolution) that game would run.
Another thing that I hope the site will provide is some sort of comparison grounds with Windows. Hopefully, with the accumulation of enough data, and possibly the development of automating tools (that for example can run a certain game at certain settings, certain stage, and then upload everything), we will have a solid ground to make informed decisions and try to push developers and porters to focus on better performance on Linux.

Finally, I could add even more fields like driver version and Linux distro; but on one hand I am a bit skeptical that that would be really that big of a difference (given same hardware and driver) between two Ubuntu versions (as in your example), and on the other hand adding variables can possibly complicate even more an already noisy situation.

QuoteIf there will only be 1 single average value, it will be pointless.
That should not be the case; each benchmark is stored with all of its data, and several statistical measures are computed. And then you can compare benchmarks, selecting which one do you want to see (ex. "let's see how game X performs on open source and proprietary drivers")

Thanks for clearing that and and dealing with pre-food me. I've done a lot of mesa testing and any change in the environment can have a massive difference (see this doc from my testing last year), however you are right that it could make it noisy so It would be the case of finding middle ground I think. I'll try to help out when it goes opensource.
dubigrasu Jan 30, 2016
Quoting: Half-Shot
Quoting: dubigrasuSo why not using Phoronix test suite for this?
Is Linux/Windows, easy to install and can run tests in controlled conditions easy to reproduce.
It's stale for one thing...
Can you give some details?
Commander Jan 31, 2016
Quoting: JSalem...stuff....

I have been in the thinking about doing something similar to your's however my idea has been to instead to try to use data over time. By that I mean that as people upload "benchmarks" you then use all from different uploaders and try to set a value that way.

For example say you got ~50+ people uploading benchmarks of CS:GO. Some of them are Linux users and some of them are Windows users. You group those people up in their own "camps" and accumulate their results together. This I think will give a better "general" result as the benchmarks grow in numbers overtime. (Still wont be as exact as having identical cases but from a user point of view I think it can be interesting).

Ofc then you could click on said benchmarks see all uploads that attribute to the final "score", same comparisons could be then done say Nvidia vs AMD, or even Nvidia 361 vs Nvidia 358 driver versions etc. To make the score better you could then remove extreme values etc. Allot to play with here =)

If you need any help im happy to contribute =) Also some games do come with demo files (source based games) that users could upload for other people to compare.


Last edited by Commander on 31 January 2016 at 12:26 am UTC
no_information_here Jan 31, 2016
I like the idea. Perhaps you could make more use of the "notes" so that people would describe in more detail the part of the game that was benchmarked. Making things consistent will be hard and it will rely somewhat on the "truthfulness" of the people submitting info. However, wikipedia works this way, so it is not impossible.

I agree that it will need more detail on distro version and driver version, if the site is to be useful at all. Although it is somewhat dated, I think WineHQ could give an idea for how to handle this.

Quoting: dubigrasuSo why not using Phoronix test suite for this?
Is Linux/Windows, easy to install and can run tests in controlled conditions easy to reproduce.

I am no expert, but I believe the Phoronix tests are designed to benchmark hardware by having a consistent batch of games to test against. Does it have the ability to plug in different games?
dubigrasu Jan 31, 2016
Quoting: noinformationhere
Quoting: dubigrasuSo why not using Phoronix test suite for this?
Is Linux/Windows, easy to install and can run tests in controlled conditions easy to reproduce.

I am no expert, but I believe the Phoronix tests are designed to benchmark hardware by having a consistent batch of games to test against. Does it have the ability to plug in different games?

You can use an awful lot of games, reorder/highlight the results using various criteria, save results in various forms, compare your tests with others, view detailed hardware/test/logs details and what else:
http://openbenchmarking.org/result/1504257-DE-STEAMOS5526
tomtomme Jan 31, 2016
About phoronix quality: i applaude michael for the balance between Speed, quantity and quality. And as gol It would be even better if the language would be kept more professional, and less emotional.
tuubi Jan 31, 2016
View PC info
  • Supporter
Quoting: tomtommeAnd as gol It would be even better if the language would be kept more professional, and less emotional.
You're forgetting that this site isn't and has no pretensions to being "professional". It's more or less a personal blog with a few contributing editors and a forum. And that's a fact many of us appreciate.

Sure, GOL is gathering quite a following, but that's no reason to suddenly start emulating a commercial "news" portal / hype pusher / advertisement outlet / clickbait hell. We've got enough of those, and I for one have no time or patience for them.


Last edited by tuubi on 31 January 2016 at 8:47 am UTC
DasCapschen Jan 31, 2016
Hmm, not all steam linux games are on there :/
Missing for example Slime Rancher, which is currently early access, would've liked to upload a Benchmark of it.
Anyway, registered, submitted 4 benchmarks :)
J_Salem Jan 31, 2016
Quoting: Commander
Quoting: JSalem...stuff....

For example say you got ~50+ people uploading benchmarks of CS:GO. Some of them are Linux users and some of them are Windows users. You group those people up in their own "camps" and accumulate their results together. This I think will give a better "general" result as the benchmarks grow in numbers overtime. (Still wont be as exact as having identical cases but from a user point of view I think it can be interesting).

Ofc then you could click on said benchmarks see all uploads that attribute to the final "score", same comparisons could be then done say Nvidia vs AMD, or even Nvidia 361 vs Nvidia 358 driver versions etc. To make the score better you could then remove extreme values etc. Allot to play with here =)

With the current setup in opengamebenchmarks, it is possible at any point to aggregate the data, for example, of a certain game and do the two groups Linux and Windows (or even separating linux distros, etc.). But computing a single "score" for each group is very very tricky, and potentially misleading. The main problem is that you need to asses if the variation among the groups is bigger or smaller than the variation between the groups, so you need something like ANOVA (https://en.wikipedia.org/wiki/Analysis_of_variance). And, in turn, those variances could be very inaccurate, if the sample size is too small, and there could be a number of hidden factors... So it is better, at the beginning, to just collect the data.

Quoting: CommanderIf you need any help im happy to contribute =) Also some games do come with demo files (source based games) that users could upload for other people to compare.
Thank you! I'll put the code on github soon, and I will most definitely need some help!
J_Salem Jan 31, 2016
Quoting: CAPTNCAPSHmm, not all steam linux games are on there :/
Missing for example Slime Rancher, which is currently early access, would've liked to upload a Benchmark of it.
Anyway, registered, submitted 4 benchmarks :)

Hmm, interesting. I get the updated list of games from steamdb: https://github.com/SteamDatabase/SteamLinux/blob/master/GAMES.json
I think that Early Access games are not there. Now it would be good to know if the community is interested in having support for those....
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.