Confused on Steam Play and Proton? Be sure to check out our guide.
We do often include affiliate links to earn us some pennies. See more here.

The big Wine 3.0 release is now officially available

By - | Views: 29,762

Good things come to those who wait, like a fine Wine. Today the Wine team has officially release the next stable version Wine 3.0 [Official Site].

After around a year of development during the 2.x cycle, Wine 3.0 brings in some major changes towards better game and application support for those of you wanting to run Windows-only stuff on Linux. It's nowhere near perfect, but it's a massive advancement for the Wine project and provides a good base for them to continue onwards.

Here's a few highlights from the mailing list announcement sent today:

  • Direct3D 10 and 11 support which includes:
    • Compute shaders
    • Hull and domain (tessellation) shaders
    • A large number of shader model 4 and 5 shader instructions
    • Cube-map arrays
    • Mip-map generation
    • And lots more
  • The Direct3D command stream, which is disabled by default. 
  • Support for OpenGL core contexts in Direct3D is improved. If you're using Mesa, you shouldn't need to set the "MaxVersionGL" registry key to enable Direct3D 10 and 11 support.
  • The Android graphics driver.
  • Improved DirectWrite and Direct2D support.

There's absolutely tons, that's me just cherry-picking random bits that I found quite interesting from this big release. For the next development cycle, we can look forward to things like Direct3D 12 and Vulkan support, OpenGL ES support to enable Direct3D on Android and plenty more.

You can find the brief official announcement here.

Article taken from GamingOnLinux.com.
Tags: Wine
21 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. Find me on Mastodon.
See more from me
The comments on this article are closed.
48 comments
Page: «4/5»
  Go to:

crt0mega Jan 19, 2018
*cough* nouveau *cough*
Pecisk Jan 19, 2018
My next GPU will be AMD because they have put their words in action - they have made AMDGPU driver official driver, redid binary one for legacy purposes, worked with community, opened their Vulkan driver and community part of developers are kick ass and keep pushing performance up.

Yes, Nvidia might have gigantic marketing and brand recognition and there's no shame for people picking it. I'm not fan of binary drivers though - although Nvidia hasn't caused lot of pain for me with that - and I want that to change.

Said that, GTX 760 has served me very, very well over these years.

As Wine 3.0 - impressive work, everybody involved (clap). I have tried to debug and report multiple games and it looks really hopeful. There's chance Wine 3 can handle even more challenging 3D games with some tweaks. There's incredible amount of work still required for Wine, it feels like never-ending story, so it is even more amazing amount of software running on it.


Last edited by Pecisk on 19 January 2018 at 12:39 pm UTC
mrdeathjr Jan 19, 2018
Quoting: cRaZy-bisCuiT@1050ti users: If you would have claimed nVidia GPUs and drivers are more competitive like 2 years ago, I'd totally agree.

Nowadays you'll have the Raven Ridge desktop APU right around the corner which should match the TDP of an 1050ti and CPU while still having enough performance.

Also, 7 mm will be a thing soon and this time I expect AMD to be quicker than nVidia.

In combination with the fact that AMDs drivers are much better integrated in the Linux world I'd definitely stick with AMD and would go for AMD if I'd be owning an nVidia GPU.

As you can see in Phoronix benchmarks, the AMD driver's performance is now competitive to nVidia and also more and more devs actively support AMD Mesa drivers. As well as Valve does.

Raven ridge APU according CU quantity (11 aka 640 shaders*) in 2400G

*Will be more or less RX 550 and this gpu stay in GT 1030 level, and before gpus is 50% more slower than GTX 1050 and GTX 1050ti stay upper

RTG have serious troubles in consume** area compared with nvidia and sadly in igpu consume cost so much to RTG

**If GT 1030 with 50% shaders minor, 50% minor memory bus, 30w of consume and have same more or less same performance of RX 550 with almost double quantity of shaders and have around 55w of consume

Without forget more important apu ryzen 2200G only have 8 CUs aka 512 shaders, with actual information will be slower than RX 550

See before information could be really difficult to rtg have a igpu with GTX 1050 non ti level, RX 550 level is very possible

Another interesting comparision will be this vega 56 have 3584 shaders and have close than geforce gtx 1070ti with 2432 shaders

Closing before information appears vega needs so much shaders (around 33% more) compared with pascal

Sadly amd dont offer 1024 shaders igpu with ryzen 2200G, with 1024 shaders could be try compete with non ti GTX 1050***

However DDR5 128bit at 7000Mhz aka 110/gbs will be difficult to compete, especially in titles with intensive memory use

Respect drivers them have many things to do: compatibility, freezing, complete opengl set support including azdo extensions with conformant tests, lack of gui, more support for prevent amd cards dont supported or some errors case broken shaders dont fix, observer doors and others

For now many nvidia users stay waiting rtg can improve but in hardware side volta is closer

^_^


Last edited by mrdeathjr on 19 January 2018 at 1:06 pm UTC
Avehicle7887 Jan 19, 2018
Just tested Race Driver: GRID, as I suspected setting the graphics details to high is just as crash happy as always. It seems the way WINE works, it causes a higher virtual memory usage compared to Windows and once 32bit games reach the 4GB Virtual memory usage they crash.

If anyone has any idea what am I doing wrong I would love to solve this mystery, here's a video showing the issue: https://drive.google.com/file/d/12ImU1RuMSGF9Jpen964cHntRVG9gJ-l5/view?usp=sharing

As you can see, the game has only been active for 13 mins then crashed due to out of memory.


Last edited by Avehicle7887 on 19 January 2018 at 2:31 pm UTC
mrdeathjr Jan 19, 2018
Quoting: Avehicle7887Just tested Race Driver: GRID, as I suspected setting the graphics details to high is just as crash happy as always.

It seems the way WINE works, it causes a higher virtual memory usage compared to Windows and once 32bit games reach the 4GB Virtual memory usage they crash.

If anyone has any idea what am I doing wrong I would love to solve this mystery, here's a video showing the issue:

https://drive.google.com/file/d/12ImU1RuMSGF9Jpen964cHntRVG9gJ-l5/view?usp=sharing

As you can see, the game has only been active for 13 mins then crashed due to out of memory.

Good date, in skyrim reports same issue with many mods (in my case only tests a few and works ok)

In my case works but use custom settings showed in video and use windows xp as windows to imitate in wine cfg

However i use xfce but do you use gnome ?

Do you report this behavior to wine devs aka bug?

And with other settings game still crash?

Seeking video error cited openal ?

In my case use wine 32 bit compiled* with all dependencies satisfied with openal needs:



*In your case seems use build complete WOW64, i use only i386

Only manage 2 prefixes, one for older apps 32bits aka wine i386 and other for 64bit apps aka wine64

Maybe can try with only i386 wine

^_^


Last edited by mrdeathjr on 19 January 2018 at 3:25 pm UTC
Avehicle7887 Jan 19, 2018
Quoting: mrdeathjrGood date, in skyrim reports same issue with many mods (in my case only tests a few and works ok)

In my case works but use custom settings showed in video and use windows xp as windows to imitate in wine cfg

However i use xfce but do you use gnome ?

Do you report this behavior to wine devs aka bug?

And with other settings game still crash?

Seeking video error cited openal ?

In my case use wine 32 bit compiled* with all dependencies satisfied with openal needs:



*In your case seems use build complete WOW64, i use only i386

Maybe can try with only i386 wine

^_^

I use MATE desktop, just tried 32bit only Wine - game crashes at first race. Could you please record a video with a window showing the Virtual Memory usage as you play?

EDIT:

I've tested the game on my laptop: Debian 9 running Xfce, Intel HD Graphics. Game behaves exactly as on my desktop and runs out of memory just as fast.


Last edited by Avehicle7887 on 19 January 2018 at 5:51 pm UTC
14 Jan 19, 2018
View PC info
  • Supporter Plus
Quoting: Shmerl
Quoting: 14Is your Nvidia decline prediction based on any information outside of your own preference and this website?

It's quite simple. Nvidia will never reach the level of AMD integration, because they have no interest in opening and upstreaming their driver, and AMD already caught up to Nvidia in performance. So once they'll also catch up in hardware (Vega 2 and Navi), Nvidia will have only disadvantages on Linux, so there will be an accelerating switching away from it.

In the machine learning and server AMD has advantages over Nvidia as well. Their hardware supports asynchronous compute, while Nvidia one doesn't. Also, Khronos are pushing new converged API for graphics and compute, that will combine Vulkan and OpenCL. That would basically undermine CUDA and the grip that Nvidia has over compute market, because there will be zero benefits in using CUDA vs the new portable API. AMD are on the right track to unseat Nvidia from these markets.
You're right, so the only remaining piece for Nvidia is marketing and vendor ties, which counts for something.

Disclosure: I will very likely build my next machine as all or partly AMD. I just like to play devil's advocate so I can see a confident statement backed up. I've heard people say things like, "Amazon is ruined now." Big businesses can often take more than one hit.

Again, I'm not trying to be pro big business, just trying to be realistic.
Shmerl Jan 19, 2018
Quoting: 14You're right, so the only remaining piece for Nvidia is marketing and vendor ties, which counts for something.

Sure, but the main point is that AMD is now mostly competitive (except for TDP and high end cards availability that will have to wait until 2019), so competition will be on fair terms and technical merits. And with Wayland and Mesa, AMD is a clear winner for Linux gamers.


Last edited by Shmerl on 19 January 2018 at 5:56 pm UTC
mrdeathjr Jan 19, 2018
Quoting: Avehicle7887I use MATE desktop, just tried 32bit only Wine - game crashes at first race. Could you please record a video with a window showing the Virtual Memory usage as you play?

EDIT:

I've tested the game on my laptop: Debian 9 running Xfce, Intel HD Graphics. Game behaves exactly as on my desktop and runs out of memory just as fast.

Hi i finish test but dont pass 3.8 to 3.9gb of virtual memory and race finish without issues

Once finish upload video put link here

This link have my grid configuration maybe can test

https://mega.nz/#!3MNzBYBb!CwyChpcbbwsU-pOE70cx0UrqoQyRaFV-CDNX3uq4VOA

Almost forget i disable postprocess effects (i dont like it) renaming effects.xml to effects.disable and effects_override.xml to effects_override.disable in postprocess folder in game directory

^_^


Last edited by mrdeathjr on 19 January 2018 at 6:09 pm UTC
Audi Jan 19, 2018
I am curious if this will improve running Killing Floor 2 through Wine. Right now, DX11 has to be disabled via startup switch. Graphics also then don't look as good and Steam Overlay does not work. I plan to try the update this weekend to give it a try.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.