Is overclocking a good idea? Also which software for AMD GPU.
TimeFreeze Aug 15, 2022
So first of sorry if its the wrong Category.

Is overclocking in general a good idea? I mean no huge overclocks just tiny ones where i would feel safe enough that it wont hurt. Are there any guidelines for overclocking a CPU or GPU? Like its "safe to assume" to go at least 100mhz higher on everything or something like that? And which software exists on Linux for AMD GPU? For Nvidia there was GreenwithEnvy i think but that didnt work for me anyway, maybe it was because my GPU was too ancient for the software or i just didnt know how to do it. Though back on Windows it was "easy" with i think MSI Afterburner. Anyway which software would be good for that? Corectl? I think i've heard that this one is good though i'm not sure.

Thanks in advance for any answer.
This topic has an answer marked - jump to answer.
peta77 Aug 15, 2022
In general (applies to CPU and GPU):
- overclocking generates more power consumption so the chip get's hotter which significantly shortens its lifetime, so check your cooling / monitor the temperature
- there's no general rule how much overclocking is OK, it heavily depends on the individual chip; depending on where on the waver it was, it may have higher quality and can be overclocked higher... so even two chips of exactly the same model, stepping, revision can have different limits for overclocking
- the biggest problem is usually computation errors, so usually a result is expected to be ready at a specific clock interval, no matter how much of it was used to run the computation; when there's time left the chip just waits; but depending on the type of computation (i.e. integer add, float multiply, etc.) this margin can be different; so the "slowest operation" sets the limit; but no-one knows which one that is... so, if you're setting the clock too high you'll get wrong results because an operation is incomplete.. this leads to strange behaviour, data corruption and crashes; so test it on something you won't miss, i.e. use a live-system and lots of benchmarks for testing, don't use your real installation before you are 100% sure it works...
Valck Aug 15, 2022
To add another aspect to what peta said:

I've never been much of an overclocker, so take my words with a grain of salt, but from what I see and read, the margins for overclocking have continuously shrunk over time.

Chip makers have greatly improved their testing and selection methods ("binning"), where chips with lower margins get sold as lower performance products, and vice versa.
If one chip has better performance potential than the one next to it, it gets "binned" into a higher tier and sold as such, which means *all* chips of one tier have very similar performance and only very minuscule potential for further overclocking.

Unless you're in it for the fun, or the pure educational, or competitive aspects (some people do it as a kind of sports ;) ), I don't think there is much to be gained, versus a significant potential of time and money to be wasted.
If you want more performance, IMO it's better to just spend a little more on a quality version of that same tier bargain graphics card, or opt for the next higher tier model (think 6600XT versus 6600, or 6700 versus 6600XT, and so on).
Again, unless you do it for the fun of it, is it worth potentially tens of hours of your time? Or the risk of having to buy a new graphics card because it died? And if so, could you afford to buy the same card again?

Of course it may look very different if your source of income is your parents' monthly allowance... or maybe not, you might no want to risk losing what you already have for marginal gains.
tfk Aug 16, 2022
I've played with undervolting a bit but nowadays I don't really care about it.

I used a tool called Radeon-profile but I believe CoreCtrl can do it too.

When I want more performance then I press "e" in the grub boot screen and add "mitigations=off" to disable all cpu security mittigations.
WorMzy Aug 16, 2022
I personally haven't had much use for overclocking since my Core 2/Wolfdale days. Back then squeezing that little bit extra was imperative to the smooth running of my Dwarf Fortress games once they got past a certain size.

These days I run on AMD, and while I haven't overclocked in 10-15 years, I like having the option there if I ever need to.
denyasis Aug 17, 2022
Remember when you had to draw on the CPU in pencil to overlook? Lol

I dabbled on my last machine, a Core 2 Duo. Other than being a genuinely fun learning experience, there wasn't a huge performance benefit, maybe like a percent. I couldn't tell a difference. I've never tried it in a GPU.

Not saying it was related at all, but that computer died in 2018. Burning plastic, smokey smell of burning plastic, you get the idea.
Forge Aug 17, 2022
So I started overclocking in the days when your CPU speed was set via jumpers on the board, and you could set your 386/486 to any speed you wanted, and if it crashed, it was your problem. In those days there was tons of free speed to be had, lots of CPUs running at double the rated speed without any issues.

In later years, you got locked multipliers, but you could get free speed by changing the FSB speed. I ran some Celeron 366s at 550 for a year or two, and some Pentium III 600E@800EB for a while after that. Rock solid stable. After that, Athlons let you change your multiplier, and then we started to get into modern-ish stuff. The Core2 era did reduced CPU clocks at idle, but it was still possible to mess with the *top* speeds and bins for benefits.

Thing is, currently, both Intel and AMD are already doing "overclocking" by the old standards. Instead of selling a CPU with an official "maximum speed", they are selling CPUs with an official max speed, an official turbo limit, and they're shaping things more by cooling capacity. Current CPUs can be "overclocked" just by putting better CPU/GPU cooling in. My CPU and GPU are on a custom water loop, so they generally go to max boost/turbo as soon as there's load, and they stay there till the load goes away, since the cooling can outrun even the turbo bins.

There are still some percentages you can squeak out, because nobody sells a CPU or GPU that turbos to the maximum possible speed, they go to the maximum consistently reached speed, and there will be some outliers that can outrun the turbo clocks, but it's rare.
h1ght Oct 17, 2022
Nowdays, it doenst make any sense to oc cpu's anymore, if u got some older intel cpu's prior 9xxx like coffe lake/skylake/kabylake/haswell etc. u can delied it, apply liquid metal and run that into 5ghz e.g. but nowdays most cpu's running in their thermal limit.
gpu's are different, maybe u get slight increase in perfomance, but i doubt it will be a new feeling. and
if u dont have beefy gpu-cooler, there isnt much to increase.
undervolting/underclocking is better, its more quieter then.
Kuduzkehpan Oct 19, 2022
Overclocking was fun and good when era of AMD 939socket CPUs we were doing hardware oc. Yet it was dangerous couldt result burnt cpu on your hands.
nowadays i use bios options to push limits of cpu and ram. but what i gain is little performance impact big satisfaction. You feel like your pc became supercomputer. but in reality it was only 1ghz difference in cpu and 1.4ghz difference in ram
also you need some kind of good liquid cooler for hardware safety.
i recommend using bios settings for oc. else dont bother with that thing.

p.s: install linux on SSD and you have most overclocked pc ever. ;)

Last edited by Kuduzkehpan on 19 October 2022 at 6:04 pm UTC
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
Login / Register


Or login with...
Sign in with Steam Sign in with Google
Social logins require cookies to stay logged in.