GPUBoss Review Our evaluation of R9 390X vs 980 among Desktop GPUs


Battlefield 3, Battlefield 4, Bioshock Infinite and 21 more


T-Rex, Manhattan, Cloud Gate Factor, Sky Diver Factor and 1 more


Face Detection, Ocean Surface Simulation and 3 more

Performance per Watt

Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more


Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more

Noise and Power

TDP, Idle Power Consumption, Load Power Consumption and 2 more


Overall Score

AMD Radeon R9 390X 

GPUBoss recommends the AMD Radeon R9 390X  based on its benchmarks.

See full details

Cast your vote Do you agree or disagree with GPUBoss?

Thanks for adding your opinion. Follow us on Facebook to stay up to date with the latest news!

Differences What are the advantages of each

Front view of Radeon R9 390X

Reasons to consider the
AMD Radeon R9 390X

Report a correction
Significantly more memory 8,192 MB vs 4,096 MB 2x more memory
Better floating-point performance 5,914 GFLOPS vs 4,981 GFLOPS Around 20% better floating-point performance
Much wider memory bus 512 bit vs 256 bit 2x wider memory bus
More shading units 2,816 vs 2,048 768 more shading units
More texture mapping units 176 vs 128 48 more texture mapping units
Higher texture rate 184.8 GTexel/s vs 155.6 GTexel/s Around 20% higher texture rate
Front view of GeForce GTX 980

Reasons to consider the
Nvidia GeForce GTX 980

Report a correction
Higher effective memory clock speed 7,012 MHz vs 6,000 MHz More than 15% higher effective memory clock speed
Better 3DMark vantage graphics score 44,693 vs 40,697 Around 10% better 3DMark vantage graphics score
Slightly higher clock speed 1,127 MHz vs 1,050 MHz More than 5% higher clock speed
Higher pixel rate 77.82 GPixel/s vs 67.2 GPixel/s More than 15% higher pixel rate
Higher memory clock speed 1,753 MHz vs 1,500 MHz More than 15% higher memory clock speed
Better cloud gate factor score 22.51 vs 21.17 More than 5% better cloud gate factor score
Significantly lower TDP 165W vs 275W 40% lower TDP

Benchmarks Real world tests of Radeon R9 390X vs GeForce GTX 980

Fire Strike Factor Data courtesy FutureMark

Sky Diver Factor Data courtesy FutureMark

Cloud Gate Factor Data courtesy FutureMark


Battlefield 4

Bioshock Infinite

Crysis 3

Metro: Last Light

Reviews Word on the street

Read more


Showing 25 comments.
An 8 GB 980?... I have never seen a 8 GB version... Where is it?
I agree TDP is an important factor that shouldn't be overlooked because of its heat related issues. That is definitely more relevant to me as an overclocker than the power consumption on power bill argument.
Fair enough, I just believe it's fair to judge a product based on it's TDP in comparison with it's direct competitor, and not as silly/pointless as you made it sound
Not true for custom loops and any decent aio coolers. Most radiator and case fans aren't PWM fans, which means they run on a constant RPM regardless of temperature. Constant RPM produces constant noise and if you use fans such as Noctua, you wouldn't hear any noise anyway. The 980 is superior to the 390x. However, there is the is/ought gap. Just because it is, doesn't entail that we ought to get it. At the time of my original comment, the 980 costed about $150-200 USD more than the 390x but offers less than 5% performance headroom for that cost, sometimes the 980 even has worse fps than the 390x. Thus, I didn't recommend it. source:,16.html
It does make more noise in order to stay cool. More heat inside a PC case means more noise from other fans, too. You were the one claiming people who enjoy efficient technology are living in their basements haha. The R9 390x is almost a year newer than the GTX 980, barely manages to beat it, and use more power. There is no argument as to which card is superior.
That's a strawman fallacy. I never made the claim that higher power consumption doesn't produce more heat. My argument was on energy bill costs and nothing else. Everyone knows higher power consumption produces more heat. However, it DOES NOT necessarily make more noise. That largely depends on your cooler. An example would be the use of custom loop. There wouldn't be any noise difference btw AMD and Nvidia cards in that case. Lastly, Intel/Nvidia are easily affordable, but it doesn't mean I enjoy paying more for it. There other things in life that takes priority, travelling for one. Spending top dollars on a gaming rig that gets outdated every 2-3 years would be a poor decision.
More power usage == more heat == more noise. Don't make stupid arguments simply because you are too poor for nvidia/intel master race
Well I'd certainly hope a gtx 980 used less power than the dual-gpu 295x2. Strange comparison.
I have had my Amd 7700 for 4 years and its still going strong.
Bro, so what your telling me is i can get my 970 with windforce x3 cooler to 1.96Ghz!! Haha yea right. I think you kean more of 10-25%, mine is currenty 18% overclocked. I perfer more power consumption as i use my pc to geat my room in winter instead of using gas ir arking up the fire and burning wood, if people take all these into consideration AMD is most likely the better choice, my 970 barely gets warm (not happy with it, worst purchase to date) getting a 390x for heater purposes and DX12. Dont get me wrong but in DX11 and below nvidia WILL rule but until they get their shit together their DX12 performance is going to loose them customers and money
MY NIGGA <3 <3 <3
also might i add after a few driver updates. even accessing that top end ram is barely taking a dip if you have a somewhat decent size page file.. just saying..
I thought that AMD fanboys were always arguing price to performance. At least I always have. o_o Also, someone said something about keeping a video card for 5 years. The past 2 years I've went through 3 GPUs, AMD, Nvidia, AMD, because hey, I love price for performance, I don't care about nonsensical arguments. The FPP > all arguments.
Maby in your dreams :D
meh, I use it for crunching, and nvidia sucks when it comes to PT. AMD fanboy, who cares about energy consumption when youre mining anyways.
That 3.5 gb of VRam is going to become a bottleneck sooner or later.
That's speculation. It is only known that the software did cause some cards to run at 20% fan speed but was patched also immediately last year. Tens of thousands is a made up number, not a single creditable site, such as anandtech, guru3d, Tom's hardware, gave a confirmed number of users affected. I personally had 4 AMD cards last November, including the hottest r9 290 ref., and not a single one overheated. Yet, you still can't deny the increased performance brought by Crimson, and these numbers are confirmed.
and the crimson melted about 10 thousand new gpus due to an error with the fan control being locked to user defined at 30%
not dead ram just slow ram,
And I have a rage 128 and an hd4870 that still run you're point
Lmao, AMD fanboys have been using energy consumption as their prime argument for years, and now that Nvidia is more energy efficient, it suddenly doesn't matter. AMD releases their stock cards with clocks near their thermal limits, so they can't be overclocked well but at least compete with stock Nvidia cards. But you can overclock most Nvidia cards these days by 30-40%, drastically pushing them ahead of any AMD card in that range, even OCd.
U right. I've just pointed that it performes better but if u really want to expercience 4k gaming u will need 2 cards.
U had to have bad setting options for last 7 years. I pitty u a bit. Ur statement proves nothing cuz ure rather exeption. More to that if u check different discussions ull see ppl tent to change their cards quite often.
U might be right to wait a bit for 4k gaming(for single card) but I rather focused on Master's assumption that AMD cards wont last more than 2 years, not 4k gaming. I mention that only to point why this card is made or rather tried to be made for. Master focused on life of a card. I'am a happy user of 7970 6g and for 1080p gaming is adecent card. It is working fine whlie I have it more than 3 years already.
I just upgraded from my 9600gt from 2007 to a 980ti. Wait, who doesn't keep a GPU for five years? Everyone?
comments powered by Disqus