GPUBoss Review Our evaluation of RX 580 vs 1080 among all GPUs

Gaming

Battlefield 3, Battlefield 4, Bioshock Infinite and 21 more

Graphics

T-Rex, Manhattan, Cloud Gate Factor, Sky Diver Factor and 1 more

Computing

Face Detection, Ocean Surface Simulation and 3 more

Performance per Watt

Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more

Noise and Power

TDP, Idle Power Consumption, Load Power Consumption and 2 more

8.5

Overall Score

Winner
Nvidia GeForce GTX 1080 

GPUBoss recommends the Nvidia GeForce GTX 1080  based on its benchmarks and compute performance.

See full details

Cast your vote Do you agree or disagree with GPUBoss?

Thanks for adding your opinion. Follow us on Facebook to stay up to date with the latest news!
VS

Differences What are the advantages of each

Front view of Radeon RX 580

Reasons to consider the
AMD Radeon RX 580

Report a correction
Higher memory bandwidth 256 GB/s vs 224.4 GB/s Around 15% higher memory bandwidth
Significantly higher memory clock speed 2,000 MHz vs 1,251 MHz Around 60% higher memory clock speed
Front view of GeForce GTX 1080

Reasons to consider the
Nvidia GeForce GTX 1080

Report a correction
Significantly better PassMark score 11,994 vs 7,781 Around 55% better PassMark score
Higher clock speed 1,607 MHz vs 1,257 MHz Around 30% higher clock speed
Significantly higher pixel rate 102.8 GPixel/s vs 42.88 GPixel/s Around 2.5x higher pixel rate
Higher effective memory clock speed 10,008 MHz vs 8,000 MHz More than 25% higher effective memory clock speed
Better floating-point performance 8,228 GFLOPS vs 6,175 GFLOPS Around 35% better floating-point performance
Higher texture rate 257.1 GTexel/s vs 193 GTexel/s Around 35% higher texture rate
Significantly more render output processors 64 vs 32 Twice as many render output processors
Significantly higher turbo clock speed 1,733 MHz vs 1,340 MHz Around 30% higher turbo clock speed
Significantly better PassMark direct compute score 8,106 vs 3,923 More than 2x better PassMark direct compute score
Better sky diver factor score 516.48 vs 441.03 More than 15% better sky diver factor score
Better face detection score 147.18 mPixels/s vs 126.41 mPixels/s More than 15% better face detection score

Benchmarks Real world tests of Radeon RX 580 vs GeForce GTX 1080

Bitcoin mining Data courtesy CompuBench

Radeon RX 580
656.93 mHash/s
GeForce GTX 1080
685.15 mHash/s

Face detection Data courtesy CompuBench

Radeon RX 580
126.41 mPixels/s
GeForce GTX 1080
147.18 mPixels/s

Ocean surface simulation Data courtesy CompuBench

Radeon RX 580
2,199.63 frames/s
GeForce GTX 1080
1,721.2 frames/s

T-Rex (GFXBench 3.0) Data courtesy CompuBench

Radeon RX 580
3,288.94
GeForce GTX 1080
3,349.27

Manhattan (GFXBench 3.0) Data courtesy CompuBench

Radeon RX 580
3,713.52

Fire Strike Factor Data courtesy FutureMark

Sky Diver Factor Data courtesy FutureMark

Cloud Gate Factor Data courtesy FutureMark

Comments

Showing 14 comments.
Yeah, now the RX is about the same price as 1080, add a bit ontop and just get a 1080 TI which is around 50% better than the RX 580 for gaming.
That was when they came out, now the RX 580 is about the same price as the 1080 so it makes more sense to get one of these or even the TI version which is slightly more expensive.
The main reasons for the top end prices is yield and new tech manufacturing costs. Yield= go for a 36-core Xeon but if there's a bad core in it toss it in a lower bin and sell it as a cheaper part. Obviously you need to profit from that, not only to keep investors off your back but to reinvest for the next manufacturing node, which, with silicon at least, is reaching the end of the road (5nm will be it if not 7nm). Every new node requires a new factory and the price of that doubles every time ($3B for 14nm). That's part of the reason why they had to add an extra tock to their tick cycle, 10nm was harder that they thought it would be in addition to cost (they fried a $50M UV laser litho machine in the attempt, 10nm plants will prob be $5-7B to build and equip). Going from silicon to exotic substrates like gallium arsenide won't help much, .999999 pure crystals of that stuff is 5000x the cost of silicon (why solar panels are seldom made from it even though they are 2x as efficient). Graphene probably even more and that's 10-20 years away from prime time (the bigger the sheet, the harder it is to make without defects, why they are all microscopic in size, never mind the 12" size of a current wafer). In short I don't see Intel stockholders getting rich at the expense of everyone else. In fact compared to their peak since 7-1-2000 shares have lost half their value even ignoring inflation. So I don't buy the common thinking that they are charging a high price just because they can. If that were true even the cheapest cpus would cost $2k to keep the stock price ahead of inflation and AMD would probably only slightly undercut that.
You got it backwards, it's not that AMD doesn't have quality and that's why they sell cheap, rather it's Intel and Nvidia that are WAY over priced and are ripping us off. Intel is at the top sure but their tech is not that revolutionary, they haven't innovated in years because they got fat and lazy and they think they can charge 1000$+ for a CPU because they can. Now that AMD is releasing new tech, now Intel is clenching it's butt cheeks and trying to release something new. Do you want to support companies like these?
recently few RXes 580 for 2000,- in my country while 1080 are available in every shop even for 2200,- all because fuc*&^ miners ; p
That's just it, AMD hasn't been better than Intel since 2005 with their top end FX cpu, and that was $1000 at the time (my last AMD build). It's doubtful they'd sell a part for 1/4 the price if performance is comparable, that would just be a stupid business decision. At best It might be undercut by 1%, if they decided to go toe-to-toe in a price war and both parts had equal performance.
Like the saying goes, a fool and their money are soon parted I pay for performance not for brand names, if an AMD part has 70% of the performance of an Intel/Nvidia part but at 1/4th the price, I will get the AMD part. Like everyone with a fully functioning brain should do
It can hold its own at stock. But seeing as most people dropping 500 for a cpu will be overclocking it is held back by its AWFUL overclocking ability to the point where you are better off with an intel Hexacore in most environments.
gtx 1080 he win why rx 580
Ryzen only has 24 pcie lanes (vs 2011 at 40), which could be a dealbreaker depending on what else you want to stick on the mobo. Like I said, you get what you pay for.
Except that AMD's best $500 8-core Ryzen can hold its own vs Intel's $1600 10-core i7 and makes mincemeat of the rest. But hey, if you like paying more just for the sake of paying more... Isn't it painful to live with a stick lodged up your ass, though?
You get what you pay for. Like everything else AMD, it's a budget component meant for poor people (ie starving artists and students). Last AMD cpu I owned was when they were actually better than Intel's best offering (a $1000 fx-5950 back in 2005).This review is like comparing AMD's best $500 8-core Ryzen vs Intel's $1600 10-core i7. Amd is best bang for buck for sure if you don't need nukes but can make do with cap guns. Obviously if cpu's and vid cards could be paralleled to infinity with the power bill the only limit AMD would be the way to go. Sadly that's not the case, I've never seen a mobo with more than 4 sockets and 8 pci-e 8x card slots and at a ridiculous price tag for that since all those are server-grade/priced, which kills the cheaper cpu price advantage. And don't compare cheaper dual cards vs single bigger ones.. SLi/eyefinity doesn't come without cpu overhead and 2 in sli is not 2x faster than a single card. It's more like 1.8x, assuming ideal conditions with no bottlenecks. Returns diminish fast with triple and quad card setups (2.6 and 3.3x or so, depending on the game and drivers).
for the price...definitely. but don't ever compare 2 lower gpus with 1 higher gpu
MOney should be an obvious factor when considering these cards. Im an Nvidia fan, but if the 1080 is more than twice the price of the amd 580 then the AMD is obviously better. Two 580s would be cheaper then a 1080 and better
comments powered by Disqus