GPUBoss Review Our evaluation of R9 270 vs 760 among Desktop GPUs over $400

Gaming

Battlefield 3, Battlefield 4, Bioshock Infinite and 21 more

Graphics

T-Rex, Manhattan, Sky Diver Factor and Fire Strike Factor

Computing

Face Detection, Ocean Surface Simulation and 3 more

Performance per Watt

Battlefield 3, Battlefield 4, Bioshock Infinite and 31 more

Value

Battlefield 3, Battlefield 4, Bioshock Infinite and 31 more

Noise and Power

TDP, Idle Power Consumption, Load Power Consumption and 2 more

6.6

Overall Score

Winner
AMD Radeon R9 270 

GPUBoss recommends the AMD Radeon R9 270  based on its noise and power.

See full details

Cast your vote Do you agree or disagree with GPUBoss?

Thanks for adding your opinion. Follow us on Facebook to stay up to date with the latest news!
VS

Differences What are the advantages of each

Front view of Radeon R9 270

Reasons to consider the
AMD Radeon R9 270

Report a correction
Much higher metro: last light framerate 90 vs 29.4 More than 3x higher metro: last light framerate
Lower TDP 150W vs 170W More than 10% lower TDP
Front view of GeForce GTX 760

Reasons to consider the
Nvidia GeForce GTX 760

Report a correction
Higher clock speed 980 MHz vs 900 MHz Around 10% higher clock speed
Slightly higher texture rate 94.1 GTexel/s vs 74 GTexel/s More than 25% higher texture rate
Higher turbo clock speed 1,032 MHz vs 925 MHz More than 10% higher turbo clock speed
Slightly more texture mapping units 96 vs 80 16 more texture mapping units
Higher memory clock speed 1,502 MHz vs 1,400 MHz More than 5% higher memory clock speed

Benchmarks Real world tests of Radeon R9 270 vs GeForce GTX 760

Battlefield 4

Crysis 3

Metro: Last Light

FarCry 3

Reviews Word on the street

Radeon R9 270  vs GeForce GTX 760 

9.2
9.0
With a retail price of $185, the card is just $5 more expensive than the AMD reference design.
Radeon R9 270

Comments

Showing 12 comments.
Is the temperature limit of AMD Cards still at 70°C? If yes, my GTX 760 is under full load hotter than 70°C. But the temperature limit of that card, is at 100°C...
I can't get a higher clock speed...
brian blair you suck a knowing the cards. AMD is for low budget gamers that want high performance for a low price and Nvidia does a great job on quality and Nvidia cards has more life time. thats why so many people go for Amd and not for Nvidia. but overall Nvidia has the most powerful cards but they come with high prices
goes to show the 760 without the clock is still superior
(and what´s your favorite gpu maker?....I have a sapphire r9270)
Mate it is true, they use heaps more power which puts out more heat, yes nvdia has had hot cards too but that was ages ago now, yes amd are good value for the dollar per performance but i doubt they would last as long with the higher heat output, no need to be an amd fanboy, im not and nvidia fanboy im just stating the facts!
Why because someone told you? Both cards are of good enough quality, And AMD has a reputation just as good as Nvidia, And as far as AMD running hotter is nothing but a myth, Both Nvidia and AMD have had problems with hot running cards, My R9 270 runs at the exact same temps as my old 650 Ti, And the 650 Ti hardly used any watts. So it really depends on who makes the card for AMD or Nvidia and the rest is nothing but placebo effect which everyone today seems to get allot of. My favorite GPU maker is whoever makes the best bang for buck at the time, People should not get hung up on brands because it does nothing but cost them in the long run.
my gpu-z with a nvidia gtx 760 without clock http://i.imgur.com/KPK67CL.png
Nvidia has made a name for its self to be better quality, last longer and their cards dont run as hot
I did, I managed to bump the clock speed in GPU Tweak to 1063MHz stable, I can make it go to 1071MHz but haven't tested it yet. Here's my GPU-Z: http://i.imgur.com/mio0AJq.png
hey, did you bought the r9 270?
For over $60 bucks less I would have to say the logical choice would be the R9 270 over the GTX 760. I can't understand why Nvidia is so greedy with their prices. It was the opposite last generation.
comments powered by Disqus