GPUBoss Review Our evaluation of R7 260X vs 750 among Desktop GPUs


Battlefield 3, Battlefield 4, Bioshock Infinite and 21 more


T-Rex, Manhattan, Cloud Gate Factor, Sky Diver Factor and 1 more


Face Detection, Ocean Surface Simulation and 3 more

Performance per Watt

Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more


Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more

Noise and Power

TDP, Idle Power Consumption, Load Power Consumption and 2 more


Overall Score

Nvidia GeForce GTX 750 

GPUBoss recommends the Nvidia GeForce GTX 750  based on its benchmarks and noise and power.

See full details

Cast your vote Do you agree or disagree with GPUBoss?

Thanks for adding your opinion. Follow us on Facebook to stay up to date with the latest news!

Differences What are the advantages of each

Front view of Radeon R7 260X

Reasons to consider the
AMD Radeon R7 260X

Report a correction
Higher effective memory clock speed 6,500 MHz vs 5,012 MHz Around 30% higher effective memory clock speed
More memory 2,048 MB vs 1,024 MB 2x more memory
Better floating-point performance 1,971.2 GFLOPS vs 1,044.5 GFLOPS Around 90% better floating-point performance
Higher clock speed 1,100 MHz vs 1,020 MHz Around 10% higher clock speed
Higher memory bandwidth 104 GB/s vs 80.2 GB/s Around 30% higher memory bandwidth
Higher texture rate 61.6 GTexel/s vs 32.6 GTexel/s Around 90% higher texture rate
Significantly higher memory clock speed 1,625 MHz vs 1,253 MHz Around 30% higher memory clock speed
More shading units 896 vs 512 384 more shading units
More texture mapping units 56 vs 32 24 more texture mapping units
Better manhattan score 3,715.1 vs 1,745.62 Around 2.2x better manhattan score
Slightly better fire strike factor score 33.7 vs 28.99 More than 15% better fire strike factor score
Front view of GeForce GTX 750

Reasons to consider the
Nvidia GeForce GTX 750

Report a correction
Higher thief framerate 24 vs 7.7 More than 3x higher thief framerate
Lower TDP 55W vs 115W 2.1x lower TDP

Benchmarks Real world tests of Radeon R7 260X vs GeForce GTX 750

Bitcoin mining Data courtesy CompuBench

Radeon R7 260X
222.35 mHash/s
GeForce GTX 750
69.81 mHash/s

Face detection Data courtesy CompuBench

Radeon R7 260X
43.73 mPixels/s
GeForce GTX 750
42.74 mPixels/s

T-Rex (GFXBench 3.0) Data courtesy CompuBench

Radeon R7 260X
GeForce GTX 750

Manhattan (GFXBench 3.0) Data courtesy CompuBench

Radeon R7 260X
GeForce GTX 750

Fire Strike Factor Data courtesy FutureMark

Sky Diver Factor Data courtesy FutureMark

Battlefield 4

Reviews Word on the street


Showing 14 comments.
it has been for 4 years, own 1x Zotac GTX 750 Ti 2GB and 2x asus/MSI GTX 750 1GB and 1x Asus Radeon R7 260X 2GB. R7 260X have better specs but when i run gaming and later try use benchmark R7 260X still lost lol.... dunno why... also R7 260X still hotter than gtx 750
hahaha.....lool....and i thing that....:P
how much did nvidia pay you? xD 260x is faster, make sure
y si, al parecer deben manipular todo a favor de nvidia.
hostia puta que risa, que una R7 260X tira lo mismo que una GTX 750 MIS COJONES. Hasta mi R7 250X es mas potente que esa patata de nvidia FAKEEEEEEEEEEEEEEEEEE
you probably pickup early edition asus or gigabyte with extra 6-pin power connector.. in my case i use MSI gaming GTX750 OC (dont have extra 6-pin slot) + Pentium G3220 + Msi B85i = 160watt only :D Plus in my state, GTX750 and r7 260x sale at the same price....
I think that's a sarcasm.
Guys don't buy 750 this is a shit card R260x is better than 750 I have both card 750 is shit.
I think he was trying to say fucking 750 :D
what brand is fakcen? never heard of it, the stock geforce gtx 750 does infact have does scores.(or around it)
please stop manipulating test
fakeeeeeeeeeeeee~!!! i baught the fakcen 750 and didnot give me HALF of that
If you compare two cards made in the same technology that makes a difference, here you have two different chips, maxwell is very efficient architecture cause as you can see even with those lower specs it still beats the 260x in games, upcoming cards from nvidia based on this architecture will be a monsters, 2x lower TDP and still better performance :D
Has better specs in everything except for power usage. Still loses. Seems legit...
comments powered by Disqus