GPUBoss Review Our evaluation of R9 280X vs 770 among Desktop GPUs


Battlefield 3, Battlefield 4, Bioshock Infinite and 21 more


T-Rex, Manhattan, Cloud Gate Factor, Sky Diver Factor and 1 more


Face Detection, Ocean Surface Simulation and 3 more

Performance per Watt

Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more


Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more

Noise and Power

TDP, Idle Power Consumption, Load Power Consumption and 2 more


Overall Score

Nvidia GeForce GTX 770 

GPUBoss recommends the Nvidia GeForce GTX 770  based on its benchmarks and noise and power.

See full details

Cast your vote Do you agree or disagree with GPUBoss?

Thanks for adding your opinion. Follow us on Facebook to stay up to date with the latest news!

Differences What are the advantages of each

Front view of Radeon R9 280X

Reasons to consider the
Generic Radeon R9 280X

Report a correction
Higher memory bandwidth 288 GB/s vs 224 GB/s Around 30% higher memory bandwidth
Much better bitcoin mining score 468.39 mHash/s vs 112.94 mHash/s Around 4.2x better bitcoin mining score
More memory 3,072 MB vs 2,048 MB 50% more memory
Better floating-point performance 4,096 GFLOPS vs 3,213 GFLOPS More than 25% better floating-point performance
More shading units 2,048 vs 1,536 512 more shading units
Wider memory bus 384 bit vs 256 bit 50% wider memory bus
Better PassMark direct compute score 3,677 vs 3,083 Around 20% better PassMark direct compute score
Front view of GeForce GTX 770

Reasons to consider the
Nvidia GeForce GTX 770

Report a correction
Much better 3DMark vantage graphics score 36,150 vs 11,255 Around 3.2x better 3DMark vantage graphics score
Higher clock speed 1,046 MHz vs 850 MHz Around 25% higher clock speed
Higher effective memory clock speed 7,012 MHz vs 6,000 MHz More than 15% higher effective memory clock speed
Much higher crysis 3 framerate 84 vs 25.8 More than 3.2x higher crysis 3 framerate
Slightly better 3DMark06 score 29,690 vs 28,452 Around 5% better 3DMark06 score
Higher memory clock speed 1,753 MHz vs 1,500 MHz More than 15% higher memory clock speed
Higher turbo clock speed 1,085 MHz vs 1,000 MHz Around 10% higher turbo clock speed
Higher BioShock infinite framerate 83.4 fps vs 73.9 fps Around 15% higher BioShock infinite framerate
Slightly lower TDP 230W vs 250W Around 10% lower TDP

Benchmarks Real world tests of Radeon R9 280X vs GeForce GTX 770

Bitcoin mining Data courtesy CompuBench

Radeon R9 280X
468.39 mHash/s
GeForce GTX 770
112.94 mHash/s

Face detection Data courtesy CompuBench

Radeon R9 280X
95.62 mPixels/s
GeForce GTX 770
50.95 mPixels/s

T-Rex (GFXBench 3.0) Data courtesy CompuBench

Radeon R9 280X
GeForce GTX 770

Manhattan (GFXBench 3.0) Data courtesy CompuBench

Radeon R9 280X
GeForce GTX 770

Fire Strike Factor Data courtesy FutureMark

Sky Diver Factor Data courtesy FutureMark

Battlefield 4

Reviews Word on the street


Showing 25 comments.
2 years later.................. 280x PWNED 770 completely.
Now I know this site is biased for Nividia. I have a R9 285 that is slightly slower than a 280x and I get 125-130FPS in Bioshock Infinite on Ultra settings. If I used the massive overclock on my GPU I would do 150FPS and I am happy to show anyone who asks me. So where are they getting these numbers?
KILLED, the card's dead.
The AMD fanboys!! They are everywhere!!
Well, ok. There you are right - it's also not great quality but I think this will be improved. Although I don't really remember any ads, or they are at least not disturbing.
Oh, I didn't know that. I had no performance impact, but that might be because I had an FX-8320, an eight core from which mostly only 4 cores are used at gaming - so theoretically Raptr had 4 cores to use for recording. ;) I also didn't see any ads, or at least they weren't that noticeable. But that could have been changed... I liked Raptr to be honest, just not the recodrding function in therms of quality, but that could be improved in the future, or already has... I wasn't online for some months, because I currently have no PC.. :/
Not in any aspect other than heat build rate. The 280 is a good deal in price to performance and comes to just under 76% of the 770's performance, 79% for the 280x. But the heat it produces is vastly greater, about 10*C more than the 770. In addition the 280/280x may have the 1GB of additional VRAM, but the Tahiti GPU is not able to fully utilize it. So it gains nothing when running at higher resolutions.
Some Virtualization features are disabled. If thats what your going for you buy a Xeon. For gaming and multimedia it's the i series.
With a performance impact and developed by a third party with ads, in addition you cannot record the desktop so if the game is not supported you are out of luck. With Nvidias Shadowplay, you simply turn it on when you want it and turn it off when you don't. The ads alone were bad enough. I don't use third party, namely Raptr in this case.
I live all the benchmarks. They are just window dressing. Under real world use the story changes. I have both of these cards. The gtx-770 is clunky And the radeon is buttery smooth.
Yes, AMD doesn't support CUDA. I also checked it out on my card via GPU-Z. :)
"Noob" ;)
Well amazon also made a $300 card into $600.....
On their Individual Assessments The GTX 770 scores an 8.2 : and The R9 280x scores an 8.5 : But apparently when they compare against each other, the Nvidial cards, "Magically" comes out on top. Loads of Horse Shit, I tell ya!!!
Communist system we are under ? Joke? you know what comunism means no ?
AMD tiene la mejor relacion precio/producto/rendimiento. y te vende lo que verdaderamente vas a necesitar a mitad de precio que nvidia, en cambio. Nvidia te vende productos compatibles con plataformas solo creadas para que te compres y te gastes dinero en una nvidia, que a su vez no hay que quitarle el merito de que consume menos. Es la misma comparativa que si la realizamos entre AMD e Intel. AMD sale ganando siempre. Apenas vas a notar el gasto del consumo por el rendimiento que te pueda ofrecer el ultimo i7. Bueno y ahora con las meomrias ddr4 que casualidad que primero lo sacan para los intel.......
"Thirdly in real world tests the r9 280 (not the 280x) is a much better deal for money"
Yes, I just saw that now. My bad. Spec Nazis are damn helpful.
Reading fail. He has speccing 8Gb (it says 4gb x2).
mine i5 3570k does support it.. who told you that?
I gotta say I agree with the majority of visitors who think this site is biased toward the green team. 770 and 280x is close it's true but why does AMD lose every time? As far as troubles are concerned, I have had more hassle with nVidia cards personaly and would encourage people to buy Radeon. Yes they have their problems but both do. Anyways, check out reviews on the net and decide what you want, remember though that the samples reviewers recieve aren't guarenteed to be the same as the one you purchase. My 280x is a Sapphire Dual X and performs brilliantly at 1080p on MAX on all my games (Titanfall, COD Ghosts..). Happy gaming and don't be a fanboy, it's just annoying.
I'm gonna try to give the most fair comparison here. I own both cards. So R9 280x if we talk FPS stats compared to the Nvidia 770 there is 1-8 frame difference in AMD's Favor, but who cares 90% of the time we are arguing two frames you cannot see this difference in game .... although I will say this if you think that AMD has bad recording stop it right now..... Gaming evolved from AMD has gone a long way and recording is pretty good..( Although these recording things are in beta) so you can come across problems... So nvidia in this aspect is more secure..... One thing is the R9 280x does indeed artifact but it is so minuscule and so easy to fix just alt tab and go back into game it will refresh card. Also don't listen to people waste your time about CUDA, PHYSX..... PHYSX is possible on AMD proccessors only thing is.. set it up is a pain! I think the bottom line is go for what is cheaper.... which most of the time is AMD but keep in mind .......... If you want to have a plug and play type of GPU because you dont have patience, go for Nvidia... they usually don't have problems and their is very little to tweak
They go purely by system specs, which are higher for Nvidia. The system doesn't account for card architecture, if you're looking for absolute comparisons look at fucking benchmarks.
all things being equal (build quality, heat tolerance, general performance) I may go with nVidia because of their "Cuda Cores"... though I do like AMD for their OpenCL performance. It would be nice to get say a 780 or a 290 that draws less than 225W.
OpenCL is getting more popular - they should have more benchmarks on OpenCL for AMD and nVidia. They should also mention the programs and systems that support OpenCL. Apple seems to be pushing OpenCL and has some nice FirePro options on the new Mac "mini" Pro (cylinder). Also, those systems only use sa 450W PSU - money!
comments powered by Disqus