GPUBoss Review Our evaluation of R9 280X vs 770


Real world tests using the latest 3D games

battlefield 3 (2013)


Synthetic tests to measure overall performance

PassMark, 3DMark 11 Graphics, 3DMark Vantage Graphics and 3DMark06

Compute Performance

General computing tests executed on the GPU

Civilization 5 Texture Decomposition (2013) and PassMark Direct Compute

Noise and Power

How loud and hot does the card run idle and under load


GPUBoss Score

Gaming, Benchmarks, Compute Performance and Noise and Power

Nvidia GeForce GTX 770 

GPUBoss recommends the Nvidia GeForce GTX 770  based on its gaming, benchmarks and compute performance.

See full details

Cast your vote Do you agree or disagree with GPUBoss?

Thanks for adding your opinion. Follow us on Facebook to stay up to date with the latest news!

Differences What are the advantages of each

Front view of Radeon R9 280X

Reasons to consider the
Generic Radeon R9 280X

Report a correction
More memory 3,072 MB vs 2,048 MB 50% more memory
Better T-Rex score 7.71 frames/s vs 3.85 frames/s More than 2x better T-Rex score
More shading units 2,048 vs 1,536 512 more shading units
Front view of GeForce GTX 770

Reasons to consider the
Nvidia GeForce GTX 770

Report a correction
Much better 3DMark vantage graphics score 36,150 vs 11,255 Around 3.2x better 3DMark vantage graphics score
Significantly higher clock speed 1,046 MHz vs 850 MHz Around 25% higher clock speed
Much higher crysis 3 framerate 84 vs 25.8 More than 3.2x higher crysis 3 framerate
Higher effective memory clock speed 7,012 MHz vs 6,000 MHz More than 15% higher effective memory clock speed
Slightly better 3DMark06 score 29,690 vs 28,452 Around 5% better 3DMark06 score
Much higher metro: last light framerate 70 vs 36.45 More than 90% higher metro: last light framerate
Higher turbo clock speed 1,085 MHz vs 1,000 MHz Around 10% higher turbo clock speed
Better civilization 5 texture decomposition (2013) score 395 vs 339.1 More than 15% better civilization 5 texture decomposition (2013) score
Higher thief framerate 50 vs 33.6 Around 50% higher thief framerate
Higher memory clock speed 1,753 MHz vs 1,500 MHz More than 15% higher memory clock speed
Higher BioShock infinite framerate 83.4 fps vs 73.9 fps Around 15% higher BioShock infinite framerate
Higher battlefield 4 framerate 57.5 vs 39.9 Around 45% higher battlefield 4 framerate
Higher battlefield 3 framerate 126.2 fps vs 106.5 fps Around 20% higher battlefield 3 framerate

Benchmarks Real world tests of Radeon R9 280X vs GeForce GTX 770

Bitcoin mining Data courtesy CompuBench

Radeon R9 280X
478.67 mHash/s
GeForce GTX 770
103.68 mHash/s

Face detection Data courtesy CompuBench

Radeon R9 280X
87.88 mPixels/s
GeForce GTX 770
43.43 mPixels/s

Ocean surface simulation Data courtesy CompuBench

Radeon R9 280X
1,857.65 frames/s
GeForce GTX 770
1,377.15 frames/s

Particle simulation Data courtesy CompuBench

Radeon R9 280X
518.94 mInteraction/s
GeForce GTX 770
433.19 mInteraction/s

T-Rex (Compubench 1.5) Data courtesy CompuBench

Radeon R9 280X
7.71 frames/s
GeForce GTX 770
3.85 frames/s

Video composition Data courtesy CompuBench

Radeon R9 280X
111.35 frames/s
GeForce GTX 770
50.9 frames/s

Reviews Word on the street

Radeon R9 280X  vs GeForce GTX 770 

The GTX 770 WindForce OC is no different, but the improvement over the reference design cooler is, nevertheless, not that big because NVIDIA's cooler is already quite good.
GeForce GTX 770


Showing 25 comments.
LUL to people that use this website.,7.html,6.html
2 years later.................. 280x PWNED 770 completely.
Now I know this site is biased for Nividia. I have a R9 285 that is slightly slower than a 280x and I get 125-130FPS in Bioshock Infinite on Ultra settings. If I used the massive overclock on my GPU I would do 150FPS and I am happy to show anyone who asks me. So where are they getting these numbers?
KILLED, the card's dead.
The AMD fanboys!! They are everywhere!!
Well, ok. There you are right - it's also not great quality but I think this will be improved. Although I don't really remember any ads, or they are at least not disturbing.
Oh, I didn't know that. I had no performance impact, but that might be because I had an FX-8320, an eight core from which mostly only 4 cores are used at gaming - so theoretically Raptr had 4 cores to use for recording. ;) I also didn't see any ads, or at least they weren't that noticeable. But that could have been changed... I liked Raptr to be honest, just not the recodrding function in therms of quality, but that could be improved in the future, or already has... I wasn't online for some months, because I currently have no PC.. :/
Not in any aspect other than heat build rate. The 280 is a good deal in price to performance and comes to just under 76% of the 770's performance, 79% for the 280x. But the heat it produces is vastly greater, about 10*C more than the 770. In addition the 280/280x may have the 1GB of additional VRAM, but the Tahiti GPU is not able to fully utilize it. So it gains nothing when running at higher resolutions.
Some Virtualization features are disabled. If thats what your going for you buy a Xeon. For gaming and multimedia it's the i series.
With a performance impact and developed by a third party with ads, in addition you cannot record the desktop so if the game is not supported you are out of luck. With Nvidias Shadowplay, you simply turn it on when you want it and turn it off when you don't. The ads alone were bad enough. I don't use third party, namely Raptr in this case.
I live all the benchmarks. They are just window dressing. Under real world use the story changes. I have both of these cards. The gtx-770 is clunky And the radeon is buttery smooth.
Yes, AMD doesn't support CUDA. I also checked it out on my card via GPU-Z. :)
"Noob" ;)
Well amazon also made a $300 card into $600.....
On their Individual Assessments The GTX 770 scores an 8.2 : and The R9 280x scores an 8.5 : But apparently when they compare against each other, the Nvidial cards, "Magically" comes out on top. Loads of Horse Shit, I tell ya!!!
Communist system we are under ? Joke? you know what comunism means no ?
AMD tiene la mejor relacion precio/producto/rendimiento. y te vende lo que verdaderamente vas a necesitar a mitad de precio que nvidia, en cambio. Nvidia te vende productos compatibles con plataformas solo creadas para que te compres y te gastes dinero en una nvidia, que a su vez no hay que quitarle el merito de que consume menos. Es la misma comparativa que si la realizamos entre AMD e Intel. AMD sale ganando siempre. Apenas vas a notar el gasto del consumo por el rendimiento que te pueda ofrecer el ultimo i7. Bueno y ahora con las meomrias ddr4 que casualidad que primero lo sacan para los intel.......
"Thirdly in real world tests the r9 280 (not the 280x) is a much better deal for money"
Yes, I just saw that now. My bad. Spec Nazis are damn helpful.
Reading fail. He has speccing 8Gb (it says 4gb x2).
mine i5 3570k does support it.. who told you that?
I gotta say I agree with the majority of visitors who think this site is biased toward the green team. 770 and 280x is close it's true but why does AMD lose every time? As far as troubles are concerned, I have had more hassle with nVidia cards personaly and would encourage people to buy Radeon. Yes they have their problems but both do. Anyways, check out reviews on the net and decide what you want, remember though that the samples reviewers recieve aren't guarenteed to be the same as the one you purchase. My 280x is a Sapphire Dual X and performs brilliantly at 1080p on MAX on all my games (Titanfall, COD Ghosts..). Happy gaming and don't be a fanboy, it's just annoying.
I'm gonna try to give the most fair comparison here. I own both cards. So R9 280x if we talk FPS stats compared to the Nvidia 770 there is 1-8 frame difference in AMD's Favor, but who cares 90% of the time we are arguing two frames you cannot see this difference in game .... although I will say this if you think that AMD has bad recording stop it right now..... Gaming evolved from AMD has gone a long way and recording is pretty good..( Although these recording things are in beta) so you can come across problems... So nvidia in this aspect is more secure..... One thing is the R9 280x does indeed artifact but it is so minuscule and so easy to fix just alt tab and go back into game it will refresh card. Also don't listen to people waste your time about CUDA, PHYSX..... PHYSX is possible on AMD proccessors only thing is.. set it up is a pain! I think the bottom line is go for what is cheaper.... which most of the time is AMD but keep in mind .......... If you want to have a plug and play type of GPU because you dont have patience, go for Nvidia... they usually don't have problems and their is very little to tweak
They go purely by system specs, which are higher for Nvidia. The system doesn't account for card architecture, if you're looking for absolute comparisons look at fucking benchmarks.
all things being equal (build quality, heat tolerance, general performance) I may go with nVidia because of their "Cuda Cores"... though I do like AMD for their OpenCL performance. It would be nice to get say a 780 or a 290 that draws less than 225W.
comments powered by Disqus