GPUBoss Review Our evaluation of R9 295X2 vs 780 Ti among Desktop GPUs

Gaming

Battlefield 3, Battlefield 4, Bioshock Infinite and 21 more

Graphics

T-Rex, Manhattan, Cloud Gate Factor, Sky Diver Factor and 1 more

Computing

Face Detection, Ocean Surface Simulation and 3 more

Performance per Watt

Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more

Value

Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more

Noise and Power

TDP, Idle Power Consumption, Load Power Consumption and 2 more

8

Overall Score

Winner
Nvidia GeForce GTX 780 Ti 

GPUBoss recommends the Nvidia GeForce GTX 780 Ti  based on its benchmarks, compute performance and noise and power.

See full details

Cast your vote Do you agree or disagree with GPUBoss?

Thanks for adding your opinion. Follow us on Facebook to stay up to date with the latest news!
VS

Differences What are the advantages of each

Front view of Radeon R9 295X2

Reasons to consider the
XFX Radeon R9 295X2

Report a correction
Is dual GPU Yes vs No About half of graphics cards are dual GPU
Much better floating-point performance 11,466 GFLOPS vs 5,040 GFLOPS More than 2.2x better floating-point performance
Significantly more memory 8,192 MB vs 3,072 MB Around 2.8x more memory
Much higher pixel rate 130.4 GPixel/s vs 52.5 GPixel/s Around 2.5x higher pixel rate
Much higher texture rate 358.4 GTexel/s vs 210 GTexel/s More than 70% higher texture rate
Many more render output processors 128 vs 48 80 more render output processors
Many more shading units 5,632 vs 2,880 2752 more shading units
Higher clock speed 1,018 MHz vs 875 MHz More than 15% higher clock speed
Much wider memory bus 1,024 bit vs 384 bit Around 2.8x wider memory bus
Many more texture mapping units 352 vs 240 112 more texture mapping units
Better face detection score 116.41 mPixels/s vs 87.61 mPixels/s Around 35% better face detection score
Better cloud gate factor score 24.18 vs 22.28 Around 10% better cloud gate factor score
Front view of GeForce GTX 780 Ti

Reasons to consider the
Nvidia GeForce GTX 780 Ti

Report a correction
Significantly higher effective memory clock speed 7,000 MHz vs 5,000 MHz 40% higher effective memory clock speed
Better PassMark score 8,890 vs 7,462 Around 20% better PassMark score
Significantly higher memory clock speed 1,752 MHz vs 1,250 MHz More than 40% higher memory clock speed
Better PassMark direct compute score 4,688 vs 3,456 More than 35% better PassMark direct compute score
Much lower TDP 250W vs 500W 2x lower TDP

Benchmarks Real world tests of Radeon R9 295X2 vs GeForce GTX 780 Ti

Bitcoin mining Data courtesy CompuBench

Radeon R9 295X2
614.07 mHash/s
GeForce GTX 780 Ti
325.02 mHash/s

Face detection Data courtesy CompuBench

Radeon R9 295X2
116.41 mPixels/s
GeForce GTX 780 Ti
87.61 mPixels/s

Ocean surface simulation Data courtesy CompuBench

Radeon R9 295X2
1,997.66 frames/s
GeForce GTX 780 Ti
2,001.15 frames/s

Particle simulation Data courtesy CompuBench

Radeon R9 295X2
651.4 mInteraction/s
GeForce GTX 780 Ti
1,129.7 mInteraction/s

T-Rex (Compubench 1.5) Data courtesy CompuBench

Radeon R9 295X2
10.68 frames/s
GeForce GTX 780 Ti
6.85 frames/s

Fire Strike Factor Data courtesy FutureMark

Sky Diver Factor Data courtesy FutureMark

Cloud Gate Factor Data courtesy FutureMark

Reviews Word on the street

Radeon R9 295X2  vs GeForce GTX 780 Ti 

8.6
9.4
Palit unfortunately didn't make good use of that capability as their card basically emits just as much noise as the NVIDIA reference design in idle, which is pretty quiet, but far from "almost inaudible.
GeForce GTX 780 Ti

Comments

Showing 25 comments.
This webpage loves nvidia
they mistaken their 2 Titan X (Yes, before it's release) for their single 780Ti
Test Benches: R9 295x2 64 MB SDRAM Intel Pentium III and GTX 980 Ti 32GB DDR4 Intel Core i7 5960X
It's totally biased!! At no cost GTX 780 ti can win over R9 295x2. Even R9 290x can beat this card. But for R9 295x2, it will totally totally totally murder this GTX cheat series GPU.
I am crying heavily ........................................... ________ ....................................,.... ................``~., .............................,.-"........ ........................"-., .........................,/.............. .............................":, .....................,?.................. .................................\, .................../..................... ..................................,} ................./....................... ...........................,:`^`..} .............../......................... ......................,:"........./ ..............?.....__................... ......................:`.........../ ............./__.(....."~-,_............. ...................,:`........../ .........../(_...."~,_........"~,_.... ................,:`........_/ ..........{.._$;_......"=,_......."-, _.......,.-~-,},.~";/....} ...........((.....*~_......."=-._..... .";,,./`..../"............../ ...,,,___.\`~,......"~.,.............. ......`.....}............../ ............(....`=-,,.......`........... ...........(......;_,,-" ............/.`~,......`-................ .............\....../\ .............\`~.*-,..................... ..............|,./.....\,__ ,,_..........}.>-._\..................... ..............|..............`=~-, .....`=~-,_\_......`\,................... ...............\ ...................`=~-,,.\,............. ................\ ................................`:,,..... .................`\..............__ .....................................`... ...............,%`>--==`` ......................................... ......._,-%.......`\ ...................................,<`... -&``................ `
Just leaving this here. https://www.youtube.com/watch?v=aC1SsnS6naA
yeah OOOOOKAY gpuboss....
But that doesn't lower the performance enough for it to be beat by a single-chip card. *facepalm* Look at these benchmarks... http://www.anandtech.com/show/7930/the-amd-radeon-r9-295x2-review/13
I knew something was wrong at CPU Boss the other day when they compared an i5 against a puny Pentium low-end cpu and said that "it was too close to call". WOW!! Even given the fact the i5 beat the pants off of the Pentium in every catagory related to performance. Now I come to GPU Boss and I'm completely blown away by the utter bullshit that I read!
Sad. We all know the 295X2 is better than the 780Ti. They even cancelled out the real world gaming performance section loool
Really this site is so biased, i'll not come to check never anymore, CPU Boss/GPU and SSD, so fucking biased, i'm saying, no way that this card would be able to beat the R9 295X2, it barely beats the 290X, come on man, that's not true.
that´s completely wrong bullshit! in theory and in practice. just look this video: https://www.youtube.com/watch?v=aC1SsnS6naA
Ha Ha Ha, someone who thinks perfect scaling appears in the real world, you also funny, but I kill you first
Ha Ha Ha, you funny, I like you. I kill you last
Nvidia whores ARE YOU RETARD? HA HA........!!!
Aww look it's a fanboy that won't listen to simple logic. TWO OVERCLOCKED 290X GPUS WILL BEAT A SINGLE 780 TI Now what don't you understand? I bet you guys would be calling bullshit if a r7 280 did better than a GT 720m
Aww look it's a butthurt AMD fan.
stanley here
How can a Dual GPU card lose to a single GPU card?!?!?!!
WHO FUCKING CARES OF PASSMARKS.
it seems obvious to me that GPU boss is just a database. Not any person(s) performing actual testing, just a bot created to generate hits.
http://www.hwcompare.com/17844/geforce-gtx-780-ti-vs-radeon-r9-295x2/ No AMD cards cost less then half as much for many times more performance, if you knew anything, your realize this review of the two is fucking bullshit. Because a 290x is meant to compete with a 780. TWO of them stuck together in one card is more then twice as powerful as a 780. But nice to hear a kid who knows nothing about them commenting on em.
Excuse me GPUBoss, are you ever going to respond to us? Or are you happy to let the mass wave of techheads who laugh at you all over the internet continue to ruin your already questionable reputation? We need to see how you are able to defend your moronic claim in regards to this review. Just so you know, NVIDIA pays us is an acceptable answer. At least then it would explain your complete lack of knowledge in this regard
AMD has unveiled a massively powerful, water-cooled graphics card that it says "manhandles today's and tomorrow's games in maximum settings at 4K resolution." Two fans and a closed-loop cooling system keep the AMD Radeon R9 295X2 from melting down "The AMD Radeon R9 295X2 graphics card is the world's fastest, period," the company says, justifying that claim with a footnote explaining that AMD compared the new card against an Nvidia GeForce Titan Black, which it says was that company's "highest performing graphics card as of March 12, 2014." On March 26, however, Nvidia unveiled its GTX Titan Z, which like the Radeon R9 295X2 gets its grunt from combining two high-end GPUs on the same card – two Kepler GK110 chips in the Titan Z, and two Radeon R9 290X GPUs in the 295X2. For the record, AMD claims that the R9 295X2 can pump out 11.5TFLOPS of compute performance, while Nvidia says that the GTX Titan Z's compute oomph tops out at 8TFLOPS – which just a few weeks ago seemed insanely great, but now appears to be an also-ran to the R9 295X2. CAN YOU READ THIS??? THIS IS REALLY BIAS!!!
Holy Crap! Maybe the benchmark application crashed on the R9 295X2? WTF is this comparison all about???:(
comments powered by Disqus