GPUBoss Review Our evaluation of R9 290X vs 780 Ti among Desktop GPUs

Gaming

Battlefield 3, Battlefield 4, Bioshock Infinite and 21 more

Graphics

T-Rex, Manhattan, Cloud Gate Factor, Sky Diver Factor and 1 more

Computing

Face Detection, Ocean Surface Simulation and 3 more

Performance per Watt

Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more

Value

Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more

Noise and Power

TDP, Idle Power Consumption, Load Power Consumption and 2 more

7.6

Overall Score

Winner
AMD Radeon R9 290X 

GPUBoss recommends the AMD Radeon R9 290X  based on its benchmarks.

See full details

Cast your vote Do you agree or disagree with GPUBoss?

Thanks for adding your opinion. Follow us on Facebook to stay up to date with the latest news!
VS

Differences What are the advantages of each

Front view of Radeon R9 290X

Reasons to consider the
AMD Radeon R9 290X

Report a correction
Much better 3DMark 11 graphics score 32,271 vs 15,630 More than 2x better 3DMark 11 graphics score
Higher clock speed 1,000 MHz vs 875 MHz Around 15% higher clock speed
More memory 4,096 MB vs 3,072 MB Around 35% more memory
Significantly better video composition score 109.83 frames/s vs 58.14 frames/s Around 90% better video composition score
Slightly better floating-point performance 5,632 GFLOPS vs 5,040 GFLOPS More than 10% better floating-point performance
Higher pixel rate 64 GPixel/s vs 52.5 GPixel/s More than 20% higher pixel rate
More render output processors 64 vs 48 16 more render output processors
Front view of GeForce GTX 780 Ti

Reasons to consider the
Nvidia GeForce GTX 780 Ti

Report a correction
Much higher hitman: absolution framerate 74 vs 58.5 More than 25% higher hitman: absolution framerate
Better PassMark score 8,914 vs 7,264 Around 25% better PassMark score
Higher effective memory clock speed 7,000 MHz vs 5,000 MHz 40% higher effective memory clock speed
Better 3DMark vantage graphics score 46,186 vs 42,403.5 Around 10% better 3DMark vantage graphics score
More texture mapping units 240 vs 176 64 more texture mapping units
Higher texture rate 210 GTexel/s vs 176 GTexel/s Around 20% higher texture rate
Slightly higher memory bandwidth 336 GB/s vs 320 GB/s 5% higher memory bandwidth
Significantly higher memory clock speed 1,752 MHz vs 1,250 MHz More than 40% higher memory clock speed
Higher BioShock infinite framerate 116.4 fps vs 94.9 fps Around 25% higher BioShock infinite framerate
Better PassMark direct compute score 4,680 vs 3,471 Around 35% better PassMark direct compute score
Lower TDP 250W vs 300W More than 15% lower TDP

Benchmarks Real world tests of Radeon R9 290X vs GeForce GTX 780 Ti

Bitcoin mining Data courtesy CompuBench

Radeon R9 290X
623.31 mHash/s
GeForce GTX 780 Ti
177.17 mHash/s

Face detection Data courtesy CompuBench

Radeon R9 290X
113.97 mPixels/s
GeForce GTX 780 Ti
71.49 mPixels/s

Ocean surface simulation Data courtesy CompuBench

Radeon R9 290X
2,360.17 frames/s
GeForce GTX 780 Ti
2,039.19 frames/s

Fire Strike Factor Data courtesy FutureMark

Sky Diver Factor Data courtesy FutureMark

Cloud Gate Factor Data courtesy FutureMark

Battlefield 4

Reviews Word on the street

Radeon R9 290X  vs GeForce GTX 780 Ti 

9.2
9.4
Palit unfortunately didn't make good use of that capability as their card basically emits just as much noise as the NVIDIA reference design in idle, which is pretty quiet, but far from "almost inaudible.
GeForce GTX 780 Ti

Read more

Comments

Showing 25 comments.
Years later AMDs driver support is still fucking terrible.
512 bits. amd is what I like
the 290x is WAYYYYYYYY BETTER than the 780 ti i wish i had 2 290x'es im stuck with 2x 780's 290x destroys the 780 ti in benchmarks either way
Looking back at all these comments and seeing how AMD's drivers have caught up to Nvidia, the Nvidia scandal where they were effectively shooting their own older cards in the leg to push 900 series sales, and the aftermarket R9 290x's indeed surpassing the much more expensive high-end aftermarket GTX 780's with a much higher price than the stock 780, I really hope people who are fanboy'ing feel stupid. This is coming from a guy who currently uses a GTX 780, plans to upgrade to a 290x, has owned both intel and AMD processors, and has done extensive research into both cards. Nvidia owns the highest-of-high-end cards that are impractical for the average persons disposable income and savings, but AMD is now matching and surpassing benchmarks at stock clocks when it comes to framerate and framerate stability. It's good to see the market growing more competitive, especially after Nvidia hopefully lost some customers over their integrated technology in games that purposefully sabotages AMD CPU's and GPU's, and the brief time where they sabotaged their own cards. Whilst Nvidia claims Gameworks does not effect AMD cards and CPU's, the proof is in the equivalent hardware to Nvidia's own performing extremely poorly specifically in instances where Gameworks is involved. Let's not forget the 3.5gb of usable memory on the 900 series instead of the promised 4gb. Before someone calls me 'entitled', lying about your cards performance and utilyzing your grip on the market to try and push further profits pissing someone off isn't entitlement, it's backlash to bad business practices. Citations; http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-7.html http://www.forbes.com/sites/jasonevangelho/2014/05/28/nvidia-fires-back-the-truth-about-gameworks-amd-optimization-and-watch-dogs/ http://techreport.com/news/26515/amd-lashes-out-at-nvidia-gameworks-program https://forums.geforce.com/default/topic/806331/nvidia-intentionally-cripples-the-older-generation-kepler-video-cards-through-nvidia-experience-/ https://forums.geforce.com/default/topic/833016/geforce-700-600-series/gtx-780-possible-fail-as-performance-in-the-witcher-3-wild-hunt-/1/ http://www.tomshardware.com/forum/id-2402171/nvidia-gtx-900-series-megathread-links-faq.html http://www.tomshardware.com/answers/id-2642952/evga-geforce-gtx-970-4gb-advertisement-5gb.html http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation
Fair enough the 780ti does beat it at a lot in DX11 benchmarks but the 290 / 290X IMO was designed for Mantle and DX12 hence the 290x performs better than the GTX 980 running DX12 from what we have seen so far in benchmarks like 3Dmark Firestrike Extreme and is the reason AMD was running a 512bit interface. AMD designed Mantle and their latest GPU's (the 290 series) to get the most out of the next gen architecture (Mantle and DX12) since the 290/290x was designed to get the most out of a current AMD GPU's running the latest AMD GCN architecture.
Completely destroys it? Where do you get that from? Opinions are not fact. Yes the 780Ti does beat the R9 290X in some games, But in some games the R9 290X beats the 780 TI, Just like in some Benchmarks the 780 Ti wins and the R9 290X wins in others. But they are both pretty much even cards in performance with the R9 290X actually having a advantage because of it's superior memory bandwidth, And because the R9 290X has 4GB vs the 3GB on the 780Ti. This means that even the R9 290 will out perform the 780 Ti when more memory performance is needed. However the 780 Ti will still not be very far behind. Also the 780 Ti cost about twice as much as the R9 290X. This honestly makes the 780 Ti a very poor choice over the R9 290X. In all honesty Nvidia has great graphics cards. And I would honestly prefer Nvidia over AMD, But Nvidia asks way too much for their products. The new GTX 980 is cheaper than the 780 Ti, But the GTX 980 is still way overpriced! And again this makes the R9 290X once again a better choice over the GTX 980. The R9 290X is a card that will do everything at a very affordable price. The 780 Ti and 980 are also cards that will do everything, But they are way overpriced and not worth what you pay for them.
Sorry to break it to you but the 290x is so much cheaper it's unbelievable. Not only is it better for Ultra-High-Res gaming, but it is more than a couple of hundred dollars cheaper right now...
When you see the way DX12 renders if it will do it like Mantle which they say it will you will know why it's a amazing choice for BF4. The rendering is on another league, you can't describe it but you will notice it instantly specially when rendering 1440p at 135% resolution scale (just under 4K internal res) at 120hz and 120fps. The way it renders the pixels no matter how fast you look around the image never degrades or changes, it updates fast enough (the driver layer not getting in the way and full utilization all the time). Plus the architecture is designed to take advantage of latest GPU rendering technologies found in Mantle and DX12 with GCN 2.0.
i get what you mean but look at the price difference
I would still choose r9 290x over GTX 780 ti. Better profit for commercial use. Better picture quality for 4k tv and its all based on real life experience and how much profit i made out of them. Clients preferred nvidia at first until they experienced the reality and how much money they wasted for nvidia gpu's. gaming purpose nvidia may be better but yeah i love AMD hands down when it comes to diversity of usage. like i said, more money out of it and if i only knew how to mine, i would've done it way back before i started my business.
R9 290X Is indeed the winner here. No brain needing.
For the best Battlefield 4 PC experience, however, we recommend having a system equal to or better than than the following specifications: Graphics Card: AMD Radeon 7870 or higher; NVIDIA GeForce GT 660 or higher
I have one in a matx case .. Bitfenix Prodigy M. Also fits in the EVGA Hadron for an even smaller case.
It does extremely well with an AMD Fx 8320 and 8gb of RAM. It gets 60+ FPS with everything on Ultra.
true, true :P
It can do BF4 pretty dang well, with almost all settings on ultra. It has some issues with the shading, though.
and tha't a GTX 780TI
I own both.. and The 290x overclocked is way faster @ 4k.. And the IQ is better on just about every game.. but Ambeint occulsion is why I own a gtx 780TI fucking love it with Nspector, Skyrim and Fall Out 3 look insane and AMD can't compete on this.
3 GB is fine for 1080p even with crossfire. I had 2x 280x's just don't expect it to be enough ram for Mantle crossfire. The 280x's performed awesome on DX11. I wanted to play at higher resolutions but keeping the 120hz / 120 FPS so I turned up the resolution scale in BF4, I upgraded to 2x 290x's so I could play at 135% and keep the 120fps in Ultra giving me 135-160 FPS, bassically not dipping below 120 FPS running Mantle. The 3 GB on the 280x is fine at native 1080p and the 280x is a great DX11 card.
Except for the 280...it only carries 3GB VRAM sadface.jpg
TIMMAAAAYYYYYY! https://www.youtube.com/watch?v=cTl762MuXyc
Why would you pair a high dollar GPU with a cheap apu. You don't. It's luge going to the gym, working your biceps and completely ignoring your triceps...
The new Asus swift is the best gaming monitor spec wise, it's going to be a little expensive at first. It's a 1440p 144hz monitor with 1ms refresh times and with Gsync. I only wish it had Duel link DVI, it's only got DisplayPort! http://www.asus.com/Monitors/ROG_SWIFT_PG278Q/
Really never heard of that card, I know what AMD Firepro cards are. What performance to you get on 4K gaming? I doubt it can do BF4 very well at 4K or can it?
I'm sticking to resolutions that my 2 MSI R9 290x Gaming OC can play at at least 120 FPS. My current monitor is a 27" Sumsung SA950D which is a 1920x1080p, 120Hz and 3D LED Monitor. In BF4 I run Ultra and turn up the resolution scale to 135% so that both my GPU's are working 100%. That basically renders 1440p then scales it down on my 1080p monitor making it even sharper and more detailed, it looks amazing. I can go further but it becomes too sharp lol, so 135% resolution is the sweet spot for me and I'm able to have my FPS locked onto 120 FPS on ultra. Once you get used to 120 FPS on a 120Hz monitor it's hard to go back as it's so smooth specially when your moving and looking around real quick.
comments powered by Disqus