Cast your vote Do you agree or disagree with GPUBoss?

Thanks for adding your opinion. Follow us on Facebook to stay up to date with the latest news!

Differences What are the advantages of each

Front view of Radeon R9 FURY X2

Reasons to consider the
AMD Radeon R9 FURY X2

Report a correction
Much wider memory bus 8,192 bit vs 384 bit More than 21.2x wider memory bus
Is dual GPU Yes vs No About half of graphics cards are dual GPU
Much better floating-point performance 17,204 GFLOPS vs 6,144 GFLOPS More than 2.8x better floating-point performance
Many more shading units 8,192 vs 3,072 5120 more shading units
Many more texture mapping units 512 vs 192 320 more texture mapping units
Significantly higher pixel rate 134.4 GPixel/s vs 96 GPixel/s More than 40% higher pixel rate
Significantly more render output processors 128 vs 96 32 more render output processors
Front view of GeForce GTX TITAN X

Reasons to consider the
Nvidia GeForce GTX TITAN X

Report a correction
Much higher effective memory clock speed 7,012 MHz vs 1,000 MHz More than 7x higher effective memory clock speed
Better PassMark score 10,669 vs 8,345 Around 30% better PassMark score
Significantly more memory 12,288 MB vs 8,192 MB 50% more memory
Much higher memory clock speed 1,753 MHz vs 500 MHz More than 3.5x higher memory clock speed
Better PassMark direct compute score 6,279 vs 4,880 Around 30% better PassMark direct compute score
Better particle simulation score 1,371.74 mInteraction/s vs 1,194.57 mInteraction/s Around 15% better particle simulation score

Benchmarks Real world tests of Radeon R9 FURY X2 vs GeForce GTX TITAN X

PassMark Industry standard benchmark for overall graphics card performanceData courtesy Passmark

Bitcoin mining Data courtesy CompuBench

Radeon R9 FURY X2
932.66 mHash/s
775.76 mHash/s

Face detection Data courtesy CompuBench

Radeon R9 FURY X2
143.04 mPixels/s
224 mPixels/s

Ocean surface simulation Data courtesy CompuBench

Radeon R9 FURY X2
3,404.53 frames/s
2,566.17 frames/s

Particle simulation Data courtesy CompuBench

Radeon R9 FURY X2
1,194.57 mInteraction/s
1,371.74 mInteraction/s

T-Rex (Compubench 1.5) Data courtesy CompuBench

Radeon R9 FURY X2
12.49 frames/s
11.39 frames/s

Video composition Data courtesy CompuBench

Radeon R9 FURY X2
140.63 frames/s
134.13 frames/s

PassMark Direct Compute Measures performance of general-purpose computing using Microsoft DirectCompute

Reviews Word on the street


Showing 25 comments.
It does, actually. The bus affects the memory bandwidth which is one of the major factors in GPU performance.
Look at their upcoming Polaris (Radeon 400 series) GPUs. They're squeezing out an extra 2.5x perf/watt ratio, allowing significantly higher speeds while keeping a low TDP. Next year they're coming out with HBM2 GPUs, which'll dominate the upcoming GDDR5X memory standards. Seems there's a supply issue with HBM2 chips, from my understanding. Both NVIDIA and AMD ware waiting until next year. But the GPUs later this year should very easily be able to handle 4K 60FPS gaming. AMD's Polaris GPUs come out this summer.
I got banned from TH because I was 12. Join the LTT (Linus Tech Tips) forums. They're probably the only forum that's always friendly, lets 13 and younger join the forum as they are Canada-run (Our laws are different), and people are extremely active!
So in EVERY category the Fury X2 beats the Titan X BUT in BOTH performance tests THE TITAN WINS??? Can someone help me to understand this? I am waiting for Nvidia Pascal and AMD Fury X2?, (Does AMD have anything stronger coming?), to make a beast 4K 60fps PC. Just looking for advice. Thanks.
It has to-do with resolution and refresh bandwidth aswell
Titan Y LOL
Yay the titan killer
Yeah. The Titan Y is a dual GPU, so compare a Titan X with a Fury X and a Titan Y with the Fury X2
I like tomshardware actually. They listen to you there and the threads OP generally goes with whoever puts forward the best solution and evidence. It's funny how most people just buy Nvidia cause "dey herd it was da best" but when they ask on the forums first they generally go with AMD unless they have lots of money to spend and go for a 980ti.
Can you reccoment anything?
Lol. Im tempted to retire from that site... but Im holding out till AI and zen are both out to see the rage faces on Nvidia nut jobs when not only is AMD's hardware VERY competitive but AMD also doesnt go bankrupt and or get bought out.
You are highly leaned towards NVIDIA, and don't say because it is rumored. The crossfired Fury (No X) would beat the Titan X in every way possible. (Besides power consumption)
The ram bus width has obviously nothing to do with ram
That has nothing to do with vram.
Keep in mind that no one actually knows the specifics of the Fury X2, no one has actually done benchmark testing with it, this comparison is actually quite pointless at the moment.
Yeah logic here reminds me of wccftech
64 bit still enough for 18 billion GBs of ram
Seems like the editor think that Fury x2 perform worse than a single Fury....
To change virtual results and get rekt at real life comprasions XD
The titan is already finished
This site is full of shit...fokn biased
Or just crossfire two Fury X2's for 16000 shaders??
Ontop of that passmark is a HIGHLY bias benchmark just shaves huge percentages off AMD scores. For example the gtx 960 gets a passmark score of 6000 and my 280x gets 5200...and yet the 280x absolutely wins hands down in nearly every game. The actual unbias mark for the 280x should be about 6400 if the 960 gets 6000...if passmark reflected real world performance. So the Fury X should at least be at 10,500 to the Titan X's 11000 if passmark reflected real world performance....two FuryX gpu's together would be so far beyond anything even the highest over clocked titan X could ever hope to come close to its not even funny.
people go for the titan x because the fury x2 isnt out yet ? And dont forget the fury x2 is a dual gpu card , although i do want to see it out soon !
am i the only one who thinks the titan x's memory should be 512 bit ? (it should be possible since there are 12 gb , correct me if im wrong ) After all that would of made it much better at high res gaming , although it would of increased power consumption.
comments powered by Disqus