GPUBoss Review Our evaluation of R9 FURY X vs 210 among Desktop GPUs

Gaming

Battlefield 3, Battlefield 4, Bioshock Infinite and 21 more

Graphics

T-Rex, Manhattan, Cloud Gate Factor, Sky Diver Factor and 1 more

Computing

Face Detection, Ocean Surface Simulation and 3 more

Performance per Watt

Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more

Value

Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more

Noise and Power

TDP, Idle Power Consumption, Load Power Consumption and 2 more

7.7

Overall Score

Winner
Nvidia GeForce 210 

GPUBoss recommends the Nvidia GeForce 210  based on its noise and power.

See full details

Cast your vote Do you agree or disagree with GPUBoss?

Thanks for adding your opinion. Follow us on Facebook to stay up to date with the latest news!
VS

Differences What are the advantages of each

Front view of Radeon R9 FURY X

Reasons to consider the
AMD Radeon R9 FURY X

Report a correction
Much higher memory bandwidth 512 GB/s vs 6.4 GB/s 80x higher memory bandwidth
Much better floating-point performance 8,602 GFLOPS vs 35.2 GFLOPS More than 244.2x better floating-point performance
Much higher clock speed 1,050 MHz vs 475 MHz Around 2.2x higher clock speed
Much higher texture rate 268.8 GTexel/s vs 8.8 GTexel/s More than 30.5x higher texture rate
Significantly more memory 4,096 MB vs 1,024 MB 4x more memory
Many more shading units 4,096 vs 16 4080 more shading units
Many more texture mapping units 256 vs 8 248 more texture mapping units
Much better cloud gate factor score 22.45 vs 3.15 More than 7x better cloud gate factor score
Significantly higher pixel rate 67.2 GPixel/s vs 4.4 GPixel/s More than 15.2x higher pixel rate
Many more render output processors 64 vs 4 60 more render output processors
Much wider memory bus 4,096 bit vs 64 bit 64x wider memory bus
Slightly higher memory clock speed 500 MHz vs 400 MHz 25% higher memory clock speed
Front view of GeForce 210

Reasons to consider the
Nvidia GeForce 210

Report a correction
Much lower TDP 31W vs 275W 8.9x lower TDP

Benchmarks Real world tests of Radeon R9 FURY X vs GeForce 210

Sky Diver Factor Data courtesy FutureMark

Cloud Gate Factor Data courtesy FutureMark

Metro: Last Light

Comments

Showing 17 comments.
Y'know the 210 is actually the fastest graphics card ever made, it can run CS 1.6 at 60 FPS just fine! But seriously, they should just use the scores to decide which is better, and fix the 750 being better than the 750 ti BS.
8 cuda
43 fps in "METRO" is the best performance per watt and per cost ever, GT210>>ALL. Will it be that in 4k the gtx 210 has an even better performance? HWCOMPARE is much more reliable. GPU FUCKING BOSS.
Really? I just ran THREE R5 vs 1080 comparisons from this site. Guess which card won them all? I'll give you a hint: The only card IN them all. You can read the rest of what I have to say from my post above. Stupid ass nVidia fanboys.
Let's see.... Following your suggestion, I compared an old, low-end ATI card to the 1080. Then I compared 2 more old Radeons to the 1080. The HD 2400, 4350, & 6400 against the GTX 1080. All 3 ATI cards are in the low-end bracket of very quiet. Guess which card won all the comparisons? I'll give you a hint: The only card IN all the comparisons. The entire industry and all the idiots are biased to love nVidia's GeForce over ATI's Radeon. It has been that way for over a decade, it is still that way, and your "claims" only backed it up when I actually DID some comparisons. Unless you care to back up your claim with real data instead of meaningless words you simply threw out without bothering to check or offer any real facts? Can you give a specific comparison which backs up your claim, for instance? By the way, having been a PC tech for probably as long as you've been alive, I can say for certain I have seen many more nVidia cards fail than ATI, both desktop & laptop. It isn't difficult to Google for the nVidia failures and find all the stupid mistakes they have made almost every year, and it usually takes them 2-3 years to fix it. Take the many thousands of MacBook Pros which failed due to the poorly designed nVidia GPU failing (I don't support or like Apple products, FYI). Or the many thousands of Dell laptops (also don't like Dell products, FYI) which failed due to..... Dun dun DUN THE NVIDIA GPU! However, given I was an on-site repair technician for Dell, I happen to know this was 1 of the 3 most common issues my coworkers & I were sent to fix. nVidia GPU's. Given I worked for HP... Well, you get the picture. Yes, ATI cards can also fail. Nothing is perfect. But not every other year by the thousands like nVidia. nVidia sucks, the industry loves them anyways because people are stupid, people love them because the industry is so biased. Oh, and while everybody seems to use Nvidia these days, historically they used nVidia for many years, which is part of what tells me you haven't been dealing with this stuff for very long. Even the spell-checker for this website accepts nVidia without complaint, but it complains about Nvidia. Don't accept what the industry tells you, don't accept all the biased reviews (even LinusTechTips, a VERY popular YouTube guy, is incredibly biased for nVidia), do your own damn research, use your own experience. After the 2nd nVidia GPU failed on me, I switched to ATI and never regretted it. Until I bought my new laptop, because it has the damn nVidia Optimus technology (sigh, it's so difficult to find good laptops with ATI cards these days so I settled for a shitty nVidia card). Why don't you Google all the problems this nVidia Optimus is causing? ATI knew better than to pull this shit. nVidia does shit like this almost every year.
I have no good idea how they put the algorithm together. I'm starting to believe it may have been created by passing out on a keyboard.
Do the same thing with an old R5 versus a brand new 1080 or Titan and it favors the AMD card. It's a broken algorithm, not a bias for Nvidia. Stupid ass AMD fanboys.
You guys are so fucking dumb. It has nothing to do with Nvidia. Do the same shit with 10 year old AMD cards vs 1080's or Titans. It does it for those too, favoring the shitty AMD card for noise/power. It's just a broken algorithm. Paranoid AMD fanboys.
if I need a heater, I'll buy one from Walmart, not AMD.
How much does NVidia pay GPU boss?
Totally not biased at all. I mean come on, an overall score of 6.4 totally beats the pants off an overall of 8.2! Mwa ha ha ha ha ha ha ha! You know it's sad when the reason for the GeForce 210 win is "Noise and Power consumption!" I compared a GeForce 210 vs a GeForce 710 and guess what? the 710 won because of performance even though it's noise and power usage was nearly identical to the 210. Also the best part is the GeForce 210, the same graphics card in "THIS" specific comparison got an overall score of 6.4 and yet when it is compared to the GeForce 710 its overall score suddenly drops to 5.4. Based on their premise that "Noise and Power Consumption" are sooooo important, you'd think that would play out when comparing the same brands right? Nope. Check out the comparison for a Radeon HD 4350 vs the Radeon HD R5 230, it's funny. But funny how the GeForce 210 beats the R9 Fury X! Man I love this site,... it's soo blatantly biased it's funny as hell.
So guys i just replaced my Fury X for quad SLI Gt 210's and its a great improvement i'm getting 167 FPS on crysis 4k with ultra settings AND GUESS WHAT my rig is only pulling 10 watts from the wall !!!!
ok... 16 cuda and low mem but its fiiiiiiiiiiiiine
totally not biased not a single bit rly nigga
K. BRB. Got to sell my Fury X and replace it with GT 210.
Gotta sell my 480 to play metro
wat
comments powered by Disqus