GPUBoss Review Our evaluation of RX 470 vs 1050 Ti among all GPUs


Battlefield 3, Battlefield 4, Bioshock Infinite and 21 more


T-Rex, Manhattan, Cloud Gate Factor, Sky Diver Factor and 1 more


Face Detection, Ocean Surface Simulation and 3 more

Performance per Watt

Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more

Noise and Power

TDP, Idle Power Consumption, Load Power Consumption and 2 more


Overall Score

AMD Radeon RX 470 

GPUBoss recommends the AMD Radeon RX 470  based on its benchmarks and compute performance.

See full details

Cast your vote Do you agree or disagree with GPUBoss?

Thanks for adding your opinion. Follow us on Facebook to stay up to date with the latest news!

AMD Radeon RX 470

GPUBoss Winner
Front view of Radeon RX 470

Differences What are the advantages of each

Front view of Radeon RX 470

Reasons to consider the
AMD Radeon RX 470

Report a correction
Significantly higher memory bandwidth 211.2 GB/s vs 112.1 GB/s Around 90% higher memory bandwidth
Better floating-point performance 4,940 GFLOPS vs 1,983 GFLOPS Around 2.5x better floating-point performance
Better PassMark score 7,382 vs 5,758 Around 30% better PassMark score
Higher texture rate 154.4 GTexel/s vs 62 GTexel/s Around 2.5x higher texture rate
Significantly more shading units 2,048 vs 768 1280 more shading units
Significantly more texture mapping units 128 vs 48 80 more texture mapping units
Better T-Rex score 9.47 frames/s vs 5 frames/s Around 90% better T-Rex score
Better sky diver factor score 383.43 vs 348.62 Around 10% better sky diver factor score
Slightly better PassMark direct compute score 3,901 vs 3,270 Around 20% better PassMark direct compute score
Front view of GeForce GTX 1050 Ti

Reasons to consider the
Nvidia GeForce GTX 1050 Ti

Report a correction
Significantly higher clock speed 1,291 MHz vs 926 MHz Around 40% higher clock speed
Higher turbo clock speed 1,392 MHz vs 1,206 MHz More than 15% higher turbo clock speed
Slightly higher memory clock speed 1,752 MHz vs 1,650 MHz More than 5% higher memory clock speed
Lower TDP 75W vs 120W Around 40% lower TDP

Benchmarks Real world tests of Radeon RX 470 vs GeForce GTX 1050 Ti

Bitcoin mining Data courtesy CompuBench

Radeon RX 470
513.7 mHash/s
GeForce GTX 1050 Ti
247.76 mHash/s

Face detection Data courtesy CompuBench

Radeon RX 470
111.75 mPixels/s
GeForce GTX 1050 Ti
75.54 mPixels/s

T-Rex (GFXBench 3.0) Data courtesy CompuBench

Manhattan (GFXBench 3.0) Data courtesy CompuBench

Fire Strike Factor Data courtesy FutureMark

Sky Diver Factor Data courtesy FutureMark

Battlefield 4

Crysis 3


Showing 25 comments.
Maurice just went FullRetard.
After that huge trade of insults and strange "facts" that has happened, have you considered undervolting a rx470? still better at the same TDP, with no power conector, you can undervolt it and not use the PCIe energy slot, it's not recommended as some configuration software make hardware tests, and require the conector for those
That also has to do with the "Sillicon lottery", they need to give more than necessary so it has spare room for bad cases, I also have one and what he is saying is possible, if you are lucky
I'm Glad I got my 1070 for the price of a 1060 then. While AMD is making great strides in performance, their heat issues on their GPU's still remain a concern for me. The Ryzen has me excited because they don't seem to run as hot while putting out major performance. I just wish they could translate that TDP over to their GPU's so you don't have to have to go overboard to cool the system. Simple fact is, AMD GPU's tend to run way hotter than nVidia.
Okay :3 Thank you Taylor
I pay .30 per kwh :) So $40 savings. And I flagged Tim for his shameful agressive talk including namecalling...
Well, TDP and power draw are two different things. TDP is a measurement of heat generated by the component, not isn't its actual power draw. With TDP results being measured in watts just like electricity, it's easy to see how the two get mixed up so often.
I really doubt that your 470 is only drawing 87 watts. How did you measure that?
She's actually right about its power draw, with the 470 actually drawing nearer to 140 watts under a typical gaming load. The 1050 Ti draws about 75 watts under a typical gaming load, so the cost to run that vs the 470 is about half. That said, if a person games for five hours a day every day of the year, and electricity costs that person $.15/kWh, then it's only a savings of $20.53 after a year, and the person takes a noticeable hit in performance (either in framerate or graphics settings). I personally prefer the better-performing card (I have an 8gb RX 480) but different situations call for different solutions.
Hmm okay :/ Weird that is has such a high TDP then by AMD v.v
my rx 470 uses 87 watts max.
Brandon Maceri Nope, I'm not an Nvidia fangirl at all xD I never gave one single shit about brands to be honest :) And I even owned a Radeon HD6870 and X850XT before... So yeah...
your just a nvidia fangirl
its such a minimal cost on electricity its almost negligible. the 470 smokes the 1050 ti.
thats not how tdp works you fucking autist
You're the one who called me an idiot and told me to cry some more, but I'm being a rude dick. Gotcha :D
Stop being such a rude dick ;_;
LOL, I received an email saying you replied calling me an idiot and telling me to cry to some because I called you out on your exaggerations. Looks like you might have deleted it. You went from saying you could save 164,25€ in two years to saying your savings in one year is enough for a new card in your most recent comment. Sure, I'm the crying idiot. lmao! :D (BTW, never criticized your card choice, just amused by your last reponse)
Well that escalated quickly. Now you're saying with your power savings in a year by using a 1050 Ti vs RX 470 you can buy a new card? GTFOH
You completely missed my closing point with my last comment. Your situation is not a normal situation, even for gamers- a lot of whom have lives outside of eating, sleeping, and gaming all day. You've been talking like efficiency is such a life-changing factor- it might be for you and your situation, but for the rest of us who work 40+hours a week and have friends to visit with and other hobbies, the amount of money we save on efficiency becomes considerably less and it makes it less of an obstacle towards the visually nicer experience. The value of a graphics card isn't just how much it costs to purchase and use, or nobody would have ever bought the R9 295x2 or Titan X. Things that factor into value might be features (having a Freesync monitor, for instance, makes an RX 470 a much wiser choice than a 1050 Ti in every instance except for power draw), potential for higher frame rates, or even nicer graphics settings, and possible discounts or hookups (buying used or doing an EP). The 470 offers almost all of that over the 1050 Ti except for the power draw, which for most users comes down to the previously mentioned "$15 more yearly for a 470". I digress, though. The point I'm trying to get across now is that I'm not arguing anymore that you've not found the best solution for you, or that you're wrong (It'll still take longer than a year to save for your 'new' graphics card, which BTW, if it's an 'upgrade' in any way, is going bump up your electrical bill regardless of what card it is) . What I'm saying is that your approach is very specific to you and doesn't necessarily apply to everybody else. I game considerably less than you, work a hell of a lot more than you, and care more for a high-fidelity experience. Yes, my game runs smoothly, but I'm also getting higher frame rates and better graphics quality. The more expensive card becomes a better value when the end user wants to look at nicer images, higher framerates, and (once out of the budget level and into mainstream performance) 1440p/4k/VR. Another side note: "Fluid" gameplay is very vague. Console gamers consider 24 to be fluid. I don't consider anything less than 90 to be fluid, especially if there are any frame time variances. A 1050 Ti is not going to be powerful enough for most users once they get used to a higher framerate and refresh rate (admittedly, neither will a 470, but that's why I've got an 8gb 480).
Lind L Taylor If I work 9 Hours, and game 10 hours, I still have 5 Hours left. And while showering I have the game running and I'm logged inside but I am afk. While eating the same. But the game runs and pulls energy out of the power plug :) And I do not have a work right now. Since I am ill and so I can't work in my condition. That's why I have so much time where I'm bored and so I play alot :) And even if I only saved 50€ it's still 50€. Money I do not need to waste if this lower consumption card renders my games fluid. No need for a higher consuming card, to gain more frames per second, I don't even need. Since my game ran fluid nonetheless :)
I'm still skeptical of this... I mean, you claim to live alone and pay your bills and everything yourself. You'd have to be a very highly paid sponsored gamer to make enough money gaming 10 hours a day to support yourself. So... Yes, I believe you in YOUR usage situation, but for everybody else who has a job that takes 6 hours five days a week, it's not going to play nearly as much of a role as it does. What I'm saying is that your situation isn't the norm- it's the extreme.
Yes, electricity costs around 30cents here... And yes, I play around 10 hours a day. I even run a netbook most of the time besides this to have twitch running, checking mails and so on. And I know what I paid for my power before and now. I saved above 100€ after my contract of 2 years. Because I wasn't 10 hours all full power. But still 100€ of savings... Which means HALF of a new card. Just by savings! While all games still run fluid. So I see no sense in wasting power just for a few more frames... It's playable, it's fluid and it's saving my money. That's what I care about :)
First of all, please format better. Your whole comment is excruciatingly painful to look and your calculation is damn near impossible to follow. Secondly, you're calculating at full load. I highly doubt you're running your stuff at full load for 10 hours daily. Just leaving your computer on, without doing anything, is NOT full load. Unless you leave your machine on doing synthetic benchmarks, you're not going to produce a significant difference in power draw between the two cards, as idle power draw for both these cards is a lot less than this, much closer to 8 watts for the 1050 Ti and 16 for the 470. How much of that 10 hours daily is actually spent pushing a hard 3D application (modern triple A titles, or torture test synthetic benchmarks like Furmark or 3D Mark)? I'll give you the benefit of the doubt and guess that you're actually gaming for 5 of those 10 hours you leave your machine on, with the machine idling the other five. Load: 75 Wh*5= 375Wh (.375kWh) Idle: 8Wh*5 = 40Wh (.04kWh) Total daily draw: 379Wh (.379kWh) Total daily cost @ $.30/kWh= $0.1137 Total annual cost @ $.30/kWh= $41.5005 Obviously, that's the 1050 Ti. The power draw on the 470 does scale linearly, so it costs twice as much in electricity. You're still only saving $41.5005 annually. It's not enough even after three years. Does electricity really cost you that much? I pay about $.17/kWh.
I give you a calculation now: 75 Watts = 10 hours means 750 Watts. 20 Hours means 1500 Watts. This means 1000 Watts here costs 30cent and I paid 45cent MORE money for the same 20 Hours. If I only let my stuff run for 10 Hours, which IS realistic, I come to 22,5cent MORE paid per day. 22,5cent * 365 days = 8212,5 cent. Means 82,12(5) € per year MORE paid for same gaming time hours... And in 2 years (where you usually upgrade cards again) I already saved up 164,25€. Just savings alone! And for this I nearly get a new card! :) JUST FROM THE SAVINGS! :D My area is around up to 200€ max for a card. So yeah, you see I save alot of money. Which I can put in hardware :3 You can re-calculate my calculation but I don't think I made a mistake :)
comments powered by Disqus