GPUBoss Review Our evaluation of R7 360 vs 750 Ti among Desktop GPUs


Battlefield 3, Battlefield 4, Bioshock Infinite and 21 more


T-Rex, Manhattan, Cloud Gate Factor, Sky Diver Factor and 1 more


Face Detection, Ocean Surface Simulation and 3 more

Performance per Watt

Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more


Battlefield 3, Battlefield 4, Bioshock Infinite and 32 more

Noise and Power

TDP, Idle Power Consumption, Load Power Consumption and 2 more


Overall Score

Nvidia GeForce GTX 750 Ti 

GPUBoss recommends the Nvidia GeForce GTX 750 Ti  based on its benchmarks and noise and power.

See full details

Cast your vote Do you agree or disagree with GPUBoss?

Thanks for adding your opinion. Follow us on Facebook to stay up to date with the latest news!

Differences What are the advantages of each

Front view of Radeon R7 360

Reasons to consider the
AMD Radeon R7 360

Report a correction
Higher effective memory clock speed 6,000 MHz vs 5,400 MHz More than 10% higher effective memory clock speed
Front view of GeForce GTX 750 Ti

Reasons to consider the
Nvidia GeForce GTX 750 Ti

Report a correction
Higher texture rate 82.64 GTexel/s vs 50.4 GTexel/s Around 65% higher texture rate
Better fire strike factor score 33.8 vs 28.33 Around 20% better fire strike factor score
Lower TDP 60W vs 100W 40% lower TDP

Benchmarks Real world tests of Radeon R7 360 vs GeForce GTX 750 Ti

Bitcoin mining Data courtesy CompuBench

Radeon R7 360
189.76 mHash/s
GeForce GTX 750 Ti
182.53 mHash/s

Face detection Data courtesy CompuBench

Radeon R7 360
37.98 mPixels/s
GeForce GTX 750 Ti
56.77 mPixels/s

Ocean surface simulation Data courtesy CompuBench

Radeon R7 360
819.2 frames/s
GeForce GTX 750 Ti
849.54 frames/s

T-Rex (GFXBench 3.0) Data courtesy CompuBench

Radeon R7 360

Manhattan (GFXBench 3.0) Data courtesy CompuBench

Radeon R7 360

Fire Strike Factor Data courtesy FutureMark

Sky Diver Factor Data courtesy FutureMark

Bioshock Infinite


Showing 25 comments.
R7 360 overclocked version vs standart GTX 750 Ti which is still can be overclocked to increase its performance up tp 5% it is obvious who will win from the introduction of the video
Nah, not a true fanboi -- I know plenty of Intel and AMD fanboys etc -- I just build a ton of machines from 20 client machines of a base level to high end servers to residential gaming machines -- And if budget allows from the client I have had better luck (as I tend to provide support for many of these builds over the long term) with reliability and general number of issues (user error or otherwise) when going with Intel/Nvidia combo over AMD. Just MY experience. I've had plenty of AMD machines without a complaint and some Intel/Nvidia machines with horrendous complaints as well.. neither are 100% perfect :)
You're just a simple intel fanboy.
How about AMD CPU with Nvidia GPU? Thats an option worth exploring. Ive tried it, works much better
Well 14 year old's generally don't have $300-$660 got the latest and greatest Intel CPU cycle offering. Yes, parents do, but the majority of people aren't exactly rich and able to buy a new Intel CPU/computer for their kid every Tick or Tock cycle (about every 6 months). The bottom line is that Intel is light years ahead of AMD in processor technology, die casting, R&D, etc -- Intel CPU's are vastly superior to AMD - That's not bias towards the company,.. I don't care what the name / manufacturer is of a product as long as it's reliable , fast, stable, etc. Intel had competition from AMD back in the 00's with the who could get a single core to the highest X Ghz every X months/year -- That settled off eventually and then multi core processors became more valuable than raw clock speed on a single core. The way we run our computers/devices now are in line with the chipset designs -- ie: take an i7 with 4 physical cores and 8 threads - We now run half a dozen or a dozen+ applications at once -- and even if you run a single application it takes advantage of the spread aggregate load across the cores/threads. 4 cores at 25% is much more efficient than 1 core at 100% of course. As for pricing -- demand + competition dictate pricing in capitalism. We have constant demand -- Games push demand for the fastest more than most items -- music producing, video production, etc also support more capable processors - the demand is there. The competition just isn't. AMD's only angle they've had for quite a while now is -- hey look at our Chips they are cheaper, with 8 billion cores but they are unreliable, problematic, have high thermal outputs, high energy needs, etc. Intel has CPU's that cover the gamut of 5 watts TDP to ramp up as needed at high speeds in all platforms (mobile to high end desktop) that are nothing but reliable and don't have issues. As consumers we need another chip manufacturer to come along to rival intel -- to bring down costs AND expedite the technology improvements. Right now intel has zero competition - so they go at their own pace with R&D with each tick or tock cycle being better but not by leaps and bounds. IE: A 4770K i7 is pretty much the same performance as a 6770K only with a dramatically different pricetag. What has improved are the graphic ability of intel GPU's each generation - Which will finally make a substantial difference as DirectX 12 becomes commonplace in applications.
I think the reason Intel can charge such a premium is due to the fact every 14 year old kid gets brainwashed into thinking every Intel CPU ever made is some sort of God crafted gem
AMD rocks
This combined with the optimisation of nvidia drivers really kicks your pc into beast mode for a 400$ build
It's a turn of phrase. I meant it is as,.. Many people will be "fanbois" of XYZ brand, and therefore could mislead you into a choice you would be unhappy with. The over-arching point was through my, subjective, experience the combination that, more often than not, works best is Intel + Nvidia. Each person must live with their choices, no matter how big or small, consequentle or not -- But fair point on starting my view with "I don't care what arguments people give." :) -- Out of curiosity what do you prefer? Have you built many different iterations of each brand (mixed and non) over the years to get a solid feel ?etc
The moment you say you don't care what argument someone has against your stance, your stance becomes meaningless. You are just giving another worthless opinion.
I remember! Quite frankly it's just Intel's resources. Intel has far more working capital,.. a MUCH larger engineering team(s),.. etc. When it was still about how much Ghz you could squeeze out of a core ,.. AMD hung in there, winning with one launch, then Intel would counter and so on. However, we reached the point where we aren't going up 100%, or even 50% on core processing speed.. it's now in the number of cores at a stable 3.5-4.5Ghz rate. Intel's R&D also has created much faster processing rates on their physical design(s).AMD L2 10ns = 10 billionths of a second = 10 CPU cycles of stall for any instruction not found in the L1 cache minimum. If this was a 1Ghz processor and you had 100% cache miss it would be as effective as a 100Mhz processor that was 100% effective. ** Not exact math, and since the instruction may be found in the L3 cache even a 100% failure of the L2 will not result in a failure to process data, it will occur at a very slow rate, like one in 33 CPU cycles give or take a few.** L3 33ns = 33 billionths of a second = 33 CPU cycles of stall. VS Intel L2 3ns L3 10ns Intel HT uses pipeline stalls to feed the "core" instructions waiting in the cache while the transfer from the first thread occurs. So instead of an actual 10ns penalty, the core still may use 6 of those cycles on another thread to improve total instruction per clock. In short, AMD processors spend most of the time waiting for instructions when they don't meet the IPC that Intel does. However the longer pipeline sometimes (server applications with large data sets and small common requests) is better, as the when the data is found in the cache or there is no possible way to have all the data in the cache, they perform on par with Intel. AMD also took the bid and won for the console market -- But the margins are razor thin for that type of contract.. otherwise Intel "could have" won that war, but didn't want the .. frankly.. thin margins. intel is quite content at being the best Desktop, Business, and Server chipset by a MILE and being able to charge a nice premium for it. Though competition is always good for the consumers, us... Brings down pricing. That said, frankly, Intel CPU's aren't really THAT expensive if you aren't getting the very latest iteration.. which isn't necessary in 99.99999% of situations. You could grab a 3rd generation Intel i5 or i7 .. put it in a gaming rig or other type of machine and still be pushing top performance. AMD's chips just utilize an odd configuration, and WAY too much TDP/Power wattage. Same really goes for their graphics cards -- (Which are much more competitive with their competitor Nvidia.) -- But AMD pushes "lesser" chips a lot harder to cut their prices so much. Nvidia has really always been the choice for reliability (moreso compatibility) with games. It's just the standard really,.. I know a lot of AMD fanbois would have alot to say about AMD Gfx vs Nvidia but as I mentioned,. an Intel/Nvidia combination always turns out as the better long run investment for me. Not that AMD graphics can't run the best games at insane FPS.
Its actually quite ironic because I remember in the early 2000s that Intel was getting their asses handed to them by AMD. I don't know if it was some sort of leadership issue or whatever but they seem to have really fallen off. It really is a shame that now intel has a near monopoly over AMD.
Performance boost? Some but you won't get as much as the pimps are suggesting with their carefully crafted tests, particularly when you are forward looking at future game requirements and see that either card is anemic, at least at the higher resolutions that all the cool kids are using these days. DX12 is more about CPU efficiency increases while the same GPU will have the same bottleneck pushing the same pixels. I lean towards the 750ti for a different reason though, that an owner of either is about due for a video card upgrade for future games and a 750ti would make a fair card for adding a couple more monitors, especially to reuse in a different system that isn't used for gaming.
Ok, I read into it and I was wrong, the 750ti does also support DX12! Even though AMD cards seem to get a bigger boost from DX12 right now, but we'll see once it actually comes out.
True, the performance is pretty much the same with these 2 cards right now. However, I would definitely go with the 360, since it will support DX12 (unlike the 750ti), which will result in a performance boost in the future. And the argument with power consumption is proven to be almost worthless, especially with these low-power cards (the energy bill difference would be ~5$ a year here).
Always go Nvidia, and Intel. I don't care what arguments people give,.. They are both the better choice -- Games are designed and more compatible with Nvidia,.. and Intel has had the huge lead over AMD for many many years now. There is a reason these "brands" cost more. Ask yourself why does AMD have to use twice the power to bring about a similar result? I build half a dozen systems a week and I learned my lesson years ago.
I own both cards, I use the r7 360, its ok. I haven't tried the gtx. I might just return it
Was the test machine running Direct X 11 or 12? My understanding is that the AMD 360 gets a bigger increase in performance using DX 12 than Nvidia. Just curious. Thanks for the Video!
better watch videos...
I would suggest the r9 270.
Hi, I have the ftw edition, and for the money I spent it's a great buy, and for comparison at 1440x900p, with an overclock of 1,500mhz core clock, and 2800ish memory clock, I get an average of around 80-110 fps in battlefield 4, (Ultra) depending on the map, sometimes if Galmund is full, I will get lower, around 60ish. BUT its a good but all the same. Also, at new egg there is a R9 270 by PowerColor for 109 free shipping, that's where I would put the money.
i got gtx 750 ti its good. <> Bu e-posta Avast tarafından korunan virüssüz bir bilgisayardan gönderilmiştir. <> <#DDB4FAA8-2DD7-40BB-A1B8-4E2AA1F9FDF2> 2015-12-20 18:51 GMT+02:00 Disqus <>:
I'm looking for the same , Asus r7 360 or MSI GTX750Ti , I'm confused about these too 360 is cheaper but consume more powers and 750Ti is power efficient BUT some people say its also much better and faster in games than r360
comments powered by Disqus