Jump to content

AMD Radeon vs Nvidia


ck_kent

Recommended Posts

Hi guys!

Is there an advantage in using an AMD Radeon video card (HD5570) on a motherboard w/ AMD Radeon on-board (HD3200) vs. using an Nvidia (GT240) on that same board?

I'm planning on buying a new video card and I'm only choosing between the two - HD5570 512 DDR3 vs. GT240 512 DDR5. The GT240 is about $3 more than HD5570. But from what I've read in reviews (Tom's Hardware and Anandtech) the GT240 slightly beats the HD5570 in performance but slightly uses more power and heat.

I've also read about the advantages of HD5570 - Eyefinity and DirectX 11 but I won't be using multiple monitors with it as I only do light gaming and most the games I play are only DirectX 10.

I'm already leaning towards buying the GT240 but I thought I should ask for your opinion whether there's an advantage using a Radeon card on a board with built-in Radeon vs. an Nvidia card on a board with built-in Radeon.

Thanks!

Link to comment
Share on other sites


  • Replies 87
  • Views 10.8k
  • Created
  • Last Reply

Get a slightly better card if you want to play DX10 titles. Last gen 4870 and 4850 should be cheap. If you can push a little more cash, the 5750/5770 are also affordable. The 5570 is MUCH too slow. You don't really care about power/heat at this level, all new gen entry-level videocards are nice on these parameters.

Also you have to make sure that your MB can support Nvidia if you chose to go with one. If it does, I think you need Windows 7 if you want to have both Radeon and Nvidia functional inside your system.

If I were you I'd get a 5750 and skip on possible Nvidia complications. Still, you have to make sure your system meets the requirements for any discrete card you buy, especially power source/ram/cpu. You will find these at the manufacturers product page.

Link to comment
Share on other sites


The one and only benefit for brand matching would be if you care to make use of hybrid crossfire or hybrid sli. Other than that there are no benefits. Hybrid crossfire or sli don't offer much performance gain, maybe 2 fps but their real purpose is for power savings.

AMD video cards are more efficient in every way as opposed to nvidia. AMD manages to harness the same computing power with less transistors and less power usage. Of course if you want your computer to double as a furnace or space heater then you can get nvidia but fermi is the most power hungry and wasteful video chipset around.

Don't be fooled by nvidia model numbers either, they haven't really invented a new GPU between the 8800GT and Fermi, all the rest in between were either rebadged or reworked iterations of the 8800GT. They do that because of customer perception since people would want to buy the newer looking model number.

I would recommend the Radeon HD 6850, which will play all the games you want, the only thing single GPU have problems with is high Anti-Aliasing, so you probably won't be able to do 16x AA with everything on Ultra High in every game. :D

For regular gaming the 6850 will work great, 2x AA is enough for most people, sometimes you don't even need it.

Almost forgot to mention, the power savings and efficiency extend to motherboard chipsets too. So if you are planning to buy a new motherboard then the one with the AMD chipset will use less power, like around 10 to 15 watts as opposed to the nvidia chipset that uses around 35 to 50 watts. People usually think the nvidia boards are higher quality because they see lots of copper heatpipes on it but the reason those are there is because they are mandatory else the heat would burn everything. Motherboards with AMD chipsets come with smaller heatsinks because they dissipate less power.

The only contrary thing is AMD cannot seem to defeat intel with cpu computing efficiency for the longest while, though both systems can use the same low power on idle, load power is much higher with AMD cpu. Anyway not to worry too much about that since most computers are at idle speed for the majority of time anyway and AMD cpu price is so affordable you can get a quadcore for under $100 USD.

Link to comment
Share on other sites


Can't you get a HD4850 for less, or don't they sell those anymore?

I got it, it's great :)

(I believe at the time I could get either the HD5550 or the HD4850... Same price.)

Link to comment
Share on other sites


Thanks all for those great replies.

@LeetPirate: I'll definitely keep those information in mind.

@toyo & shought: The HD4850 is not available here in our country anymore and I don't buy online. Also, if I do find an HD4850, I was worried that my stock power supply would be able to support it. :unsure:

Link to comment
Share on other sites


  • Administrator

How many Watts your PSU (power supply unit) is of?

Even I'll recommend AMD over nVidia. When I invested in a DX 11 card, I looked into the future, whether I'll need it or not. Honestly, I made a mistake, but on other side, there are many apps and other things that need specifications that mostly DX11 cards have.

Link to comment
Share on other sites


There's no future investing in video cards! There's a new gen launch every 4-5 months or so for God sake! I bet you all remember the launch of HD5000 series as something really recent, but look where we stand now.

If I had a DX11 card, it would only be of use to me in games and if Nvidia (I won't buy another card without CUDA , ever), in Premiere's Mercury engine (if only I have bought a Nvidia), compared to my HD4850. I think the time will come soon when I'll need a faster card myself, although the 4850 runs my games on 1080p usually. And when it dips in the low 20 fps zone, it's because the game is poorly coded.

What card did you get DKT, by the way? And why was it a mistake?

Link to comment
Share on other sites


  • Administrator

Well, I really really wanted ATI 5570 but wasn't available here in India at that time, anything more was out of my budget, i5 is not cheap. So I had to settle with ATI 5450, but double the price of what US was selling. :frusty:

I was so annoyed with those games telling me that I don't have shader model that, I don't have DirectX this, your card doesn't fully (100%) support OpenGL, then you cannot download these new drivers because they are too new for your card. It was like hell.

After looking into things that my monitor is just 17" and I being a 1152x864 of desktop and 1024x768 gaming guy, I thought that 5450 is all that I need to get away from those new game compatibility problems. Of course the card's performance is so bad that even old ATI 9000 can come near this one. And on the other side, I can still run games at 60 FPS, the max my monitor allows when games are running, except NFS Hot Pursuit, it's FPS was around 25-30. But that's not really the point here. I have not played a single game that has DX10 or DX11 requirement. :blink:

What made me sad about my card is, one of the member here benchmarked AMD 68xx with 3DMark and scored 4000 points with FPS around 25 or something. And when I benchmarked, it scored 390 only. With FPS at 1 to 2. :(

EDIT: Forgot to mention, I don't think ATI HD 4xxx supports OpenCL.

Link to comment
Share on other sites


Owww. That's a fine CPU. But that's a homecinema videocard, thought for casual gaming or older stuff.

I learned many years ago when the GPU thing was just starting to heat up (Voodoo period, 3DFX dominance) that you have to have PATIENCE. Meaning you never, ever buy from the low-end. You need to stay with what you have and just save penny by penny until you have enough to buy something from the mid-end. At least if you need some performance and want to play recent titles. Eye candy will need some of that performance, even at 1024x768.

The 6800 is totally in another league... I think the 6870 is approx like a GTX 285 which was damn fast. So don't be sad at such comparisons... they just don't mean anything.

I'm 100% you'll never buy another HTPC videocard, but maybe my words will help someone else.

OpenCL is supported in 4800 series if you install the CL driver, I have it right now on my card. It could be with some limitations though compared to the 5000/6000 series, just like the Direct Compute. Although I don't use it at anything. But why did you edited your post about it?

Ooo, shiny... GPU-Z has automated screenshot-taking capabilities...

db4.png

Link to comment
Share on other sites


  • Administrator

I see. Thanks for the info and giving some hope. :)

You are right, I don't think I'll ever buy another card like this one. I am a hardcore gamer, but don't prefer FPS games and am currently in mood to play all the 1990s graphic adventure games, you know, the LucasArts ones, atleast once in life as never know, with the new PCs, I wont be able to have a taste of richness again. Of course, I do play all the new ones, but not FPS.

About OpenCL, well, AMD and Wiki don't seem to mention anything about them. You also installed the ATI OpenCL version to get it, right?

Talking of GPU-Z. Here's mine. You think it should be safe for me to overclock it from 650MHz to 700MHz GPU clock and 800MHz to 850MHz Memory via ATI Catalyst Control Center's overdrive?

59s.png

Link to comment
Share on other sites


The Catalyst has an automated Auto-Tune (which is crap, and it will take some time, but can be useful). Try to run it. It will give you values which are higher than your card can handle.

From what I know of the low-end, it's only the GPU that is actively cooled. That leaves the memory and the voltage regulators only with the passive heatspreaders and the occasional blow of some nearby fan, meaning that you should be cautious when overclocking the memory. The memory chips are actually rated by their producer (Maybe Hynix or Samsung? You can ask XFX what chips you have or seek a review or reviewer that knows that) so you could push them to the rated value which is usually bigger than the speed that they run.

Let the Auto-Tune run and then decrease the GPU by 10(-ish) MHz and the Memory... hmmm... I wouldn't push the memory much since it has no active cooling.

So in conclusion, depending on your cooling, you could go for the 50 Mhz on the GPU, which can be stable if you keep the temps around the 90 Celsius. Maybe even a little higher.

Maybe it will be stable. Maybe not. If not (could crash games, could see white "snow" dots on the screen as the memory reaches its limits).

LINKY: http://www.overclock.net/ati/662490-xfx-hd-5450-a.html

I actually wrote this before I had the idea to search. So I'm not going to throw it away! :)

This guy pushed it to 700/900. Which is damn crazy on the RAM, since I see it doesn't have even heatspreaders, the chips just sit there.

My personal opinion is that it isn't really worth it... Very small performance increase. I tried OC on this 4850 I have some time ago and got bored with the few frames I got as improvement. I mean, in games where I needed it (under 30 fps) I never got more than 2-3 fps. Who cares if it stutters at 27 or 24 fps? And in the games where performance was 50 fps+, it didn't matter.

I hope this all makes some sense. Please ask any questions if you have...

Link to comment
Share on other sites


OK, I think I solved your memory mystery. The chips should be like the ones in the pic.

Here is a Hynix decoder for what the codes mean.

The speed of the chips is encoded in the "12C" specification, so they are rated at 12: 800 Mhz, at C: Commercial Temp & Normal Power, whatever that means to Hynix.

Still, XFX could have many reasons to change this default Hynix chips so you can end up with different stuff on your card, but we can find out what is their speed rating if you post the numbers or make a pic.

Also, I assume that XFX didn't bother to mess with the voltage specs or other stuff (these are also encoded in those numbers/letters).

It's no surprise finding out that XFX used very cheap memory on the card, rated at exactly the official specs. And with no cooling on it...

post-2350-0-44314100-1293759567_thumb.jp

Link to comment
Share on other sites


  • Administrator

I did read that Auto-tune is buggy and creates problems. I did ran it for 5 mins only to cancel it by ESC (as it was damn fullscreen) and see that so far, it has only overclocked the GPU to 655 or 665. Somehow I lost my patience there.

The original card does have a red (and beautiful) heatsink. But XFX one doesn't, well, could be out of stock as per your link. Mine is exactly same as showed in the pic. Except, I have a rear fan in my computer case. I know, everyone has, but the old computer that I used for 4-5 years didn't. :P

I was not exactly aware about the bad effects of overclocking though. Like the snow dots you mentioned.

Yes I did consider overclocking it after reading the guy going till 700/900. But considering that I'm not a real overclocker (so far) and the really high temperature in India, I should take some real care.

Sometimes, even 3 frames matter for me. Yes with the card like this, they do. But something I'm surprised and unsure why is happening. You see, one day I was benchmarking a browser in peacekeeper. So I changed my control panel power settings from power saver to HP. And after benchmarking, I forgot to change it back. In both those settings I've kept my graphics to full power considering how low watt it takes. Now, I played a game, I was surprised that the game I used to play at about 25 FPS is now topping 75 FPS. Now i5 has Speed Step, and when it power saver, it sits at 1200MHz and base at 2660 and overclocks till 2800 (4 cores active at a time) or 3200MHz (2 cores active at a time). On the HP mode, the CPU is set to full, that is 2800 till 3200. Do you think my game is not able to utilize the speedstep and increase my CPU's speed? I normally keep mine at power saver. But it should overclock just fine.

EDIT: Didn't see your new post. Can't see it today. Will report tomorrow about the on board chip. :)

Link to comment
Share on other sites


That powerful i5 vs the 5450 is some nasty combo. Very hard to appreciate correctly what is happening in your system, especially with power saving feats active. If you want to get a clear idea, disable from BIOS any power saving feature of the i5, and let just the GPU to be the factor that makes the difference. Otherwise it is impossible to quantify what impact has some GPU OC on your games.

The games do not utilize the Speedstep. The apps just demand computational power. So if your game do not need more than 1200 MHz, the CPU won't step to the higher multiplier. This is not the only power-saving mechanism that takes place though. There are others, some work at the CPU level, others at the CPU-mobo chipset level, and some need a compliant OS. So it's a very complicated ballet between the components just for a few Watts.

If you would play a pretty demanding game, like let's say Mass Effect II (because this one I have installed right now...) then it will most probably keep your CPU on Turbo/Max multi, except for menus, when the CPU will throttle to low multi.

If you are fearless... than you can overclock your card from its BIOS. The needed program is RBE. There are dangers. I used it for my 4850 back in the days. You see, all reference design 4850 cards came with some amazing idle temps. Mine was 70-80 Celsius at idle... and 85 Celsius at full :|

Urgent action was needed, so I made me a BIOS with modified Powerplay values, so my idle went down to 45 Celsius after some magic. RBE was the tool. The same Powerplay manipulation is possible for a clean overclock. From RBE you can modify voltages also. But you must first know what is a safe OC for your card, and be prepared to convince the guys at the store that the card died by itself and you have no clue what a BIOS is.

If you brick it, it's still possible to restore it (flash it again with a correct BIOS)... but it's also possible to brick the card completely. I flashed mine more than 10 times and all was cool. Now I use an official BIOS from ASUS, as they corrected the temp issue. And it only took them half a year!

Link to comment
Share on other sites


  • Administrator

I see. Well it seems that the game works at the 300% speed when the CPU is full speed. But the game doesn't feel the need to do so.

I have downloaded RBE couple of days ago but feel that Ati Overdrive is a rather safe side.

I don't think I'll risk flashing my BIOS.

--------------------------------------------------------------

Just did a FurMark stress test on power saver and HP. My idel temp is 48C. Ran for about 5 mins each. Well, it seems that the average frame rate remains the same both the times. And the max it reaches is 70C at default speeds. Can I risk it to 80C by overclocking it just by 10MHz everytime? I mean, what's the safe temp limit.

BTW, thanks for your help. It's been really helpful. ^_^

Link to comment
Share on other sites


At low resolution the games are very CPU dependant. In the summer my GPU hits even 100 Celsius on Furmark, stable for hours. Dunno about the 5000 series, but I think 90 is safe. I'll hit the bed right now but I'll be back with some more info tomorrow. Glad I could be of assistance.

Link to comment
Share on other sites


What you need is a good cpu cooler and superior thermal paste. Safe limit is relative, the cpu will cut off the computer when it reaches its limit. The problem is the cpu temp limit is high and the excess heat puts stress on the surrounding components and the motherboard pcb. My 32nm i3 can reach 4Ghz on air cooling with 35C idle and 60C on max load and it blows away i5 and core 2 quad extremes in benchmarks. Of course it cannot do it in multi thread benchmarks but in real life only few applications make use of multiple independent threads. Design software and encoding software can use more than 8 cores and what not but games hardly use more than 1 or 2 cores. So for real life usage the i3 was a better investment for me since the applications I use mainly depend on 1 or 2 threads at most, like winrar, games, web browser, even encoding works quite speedy despite only using 2 threads. I don't know if the 45nm i5 can hit 4Ghz easily but you should be able to get at the very least a 500Mhz boost with a nice cooler. Besides the stock intel fan sounds like a jet engine, lol, so using a better cooler will help your pc be quieter too. For stable overclocks be sure to disable spread spectrum in the bios.

The GPU has a higher heat threshold than the CPU, HD4850 can reach up to 120C, lol. The 5 series is cooler than the 4 series and the 6 series is more efficient so it's cooler than the 5 series but it performs less than the 5 series of equivalent series number so a 5850 will give more fps than a 6850. Even though, I'd still recommend the 6 series because of its increase in computing efficiency, it can do alot of the newer features better. Look for benchmarks of Metro 2033 on high resolutions like 1080p and more and you might see 5850 beating 6850, its not incorrect. Due to the refinements of the 6 series you can afford a higher model 6 series to make up for the reduced raw power.

Link to comment
Share on other sites


i5 750 goes up to 4 GHz easily on aftermarket air cooling. Even my e7200 C2D 45 nm CPU goes up to 4 GHz on air with enough Vcore. More and more games are being optimized for quads, so in my humble opinion the time for dual cores for gaming will be over soon. But I admit even a C2D is enough, for almost anything. However, that i5 700 series CPUs are a jewel and if I had money it would be my choice for CPU. I'll wait though for 32 nm quads.

The 6800 series is not the continuation of the 5800 series, the idiots at AMD thought that the performance increase vs the 5700 series (which it kinda replaces) is so big it warrants a rebadge move a la Nvidia. The 6900 series is the follow-up to the 5800 series. Pretty stupid.

DKT's CPU is a monster, especially as it accommodates user needs with variable TurboBoost. His problem is with the GPU... being only a 5450. I think he pulls out a hair out of his head every time he hears or sees this number.

Link to comment
Share on other sites


The games I play don't use quads so it's no use buying a quad in my case, it would be like buying for the future, lol. The price is too high and I won't need it, I also didn't want to spend alot because Sandy Bridge and Bulldozer is around the corner which means 2 things, I could either buy a new cpu or the prices on i7 will drop and I could buy one of those, lol. I could have spent the money back then on an i5 but I wanted to just buy what I needed for the current time and save the extra for Bulldozer and Sandy Bridge. According to reviews Sandy Bridge is only about 12% more efficient than the current i-series and they embed the clock on the chip so overclocking will be difficult if not impossible, I'm waiting to see what Bulldozer can do because it looks good on paper but I want to see real life performance.

Also don't worry about AMD HD6000 series, their prices are super cheap as compared to the previous generations so you could actually afford alot more power for the same money. AMD GPU is still the better card for the money because when AMD released the HD6000 series, nvidia jumped the gun as usual and released their duct taped geforce 500 series which is actually a furnace due to the massive voltage leak and poor chip architecture. nvidia is all about trickery, they bribe and force game manufacturers to make games with "the way it's meant to be played" and they invoke code to make sure they sabotage AMD video cards. I remember when nvidia EPIC FAILED with Assassins Creed and the game ran faster with ATI, nvidia forced the game publisher to release a patch to undo that, lol. They also tout this green theme as if to convince you that green is saving the planet or saving power or some nonsense when in reality using nvidia is destroying the planet and wasting more energy than ever.

Link to comment
Share on other sites


  • Administrator

I don't have any troubles with CPU cooling. Well, it is at 67C in BIOS when speedstep and turbo are not active and the clock is at default 2.66GHz. (when in BIOS). The problem here is that it's only been about 9 months and my heatsink paste has dried as dead. I remember you mentioning to buy good paste, but I always wondered, I read somewhere thermal "tape" is better. Not in mood to invest a lot in a cooler, have already fried a DDR3 2GB RAM 4 days ago when I was clearning the computer and did a mistake, well, I've cleaned the RAM a thousand times before, dunno how two of it's golden lines became black. :ph34r:

The recommended overclocking of i5 750 is max 3.8GHz. The multiplier allows till 25x (133Mhz) if two cores are active. I guess 150MHz should be the max they recommend. Seriously, this two core, four core speed is confusing if you are looking for real values from softwares. CPU-Z shows it's running at 3.2MHz when I turn it to full power, but when I open Intel utility, it says 2.8GHz, but CPU-Z keeps the clock at 3.2 intact.

Well, I was playing NFS Undercover, allowed it to use only 2 cores and it was going onto 3.3GHz on both of them, but as soon as I allowed it to use 4 cores, all of my core speeds settled down to 1.2GHz. That told me that it was able to use more than two cores.

I am still OK with 5450 though, it was in my budget and I don't wanna look at anything above it. In 2 years or so, I'll look to buy a better one. ^_^

Man I'm tired seeing nVidia "enhancements" in the game I've played. Also that NVIDIA PhysX icon in my CP.

Link to comment
Share on other sites


I had thermal pads on my MB northbridge. I changed them with Arctic Cooling MX-3 and temps went down 3 Celsius. I don't know where you can buy pads...

I'm not a fanboy of any company. I buy what I need. My needs right now say that AMD is only good for gaming, since no apps make use of OpenCL. What a shame.

But CUDA... is another story.

Can you believe all the encoder/decoder stuff based on it? And Premiere CS5 works with CUDA, too! And works good, if you have a 460+ it's like another world. Adobe totally left AMD out of the equation. This is BAD news for me as a designer.

So from my point of view, I'd need a GPU that I can use in other stuff than gaming. Sorry Radeon, but you got left behind on this - for now. By the way, I don't care about a few frames. I only care that I play my games at 30 fps+ with max detail, 1080p if possible, but even 720p is OK if nothing else works. However, I found that some settings tweaking, like turning off Dynamic Shadows or similar stuff will always boost my 4850 to playable fps.

67 Celsius is kinda high for default speed. There must be something that causes the CPU to go that high. If it is the CPU temp (not core temp), than it's too HOT!

Link to comment
Share on other sites


  • Administrator

I think Firefox 4 makes use of OpenCL. :) That was one of the biggest reasons I downloaded the OpenCL version from ATI drivers.

I haven't seen many AMD ads, actually, have seen none, but I guess that's the reason it beats others in the price. Ads don't come here, but it's part of marketing. :P

The first thing I do to speed up the game is reducing the shadows, I mean, you can ignore the little less shadows but you do love good details in other places.

Well, 67C at 2.66GHz is probably due to the default intel fan + already included paste that has dried. The MOBO temp should be in mid 30s. HDD temp is around 38C right now, goes to about 42C in summer. But then again, Mumbai is quite hot, the min room temp here has been 24C all winter.

Link to comment
Share on other sites


@DKT27:

I have to agree with toyo that 67° C is too high for default CPU speed.

I mean, I OC my C2D up to ~700 MHz getting ~3.3 GHz while still having room for another OC.

What's best is my temp ranges from a minimum of 35° to a maximum of 45° C with just the stock HSF.

The ranges above gets lower by ~3° when I rid my rig of dust-bunnies and reapply thermal paste ^_^

Link to comment
Share on other sites


My country is also blazing hot at 35-42 Celsius temps but only in the summer. I lowered dramatically my temps mounting a 120 mm fan over the videocard, which blows over the NB also, which went down from 42 Celsius to 30.

It must be some voltage lack of control when your CPU is in BIOS with no OS loaded. Or the stock fan does not spin enough.

Could the Firefox thing be this one "Firefox gets GPU accelerated OpenGL Push with WebGL"?

If not, can you post a link about these two? I can't seem to find anything about OpenCL/Firefox.

Thanks.

Link to comment
Share on other sites


Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...