The Community Guidelines have been updated.
Welcome to nsane.forums
- Access special members only forums
- Start new topics and reply to others
- Subscribe to topics and forums to get automatic updates
AMD Radeon vs Nvidia
Posted 30 December 2010 - 02:07 AM
Is there an advantage in using an AMD Radeon video card (HD5570) on a motherboard w/ AMD Radeon on-board (HD3200) vs. using an Nvidia (GT240) on that same board?
I'm planning on buying a new video card and I'm only choosing between the two - HD5570 512 DDR3 vs. GT240 512 DDR5. The GT240 is about $3 more than HD5570. But from what I've read in reviews (Tom's Hardware and Anandtech) the GT240 slightly beats the HD5570 in performance but slightly uses more power and heat.
I've also read about the advantages of HD5570 - Eyefinity and DirectX 11 but I won't be using multiple monitors with it as I only do light gaming and most the games I play are only DirectX 10.
I'm already leaning towards buying the GT240 but I thought I should ask for your opinion whether there's an advantage using a Radeon card on a board with built-in Radeon vs. an Nvidia card on a board with built-in Radeon.
Posted 30 December 2010 - 04:50 AM
Also you have to make sure that your MB can support Nvidia if you chose to go with one. If it does, I think you need Windows 7 if you want to have both Radeon and Nvidia functional inside your system.
If I were you I'd get a 5750 and skip on possible Nvidia complications. Still, you have to make sure your system meets the requirements for any discrete card you buy, especially power source/ram/cpu. You will find these at the manufacturers product page.
Posted 30 December 2010 - 05:45 AM
AMD video cards are more efficient in every way as opposed to nvidia. AMD manages to harness the same computing power with less transistors and less power usage. Of course if you want your computer to double as a furnace or space heater then you can get nvidia but fermi is the most power hungry and wasteful video chipset around.
Don't be fooled by nvidia model numbers either, they haven't really invented a new GPU between the 8800GT and Fermi, all the rest in between were either rebadged or reworked iterations of the 8800GT. They do that because of customer perception since people would want to buy the newer looking model number.
I would recommend the Radeon HD 6850, which will play all the games you want, the only thing single GPU have problems with is high Anti-Aliasing, so you probably won't be able to do 16x AA with everything on Ultra High in every game.
For regular gaming the 6850 will work great, 2x AA is enough for most people, sometimes you don't even need it.
Almost forgot to mention, the power savings and efficiency extend to motherboard chipsets too. So if you are planning to buy a new motherboard then the one with the AMD chipset will use less power, like around 10 to 15 watts as opposed to the nvidia chipset that uses around 35 to 50 watts. People usually think the nvidia boards are higher quality because they see lots of copper heatpipes on it but the reason those are there is because they are mandatory else the heat would burn everything. Motherboards with AMD chipsets come with smaller heatsinks because they dissipate less power.
The only contrary thing is AMD cannot seem to defeat intel with cpu computing efficiency for the longest while, though both systems can use the same low power on idle, load power is much higher with AMD cpu. Anyway not to worry too much about that since most computers are at idle speed for the majority of time anyway and AMD cpu price is so affordable you can get a quadcore for under $100 USD.
Posted 30 December 2010 - 07:18 AM
I got it, it's great
(I believe at the time I could get either the HD5550 or the HD4850... Same price.)
Posted 30 December 2010 - 07:39 AM
Much appreciated, and thanks to all for your replies.
Posted 30 December 2010 - 11:47 AM
@LeetPirate: I'll definitely keep those information in mind.
@toyo & shought: The HD4850 is not available here in our country anymore and I don't buy online. Also, if I do find an HD4850, I was worried that my stock power supply would be able to support it.
Posted 30 December 2010 - 01:49 PM
Even I'll recommend AMD over nVidia. When I invested in a DX 11 card, I looked into the future, whether I'll need it or not. Honestly, I made a mistake, but on other side, there are many apps and other things that need specifications that mostly DX11 cards have.
Posted 30 December 2010 - 06:09 PM
If I had a DX11 card, it would only be of use to me in games and if Nvidia (I won't buy another card without CUDA , ever), in Premiere's Mercury engine (if only I have bought a Nvidia), compared to my HD4850. I think the time will come soon when I'll need a faster card myself, although the 4850 runs my games on 1080p usually. And when it dips in the low 20 fps zone, it's because the game is poorly coded.
What card did you get DKT, by the way? And why was it a mistake?
Posted 30 December 2010 - 06:59 PM
I was so annoyed with those games telling me that I don't have shader model that, I don't have DirectX this, your card doesn't fully (100%) support OpenGL, then you cannot download these new drivers because they are too new for your card. It was like hell.
After looking into things that my monitor is just 17" and I being a 1152x864 of desktop and 1024x768 gaming guy, I thought that 5450 is all that I need to get away from those new game compatibility problems. Of course the card's performance is so bad that even old ATI 9000 can come near this one. And on the other side, I can still run games at 60 FPS, the max my monitor allows when games are running, except NFS Hot Pursuit, it's FPS was around 25-30. But that's not really the point here. I have not played a single game that has DX10 or DX11 requirement.
What made me sad about my card is, one of the member here benchmarked AMD 68xx with 3DMark and scored 4000 points with FPS around 25 or something. And when I benchmarked, it scored 390 only. With FPS at 1 to 2.
EDIT: Forgot to mention, I don't think ATI HD 4xxx supports OpenCL.
Posted 30 December 2010 - 07:29 PM
I learned many years ago when the GPU thing was just starting to heat up (Voodoo period, 3DFX dominance) that you have to have PATIENCE. Meaning you never, ever buy from the low-end. You need to stay with what you have and just save penny by penny until you have enough to buy something from the mid-end. At least if you need some performance and want to play recent titles. Eye candy will need some of that performance, even at 1024x768.
The 6800 is totally in another league... I think the 6870 is approx like a GTX 285 which was damn fast. So don't be sad at such comparisons... they just don't mean anything.
I'm 100% you'll never buy another HTPC videocard, but maybe my words will help someone else.
OpenCL is supported in 4800 series if you install the CL driver, I have it right now on my card. It could be with some limitations though compared to the 5000/6000 series, just like the Direct Compute. Although I don't use it at anything. But why did you edited your post about it?
Ooo, shiny... GPU-Z has automated screenshot-taking capabilities...
Edited by toyo, 30 December 2010 - 07:34 PM.
Posted 30 December 2010 - 07:59 PM
You are right, I don't think I'll ever buy another card like this one. I am a hardcore gamer, but don't prefer FPS games and am currently in mood to play all the 1990s graphic adventure games, you know, the LucasArts ones, atleast once in life as never know, with the new PCs, I wont be able to have a taste of richness again. Of course, I do play all the new ones, but not FPS.
About OpenCL, well, AMD and Wiki don't seem to mention anything about them. You also installed the ATI OpenCL version to get it, right?
Talking of GPU-Z. Here's mine. You think it should be safe for me to overclock it from 650MHz to 700MHz GPU clock and 800MHz to 850MHz Memory via ATI Catalyst Control Center's overdrive?
Posted 30 December 2010 - 08:26 PM
From what I know of the low-end, it's only the GPU that is actively cooled. That leaves the memory and the voltage regulators only with the passive heatspreaders and the occasional blow of some nearby fan, meaning that you should be cautious when overclocking the memory. The memory chips are actually rated by their producer (Maybe Hynix or Samsung? You can ask XFX what chips you have or seek a review or reviewer that knows that) so you could push them to the rated value which is usually bigger than the speed that they run.
Let the Auto-Tune run and then decrease the GPU by 10(-ish) MHz and the Memory... hmmm... I wouldn't push the memory much since it has no active cooling.
So in conclusion, depending on your cooling, you could go for the 50 Mhz on the GPU, which can be stable if you keep the temps around the 90 Celsius. Maybe even a little higher.
Maybe it will be stable. Maybe not. If not (could crash games, could see white "snow" dots on the screen as the memory reaches its limits).
I actually wrote this before I had the idea to search. So I'm not going to throw it away!
This guy pushed it to 700/900. Which is damn crazy on the RAM, since I see it doesn't have even heatspreaders, the chips just sit there.
My personal opinion is that it isn't really worth it... Very small performance increase. I tried OC on this 4850 I have some time ago and got bored with the few frames I got as improvement. I mean, in games where I needed it (under 30 fps) I never got more than 2-3 fps. Who cares if it stutters at 27 or 24 fps? And in the games where performance was 50 fps+, it didn't matter.
I hope this all makes some sense. Please ask any questions if you have...
Posted 30 December 2010 - 08:46 PM
Here is a Hynix decoder for what the codes mean.
The speed of the chips is encoded in the "12C" specification, so they are rated at 12: 800 Mhz, at C: Commercial Temp & Normal Power, whatever that means to Hynix.
Still, XFX could have many reasons to change this default Hynix chips so you can end up with different stuff on your card, but we can find out what is their speed rating if you post the numbers or make a pic.
Also, I assume that XFX didn't bother to mess with the voltage specs or other stuff (these are also encoded in those numbers/letters).
It's no surprise finding out that XFX used very cheap memory on the card, rated at exactly the official specs. And with no cooling on it...
Edited by toyo, 30 December 2010 - 08:48 PM.
Posted 30 December 2010 - 08:54 PM
The original card does have a red (and beautiful) heatsink. But XFX one doesn't, well, could be out of stock as per your link. Mine is exactly same as showed in the pic. Except, I have a rear fan in my computer case. I know, everyone has, but the old computer that I used for 4-5 years didn't.
I was not exactly aware about the bad effects of overclocking though. Like the snow dots you mentioned.
Yes I did consider overclocking it after reading the guy going till 700/900. But considering that I'm not a real overclocker (so far) and the really high temperature in India, I should take some real care.
Sometimes, even 3 frames matter for me. Yes with the card like this, they do. But something I'm surprised and unsure why is happening. You see, one day I was benchmarking a browser in peacekeeper. So I changed my control panel power settings from power saver to HP. And after benchmarking, I forgot to change it back. In both those settings I've kept my graphics to full power considering how low watt it takes. Now, I played a game, I was surprised that the game I used to play at about 25 FPS is now topping 75 FPS. Now i5 has Speed Step, and when it power saver, it sits at 1200MHz and base at 2660 and overclocks till 2800 (4 cores active at a time) or 3200MHz (2 cores active at a time). On the HP mode, the CPU is set to full, that is 2800 till 3200. Do you think my game is not able to utilize the speedstep and increase my CPU's speed? I normally keep mine at power saver. But it should overclock just fine.
EDIT: Didn't see your new post. Can't see it today. Will report tomorrow about the on board chip.
Posted 30 December 2010 - 09:21 PM
The games do not utilize the Speedstep. The apps just demand computational power. So if your game do not need more than 1200 MHz, the CPU won't step to the higher multiplier. This is not the only power-saving mechanism that takes place though. There are others, some work at the CPU level, others at the CPU-mobo chipset level, and some need a compliant OS. So it's a very complicated ballet between the components just for a few Watts.
If you would play a pretty demanding game, like let's say Mass Effect II (because this one I have installed right now...) then it will most probably keep your CPU on Turbo/Max multi, except for menus, when the CPU will throttle to low multi.
If you are fearless... than you can overclock your card from its BIOS. The needed program is RBE. There are dangers. I used it for my 4850 back in the days. You see, all reference design 4850 cards came with some amazing idle temps. Mine was 70-80 Celsius at idle... and 85 Celsius at full :|
Urgent action was needed, so I made me a BIOS with modified Powerplay values, so my idle went down to 45 Celsius after some magic. RBE was the tool. The same Powerplay manipulation is possible for a clean overclock. From RBE you can modify voltages also. But you must first know what is a safe OC for your card, and be prepared to convince the guys at the store that the card died by itself and you have no clue what a BIOS is.
If you brick it, it's still possible to restore it (flash it again with a correct BIOS)... but it's also possible to brick the card completely. I flashed mine more than 10 times and all was cool. Now I use an official BIOS from ASUS, as they corrected the temp issue. And it only took them half a year!
Edited by toyo, 30 December 2010 - 09:33 PM.
Posted 30 December 2010 - 10:53 PM
I have downloaded RBE couple of days ago but feel that Ati Overdrive is a rather safe side.
I don't think I'll risk flashing my BIOS.
Just did a FurMark stress test on power saver and HP. My idel temp is 48C. Ran for about 5 mins each. Well, it seems that the average frame rate remains the same both the times. And the max it reaches is 70C at default speeds. Can I risk it to 80C by overclocking it just by 10MHz everytime? I mean, what's the safe temp limit.
BTW, thanks for your help. It's been really helpful.
Posted 30 December 2010 - 11:08 PM
Posted 30 December 2010 - 11:15 PM
The GPU has a higher heat threshold than the CPU, HD4850 can reach up to 120C, lol. The 5 series is cooler than the 4 series and the 6 series is more efficient so it's cooler than the 5 series but it performs less than the 5 series of equivalent series number so a 5850 will give more fps than a 6850. Even though, I'd still recommend the 6 series because of its increase in computing efficiency, it can do alot of the newer features better. Look for benchmarks of Metro 2033 on high resolutions like 1080p and more and you might see 5850 beating 6850, its not incorrect. Due to the refinements of the 6 series you can afford a higher model 6 series to make up for the reduced raw power.
Posted 31 December 2010 - 06:06 AM
The 6800 series is not the continuation of the 5800 series, the idiots at AMD thought that the performance increase vs the 5700 series (which it kinda replaces) is so big it warrants a rebadge move a la Nvidia. The 6900 series is the follow-up to the 5800 series. Pretty stupid.
DKT's CPU is a monster, especially as it accommodates user needs with variable TurboBoost. His problem is with the GPU... being only a 5450. I think he pulls out a hair out of his head every time he hears or sees this number.
Posted 31 December 2010 - 11:14 AM
Also don't worry about AMD HD6000 series, their prices are super cheap as compared to the previous generations so you could actually afford alot more power for the same money. AMD GPU is still the better card for the money because when AMD released the HD6000 series, nvidia jumped the gun as usual and released their duct taped geforce 500 series which is actually a furnace due to the massive voltage leak and poor chip architecture. nvidia is all about trickery, they bribe and force game manufacturers to make games with "the way it's meant to be played" and they invoke code to make sure they sabotage AMD video cards. I remember when nvidia EPIC FAILED with Assassins Creed and the game ran faster with ATI, nvidia forced the game publisher to release a patch to undo that, lol. They also tout this green theme as if to convince you that green is saving the planet or saving power or some nonsense when in reality using nvidia is destroying the planet and wasting more energy than ever.