Which graphics card is better

Thank you and this is exactly what the options are Copy and pasted
Integrated
NVIDIA GeForce 6150 SE [VGA]

512MB ATI
Radeon HD 5450 [DVI, HDMI, VGA adapter]

1GB ATI Radeon
HD 5450 [DVI, HDMI, VGA adapter]
 
Last edited:
Yeah I went back to your link and then modified my post though the basics remain the same. It looks like the ATI is a separate card versus integrated. Discrete graphics is much more desirable even if the video card is low-end - and the ATI is quite decent.
 
Integrated cards often take low blows by people. Sure they are not top notch like dedicated cards are, however they do more then enough to get the job done, and do it reasonably well.
My Compaq Presario cq62 has a 256MB dedicated ATI Mobility Radeon HD 4250, that will also use up to 1.5 gigs of shared memory. It will change dynamically so as you change from say the desktop which requires very little to no video memory used, to driver in trainz, it will change to what it needs as needed when needed. I can run TRS2006 with all the sliders to the right and still have fps left over.
Of course having the amount of RAM to cover all basis is needed to sicne the video memory uses that, but if you have high speed RAM and enough of it, it is not as much as an issue with dedicated vs integrated. I maxed my laptop out to 8 gigs, and it runs like a dream in trainz.
 
Thanks for all of the responses but I have a new question, what is the difference between a normal and low-profile/ lowpro-file ready graphics card.
 
Thanks for all of the responses but I have a new question, what is the difference between a normal and low-profile/ lowpro-file ready graphics card.
Normal is well normal:p Low profile means it is skinny and only takes up 1 expansion slots, designed mostly to fit in low profile PCs.
 
Lol, with good reason.




They do, since when?
Since they are better then the ones that could only support a meager 34MB of video memory.
Please understand everything before you jump so conclusions about hardware. Yes integrated is slower, however it is not as bad as people say they are, it is the people who do not understand exactly how to get the best use out of it that say it sucks. Sure it isn't going to get you to the ability to play fallout 3 on max setting all around but it gets you middle of the road graphics with ease. Not to mention play trainz 2006 and older on max settings(all sliders to the right) and run at 25 fps on some machines(pending the exact integrated graphics) and anything more is really overkill.
 
Last edited:
I think the OP is using 2010, which actually uses the GPU more than previous versions, such as hardware compression and the Speed Tree rendering engine. There may well be other functions performed by the GPU as well that we don't know about, in which case how 2006 and older perform on an integrated card is no longer a factor. What may be fine for 2006 does not necessarily mean it's fine for 2010.
How 2010 performs on integrated cards however is relevant.
Granted there are now on board graphics, at the expense of system ram, that support 1GB of shared memory but is anyone actually using an integrated card with 2010, able give us an insight as to what they are using and how well it actually performs, before the OP goes and wastes his cash on something that may not be up to the job, it might well be, however I think we need some actual hard facts before assuming anything.
 
People keep saying that 2010 is less demanding on resourses than earlier (2006 & 2004) versions, so taking it that my Gforce 9800 runs 2004 without problems, he should be ok with an 8 or 9 series Gforce card which can be had for a snip these days. ;)
 
Please understand everything before you jump so conclusions about hardware.


Lol, I understand all too well, I’ve been building machines and maintaining them as a side business for almost ten years now.

Integrated graphics are fine for running office applications but I would never recommend them to anyone who is planning on running games.





Not to mention play trainz 2006 and older on max settings(all sliders to the right) and run at 25 fps on some machines(pending the exact integrated graphics) and anything more is really overkill.
Doesn’t tell us much, at what resolution? At what AA/AF level?


25 fps to someone who has experienced better is nothing but a slide show.
 
25 fps to someone who has experienced better is nothing but a slide show.
Ahh but if you take into consideration all of studies saying that the human eye sees 30fps as fluid, more then that just clarifies the image even more. Yes 25 fps is indeed slower then 30...obviously. But it is not a killer and still seen as fluid to some. Every human is different then the next, so someone like you might see that 36 fps is fluid, I may say 29 fps is fluid. (Fluid=fluid motion like you moving your arm, no lag or slide show effect)
 
Ahh but if you take into consideration all of studies saying that the human eye sees 30fps as fluid, more then that just clarifies the image even more. Yes 25 fps is indeed slower then 30...obviously. But it is not a killer and still seen as fluid to some. Every human is different then the next, so someone like you might see that 36 fps is fluid, I may say 29 fps is fluid. (Fluid=fluid motion like you moving your arm, no lag or slide show effect)



Many “studies” have also said that the human eye sees frame rate changes as high as 70 fps +.


Fluid” to me as well as many others is when the frame rate consistently matches the refresh rate of the monitor, which with most LCD’s is 60Hz.
 
Back
Top