Why does (for example) a video card spec says that it needs a 650 watt PS ... When it doesn't pull 1/10 of that under load ? I don't understand.
And can any 300 watt descktop PC, have its PS replaced with a 800-900 watt PS without damaging the factory motherboard ?
My system is a 3 month old Lenovo Model 7727 idea centre (tower) with a 300W PS.
It is factory equipped with:
1. Intel i7/cooler
2. 12GB of ram
3. 1.5TB DVD/RW HD (7200 RPM)
4. Dual USB 3.0 card (two ports) This is in addition to the usual 8 USB 2.0 ports
5. Wifi card
6. Case fan
7. Multi Port Card Reader
I have added and/or installed the following:
1. Toshiba 250GB USB portable HD
2. Ventura 32GB USB flash drive (can be used as a HD)
3. The MSI AMD 6670/1GB DDR5 ram and dual fans as noted above (fan speed set to auto)
The total
SYSTEM power draw for this setup is 130-140W under load as measured with a watt meter. As you can see the power draw does fluctuate some which is normal. Since I have a 300W PS (280W Max Continuous Load) it leaves me with a 140-150W (cushion) for the PS. This approximately 50% of the rated capacity and quite adequate. I should also mention that I have run TRS12 and some intensive routes for hours at a time without any fans speeding up to indicate a temperature rise.
Hope this helps to clarify the relationship between total system power requirements and graphics card needs. I believe the HD6670 specs calls for a minimum 400W PS, but as you see, 300W will do quite nicely if the rest of your system is not overly power hungry.