The GPU is probably just getting old. As time goes on components become flaky, some die suddenly or some will become temperature sensitive. By this I mean they will work fine while cool, but as soon as they warm up, they fail. This is due to the little bonds inside cracking and flexing with the temperature changes. In my ancient work life, I was an electronics technician who troubleshot right down to the chip level, and more importantly, it was on video circuitry in video terminals and graphics boards. I learned a ton about how they work and then some by fixing them. Even though I did this over 25 years ago, the theory behind the current video cards is still the same, just they work a lot faster.
There is nothing wrong with what you're doing. In fact this has been done for years by 3d modelers. Way back in the mid to late 1990s, I worked for a company that made training programs. We had a PC running 3ds4 for DOS, and we would let it render overnight since it took 2-3 minutes per frame on the animation. Watching something like that run is like watching paint drip, so we'd test it briefly, head out the door, and collect the .tiff images the next day for processing on the //Fast AG system. At one time, we setup a render farm using all the machines in the office. The boss had a hissy fit over it, but in the end he liked the idea because the data was processed a lot faster.
Don't worry about the GDDR5 stuff. This is a 'selling' feature by AMD to show they have something different from NVidia. What you are more interested in is GPU cores and pathways. NVidia also supports Open/CL and Open/GL quite well and always has. The programs such as 3dsMax make use of the CUDA chip for extra processing and use all the pipelines. I say this from experience since I always used ATI cards until I ran into major compatibility issues with Open/GL and Open/CL. The thing with this stuff too is graphics RAM today is not like it was decades ago. The old VRAM was very, very fast, and I think worked faster than the stuff we use today because of how it was made. The problem is the old VRAM was really, really expensive, and more so than regular RAM ever was. This is why the video cards are relatively inexpensive today. Instead of using expensive VRAM, they use regular DDR5 RAM, or the graphics equivalent.
The thing is if you were to do strictly 3d modeling and nothing else, I'd suggest a CAD-type graphics card. These cards are way beyond your budget, and even well beyond mine too. But because you also play games you need the best of both worlds with a decent game card capable of doing graphics stuff. At the moment, I just replaced my GTX680 with a GTX780Ti. I know it's expensive, but I don't replace the hardware unless it's necessary. Sadly, my '680 had problems and will be replaced under warranty. I'll put the replacement away as a spare as I use the new card going forward. But anyway, the 600-series is getting quite old now in a computer part's lifetime terms, and the 700-series cards are now moving down as the 900s are coming out. NVidia just came out with the 900s and skipped over the 800s. Don't ask me why they did it.
If anything, here's an example of what you can get if you watch the sales at New Egg.... I think you have to sign up for the newsletter/advert though like I did.
http://promotions.newegg.com/NEemai...D-_-Weekend&et_cid=11549&et_rid=67996#Weekend
This came in today.
John