Defragging in its simplest form takes data that has been spread all over a hard drive and moves the data together so that the drive access time is faster. When data is written to a drive, it is written sequentially, file after file no matter how large or small the file is. On a new or empty hard drive, the files are accessed quickly because there is no extra work for you hard drive. Remember hard drives, not thumb drives or SSD drives, are mechanical devices and being mechanical are a lot slower than the RAM that the data is being moved to. This motion on a newly formatted drive is minimal so the drave access time is pretty fast. As tine goes on, the drive access time will decrease not because the drive is getting older, but because the data is now spread all over the drive. When files are deleted, there will be a hole where they once existed. The sectors are marked free by the operating system so they can be reused by again. Since the files are written sequentially on the disk, the operating system will write to the first open spot whether this is a large enough to fit the complete file or not. If it's not enough space to fit the complete file, the operating system will put what it can in the small space and the rest in other spots on the drive. This extra mechanical motion by the read write heads is what slows down the performance of the hard drive as time goes on.
Disk data fragmentation has always been an issue with hard drives. Over the years many companies have created applications and utilities to defragment the data. What the defragmenting does is take the data that is spread all over the place and put the sectors in sequential order. This will restore the access time for the drive, making the computer respond faster when reading (acessing) and writing (saving) to the hard drive. Over the years the simple defragmenting utilities have gotten more complicated. As companies have found, there are other issues to that can degrade the performance of hard drives. Among the issues is the location of the data on the hard drive, etc., as well as the type of data such as system files needing access more than stale data, etc.
For more detail on this, check out
http://www.condusiv.com/ for more information.
Now getting back to Trainz. Yes disk data fragmentation can cause poor performance. You are correct in assuming that the Assets.tdx and *.bku (this is the backup file) can get fragmented and cause slower performance as the data pointers are accessed in the database file. The individual asset files too need to be defragmented as well since these are then referenced and loaded into the program as you build a routes in Surveyor or drive your routes in Driver. By just rebuilding your Assets.tdx file, you are only fixing one small part of the program. I need to add here that this is not the recommended way of doing this in particular with TRS2009 and upwards. This can cause you to lose everything and have to reinstall the program from scratch.
As far as using your A/V program. This is just what it is unless there is a disk maintenance program in there, otherwise you are only scanning for malware. For disk defragmenting, you can use the built-in utility supplied with Windows at the least. Others exist by third party companies such as Condusiv, (Formerly Executive Software, and Diskeeper), The makers of Defraggler (They also make CC Cleaner), and even AVG. I happen to use the AVG PC Tune up. It has an excellent, though not as thorough as Diskeeper, but it does a fair job.
John