Intel Turbo Boost - Uninstall it and get higher frame rates

Werewolf13

New member
A couple of weeks ago I had a system crash that required a complete HDD wipe and reinstall of the Win 7 64 bit OS. Prior to that I had uninstalled the Intel Turbo Boost stuff about a year ago. After the reinstall it came back during an auto-update.

A couple of days ago I noticed that some of the games I play were experiencing stutters and their frame rates were noticably slower than before. I knew that I had reset all the nVidia control panel settings to what they had been with no luck. Then I started killing processes that used huge amounts of memory manually to see if any of them were the source of the problem (they weren't). It never occured to me that a software designed to make a CPU work at peak efficiency would impact graphical framerates because before deciding to not remove the boost tech I researched it. Intel says that the software allows/forces the CPU to run at higher clock rates when it is not fully loaded or temps are below cutoff temps. I kept it.

Long story short I went back and did the remove high memory usage processes one at a time and checking frame rates on two different games under identical conditions each time and then rebooting to see if the process running or not installed made any difference. None did. UNTIL I killed the turbo boost software and low and behold average frame rates for trainz jumped from the mid to low teens in a large train yard to the mid to high 20's in the same train yard. Outside of a city frame rates jumped from mid 30's to low 50's with most of the time the frame rate a steady 60.

Note: The main offender is the signalislandui.exe process. That thing grabs an average of 38 Mb on my system and the average mem usage for users is 23 Mb. The other process that goes with it is Turboboost.exe. Both can be found on the processes tab of the task manager if installed. In order to see the turboboost.exe process you will need to set task manager to see processes from all users. On XP it's a check box and with W7 there's a visible admin icon at the bottom of the TM.

To test whether this works on your system or not pick a trainz session and run it with framerates visible (you'll need a 3rd party tool to do this). Set an external view above and to the side of the loco and don't change it but remember it because the view distance will change framerates. Monitor framerate. At some point switch to cab view. FPS will jump up. Monitor framerate. Remember them. End the signalui... process and the turboboost process. Run trainz again (don't shut down yet). Using the same session do exactly what you did before with the same views. I got significantly higher framerates.

Your mileage may depending on your system, CPU and graphics card.

For skyrim I had similar results. Average low 20's outside to mid to high 30's and inside a dungeon from mid 30's to mid to high 40's average with FPS going to mid 50's about half the time.

For me at least uninstalling Intel's Turbo Boost improved performance not only for trainz but the other stuff I use too.
 
Last edited:
That's good to know but interestingly I don't have Turbo Boost on my system. I like the way you did the troubleshooting for this as it was very methodical and carefully logged. :)

What are the specifications of your machine? These drivers maybe specific to your motherboard.

I have the Intel z77-I3770-series motherboard and never saw anything like this unless I never bothered to load it in when I restored my system a month ago after replacing a hard drive.

EDIT: I just checked. It's enabled in my BIOS by default, but I did not install any software for it.

The way it works is just as you said. It allows the CPU to operate at higher speeds briefly even while it should be moving into lower power modes due to heat and power levels. This may be fine for short duration processes such as video editing/post processing or photo editing, but not for games such as Trainz or Skyrim because of the constant load they are putting on the motherboard and the rest of the hardware. What is happening, I think, is the system is still dropping its speed due to the increased temperature anyway because the turbo boost is just like a shot of Nitro. It works for the burst once, but then that's it. It's not meant for continuous operation. The fact that I don't have the software loaded means I probably don't have anything to worry about.

John
 
Last edited:
Okay, you guys have my attention now. I have an Intel MB so how do I check for this Turbo Boost? In easy to understand steps. Thanks.
 
I've got an Asus MB with an i7 2nd gen processor Q20?? something so it is an intermediate power i7. 8Gb Ram onboard with Win7 64 bit Home Premium. The GPU is an nVidia 560 with 2Gb Ram.

I managed to squeeze another 5 to 10 FPS out of the thing this afternoon. My monitor is an LG LCD with 1920X1080 resolution that I feed using an HDMI connection. Because it is LCD I've always setup every program with vsync on. Today I turned it off just to see what would happen. I reran a benchmark program I have and the 3D graphics rating went from 1639 to 2234. Wondered if that was real. It was. Trainz FPS on the ECL Lincoln relief session stayed in the mid to high 20's in the large busy trainyard which surprised me but once out of the train yard they shot up to the high 50's and eventually (I think after all the textures being used got cached in that 2Gb of onboard vram) they shot over 60 and stayed between 65 and 72. Now - that's wierd because the max framerate on an LCD monitor to the best of my knowledge is 60 FPS. But maybe like a CRT more FPS can still be displayed. I don't know.

That higher FPS with vsynch turned off led me to the the next thing I'm going to check which is the coupler breakage problem in the Mojave to Bakersfield session. After reading the rather large thread on the subject and seeing that some players have the issue (me) and some don't ever see it and reading all the potential problems I think calculations for coupler strain are likely tied to frames. It is possible that the physics calculations are working on snapshot environment physics parameters for each frame prior to being sent to the GPU and not what is displayed and that forces an out of synch condition that could create calculation issues. The other possible issue may be that the nVidia 560 preprocesses up to 4 frames ahead of time. One wonders if that could be creating an out of synch condition that messes up the physics calculations. I'll be checking those two hypothesis tomorrow.
 
Last edited:
Okay, you guys have my attention now. I have an Intel MB so how do I check for this Turbo Boost? In easy to understand steps. Thanks.

NOTE: one poster mentioned that with his Intel MB that the tech was hardwired into his BIOS and not loaded as a part of the OS. If that is the case it maybe possible to disable it by changing an <enabled>/<disable> switch in the BIOS setting but if you don't know what you are doing or aren't comfortable messing with the BIOS then DON'T DO IT!

Otherwise - to see if you have the Intel Turboboost Monitor Technology 2:

Open task manager.
Goto the processes tab.
If you're running XP then check the box at the bottom of the tab that says show processes for all users. For win 7 there's a push button at the bottom that has an admin icon on it and says show processes from all users.

Sort the image name column either descending or ascending.
Look for a process called SignalIslandui.exe and one called TurboBoost.exe. Right click on the signalislandui.exe and end the process (it won't hurt anything just unloads it). If the turboboost.exe process goes away at the same time you've got the Intel Turboboost Technology Monitor software running.

An Alternative is to open the Control Panel and go to the uninstall programs panel. Look for an installed program called Intel Turboboost Technology Monitor 2.0 - that's what I ended up uninstalling after doing all the testing.

Guys! Please keep in mind that every system is different. Run your framerate testing before and after. Don't uninstall the software before you're sure. Just turn it off in taslk manager as explained above when you want to see if that helps.
 
Last edited:
Okay, you guys have my attention now. I have an Intel MB so how do I check for this Turbo Boost? In easy to understand steps. Thanks.

Easy. Check your BIOS. Press F2, if your motherboard is like mine. You might have to press it a couple of times to get into the BIOS. Check the CPU settings, overclock settings, etc. Look for Turbo Boost. If it's checked, it's on. Mine is checked, but I don't load any software for it in Windows.

In Windows look for an application called "Turbo Boost" Use Task Manager. (Ctrl and ESC at the same time will bring it up without crashing any graphics programs, usually). If it's running, then it's loaded... Try removing it from startup or killing it and see if Trainz runs better.


I've got an Asus MB with an i7 2nd gen processor Q20?? something so it is an intermediate power i7. 8Gb Ram onboard with Win7 64 bit Home Premium. The GPU is an nVidia 560 with 2Gb Ram.

I managed to squeeze another 5 to 10 FPS out of the thing this afternoon. My monitor is an LG LCD with 1920X1080 resolution that I feed using an HDMI connection. Because it is LCD I've always setup every program with vsync on. Today I turned it off just to see what would happen. I reran a benchmark program I have and the 3D graphics rating went from 1639 to 2234. Wondered if that was real. It was. Trainz FPS on the ECL Lincoln relief session stayed in the mid to high 20's in the large busy trainyard which surprised me but once out of the train yard they shot up to the high 50's and eventually (I think after all the textures being used got cached in that 2Gb of onboard vram) they shot over 60 and stayed between 65 and 72. Now - that's wierd because the max framerate on an LCD monitor to the best of my knowledge is 60 FPS. But maybe like a CRT more FPS can still be displayed. I don't know.

That higher FPS with vsynch turned off led me to the the next thing I'm going to check which is the coupler breakage problem in the Mojave to Bakersfield session. After reading the rather large thread on the subject and seeing that some players have the issue (me) and some don't ever see it and reading all the potential problems I think calculations for coupler strain are likely tied to frames. It is possible that the physics calculations are working on snapshot environment physics parameters for each frame prior to being sent to the GPU and not what is displayed and that forces an out of synch condition that could create calculation issues. The other possible issue may be that the nVidia 560 preprocesses up to 4 frames ahead of time. One wonders if that could be creating an out of synch condition that messes up the physics calculations. I'll be checking those two hypothesis tomorrow.

This is interesting. Your motherboard is similar to the one I just retired, and the system overall is spec'd out the same. I'm going to try the VSync thing and see what that does. I drive my Samsung 27-inch monitor at the same resolution. Heck we'll try anything to squeeze out a drop of FPS anywhere! :)

Regarding the Mojave to Bakersfield Session. I completed it. You need to be careful with the throttle. Screen sync really doesn't have much to do with that other than lag, which can get you confused because the display may not update right away, causing you to over react. The VSync might resolve that if that helps with the FPS. I really doubt it has anything to do with your video card other than what I mentioned.

Interesting stuff and a really great find. :)

John
 
Because it is LCD I've always setup every program with vsync on. Today I turned it off just to see what would happen. I reran a benchmark program I have and the 3D graphics rating went from 1639 to 2234. Wondered if that was real. It was. Trainz FPS on the ECL Lincoln relief session stayed in the mid to high 20's in the large busy trainyard which surprised me but once out of the train yard they shot up to the high 50's and eventually (I think after all the textures being used got cached in that 2Gb of onboard vram) they shot over 60 and stayed between 65 and 72. Now - that's wierd because the max framerate on an LCD monitor to the best of my knowledge is 60 FPS.

You're misunderstanding what the FPS counters are telling you. Here's a quick explanation of how this works:

* The GPU has an area of its RAM set aside for the pixel array that you see on the screen.
* Each screen refresh (at 60Hz, or whatever your display is set to) reads through that buffer, encodes the data, and sends it across the cable to your display. This is what you are seeing on the screen, and it occurs at a fixed rate regardless of whether the pixels have changed or not. It takes approximately one refresh interval (ie. 1/60th of a second) to send the entire screen data across.
* The game renders into its own private buffer. Once rendering is complete, this buffer is copied onto the screen buffer. The "FPS" measurement determines how many times per second this process is occurring. The actual copy is very fast, but not instant. Rendering and other game operations take most of the time here.
* You can think of both processes (the screen refresh, and the buffer copy) as a line that is moving down the screen, rapidly updating the pixels. The screen refresh moves (relatively) slowly, the buffer copy moves very quickly.
* If the screen refresh is part-way through when the buffer copy occurs, the buffer copy "line" will quickly catch up to the screen refresh "line". Everything sent to the screen prior to this catch-up will be from the previous frame, but once buffer copy overtakes the screen refresh, any further data sent to the screen is from the new frame. This leads to a visible disjoint ("tear") on the screen where you can see one frame near the top but a different frame (often from a slightly different camera position) near the bottom. How noticeable this is depends on how visually different the two frames are, and the exact timing of the two processes.
* VSync works by delaying the buffer copy until the screen refresh is out of the way. This completely prevents tearing but comes at the expense of FPS (since there's now an additional delay in the process.) This also means that the game will never have an FPS higher than what your display is capable of using.
* Without VSync, your frame rate will be as fast as your computer can produce. If this exceeds that capabilities of your display, then the additional frames will be produced but will simply never make it to the display before they are replaced by a subsequent frame.

hth,

chris
 
Thank you, Chris for the explanation and this sure makes sense to me. The process is a bit different than it was with CRTs, since there's no longer a need for interlacing which I was thinking of when with the V-Sync. In the old days of CRTs the V-Sync was used to control the interlacing of the two parts of an image. There was a top part then a bottom part with the a return signal to the top again. Back in my days as a technician, I used to test for sync problems on both 60hz and 50hz terminals. This was controlled off of the bit-clock as it was called coming out of the Signetics 2672 video timing controller. It's funny I still remember this. I haven't touched a scope on these things since 1987 :D

John
 
Thanks WindWalker - great explanation. I've been hearing about the tear issue since God was a Baby but have never ever seen it or if I did didn't notice it.

Long story short then: If tearing isn't visible and turning off vsynch eliminates the buffer copy delay does turning off vsynch with an LCD monitor actually accomplish anything regardless of what the actual frames being sent to the monitor is shown by the counter software?

One of the things I noticed when uninstalling the intel turboboost tech software was that the horrible stutter I was seeing in trainz almost completely dissapeared. It went from a noticeable momentary skip to just a quick skip. Turning off vsynch (my perception - maybe wishful thinking) decreased the number of skips and seemed to make them short enough that they seem more like a huh? what was that moment than an actual stutter.

What causes stutter and how can one get rid of it?
 
Thanks WindWalker - great explanation. I've been hearing about the tear issue since God was a Baby but have never ever seen it or if I did didn't notice it.

Long story short then: If tearing isn't visible and turning off vsynch eliminates the buffer copy delay does turning off vsynch with an LCD monitor actually accomplish anything regardless of what the actual frames being sent to the monitor is shown by the counter software?

One of the things I noticed when uninstalling the intel turboboost tech software was that the horrible stutter I was seeing in trainz almost completely dissapeared. It went from a noticeable momentary skip to just a quick skip. Turning off vsynch (my perception - maybe wishful thinking) decreased the number of skips and seemed to make them short enough that they seem more like a huh? what was that moment than an actual stutter.

What causes stutter and how can one get rid of it?

I would say that if there's no apparent problem with the graphics, don't worry about it.

Stutters are caused by many things. This can range from a highly fragmented hard disk (a mechanical drive that is) as the data is loaded in to lots of high-poly objects in an area such as in a city or big rail yard, and many things in between. I'm sure too that if the CPU is slowing down, then the program is working harder to produce the graphics, and that can cause the stutters too. That would explain why you see them with the Turbo Boost on. With the TB, it's working in spurts to give you some extra performance even as the CPU is heating up and slowing down. In this case you're probably seeing the boost kick in and then the CPU go back to normal. By turning off the boost, the data is continuous.

All in all, ensure you have adequate cooling because as your computer heats up, the performance will drop off dramatically.

John
 
The process is a bit different than it was with CRTs, since there's no longer a need for interlacing which I was thinking of when with the V-Sync.

You're probably thinking of the Vertical Blanking Interval, which is strongly related to VSync. VSync is, and always has been, the process of synchronising the screen buffer updates with the actual Vertical Blanking Interval, regardless of the hardware in use. All computer display hardware (CRT, LCD, OLED, plasma, whatever) works roughly the same way, partly for legacy reasons, partly because it's sensible.

Interlacing doesn't really impact on the issue, it just means that there are two vertical blanks per full screen frame, rather than one. This is an interesting trade-off which adds apparent motion blur to the resultant image. It's not commonly used for computer displays, even on CRT displays, because you generally want computer displays to have a high refresh rate. Many people can see 60Hz refresh on a CRT computer screen, so a good screen will have a 75Hz-120Hz refresh. Older TV screens avoid this problem by having the phosphor take longer to decay. This introduces ghosting, but prevents visible flickering and induces motion blur, both of which help trick the eye into seeing a steady image instead of a flickering mess.


Werewolf13 said:
If tearing isn't visible and turning off vsynch eliminates the buffer copy delay does turning off vsynch with an LCD monitor actually accomplish anything regardless of what the actual frames being sent to the monitor is shown by the counter software?


If the tearing doesn't bother you, then you will get a higher frame rate by turning off VSync. If your frame rate is already in excess of your LCD's refresh rate (often 60Hz) then there's absolutely no benefit to this. If your frame rate is below 60fps, then you may see a modest improvement.

To give a worst-case example, consider that the game is capable of rendering at 59fps. With VSync off, you will get 59fps on a 60Hz screen, meaning that for every 60 screen refreshes, two will be identical. (Note: this is completely ignoring tearing, which would be horribly visible in this example.) We call this "dropping a frame" and you can say that in this example, one frame is dropped every second. It will lead to a slight shudder once per second if your camera is moving. If we assume that every frame is taking an equal amount of time to render, then you can see that each frame takes slightly longer than 1/60th of a second. If you turn on a basic VSync, anything which misses the first refresh (1/60th of a second) will have to wait until the second refresh (2/60th) before it is displayed. In a simplistic case, this would result in your game dropping to 30fps when VSync is on. The result will be silky smooth, but at a much lower frame rate. A lot of people will still actually prefer this, but not those who care more about the number than the appearance :-)

In practice, many GPU drivers will implement 2-3 buffers, so that the hardware doesn't simply stop while waiting for VSync. Once the game has finished rendering the first buffer, it will start on the second even though the first hasn't been displayed. The pre-prepared buffers are then copied to the screen at exactly the right moment to prevent tearing. This does add a little latency to the screen update, and may cause the camera movement to be slightly jumpy, but results in higher average frame rates without introducing tearing.

As always, it's best to try a few settings and see what you prefer on your hardware. Different hardware gives different results, and different people prioritise different things in the output. Some people prefer high frame rate. I personally prefer a smooth image and no tearing.

As John says, make sure that your computer is not overheating. It shouldn't under normal usage (that would be very poor hardware design) but some do. Regardless of what other steps you take, if your CPU or GPU are overheating, you won't get good results- the hardware will reduce performance drastically to avoid permanent damage, or may even crash.

chris


 
In a simplistic case, this would result in your game dropping to 30fps when VSync is on. The result will be silky smooth, but at a much lower frame rate. A lot of people will still actually prefer this, but not those who care more about the number than the appearance :-)

The drop to 30 fps when the frame rate can't maintain the refresh rate of the monitor (60Hz which requires a consistent 60 fps) is by no means smooth and is the main reason why people get stuttering when v-sync is enabled.

A good explanation of how v-sync works -

http://hardforum.com/showthread.php?t=928593

In addition the closest you’re going to get to 30 fps being silky smooth is with v-sync enabled, frame rate locked at 30 fps along with triple buffering which is usually only supported in OpenGL applications (games). In reality the only games that I’ve seen that look decent at 30 fps used motion blur (mostly console games).

Nvidia does have an adaptive v-sync option at half of the refresh rate which is useful if you can’t maintain 60 fps in a particular game. It’s still not as smooth as a consistent 60 fps with v-sync enabled but it’s the next best thing.

http://www.hardocp.com/article/2012/04/16/nvidia_adaptive_vsync_technology_review

http://www.anandtech.com/show/2794
 
NOTE: one poster mentioned that with his Intel MB that the tech was hardwired into his BIOS and not loaded as a part of the OS. If that is the case it maybe possible to disable it by changing an <enabled>/<disable> switch in the BIOS setting but if you don't know what you are doing or aren't comfortable messing with the BIOS then DON'T DO IT!

Otherwise - to see if you have the Intel Turboboost Monitor Technology 2:

Open task manager.
Goto the processes tab.
If you're running XP then check the box at the bottom of the tab that says show processes for all users. For win 7 there's a push button at the bottom that has an admin icon on it and says show processes from all users.

Sort the image name column either descending or ascending.
Look for a process called SignalIslandui.exe and one called TurboBoost.exe. Right click on the signalislandui.exe and end the process (it won't hurt anything just unloads it). If the turboboost.exe process goes away at the same time you've got the Intel Turboboost Technology Monitor software running.

An Alternative is to open the Control Panel and go to the uninstall programs panel. Look for an installed program called Intel Turboboost Technology Monitor 2.0 - that's what I ended up uninstalling after doing all the testing.

Guys! Please keep in mind that every system is different. Run your framerate testing before and after. Don't uninstall the software before you're sure. Just turn it off in taslk manager as explained above when you want to see if that helps.

Sorry to necropost but I just stumbled across this thread and it piqued my interest. I did all the checks and I have Turboboost enabled in my BIOS but the exe itself doesn't appear to be installed. Do I need to do anything else? This is an interesting experiment to try. I know how to disble it in the BIOS if need be.
 
Sorry to necropost but I just stumbled across this thread and it piqued my interest. I did all the checks and I have Turboboost enabled in my BIOS but the exe itself doesn't appear to be installed. Do I need to do anything else? This is an interesting experiment to try. I know how to disble it in the BIOS if need be.

Hi Boc,

If you don't have the software loaded, don't worry about it. The hardware (BIOS) part doesn't seem to affect the performance.

John
 
Back
Top