QUOTE

This version has improved ATI CrossFire support. I added a few more details like Clock and Memory speed, etc.
On my 5770 cards it's able to read the temperatures and details for each individual card.
If you have an ATI card or cards then you can try this new code by adding this to the RealTemp.ini configuration file.
GPU=2
I haven't heard back from any testers yet so give it a try and let me know if it works on your system or if you have any problems.
Wishmaker: Thanks for that tip. I found the easiest thing to do was to search through the Registry for EnableUlps and I changed each value to 0 to disable or to 1 to enable this feature and then did a reboot after that.

RealTemp respects whatever setting you choose. If you don't like having access to the GPU 2 data blocked by the driver then set EnableUlps to 0 to get rid of that power saving mode. With a Kill-a-Watt meter I measured a savings at idle of 3 or 4 watts so it doesn't make a huge difference.
When both GPUs are reporting temperature data, RealTemp should display the highest current GPU temperature on the main screen and in the system tray.
If you don't have access to both GPUs then I found just starting Furmark was enough for the second GPU to wake up. You don't need to click on the GO button and start Furmark rendering, just having the GUI visible or minimized is enough to keep the second GPU awake for testing purposes. When EnableUlps is enabled, about 30 seconds after you close Furmark, if nothing else is using GPU 2 then it will stop reporting temperature and MHz data.
Feb 27 2010, 10:19 AM
Quote

0.0220sec
0.57
5 queries
GZIP Disabled