Switching off discrete graphics to use IG for power saving

yunti

Junior Member
Feb 27, 2014
3
0
0
I recently set up a new computer (specs below) which was whisper quiet and pretty low power in use, (26W, for internet browsing). However I do want to use the machine for gaming perhaps around 10% of the time, so put in a GTX 580 card for now.
However light browsing use has now jumped the power up to 60W and I can hear the fans of the graphics card. That's more than double the power required for no extra real use for 90% of the time from the machine.

- Is there a way to disable (without unplugging) the discrete graphics card to use the Intel IGP instead and revert back to low power and lower noise?

I have tested changing the primary display setting in the Bios and setting it to IGP, but that didn't help unfortunately. It just meant that output via the discrete graphics card was disabled for Post and Bios screens but still could be used in windows. Disabling the graphics card in device manager didn't help either, the power still remained the same.

Is there another way this could be achieved?

Thanks.



Specs:
OS: Windows 8.1
Motherboard: Asus H87M Pro
CPU: Intel i5 4570 Haswell.
GFX: Nvidia GTX 580
 

MeldarthX

Golden Member
May 8, 2010
1,026
0
76
not on the desktop that I know of.....and 580 is a bit of a hungry card......you can pick up newer cards that aren't as power hungry and use less power....
 

BonzaiDuck

Lifer
Jun 30, 2004
15,785
1,500
126
not on the desktop that I know of.....and 580 is a bit of a hungry card......you can pick up newer cards that aren't as power hungry and use less power....

I wouldn't know for sure how BIOS features have changed over approximately two generations of newer motherboards. But the Z68 (and same gen) chipset(s) addressed coordination between iGPU and dGPU with a direct reference to Windows "software" in the System Agent menu of BIOS -- "multi-monitor" for use with Lucid Virtu.

I've been experimenting with this all along, but for most of the computer's life it had been arranged for "dGPU mode." The OP would know what this means from the configuration he described as "iGPU mode." I wasn't concerned initially about power savings.

I switched to iGPU mode because I began to think I had too many monitors (two, in fact) connected to my GTX 570 card. This card is also a power hog like the OP's GTX 580 -- more power-hungry than a GTX 770 by my recent investigations.

The dGPU still feeds my AVR/HDTV with programming (more or less 24/7); my desktop monitor is connected to the motherboard iGPU port. You can turn Virtu on and off at will. If I play games on the desktop monitor, the gaming performance seems just noticeably better than when the system was configured for dGPU mode. If you turn Virtu off with this iGPU-mode configuration, gaming suddenly descends to the quality of a late-90's graphics card with 64MB of VRAM -- great for desktop applications, minimally sufficient for any type of gaming. So far -- this is fine.

But I've been watching the temperatures for my GTX 570 from NVidia Inspector. In the previous configuration (both monitors connected to dGPU), it seemed the temperatures were around 50C. Now -- with Virtu enabled -- the dGPU is running at about 63 to 64C. And I'd watched it over time since yesterday -- there was a period when the P-state changed, and it had dropped back to about 53C. Checked it again, and it is back to P0 and above 60C for the GTX 570.

There is very little in the way of features or configuration guides for the VIRTU software. I can only say that it "works," but this new regime of temperatures has me searching around for more information. I would say it is definitely NOT saving me on the power-bill.

You don't need to use the Virtu software, since you can run the Intel HD3000/4000 with monitor(s) connected and the dGPU (570/580/etc.) with monitor(s) connected separately.

I'm now wondering how it would be connecting the HT HDMI link to the iGPU and running Virtu. I would hope or expect that this would allow the dGPU to work at lower power. I just don't know yet, and the switchover would likely require me to reconfigure audio and Media Center settings.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
What about using *2* cables, between the computer and the display?

Plug one cable between the discrete video card and a first input port on the display (e.g., HDMI).

Plug the second cable between the integrated graphics and a second input port on the display (e.g., VGA).

Then you can easily use the built-in windows shortcut win+p to toggle between the two "displays" that are now connected to your two different video cards (even though really it's one display connected via two different cables to both video cards).
 

BonzaiDuck

Lifer
Jun 30, 2004
15,785
1,500
126
As much as the Lucid software didn't find many adherents in these Anandtech forums, I'm not so sure either the reviewers or enthusiasts explored all its possibilities. I'm equally skeptical, either way.

But I'm wondering if perhaps the MVP version of Virtu holds a better key to configurations that might reduce power consumption. Here's a link to a year-old Maximum PC article on the topic for use with either SB or IB systems: Haswell had not yet appeared in 2013 February, so it isn't mentioned, but there's little doubt that either the software or sketchy explanations about configurations would be equally applicable to processors with integrated graphics:

http://www.maximumpc.com/node/24581?page=0,0

The only thing I can guarantee so far: while I suspected that Virtu may have been behind an occasional instability I was troubleshooting, I am now absolutely sure that it wasn't the cause of it. Since it can be switched on and off in Windows, I see no reason to discourage its use. It appears from my own experience that it can be switched on or off without stopping any applications or rebooting.

I just don't know how much it bears on this discussion, but given its big promotion with motherboard makers, it is equally likely that it may.

I just don't know yet.
 

BonzaiDuck

Lifer
Jun 30, 2004
15,785
1,500
126
What about using *2* cables, between the computer and the display?

Plug one cable between the discrete video card and a first input port on the display (e.g., HDMI).

Plug the second cable between the integrated graphics and a second input port on the display (e.g., VGA).

Then you can easily use the built-in windows shortcut win+p to toggle between the two "displays" that are now connected to your two different video cards (even though really it's one display connected via two different cables to both video cards).

I had been reluctant to even try this, and what you are suggesting is to use two "display adapters" with one monitor. Does this work?

In my case, I want at least two monitors using all the resources of iGPU and dGPU. So the question becomes (first) -- whether connection to mobo-iGPU of all monitors is better than connecting all monitors to dGPU, or spreading monitors between the displays (which I now do). This, of course, would occur in the context of BIOS-enabled VIRTU usage. You don't NEED the VIRTU software for this: you can enable the iGPU to grab 512MB of RAM as if it were a separate graphics card with that much memory. And you can then use the two adapters separately with a monitor connected to each. But if you want to game on the iGPU and monitor, you get "512MB" performance. If you instead enable "multi-monitor" and allocate a maximum of 64MB of RAM for it, then you get "dGPU" performance with VIRTU from the same iGPU/monitor setup.

I can guarantee that with the VIRTU software and a single monitor in "iGPU-mode," the user shouldn't be disappointed about the performance, and you can turn off the software at will with a mouse-click. so there would be no need for this dual-connection, single monitor approach.

The problem for which both I and the OP overlap is the issue of power-consumption.
 
Last edited:

Piroko

Senior member
Jan 10, 2013
905
79
91
I'm not aware of a well working way to switch between IGP and discrete on desktops on the fly.

If you're fine with losing a bit performance (around 20% I'd say) you could get away with selling your GTX 580 and buy a GTX 750 Ti as a replacement. That should bring noise and power consumption way down both at idle and load.

If that's not an option I would probably wait for cards manufactured on 20nm and then do an upgrade.
 

BonzaiDuck

Lifer
Jun 30, 2004
15,785
1,500
126
I'm not aware of a well working way to switch between IGP and discrete on desktops on the fly.

If you're fine with losing a bit performance (around 20% I'd say) you could get away with selling your GTX 580 and buy a GTX 750 Ti as a replacement. That should bring noise and power consumption way down both at idle and load.

If that's not an option I would probably wait for cards manufactured on 20nm and then do an upgrade.

Well, the OP can make his own decisions, but I'm right at the point of ordering a 780 GTX card touted for performance and cooling. Likely a 770 GTX consumes less power. But I'd really like to resolve this power-consumption issue in the context of "iGPU-mode-dGPU-enabled" with the VIRTU software.

Only the reviewers and pubs like Maximum PC have been mildly supportive of the Lucid SW. There were vague "warnings" that we would have trouble with it. I don't think I ever really had trouble due to it, and like I said, you can switch it off and on in a window. The only drawback there: the iGPU is configured to use only 64MB of onboard RAM. It behaves as though it had your dGPU's vRAM when LUcid is enabled, and it behaves like a 64MB graphics card when it isn't. But it's about the power -- and for the dGPU, under my apprehensive observations, it's about the heat.

UPDATE [of my own trivia]: The plot thickens. There's nothing about my GTX 570 and HD 3000 configuration that is causing the graphics card to run hotter. It is running hotter because I put my big 200mm side-panel fan under mobo thermal control. The NVidia card had continued to control its own fan. It was running on "auto" in NVidia Inspector -- showing "49%" duty-cycle. So I decided to see what happened if I ran it up to 70%. First, I can't hear it -- even at that speed. And second, the temperature drops back down from 64/65C to about 52C -- the way it was before I started thermally controlling the side-panel fan.
 
Last edited:

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Using the dual-cable approach, you'd have to make sure to switch the active input on the display. On my Samsungs, there is a button that lets you cycle between different inputs, and I leave my displays set to "Manual" so I can control which video card is feeding the display. But you could also set the display to "auto" and then enable/disable the video card output using win+p to toggle which video card output is active.

The worst case scenario, is you'd also have to hit win+break to launch the device manager and manually enable/disable video cards there. But I don't think that's necessary. I think that merely using the windows shortcut win+p to toggle which output is active will also cause the video card to disable and conserve power when it's not active.
 

BonzaiDuck

Lifer
Jun 30, 2004
15,785
1,500
126
Using the dual-cable approach, you'd have to make sure to switch the active input on the display. On my Samsungs, there is a button that lets you cycle between different inputs, and I leave my displays set to "Manual" so I can control which video card is feeding the display. But you could also set the display to "auto" and then enable/disable the video card output using win+p to toggle which video card output is active.

The worst case scenario, is you'd also have to hit win+break to launch the device manager and manually enable/disable video cards there. But I don't think that's necessary. I think that merely using the windows shortcut win+p to toggle which output is active will also cause the video card to disable and conserve power when it's not active.

I'm trying to visualize this "win +p" toggle you speak of. But perhaps my situation is different from the OP's or yours. I want to be able to run up to three monitors -- and I know I can. I can do it with the 570 GTX card and HD 3000: the GTX allows for two monitors tops; I can run either two of the three or one of them off the HD3000.

But it gets back to this mythical "power-saving" feature of iGPU-mode with Lucid for the HD3000. I can see that the power-saving doesn't occur if you're running a monitor off the dGPU. I'm running my HDTV 24/7 off the GTX 570 at the moment -- thus no "power-saving."

I just clicked "proceed to checkout" to purchase an ASUS 780 GTX "OC" card. That's second from top of their line. It's a $500 card. The 780 Ti card -- looks very much the same -- is $700. And I was afraid the Ti card would overpower my PSU, but probably not. In a guru3D review of the card I ordered, the GTX 570 is about neck-and-neck with it in power consumption. But if I wanted, I could run three monitors off the GTX 780 alone. I think I'll be curious to see what happens with two monitors connected to the mobo and iGPU, using the 780 GTX 3GB VRAM with the Lucid software. Theoretically, the dGPU (780GTX) would lapse into a lower "P" state as a "passive resource" and no monitor connected. Therein would lie the power saving.

Just an afterthought. With the NVidia cards, the NVidia software has its own "multi-monitor" configuration screen. It is possible to set up a second monitor, and then turn it off from within the software. And -- it works. Once the selection is executed, the monitor goes into a sleep state. So there are likely additional possibilities for controlling monitors and display adapters.
 
Last edited:

AMD64Blondie

Golden Member
Apr 20, 2013
1,660
140
106
Nice choice.The GTX780 is a beast.

(I own a Gigabyte GTX780,which is in my main Windows 7 PC.)
 

BonzaiDuck

Lifer
Jun 30, 2004
15,785
1,500
126
Nice choice.The GTX780 is a beast.

(I own a Gigabyte GTX780,which is in my main Windows 7 PC.)

Somewhere in this forum, I may have posted a link to a 2014 Tom's HW article on "best graphics cards." The article begins by touting IT pro's inclination to discount top-of-the-line cards in a bang-for-buck perspective. The graph they provide showing bars for performance and a line with dots for price shows the point where prices begin to rise dramatically for the last 25% of performance level -- normalizing the results to show a GTX 690 at 100%. I think the 690 card was a dual-GPU card for the last generation, and we are waiting for release of a 790 dual-GPU card -- or we were as of last month.

So less expensive cards in a $300 to $400 range fall around the point where the price-curve suddenly starts to bend. They include the GTX 770 cards, for the most part.

I figured if I were planning to get a GTX 770 for $400+, I might as well "graduate" myself to a 780 card for an extra Franklin. My biggest concern was how the card handles temperatures, so I have expectations for the ASUS card. It is constructed with what looks like a ventilated metal back-plate. Metal or not, the plate has a natural potential for ducting airflow, while the card's cooling ability is couched in a "direct touch" copper heatpipe assembly.
 

yunti

Junior Member
Feb 27, 2014
3
0
0
But perhaps my situation is different from the OP's or yours.

I think it is a different issue, but you are welcome to open a thread for it, which may help to stop confusion between 2 seperate issues.

Thanks for you posts though.

Unfortuantely getting a different low power card isn't really a solution, my current GTX 580 is just an example card, I need a high power card and will shortly be putting in a 780 TI instead which is only needed when gaming.

That still has the same problem of needlessly drawing power when it doesn't have to.
This still occurs even when it has no cable plugged into it.

There doesn't seem to be any software solution unfortunately. Whatever the setting the discrete GPU still draws power when it's plugged into the board - even if it's not used.

The only solution I can think of (short of taking it out of the motherboard when it's not used - which isn't a good solution) is to get some sort of PCIE extension cable to plug the card into which has an on off switch to connect/or disconnect the card, which can be changed on reboot. (Not that ideal either).

Let me know if you know of another solution?

thanks,
 

yunti

Junior Member
Feb 27, 2014
3
0
0
http://forums.tweaktown.com/applica...-virtu-mvp-does-really-help-power-saving.html

Virtu with the monitor hooked up through iGPU looks like the only software solution available. Idling with the display connected to the iGPU seems to save power vs idling with the display connected to the dGPU. Dunno how it compares to iGPU only and dGPU only.

Yes I would agree with that if using both Virtu can lower the idle power (by idling with iGPU). However idle power in that case is still significantly above idle with iGPU only and no dGPU installed.
(this looks to backup those findings too: http://www.bit-tech.net/hardware/motherboards/2011/05/13/what-is-the-intel-z68-chipset/4)
- 4th graph down.
 

coffeejunkee

Golden Member
Jul 31, 2010
1,153
0
0
Sadly it's not possible. Laptop users have had Optimus/Endurance for a long time now, but us desktop users are just supposed to take the hit.

I messed around with Virtu a bit trying to accomplish the same but it only made things worse because my gpu wouldn't even downclock anymore. It's only useful for a) using quicksync and b) connecting another monitor to igpu.
 

BonzaiDuck

Lifer
Jun 30, 2004
15,785
1,500
126
I think it is a different issue, but you are welcome to open a thread for it, which may help to stop confusion between 2 seperate issues.
Thanks for you posts though.
Unfortuantely getting a different low power card isn't really a solution, my current GTX 580 is just an example card, I need a high power card and will shortly be putting in a 780 TI instead which is only needed when gaming.
That still has the same problem of needlessly drawing power when it doesn't have to.
This still occurs even when it has no cable plugged into it.
There doesn't seem to be any software solution unfortunately. Whatever the setting the discrete GPU still draws power when it's plugged into the board - even if it's not used.
The only solution I can think of (short of taking it out of the motherboard when it's not used - which isn't a good solution) is to get some sort of PCIE extension cable to plug the card into which has an on off switch to connect/or disconnect the card, which can be changed on reboot. (Not that ideal either).
Let me know if you know of another solution?
thanks,

coffeejunkee said:
Sadly it's not possible. Laptop users have had Optimus/Endurance for a long time now, but us desktop users are just supposed to take the hit.
I messed around with Virtu a bit trying to accomplish the same but it only made things worse because my gpu wouldn't even downclock anymore. It's only useful for a) using quicksync and b) connecting another monitor to igpu.

To save the clutter, take a look at the last half of my post this morning 8:54A PT, on this thread and entitled "Heh-heh! It's a--LIVE!! . . . "

http://forums.anandtech.com/showthread.php?p=36128747#post36128747

If yunti is about to install his own 780 GTX or "Ti" card, he may discover what I did. I'm even receptive to pronouncements that "this is all in my imagination," but I'm pretty sure I "saw what I saw."

On this power consumption issue, I had tried lowering the power state of the GTX 570 through NVidia Inspector. Of course, it works, but it's a manual setting. And since I was running my HDTV off the dGPU (with or without Lucid), there were lags between audio and video which made it totally unacceptable.

But the 780 GTX card has some sort of adaptive adjustment. I can't confirm what this would be "without Lucid," but with Lucid and running "iGPU mode" with my desktop and current gaming monitor connected to the mobo, it settles into a P8 state -- lower power. For the game I tested, which isn't really the sort of intensive experience you have with CoD or Crisis and other demanding games, it didn't even leave the P8 state. But the performance with Lucid "On" was at least as good as it had been with the GTX 570.

I'll need to play with this some more. I bought this card for some other features not necessarily associated with high-performance gaming. I didn't figure that it would install with an adaptive power-saving feature.

You can question this, challenge it, or explore it. I'm going to explore it further after I install the ASUS "GPU Tweak" software. There's a lot that I need to learn about this.
 

Tristor

Senior member
Jul 25, 2007
314
0
71
@yunti,

Since low power consumption and low noise are your priority, you ought to look at getting a GTX 750 Ti. It's not as fast as a GTX 580 at stock speeds, but with the EVGA FTW version of the card you get a hefty factory overclock that puts it neck and neck with the GTX660, and it still only draws 85W (and it can overclock more from there easily with the ACX Cooler). It should run very cool and quiet (you can mess with fan curves if needed) because of it's low power draw/dissipation and the hefty cooler.

Just a thought. It'd give you something that gets you most of the performance (maybe lose 10%) of your GTX 580 but uses 1/4th the power and puts out significantly less heat into your system so should help everything stay quieter.

I don't know that springing for a 780 Ti is worth it for your stated objectives.
 

BonzaiDuck

Lifer
Jun 30, 2004
15,785
1,500
126
@yunti,

Since low power consumption and low noise are your priority, you ought to look at getting a GTX 750 Ti. It's not as fast as a GTX 580 at stock speeds, but with the EVGA FTW version of the card you get a hefty factory overclock that puts it neck and neck with the GTX660, and it still only draws 85W (and it can overclock more from there easily with the ACX Cooler). It should run very cool and quiet (you can mess with fan curves if needed) because of it's low power draw/dissipation and the hefty cooler.

Just a thought. It'd give you something that gets you most of the performance (maybe lose 10%) of your GTX 580 but uses 1/4th the power and puts out significantly less heat into your system so should help everything stay quieter.

I don't know that springing for a 780 Ti is worth it for your stated objectives.

I'd give that a +1, though it depends on what he wants. I could actually feel guilty about this ASUS 780GTX model which I just finished installing (you can see my other thread in this Vid & Gfx forum). For me, it was partly to anticipate building another system with Haswell-E next year. If that sounds extravagant, I was going to spend a bundle this month to build an IB-E system, but changed my mind. So the 780GTX is chump-change looking at later into next year -- no final decisions yet.

I've spent as little as $100 and more than 3 Franklins on graphics cards from time to time. When I saw the Tom's HW 2014 article, I could see you could do quite well with 770 cards. I finally just decided to get the big enchilada. But it's not a "Ti."

I think I've evolved a level of complexity here with this system. multi-monitor, TV, games, two adapters and the "Lucid thing." Some say "build a dedicated HTPC," but I just decided to see if I could do an "all-in-one."

It might be interesting to see what happens to power-consumption and adaptive P-states if I put all the monitors on the 780GTX and disable both the iGPU and Lucid. I'll probably do that, too, and pretty soon . . .
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |