Jump to content

Recommended Posts

Posted

http://techreport.com/ja.zz?comments=12749

In its Financial Analyst Day presentation last December, AMD unveiled a notebook technology it called "hybrid graphics." The premise of hybrid graphics is to allow notebooks to feature both integrated graphics and a dedicated graphics module and to switch seamlessly between the two. Dedicated graphics can be switched off when the notebook is running off its battery and turn back on when the system is plugged in. AMD has yet to deliver products with hybrid graphics support, but HKEPC now has word that AMD isn't the only one working on the concept. According to the Hong Kong-based site, Nvidia is developing a similar implementation that it dubs Hybrid SLI.

Unlike AMD's hybrid graphics, Hybrid SLI will allow integrated and discrete graphics processors to combine their processing capabilities—perhaps the IGP could be used to compute physics, for instance. The technology appears to be intended for both notebooks and desktop systems. A purported excerpt from an Nvidia e-mail quoted by HKEPC states the following:

Depending on the processing demands of each application, the discrete GPU may be completely shut-down to save power. For the most powerful of systems, where the combined power of dual Geforce 8800 GTX SLI can reach 400W, both GPUs can be powered down when the user is just doing email, surfing the web, or watching a Blu-ray movie, keeping the system completely quiet and consuming the least possible energy.

But when a game, or other demanding GPU application is launched, the dual 8800 GTX�s are powered up to deliver unrivaled performance.

The excerpt concludes, "Hybrid SLI is our core PC ecosystem strategy for the coming year."

I can't wait until this comes out. Then I can buy a high end card and not worry about the electric bill and the cooling in my house while the computer is on!


Posted

No. Its basically using integrated and added on graphics chipsets.

So when you use your computer you use just the integrated for e-mail and word processing and stuff, and when you want to play a game your 8800GTX kicks in and takes over. That way you don't hae to have an 8800 screaming in your ear and heating up your house when all you want is to write an e-mail. You use onboard video for that.

Posted

Okay, something like ePCI-E.

You drop in a PCI-E or PCMCIA card and connect an external GPU with it´s own PSU. This is what I saw over the last months on diferent news sites.

If not, could you explain us a bit more technical how it works?

Posted (edited)

It's not external at all. You purchase a motherboard with onboard/integrated graphics. Then you upgrade the system with a standard addin PCIe X16 card...no external device and no external PSU. When you're playing games or doing something graphic intensive the system uses the addin PCIe card. When you're doing general stuff like web browsing, email, word processing, etc the PCIe card is turned off and the system uses the onboard/integrated graphics.

It's a neat idea but at the same time I wonder why they don't just fix their chips so that the 3D processing units are powered off when you're not using them. I don't see the reason for switching back and forth between two cards other than to be a marketing gimmick. Instead of requiring yet another video device in the PC, make the existing (or new) devices more power efficient...kinda like how Intel turns off part of the Core 2's cache when it's not being used.

TESLA is geared more towards graphics artists and rendering farms.

Edited by nmX.Memnoch
Posted (edited)
It's a neat idea but at the same time I wonder why they don't just fix their chips so that the 3D processing units are powered off when you're not using them.

They already do that, so I wonder what the big deal here is...

I don't see the reason for switching back and forth between two cards other than to be a marketing gimmick.

After wondering I agree with you...

I would have liked to see that it was PCI-E/PCMCIA to an external solucion so labtops and computers that are "too full" could benefit from it. This "new feature" just seems marketing indeed...

EDIT: They use the term SLI, something I would NOT call SLI, only if they work at the same time? I know the 690G/V chipset can use onboard and a add-on PCI-E x1250 to get SLI, never got a X1250 so I could not test it... (only I see x1050 and x1550 here, nothing in between...)

Edited by puntoMX
Posted

Well, the one thing they mentioned that might be cool was the use of the integrated GPU for other things, like physics processing, while you're playing a game. I guess you could apply the term "SLI" to that part of the technology. But if you think about it, that would negate the battery savings because that's yet another device powered on. Actually, it wouldn't only negate it but the laptop would end up using even more power. On the flip side, if you're gaming on a laptop are you doing it while running on battery? I wouldn't. Everyone's on the "it uses less power" kick so they're using that as a marketing angle.

Posted
Well, the one thing they mentioned that might be cool was the use of the integrated GPU for other things, like physics processing, while you're playing a game. I guess you could apply the term "SLI" to that part of the technology.

Exactly, but the only chipset with integraded GPU that can use the onboard VGA together with a extra AMD/ATi PCI-E card and controll for example 3 screens at the same time is the 690G/V chipset. Even the nVidia 7025/7050 doesn´t support it...

Posted

Yeah, I imagine that what they're working on won't work with anything currently available on the market. Most motherboards automatically disable the integrated graphics when an addon card is installed.

Posted
It's a neat idea but at the same time I wonder why they don't just fix their chips so that the 3D processing units are powered off when you're not using them.

They already do that, so I wonder what the big deal here is...

I don't see the reason for switching back and forth between two cards other than to be a marketing gimmick.

After wondering I agree with you...

Really? Then why does something like an 8800GTX consume roughly 150Watts doing absolutely nothing? Just idling there?

Posted
It's a neat idea but at the same time I wonder why they don't just fix their chips so that the 3D processing units are powered off when you're not using them.

They already do that, so I wonder what the big deal here is...

I don't see the reason for switching back and forth between two cards other than to be a marketing gimmick.

After wondering I agree with you...

Really? Then why does something like an 8800GTX consume roughly 150Watts doing absolutely nothing? Just idling there?

Bruce, this is nothing personal but please, would you stop saying things that are not right, not even close..

I don´t call playing a game on a 8800GTX is doing nothing! This is when the 8800GTX uses 150 Watts! And full load 180 Watts! Idle about 60 Watts!

So stop saying things that are not even close to the truth, 60 Watts isn´t 150 Watts (150% more)... :hello:

Posted (edited)

60 Watts idling is still very much. Hopefully this new hybrid SLi will help cut this down significantly.

Edited by brucevangeorge
Posted

Well, think about this:

You wan´t the best of the best and it has a price. If you use windows XP/Vista it will use the Direct3D part for the desktop also, so it´s more that Microsoft is making the card work more, even when it looks idle. It would be cool if the test it under Linux for example, or just plain text on the screen. I´m sure it will be lower then 60 Watts.

Posted

Yeah, but even Linux distros like Ubuntu are starting to use the video card more. To be honest, while Vista does use more of the 3D capabilities they aren't really that taxing on the GPU. Even the integrated graphics on the i945G chipset run the full Aero interface without a problem. :)

Posted
Well, think about this:

You wan´t the best of the best and it has a price. If you use windows XP/Vista it will use the Direct3D part for the desktop also, so it´s more that Microsoft is making the card work more, even when it looks idle. It would be cool if the test it under Linux for example, or just plain text on the screen. I´m sure it will be lower then 60 Watts.

Maybe. Or maybe its like with processors. They still use alot when Idling (compared to full load).

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...