Jump to content

Saving old hardware


Zxian

Recommended Posts

Any way, it DOES do TnL hardware and DOES pixel shading hardware too (9.0c even as I remember, just wasn´t enabled in their first release of their drivers). The GMA 950 isn´t bad at all and does run some titles like NFSU2 and HL2 pretty well, ... for real...

TnL only enabled in later drivers? That means EMULATION... It only does software TnL and shaders.

http://forum.notebookreview.com/showpost.p...amp;postcount=2

As about only needing 60Hz on a LCD, well, the only LCD monitor i have is my laptop's display. AND refresh rate = response time, so your 2ms monitor will be useless if you only give it 60Hz refresh.

Never had any issues with any of that really. It worked just fine under any OS, and did everything I needed it to do: display 2D stuff, and play some videos.

That is because your CPU is able to do all the work.

Wow, I see a lot of smart guys around here that consider mounting a V12 engine on a bicycle as perfectly normal.

I'll show myself out of this talk now, thank you.

I'm sorry to say, but you are wrong here. As Zxian pointed out, in this case it's not about the power output but the effciency of that particular PSU.

Link to comment
Share on other sites


Wow, I see a lot of smart guys around here that consider mounting a V12 engine on a bicycle as perfectly normal.

I'll show myself out of this talk now, thank you.

It's not peak wattage, it's efficiency. Hopefully the next time you build a PC you'll do research into which PSU is the most efficient AC unit and use that, not just purchase the lowest wattage unit you can find and assume you're being efficient.

Link to comment
Share on other sites

Many P4's have a TDP of as high as 115W, and right now you can get high-end quad cores like a Q9650 on a lesser TDP (95W), or plain old Core 2 Duo's that still totally beat the P4 in speed, and have a TDP of 65W (about half of the P4).

Funny how you're talking about this but don't say a word about the cooling needed.

I've never seen a game programmer say anything bad about Intel chipsets.

http://www.tgdaily.com/2006/07/12/epic_integrated_graphics/

Go look at almost any board by Asus (e.g. their very popular P5K series), Gigabyte (e.g. their DS3 series), etc. Lots of large OEMs use them too (e.g. HP). The vast majority of boards use that. Even the 5 year old P4 I sold last month used that.

Well, mine sure doesn't! It's got a SiS one.

Not like this onboard stuff will last long anyway. Sure, built-in chipsets of today are better than a SoundBlaster 16 PCI, but my argument was that instead of buying something decent as a separate card, which meant you could actually decide what you wanted, it's now all on-board as cheap low-end chipsets.

Before you call it FUD, do your homework. I'll give you a very basic example of how a modern system can draw equal to, or less power than an old one.

Which is just an example of how you can. Of course you can. But such a rig is not everywhere, and it's not the standard.

Once of the MAJOR selling points of Core Duo when it was released was that they drew far less power and therefore ran cooler than their older counterparts.

Of course. The Core Duo series is based on the Centrino architecture. But there's more than Core Duo out there, and generally the heat rises exponentially the more cores you have on a CPU, which means more cooling, which means more power consumption.

Link to comment
Share on other sites

Many P4's have a TDP of as high as 115W, and right now you can get high-end quad cores like a Q9650 on a lesser TDP (95W), or plain old Core 2 Duo's that still totally beat the P4 in speed, and have a TDP of 65W (about half of the P4).

Funny how you're talking about this but don't say a word about the cooling needed.

Cooling has evolved. Today's big coolers are big because of the "silent PC" trend, and it's really better that way. I've had 4 P4s, still have a 2.4GHz Prescott clocked to 3.6 in mom's rig, and man does it run HOT despite it having a properly sized cooler fitted.

The Northwoods were alright, but i remember the 3.0GHz HT Prescott. Its stock cooler went up to 5400rpm and sounded like a jet taking off yet the chip ran at 72C in the summer. A ThermalRight XP-90 took care of both heat and noise, and it's a cooler that can cope with any current dual or quad-core CPUs, just that it doesn't fit LGA775 as far as i know. Too bad i sold it a few years back, it would've done a nice job in mom's current PC.

Did you ever see the coolers for the 45nm C2Ds? I thought "man, those look like they were pulled straight from a 486... Yet they're quiet, and they cool the chips just fine at stock speeds. When you're a computer enthusiast and like to squeeze every last MHz from your chip, of course you're going to need big coolers, it's always been that way. But the stock LGA775 cooler can even keep quads within their normal temperature limits at stock speeds. I can't say that about the P4 Prescotts.

I've never seen a game programmer say anything bad about Intel chipsets.

http://www.tgdaily.com/2006/07/12/epic_integrated_graphics/

You speak about integrated graphics. Intel's integrated graphics is what they always were - RUBBISH. You can't expect an integrated graphics solution to run demanding games. Intel's current CHIPSETS are awesome. Oh btw, your link is broken.

Well, mine sure doesn't! It's got a SiS one.

Wow, SiS. Don't remind me how incredibly s***ty those were.

Not like this onboard stuff will last long anyway. Sure, built-in chipsets of today are better than a SoundBlaster 16 PCI, but my argument was that instead of buying something decent as a separate card, which meant you could actually decide what you wanted, it's now all on-board as cheap low-end chipsets.

There's nothing stopping you from buying a dedicated audio card if you're not happy with the onboard. And onboard audio has really evolved in the past few years. It's meant to provide a cheap HD-compliant audio solution for people that aren't audiophiles or performance gamers, and does that very well.

Before you call it FUD, do your homework. I'll give you a very basic example of how a modern system can draw equal to, or less power than an old one.

Which is just an example of how you can. Of course you can. But such a rig is not everywhere, and it's not the standard.

Once of the MAJOR selling points of Core Duo when it was released was that they drew far less power and therefore ran cooler than their older counterparts.

Of course. The Core Duo series is based on the Centrino architecture. But there's more than Core Duo out there, and generally the heat rises exponentially the more cores you have on a CPU, which means more cooling, which means more power consumption.

The Q6600 eats less power than a P4 Prescott, and that's a FACT. 4 much more advanced cores using less power than a single core. Sure, the C2Q takes up 30W more than the C2D, but it's got double the processing power. And it STILL uses less than the P4 Prescott.

The Core architecture is based on Pentium M. Centrino was a mobile PLATFORM, which contains a Pentium M CPU on an Intel mobo with an Intel wireless card. That once again shows how misinformed you are.

Edited by Th3_uN1Qu3
Link to comment
Share on other sites

Are you drunk?

Many P4's have a TDP of as high as 115W, and right now you can get high-end quad cores like a Q9650 on a lesser TDP (95W), or plain old Core 2 Duo's that still totally beat the P4 in speed, and have a TDP of 65W (about half of the P4).
Funny how you're talking about this but don't say a word about the cooling needed.
Have you even looked at the coolers from a Core 2 Quad? They're smaller and quieter then the P4s ever were. Hell you could hardly even call the regular Core 2 Duo stock cooler a cooler considering is such a small and quiet cooler. I've never seen anything that small and quiet from any previous CPU from either AMD or Intel. Also, the TDP is the MAXIMUM heat output under 100% load. Under regular use then generate even less heat thanks to things like Speed Step.
I've never seen a game programmer say anything bad about Intel chipsets.
http://www.tgdaily.com/2006/07/12/epic_integrated_graphics/
Umm, you haven't proven anything at all.
Go look at almost any board by Asus (e.g. their very popular P5K series), Gigabyte (e.g. their DS3 series), etc. Lots of large OEMs use them too (e.g. HP). The vast majority of boards use that. Even the 5 year old P4 I sold last month used that.
Well, mine sure doesn't! It's got a SiS one.
Just the fact that you bought a SiS based motherboard already speaks wonders about the greatest of your hardware knowledge.
Not like this onboard stuff will last long anyway. Sure, built-in chipsets of today are better than a SoundBlaster 16 PCI, but my argument was that instead of buying something decent as a separate card, which meant you could actually decide what you wanted, it's now all on-board as cheap low-end chipsets.
The on-board chipsets trump anything that was designed in the early 90s. The chipsets that you refer to as cheap can do more, faster, better and with less power then any of your draconian hardware can.
Before you call it FUD, do your homework. I'll give you a very basic example of how a modern system can draw equal to, or less power than an old one.
Which is just an example of how you can. Of course you can. But such a rig is not everywhere, and it's not the standard.
Not the standard? And what do you define as standard? A machine built on archaic technology that's built in a beige box running an AT power supply?
Once of the MAJOR selling points of Core Duo when it was released was that they drew far less power and therefore ran cooler than their older counterparts.
Of course. The Core Duo series is based on the Centrino architecture. But there's more than Core Duo out there, and generally the heat rises exponentially the more cores you have on a CPU, which means more cooling, which means more power consumption.
More cores is not the only factor in heat generation. A single core based on a 250nm architecture is going to generate plenty more heat then four cores built using a 45nm process.

I think you're throwing arguments around thinking you're still in another decade. You should catch up with the times because the times obviously left you behind.

Link to comment
Share on other sites

Which is just an example of how you can. Of course you can. But such a rig is not everywhere, and it's not the standard.
Sure... that rig isn't standard, but it's also probably a lot more powerful than most people need as well. That being said, the ATI 3800 series video cards have amongst the lowest idle power consumption of any video card available today. Like I said in my original post, the X38 chipset draws a lot of power compared to other modern chipsets.

I gave you an example of a relatively high-end system, and showed that it draws less AC power than my mother's old Athlon system from 2001. I'm not sure what you're getting at, but if a "worst-case" system from now draws less than an "average case" system from 2001 (it was a basic HP model), then I'm not sure how else to convince you of this...

Of course. The Core Duo series is based on the Centrino architecture. But there's more than Core Duo out there, and generally the heat rises exponentially the more cores you have on a CPU, which means more cooling, which means more power consumption.
Wrong. Take any modern, 45nm quad-core Xeon CPU and stack it next to an old S601 socket Xeon. The modern CPU draws less power.

Compare an Intel Q9650 and a P4EE 3.4GHz. I don't have the numbers on hand, but I'd be willing to put money on the fact that the P4EE draws more power.

There's no magic about this, nor is there a "golden rule" that says more cores = more power consumption. The 45nm HF based process that Intel has adopted has reduced power consumption over the previous SiO2 process that's been used for the past 10-15 years (if not longer). It's simply a matter of improvements in technology to reduce power consumption in CMOS devices.

Intel has several well-documented papers on the technology used in their 45nm process (as documented as they can go without exposing secrets). A simple web search will help you find them if you're interested.

Link to comment
Share on other sites

TnL only enabled in later drivers? That means EMULATION... It only does software TnL and shaders.
Dude! I don´t get how you think and where your logic is at but this is really not the smartest thing to say. Listen, it has hardware T&L and I have to correct myself there a bit as it´s DX 9.0b and not 9.0c. Then you come with a link to a guy that is talking bullcrap like:
The GMA 950 only has DX9 support. DX10 will run, but will reduce DX10 effects to what is possible in DX9. The X3100 is the first Intel GPU to support DX10.

Err... :blink:

As about only needing 60Hz on a LCD, well, the only LCD monitor i have is my laptop's display. AND refresh rate = response time, so your 2ms monitor will be useless if you only give it 60Hz refresh.
And where do you get this from? Refresh rate is response time?...You are totally, but totally wrong there... totally :no:.
Link to comment
Share on other sites

I won't bother replying to most points as they've for the most part all been refuted 3 or 4 times already.

Well, mine sure doesn't! It's got a SiS one.

Funny how a several years-old chipset/motherboard that uses a not very popular chipset differs from today's modern & popular designs. Who would've thought?

Not like this onboard stuff will last long anyway. Sure, built-in chipsets of today are better than a SoundBlaster 16 PCI

How so? I don't expect requiring more than 10 channels of high definition audio anytime soon, nor is the codec going to break or anything like that.

In a lot of cases, there's no point in buying separate cards for commodity jobs anymore. A while ago, various ports (e.g. IDE, floppy, parallel, serial, etc) weren't onboard (ISA cards rather). Then they were integrated onboard, and that didn't make them suck. The network cards -- every board built in the last few years has at least 100mbit ethernet, most of the new boads now even have gigabit ethernet (sometimes two ports even), and they're not bad either (my onboard Realtek NIC even does TCP checksum offloading and more). Same for audio now. When $50 mATX motherboards come with onboard audio as good as a Audigy 2 ZS or similar (a card that used to cost ~$110), and for the most part have FAR better drivers too (all-around better solution, except perhaps if you care for EAX), it's no wonder Creative's shares have been dropping a LOT over the last few years. Their shares peaked to like $30 circa 2000, and now they're down to like 4$ and still dropping (they were removed from Nasdaq last year). There should be more motherboards with basic video onboard (for non-gamers too), and AMD seems to be making this happen.

my argument was that instead of buying something decent as a separate card, which meant you could actually decide what you wanted, it's now all on-board as cheap low-end chipsets.

And what would having to buy a $50 PCI-e SATA controller bring me over the onboard controllers? What would a $50 PCI-e NIC bring to me? What would wasting $100 on Creative garbage bring me? Maybe I should buy a USB card too? The onboard stuff is quite good in most cases, no need to buy a card separately. All buying cards for this stuff would do in most cases, is fill all of the motherboard's slots very quickly, and cost a whole lot more.

Link to comment
Share on other sites

Dude! I don´t get how you think and where your logic is at but this is really not the smartest thing to say. Listen, it has hardware T&L and I have to correct myself there a bit as it´s DX 9.0b and not 9.0c. Then you come with a link to a guy that is talking bullcrap like:
The GMA 950 only has DX9 support. DX10 will run, but will reduce DX10 effects to what is possible in DX9. The X3100 is the first Intel GPU to support DX10.

Err... :blink:

The newer drivers introduced hardware vertex shading, which is a totally different thing from TnL. I linked to a single post in that thread, and could care less what the rest of the thread said.

Taken from OFFICIAL INTEL SPEC SHEET:

Microsoft* DirectX* 9 Vertex Shader 3.0 and Transform and Lighting supported in software through highly optimized Processor Specific Geometry Pipeline (PSGP)

Texture Decompression for DirectX* and OpenGL*

OpenGL* 1.4 support plus ARB_vertex_buffer and EXT_shadow_funcs extensions and TexEnv shader caching

http://www.intel.com/products/chipsets/gma950/index.htm

There. Now you'll say that Intel themselves are lying in their specs?

As about only needing 60Hz on a LCD, well, the only LCD monitor i have is my laptop's display. AND refresh rate = response time, so your 2ms monitor will be useless if you only give it 60Hz refresh.
And where do you get this from? Refresh rate is response time?...You are totally, but totally wrong there... totally :no:.

The response time determins the maximum refresh rate, and the maximum FPS rate that the monitor can actually display. Running a lower refresh rate is only going to make things slower.

You can contradict me all you want, i'm off to patch some stuff in a game, see you tomorrow.

Edited by Th3_uN1Qu3
Link to comment
Share on other sites

1. The Intel GMA950 (940GML, 945G<x> chipsets) can NOT do hardware transform and lighting nor advanced vertex shading *in hardware*. This was not added to the chip hardwre until the GMA X3000/X3100 (GL960 and GM965 chipsets). It can be done in software, but it's not done in hardware on a GMA950 chip.

2. LCDs do have a refresh rate, but it's not as important as response time. The equivalent response time (from dark to bright and back to darkness) from an LCD to equal a CRT refresh of ~120Hz is 8.33ms. The reason this is not as important is that unlike a CRT, which uses the refresh rate both as a way to image and illuminate the screen, an LCD uses the shutter (no refresh rate) to illuminate a pixel, and thus the response time is what measures this. The reason a CRT needs to refresh (refresh rate) is that otherwise you'd get "flicker", the result of the phosphor decay on the screen after the energy from the electron gun that transferred to the phosphor material behind the screen starts to decay (slowly) until it is refreshed when the electron beam from the gun hits that phosphor location again.

Since LCD monitors do not use phosphors to display on their screen, refresh rate is not of real concern to most people. The LCD transistors used to illuminate the pixels stay open or closed until told to switch. However, the refresh rate of an LCD does matter in some instances, if the hardware supports it. A refresh rate of 60Hz means that you are capped at refreshing a pixel 60 times a second, which is usually fast enough. Saying it doesn't matter at all is not correct, but saying it's just as important as response time is also incorrect, for almost all applications. Since some of you guys are gamers I'd assume you will end up eventually havng an application fall outside that "almost all applications" bucket :), but for everyday use 60 or 75Hz (again, check your hardware specs for the optimal refresh rate for the LCD and the card) should be perfect.

Link to comment
Share on other sites

The response time determins the maximum refresh rate, and the maximum FPS rate that the monitor can actually display. Running a lower refresh rate is only going to make things slower.
Are you also drunk? You must be sharing drinks with BenoitRen.

As Cluberti already explained, LCD refresh rates are tied to their response time. If fact you couldn't be more wrong. The fact that the interface is running at 60Hz means that even if your response time was 0ms you'd still be able to update the screen no faster then 60 times a second. The response time only affects how long it takes for the image to change once it receives instructions to change in one of those 1/60th of a second lapses. So the response time has nothing to do with the maximum refresh rate nor does it have anything to do with FPS. Get yourself another beer and go watch TV.

Link to comment
Share on other sites

The Intel GMA950 has no TnL hardware onboard

Meh. It doesn't really matter. It was a moot point in the first place.

Like I said, I don't actually need any 3D performance (no games at all), so 3D features are pointless, and the potential lack of them is an absolute non-issue.

It's not like TnL will help when I'm checking email, paying bills, coding something, playing mp3's, burning discs or whatever other everyday tasks.

If anything, I think he made my point: his "OMG GMA sucks" stance seems to be all about the lack of 3D features (which are pointless for me -- it's like complaining a bike has no doors, even though they're clearly not needed), and no actual reasons why it sucks so bad for everyday non-gaming tasks (i.e. why I shouldn't use it). Framerate is also of little concern for things like browsing web pages, typing a word document, viewing por err... I mean photos! Seriously, I'm perfectly happy with the performance of a 10 year old card (except perhaps for video decoding acceleration). Onboard video is plenty fast for a lot of people.

Link to comment
Share on other sites

The Intel GMA950 has no TnL hardware onboard
Meh. It doesn't really matter. It was a moot point in the first place.
I agree with you - I used an onboard ATI M200 chip for the last few years in my desktop with shared memory because I just write code, do some web browsing/email, and run some VMs. No point in paying for anything but the bare minimum, and onboard is fine with me. My new box doesn't have an onboard video like the last, but I still bought a relatively low-end ATI Radeon video card that runs aero just fine, and while there are some here who would say it's underpowered, I say, "for what?". I do 2D only, like yourself, and a GMA950 would be fine for what I do as well. I was merely pointing out some of the inaccuracies in the threadline.

I actually did look for a box with onboard video too, but couldn't get a motherboard that met my needs with an onboard video, or I'd probably be running an onboard Intel or AMD again.

Link to comment
Share on other sites

Excuse me jumping into the middle of a very interesting topic, but as regards this:

Agreed, my P1-133, 32MB RAM, onboard ATi Graphics Xpression with 2MB dedicated, 2x 3Com Fast Etherlink XL PCI, uses like what, 20W, if that. But i don't see it doing anything else than what it's been doing in the last 3 years - acting as a router. Older PCs with power consumption lower than today's low-power chips can't serve as PCs anymore.

what software do you use to turn this into a router?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...