Jump to content

InvaderFromMars

Member
  • Posts

    8
  • Joined

  • Last visited

  • Donations

    0.00 USD 
  • Country

    United States

About InvaderFromMars

Contact Methods

  • Website URL
    http://

InvaderFromMars's Achievements

0

Reputation

  1. Guys, you completely missed the point, or maybe you didn't. Hey, I didn't intend to bruise anyone's post count. You say: "The difference with the new cards is that their power isn't all about raw pixel pushing ability." Of course, geez, read my post. What did I say? It has always been like that, but there is one major difference nowadays. Quality (new technologies like AA, AF, HDR, etc) scaled along with pure speed (raw pixel processing power, ROP count, and/or GPU MHz). That is not the case anymore. For comparison purposes: We went from cards with GPUs that ranged from 80Mhz to 125MHz in late 1999 to mid 2000, to cards with GPUs clocked up to 275Mhz a year later. In the same year ROP count doubled. Also nVidia, ATI, and others introduced new technologies, including new filter and/or processing techniques. You say: "The cards are doing A LOT more work than those old GeForce4 cards." Which is true for every new generation of video cards, except when it comes to raw-pixel pushing power (core clocks, ROP counts) which are not scaling so well as when compared to what happened from 1997 to 2002. Anyone here remember anything from GeForce 5xxx series? Exactly. You say: "Basically it comes down to the fact that the card and game manufacturers realized they don't have to push tons of pixels to get the same, and in most cases much greater, image quality. " Image quality and max AF/AA filtering means nada when your "insert game title" is running @ 6fps with. Case in point, or several: entire GeForce 5xxx series, most of GeForce 6xxx series, and even chunk of GeForce 7xxx series, (everything below and including 7600GT) Anyone remember XGI's Volari Duo waste of silicone? All pretty much cards that had fancy new technologies but couldn't perform and/or showcase those technologies due to the lack of raw power. This was only corrected with introduction of 256bit cards with hefty ROP counts, such ATI's Radeon x8xx series and nVidia's upper tier GeForce 7xxx cards. You say: "Yes, for older games like say Quake III they may not be that much faster. " Yes, indeed, and it is an embarrassment how that particular segment of development has stalled, regressed at some points even. I mean, XGI's Volari Duo? multi-GPU? Last time someone tried that they went bankrupt. That was 3Dfx.. Actually, what is the current XGI GPU line up.......exactly. You say: "But when you have a system that can constantly maintain 100+ frames per second how much faster do you need it to be?" If that was the case yeah, but like I said, all parties in question are actually dropping their ROP counts (and decreasing the raw power of their cards in many cases, with the exception of high-end models) which when coupled with stagnating GPU speeds, means really, REALLY weak performance when compared to older cards, even ones dating to the early part of this decade. Sure, some newer cards can run Quake III with AA/AF @ same levels an older GF3TI/GF4TI series could and get better frame rate. But, lower the AA/AF or turn it off completely, and those older cards will not only perform as fast, but in some instances faster. And "a system that can constantly maintain 100+" as you put it, is really new CPUs, RAM, and similar technologies coming into their own. You say: "They're focused on increasing the capabilities the cards can do so that there's higher image quality" I've owned probably 35-40 "2d/3d accelerators" *chuckle* in the last 10 years or so. Starting with several Voodoo/Voodoo2 cards back in the 90s. (Few of which are still in my basement somewhere) I even tried to make a list year or so ago. 2D image quality has remained roughly the same, really, except when it comes to high end professional GPUs geared towards professionals. 3D image quality, and by that I mean various filtering and/or processing techniques, such as AA/AF or hardware optimizations that bettered the way GPUs communicated with VRAM, or the system bus for that matter, well... they have all come pretty far. HDR shure is purtty. But raw pixel pushing power has stagnated, especially if you look at the last 2-3 years. I remember many industry professionals predicting GPU's clocked @ 1Ghz or more, and cards with astronomical ROP counts slated for middle of this decade. It didn't happen. CPU industry hit the same wall, but they tried to correct it with multi-core chips. Hey who knows, maybe true multi-GPU cards in the spirit of Voodoo 6000 are not that far off. I guess introduction of SLI and CrossFire could be a part of that. "^ in short: quantity != quality <---- read above.
  2. My EVGA 8800GTS scored only 5% to 8% at best, better on older DirectX 7/8 and OGL benches than my much, MUCH older X850XT. Yeah 8800GTS has 96/96 fragment/vertex pipelines vs. 16/6 on the X850XT, but 8800GTS has only 20 ROPs vs 16 on the X850XT.
  3. Anyone notice how cards nowadays are lacking for a lack of a better word in the pixel pushing department? (pixel-fill-rate-wise that is). In fact many times with newer cards as far as raw pixel fill-rate power is concerned, they are slower than the cards from the previous generation. (What's the ROP count on R600s?) I mean, I understand cost cutting and the fact that these corporations are in it for the money, but if my shiny brand new $450 card from nVidia and/or ATI scores lower under 3DMark2001SE (or any other DirectX 8.x benchmark, no-AF, no AA included) than say my 5+ year old GeForce4 TI, then I would be a bit concerned with the state of the industry. It's all shaders, shaders, shaders nowadays. Kind of like what auto manufacturers are doing. Why waste cash on developing better engines or similar, when you can just slightly redesign the shell, (make it look "wicked", or whatever the PR department decides is "wicked" for that particular year) and slap $50 dollars worth of "bling-bling" decals or what-not (charge extra $3000 for it), and call it a "brand new model". Latest cards might be new, but they ain't much faster, raw-power wise. Call me old fashioned, but I like me some pixel fill-rate.
  4. Yeah, that's the one. Pretty neat, since the libraries would work under Win98 as well. Of course, you would need to build an application around those libraries. In the mean time, I've found some tools that allow me to control the transparency of my taskbar and/or various explorer windows and other applications under W2k. (TransBar, Glass2k, and Effective Desktop) Problem is, none of them can turn the Start Menu and corresponding program groups transparent. Oh well...
  5. Are there any applications and/or patches out there that would allow W2k users to use themes/msstyles from XP? I think I found one (Theme Engine?), but it is pretty expensive ($99 US). Anything free out there?
  6. Worked perfectly. Icons look much better now. Good job man.
  7. This pack seems severely unstable as well as potentially dangerous. (As far as certain system files are concerned). Good luck with it, but I ain't touching it, yet anyhow. Anyone know of a way to skin just the taskbar in Windows 2000. I mean everything else can be done with separate icon packs, wallpapers, and similar.
  8. I have Win2k, and a decent system hardware-wise. My system does not come out of stand by most of the time (75%), and I am forced to reboot/reset. 25% of the time it comes out of standby there are serious issues: for some reason system is super-slow and keeps getting slower, until after about 5 minutes after coming out of standby and then its unusable and I need to hard reset. (mouse pointer gets slower and slower, application won't open, and even start menu won't pop-up anymore). Any ideas what could be causing this? Could this damage my hardware in any way?
×
×
  • Create New...