Thunderbolt 2864 Posted February 25, 2008 Share Posted February 25, 2008 Well, you NVIDIA fanboys are going to be in for a huge disappointment. If this is the final specs for the 9800 series, I will not buy it. Seriously, the first 8800 series had a 320 bit memory interface, why downgrade it back to the 256 bit? Like they did for the 512 8800 GTS, which was very surprising and very disappointing. Seriously, if NVIDIA is going to release the 9800 series with a 256 bit interface, I will lose a lot of respect for them.And I was told that the 9800 was going to be a 512 bit interface. This is a whole bunch of BS, I'm telling you. All this wait for nothing and I'm going to be disappointed? Not happy. Link to comment Share on other sites More sharing options...
jcarle Posted February 25, 2008 Share Posted February 25, 2008 Don't believe anything until you see an official annoucement from nVidia. Link to comment Share on other sites More sharing options...
puntoMX Posted February 25, 2008 Share Posted February 25, 2008 And I was told that the 9800 was going to be a 512 bit interface. This is a whole bunch of BS, I'm telling you. All this wait for nothing and I'm going to be disappointed? Not happy. Nothing to worry about, logic tells that more then 128 shader units would need more bandwidth, thus a 512bit bus to the memory would be more likely. Speculations about the 9800GTX are 384 shader units with 1GB RAM (GDDR3 or most likely GDDR4). Link to comment Share on other sites More sharing options...
nmX.Memnoch Posted February 25, 2008 Share Posted February 25, 2008 why downgrade it back to the 256 bit?I've read that they did some more compression optimizations that actually gives them more bandwidth with the 256-bit interface on the 9-series chips than the had with the 32-bit interfact on the 8-series chips. Time will tell, but I can't see them releasing the top end chips with only a 256-bit interface either.....but stranger things have happened. Maybe instead of a single 256-bit interface it'll be 2x256-bit. Until solid information is released this is all just speculation anyway. Link to comment Share on other sites More sharing options...
ripken204 Posted February 26, 2008 Share Posted February 26, 2008 ya i have been following this card and so far it looks like a dud.i just hope that it comes out before my step-up runs out. and it better be worth it! Link to comment Share on other sites More sharing options...
Thunderbolt 2864 Posted February 26, 2008 Author Share Posted February 26, 2008 (edited) Well, somebody posted this image:Great NVIDIA, you create a higher generation card that has lower specs than the previous generation. Really, NVIDIA, what a wrong move. I think I'll be keeping my 640MB GTS or get a 8800 GTX when its cheaper. Very disappointing indeed.I wonder if the GX2 is any good since it has dual GPU's. I'll wait for the benchmarks to be released first before making my move. But seriously, I doubt that the 9800 GTX is the card for me. All this wait and it has already been a huge disappointment. And the new ATI cards don't look to appealing either. And games being released these days are kinda demanding, and this is the best NVIDIA could do? Boo. Edited February 26, 2008 by Thunderbolt 2864 Link to comment Share on other sites More sharing options...
puntoMX Posted February 26, 2008 Share Posted February 26, 2008 You beleve that?So you beleve this too?:And then compare your image in your post with this: Link to comment Share on other sites More sharing options...
ripken204 Posted February 26, 2008 Share Posted February 26, 2008 well there have also been 3dmark screens posted.my OCed 8800GTS G92 beats the 9800GTX by about 1600pts in the graphics test(disluding the cpu tests)if this is true then WTF NVIDIA! Link to comment Share on other sites More sharing options...
puntoMX Posted February 26, 2008 Share Posted February 26, 2008 No no no... this is not true...Both you guys, just snap out of it... Link to comment Share on other sites More sharing options...
Thunderbolt 2864 Posted February 27, 2008 Author Share Posted February 27, 2008 With those 3D Mark screenshots posted I'm already quite convinced, unless if somebody was fooling around with us and faked those results then I'll be happy. Until I see the official specs when its released I'm going to be annoyed with these poor specs - and a higher generation hardware should be more superior than its predecessors, not downgrade it. Link to comment Share on other sites More sharing options...
jcarle Posted February 27, 2008 Share Posted February 27, 2008 (edited) Anyone who believes all this gossip before nVidia officially announces anything is a fool. If you want to see something official WORTH speculating about, nVidia acquired Ageia. For those of you who don't know who that is, they developed the PhysX processor. Edited February 27, 2008 by jcarle Link to comment Share on other sites More sharing options...
ripken204 Posted February 27, 2008 Share Posted February 27, 2008 cool, i didnt know the bought ageia. that is definitely worth speculating about! Link to comment Share on other sites More sharing options...
suryad Posted February 29, 2008 Share Posted February 29, 2008 It is I think readily available on the internet if you do a search but Nvidia said that they were goign to follow Intel's tick-tock strategy. So that means one product will launch will be a new architecture launch whereas the tock would be then just an architecture refresh, process refresh etc. I think the 9800 falls in line with the tock considering how much of a colossal boost the 8800 series were. Link to comment Share on other sites More sharing options...
kalo Posted February 29, 2008 Share Posted February 29, 2008 It's a mid range card guys - Keep ya pants on. Just because they release a new line of cards, doesn't mean the first one has to be high end. Does it? Link to comment Share on other sites More sharing options...
suryad Posted March 1, 2008 Share Posted March 1, 2008 Are you calling the 9800Gx2 card the high range? Even if it is so, the performance increase is 30% over an ultra...not bad but when you conisder it is having 2 GPUs in it...its kinda sad. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now