Jump to content

atomizer

Member
  • Posts

    566
  • Joined

  • Last visited

  • Donations

    0.00 USD 
  • Country

    United States

Everything posted by atomizer

  1. you'll have to do some searching, but i think you can just make a very simple edit to 'boot.ini' (in your 98 installation) to get your boot selection back. here's a generic copy of a 'boot.ini', for a single installation: [boot loader] timeout=30 default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS [operating systems] multi(0)disk(0)rdisk(0)partition(1)\OS="Microsoft Windows XP Professional" /fastdetect and here's a sample 'boot.ini' for a dual boot, XP and ME, for which each OS resides on a different hard drive (a different physical drive, not a different partition on one drive): [boot loader] timeout=30 default=multi(0)disk(0)rdisk(0)partition(1)\WINDOWS [operating systems] multi(0)disk(1)rdisk(0)partition(1)\WINDOWS="Windows XP Professional" /fastdetect multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Windows ME" /fastdetect hope this helps. another quick way to boot XP may be to manually select the boot volume in BIOS. rather than making changes to BIOS, there may be a key that you press which will ask you what drive you want to boot from (in my case it's 'F11). typical choices are CD/DVD drives, USB (if something is plugged in, like a flash drive), floppy and your hard drives. here you'll be able to select any recognized drive, regardless of whether it has the necessary files to boot windows.
  2. there's obviously multiple problems here... @asflyboy when memtest finds problems, that doesn't necessarily mean your memory is bad. the timings or voltage could be wrong (i always have to up the voltage in my case or i get errors as well). if you poke around on the web, you'll find the proper settings for your particular memory. just wanted to mention that before you go and buy new modules @every one else... as for nLite not copying files from NON-OEM installations, others have evidentially found that their CD/DVD drives may be the problem in some cases. the drives could possibly be getting to hot during the read, or were burned at too slow of a speed. suggestions: 1) update the drives firmware. the codeguys is a great resource for hacked firmware that can fix problems with the manufacturers firmware and enable 'hidden' features for the drive. of course it could potentially create problems as well, though i've had good success with their firmware (in one instance i successfully turned my single layer DVD burner into a double layer burner) . you can always turn to your vendor for firmware as well. 2) this is not the 'final word', but i read a very good article about burning speeds. it states to ALWAYS burn at the MAXIMUM speed supported by your burner/firmware/media and goes on to explain exactly why. one reason is heat - if you run a fast burner at a slow burn speed, you'll heat up the disk which can lead to problems, including data corruption. if i remember right, this is because the laser is more powerful in fast burners, so it doesn't have to sit in one place as long to burn correctly. 3) if setup won't copy files, just stop where you're at and try just letting the drive cool down for several minutes, then click 'retry'. if they copy then, you know the drive got too hot during the read. 4) if you went and bought a bundle of no-name CD's, well, you often get exactly what you pay for. they may work fine for MP3's, but may fail miserably when copying important data. you'll notice that on good media, it says right on the package that it is 'CERTIFIED' to burn at 'x' speed. cheap media doesn't have that certification. who certifies it, i don't know. 5) if you bought a no-name burner... need i say more. the same may hold true if you have an OEM burner that was supplied in an OEM machine. as for nLite not working properly with OEM installation sources, i can only guess, but my guess would be that OEM installations are often (if not always) modified to suite the desires of the manufacturer. i could see where nLite may have serious problems in this case, though nuhi would be the one to query on this. as for suspicions of nLite not working on OEM machines in some instances (when using a NON-OEM installation), i wouldn't doubt that at all, though it's probably not the fault of nLite. again, OEM machines often use customized installations of windows. for instance, my box at work is an HP Pavilion (100% piece of JUNK). after i got sick of XP home, i decided to load XP Pro on it. that was the end of the on-board sound chip - i could not get a driver to install to get the sound working and eventually used an extra sound card i had lying around.
  3. you don't want to hear my solution (hint)
  4. @newbie it's a double edge sword... if you want all the bells and whistles, then there's going to be more issues - bigger patches, longer d/l's, slower loading, etc., etc.. other state-of-the-art engines suffer problems as well, though i wonder if any suffer as much as steam/source. one of the tools valve uses to make development decisions is the hardware survey thingy. if you look at the results for those surveys (http://www.steampowered.com/status/survey.html), you'll find that the average specs are pretty high, so they design accordingly. problem is, they evidentially don't do enough public testing and the way that steam is implemented is a joke IMO. i like the general concept of steam, which is to make it easy to deliver new content and updates, but the fact that running the games, and even the SDK for cripe sake, is dependent upon steam being loaded is simply stupid. this is just a guess, but from what i've seen it appears that a lot of the content developers for pre-steam HL games have abandoned the community. many of the old websites are stale or gone. one of the reasons may be because of the way HL content is packaged now (in GCF archives). it makes it more difficult to work with. QuArK, a popular level editor, is difficult at best to get working anymore. Hammer, valves editor, is rather limited compared to QuArK which is a very powerful, open source tool. i think there's little doubt that steam and the buggy source engine has divided the community. non-steam (cracked) releases are very popular and the following for pre-steam releases is still formidable.
  5. update... after i got my array back yesterday, i booted into my normal windows partition and uninstalled sandboxie. it asked for a reboot, which i didn't bother doing. i just kept running until i shut my PC off later on. upon loading this morning: "NTDLR" is missing! so i had to fix that.
  6. yeah. and it worked i see lots of assumptions here, but does anyone actually KNOW why he was pied?
  7. your more than welcome peoples! and yes, CUBE is a fantastic quick-fix! it's not hugely popular, so it's common to experience a lot of empty servers. often though, if you join, someone comes along. promote it and the community will grow
  8. i was always an Intel fan - till i switched to an AMD x86-64
  9. never played much console. i did screw around with grnad theft auto when a friend got it last christmas. i don't know what console it was - either xbox or ps2 i imagine. i was really surprised at how horrible the graphics were considering how long the consoles have been around. maybe that's not the rule though. still, i much prefer the PC - awesome graphics, awesome sound (5.1), better controls.
  10. on your 98 PC: 'start > settings > control panel > add/remove progs > add/remove windows components' and see if you can't find the pinball game in there somewhere (though it may not be i suppose. never used SE).
  11. welcome to the world of wonderful reasons to NOT buy an OEM of course that may not be the problem at all. install the network (off-line) version of SP2, as suggested earlier and see if that does it.
  12. your comment about being less accurate while firing a weapon when jumping is obviously very correct, no doubt. the reason i said it's unrealistic is because the 'random' effect is far more exaggerated in CS than it is in the real world. sorry if i didn't explain that well. same with the rising barrel when full-auto (though it'll go up and left if you're left handed). it occurs in real-life, but it's relatively easily compensated for and very possible to hold a much tighter pattern at a given distance than in CS. i mention these difference because they are so dramatic. if CS weapon physics were even remotely close to real-world, i wouldn't bother. maybe, just maybe, in the hands of the very inexperienced a real-world weapon would exhibit similar behavior as in CS, but CS is all about terrorists and counter-terrorists - weapon experts. my comment about firing a lot of weapons refers to real-world - M16 (military), SKS, 22, 243, 30-06, etc., a bucket load of pistols and various other weapons. i'm by no means considered good with any weapon, yet the dramatic difference between real-world weapons and CS weapons doesn't stack up. long before CS was played professionally (CPL), QUAKE was king and there is no random trajectories in QUAKE (unless recently added). is CS more realistic? sure it is. my opinion is that it goes too far however. at any rate, i would guess that i'm the minority. still, it has been suggested many times that valve release a 'pro' version of CS that removes a lot of CS's limitations, causing the game play to be based on accuracy and skill and not how well you can compensate for random trajectories (which cannot be compensated for totally, even by the best player because they are just that; random).
  13. of course valve is pushing CS:S. they're a business and need to make money i don't know that anyone can tell you which one is better. there's basically 2 camps: the old school and the new school and the 1.6 followers are many. i think it'd be safe to say that 1.6 is more of a 'hard-core' game. it's a little leaner and devoid of all the source physics and eye candy. you could say it's more pure. source is loaded with specular graphics, effects and physics (like when you kick a barrel or shoot objects). you could say it's more realistic, aside from the weapon dynamics which are pretty much the same as in 1.6. i have 2 pet peeves with all the later versions of CS, including CS:S: 1) there's a short period where your movement slows down after jumping. it's not realistic at all and i hate it. 2) bullet trajectories: weapons are far less accurate when jumping for instance. they are the most accurate if crouching and not moving. if you fire more than a 3 or 4 round burst from any automatic weapon, the bullets rise and you have to try and compensate. it tries to add some realism to the game, but it's not realistic at all (i've shot a lot of firearms). i just don't think that random bullet trajectories belongs in a game that is (or used to anyway) based on skill. i'd guess another reason for it is to help even the playing field between the newbs and the seasoned players, which i think it does to an extent. still, i hate it. BTW, although your system is better suited to 1.6, i would think you could run CS:S just fine. maybe some tweaking would be in order though. source is quite a bit more demanding on your hardware.
  14. whewwwww! i got my array back! i never lost an array before and i thought i wan in for a real headache. thank whatever powers that be that i had a 1 GB swap partition to install a parallel OS so i didn't have to lose any data! addressing your questions... i didn't disable anything because i don't think i have anything enabled or installed that could inhibit the sandbox installation. i even have that "prevent data execution" thingy set to "noexecute=AlwaysOff" in 'boot.ini'. i have a firewall, but i doubt that was an issue. besides, it's set to prompt me on unknown connections and the file protection part of it is disabled. i have an AV scanner, but i use ClamWin and it's very small, unobtrusive and doesn't embed itself into the system like many others do. it's also on-demand only, so it's not scanning anything in the background. as for booting in safe mode, that's not possible when your system drive RAID array is off-line the problem had to do with one of the disks not being assigned to the array, not both, or so it appeared. and yes, i changed the Sandbox Top-level folder option to set it to 'D:', my storage array. and yes again, i will absolutely post my issue in the sandboxie forum. first i better cool down though in order to prevent the 'F' word from surfacing several times just kidding i'm just happy i fixed the array. and i didn't have to install a parallel OS to do it, but i couldn't figure out how to do it in the promise BIOS so i needed internet access to search around. as it turns out, you can just delete the array and create a new one - however this MUST be done carefully or you risk data loss. PS... thanks for the forum link and you input
  15. !!! ATTENTION WILL ROBINSON !!! if i had the author of that program by the neck right now... BE CAREFUL! here's what happened to me... i have 4 drives, 2 in a RAID 0 and another 2 in a RAID 0, though this may have NOTHING to do with it. i installed the app where i install everything; on the system drive. it launched fine after installation. i played with config options and set the sandbox directory path to 'd:\sandbox' (i created that directory prior). i also set it to not load at win boot and, i think, to not display the tray icon, or something. then i closed it down and launched it again from the first menu item in the shortcut folder. CRASH! blue screen. went by so fast i didn't make out a word. the PC rebooted fine. hmmm... wonder if it'll do that again? this time i chose the 2nd menu item. same thing happened. that's where i should've quit!... but NOOOO the 3rd time i launched, i think, the bottom shortcut. crash again! when win bluescreened, i hit pause and started to copy the info. all i got was unhandled exception in sandbox.sys and then it rebooted. this time when the BIOS for my promise controller loaded, there was an error and my promise array, which happens to contain my system drive, was off-line! i tried to repair it, but have been unsucessful so far. i'm posting this from a parallel install on my via RAID array and hoping i'll be able to recover my primary partition on the promise array.
  16. the page file won't fragment as long as it's a permanent page file, meaning that min/max are equal size. if they are not, it can fragment. however, for the most part, i would tend to agree that excessive tweaking of the page file(s) isn't necessary for the most part. the time and wear and tear saved by reduced disk thrashing is probably negligent. still, it's so easy to do and, after all, i'm a die-hard tweakster
  17. you may want to read about that tweak here. @ripken204 first, i'm no expert. obviously there's a lot of widely varying opinions on this hotly debated subject. even the 'experts' often disagree. the way that i handle the page file is based on articles from people whom i believe to be creditable. in my case, i have 4 disk drives; 2 in a RAID 0 config that holds the OS and all programs and 2 more in a RAID 0 config that holds data, media, backups, storage. firstly, windows will (according to MS) use the page file on the disk that is least used. also, before messing with page files, it's probably best to disable all page files > reboot > delete any 'page file.sys' files > defrag - then do what you need to. keeping that in mind... in 'system properties > advanced > startup and recovery > settings' i set "write debugging information" to "none". this will allow you to set a small page file on the system drive while bypassing the error message about it being to small should windows crash. i always keep a small, permanent page file on the system drive (and according to what i've read, you should always have a page file on the system drive. period.). in my case, it's 20 MB (it's permanent because both the min and max settings are set to 20 MB). on my second drive, i create a partition exclusively for a page file. it happens to be 1 GB. in that partition, i create a semi-permanent page file having a minimum size of 50 MB and a max of 1019 MB (the total size of the partition). this avoids unnecessary fragmentation of the page file and allows windows to increase its size, IF it needs to. here's some very good reading on the subject: http://www.tweakguides.com/TGTC.html http://aumha.org/win5/a/xpvm.php - particularly good http://aumha.org/win5/a/xpvm.php note that putting a page file on a 2nd drive (not partition) may only be a good idea if that drive is as fast, or almost as fast, as your system drive. of course if you only have 1 drive, or only 1 fast drive that is your system drive, then the options are fewer. read the articles i linked to to learn more. EDIT: perhaps a couple things could've been clarified better... i don't create a separate partition for the page file on the system drive - there's no need to if you create it the right way (see links and info above). both its min and max are set to 20 MB, so it won't fragment. if it can't fragment, then there's no need to put it in its own partition. the main page file, on my 2nd drive, goes in a partition because it's semi-permanent. it can grow and shrink, so it can fragment. i want to keep that under control, thus the separate swap partition. @ripken204 so, to answer your question, i'm not completely sure. i would suggest that if you think you need a large page file, you stick it in it's own partition and make it semi-permanent (a conservative min size and a max = to the partition size). if it's a permanent page file (min = max), then you don't need a separate partition. my best answer is, don't create a large page file to begin with. use a semi-permanent page file in its own partition. this will keep it from getting scattered all over the place while, at the same time, keep its size in check since you'll allow windows to increase it's size as needed. also, according to what i believe to be creditable references, IGNORE any advise on setting the page file size according to RAM capacity. that's ludicrous. one popular formula is to set its size to RAM x 1.5, so if you have 64 MB of RAM, your page file will be 96 MB. the less RAM you have, the BIGGER the page file needs to be. by the same token, using that formula for 2 GB of RAM results in a page file size of 3 GB !?! the more RAM you have, the smaller the page file can be - to a point! it shouldn't ever be disabled, regardless of how much RAM you have.
  18. steam is great - for valve. it bypasses the middleman and enables direct sales which = higher profits. i honestly commend them on their insight. however, for may end-users, steam is nothing more than a big fat PITA. i'm one of them. after years of being actively involved in the HL community, providing artwork, scripts, running a server, etc., i finally just gave it up and i gave it up for no other reason than steam[ing-pile-of-s**t]. having to be on-line at least once in order to enable off-line mode - ??? having to be on-line in order to launch the SDK - a joke steam server is down - really? what else is new? friends is down - again? (or should that be "still?") having to run an app in the background (steam.exe) in order to play a game that doesn't need it to run - silly ads, displayed through the steam UI - for a game i PAID FOR??? there's over a 1/4 million posts in just ONE topic on the steam forums, many dealing with steam issues.
  19. @Zxain - thanks for the kind words @JRosenfeld - are you sure you were looking at the XP sheet? server is in there regarding the 'Alerter' service, i described it as such: then i found this description from another resource: when the author of the second quote mentions 'NetAlertRaise' and NetAlertRaiseEx', what the heck are they referring to? are they API's used by applications run on the local machine, or on a remote machine? because if if this refers to local apps. then my description (the 1st quote) could be wrong.
  20. libraries, yes
  21. @amenx can't agree more. i hate the bLOAt of .NET, all the files/reg entries and even a new user account and yet another service. at least nuhi, one of the nLite dev's, provides the runtimes so you don't have to install all the crap. i wonder if the runtimes could be made to work with other apps as well???
  22. 9x only supports 137 GB partitions natively. one way around it might be to create multiple partitions. from just a little google work i see there are other work-a-rounds as well, just be sure that other disk utilities (defrag, etc.) will also support larger partitions.
  23. .NET is not backward compatible. some apps will require 1.* while some require 2.0.
  24. glad you got it working!
  25. forgive my poor atitude, but i very much dislike OEM boxes. nothing but problems. the hardware is almost always sub-par, no matter what the specs are. we have HP's at work, about 2-3 yrs old now. they have P4 2+ GHz processors and my old P3 800 box at home, which is much older, runs faster and everything works, whereas the HP's are all losing their floppy and CD/DVD drives. the MB looks like something you'd get at a garage sale in 1970. i've had other OEM's in the past as well. no more. DIY
×
×
  • Create New...