Jump to content

NTLDR is missing error


morocco31477

Recommended Posts

I use Diskeeper 2007 so i shouldn't have to defragment my drive, as InvisiTasking should do it for me. But as I do the SETI@Home, I never have any idle resources for it, since SETI takes 100% CPU usage but in low priority. I suppose I should stop that, then.

Same reason I stopped running CPU Idle Pro. The fact that there were never any idle cycles left caused a lot of different problems. Diskeeper did not run as it was meant to, XP did not perform it's idle time optimizations, certain scheduled low priority tasks did not run as intended as well. There are a lot things that rely on the expenctancy of eventually running into idle cpu cycles.

Link to comment
Share on other sites


You should just leave the system on overnight, with no unneeded programs running, just for a defragging.

Less risky than having it defrag in the background, and also ensures most if not all of the files do get defragged.

Link to comment
Share on other sites

You should just leave the system on overnight, with no unneeded programs running, just for a defragging.
Sometimes I'm up till 5 AM and my "overnight" hours vary constantly.
Less risky than having it defrag in the background, and also ensures most if not all of the files do get defragged.
How is it risky to defrag in the background? All files do get defragmented and an analysis will show that.
Link to comment
Share on other sites

Less risky than having it defrag in the background, and also ensures most if not all of the files do get defragged.

There is no risk in defragging your files in the background. No more so then actually USING your computer. Why do people always assume this mentality that because computers had difficulty multitasking back in the early 90s that it's always the case. Today's operating systems, applications and hardware are so sophisticated that there is no risk whatsoever by defragmenting in the background.

You run no more a risk of corrupting your data by defragging in the background then you do by defragging in the foreground exclusively. You run no more of a risk then copying a file from one folder to another, saving a document, renaming a file, making a change in the registry. Why? Because no data corruption will occur if the computer is in good health and operating normally. No matter how heavy the load, no matter how many threads are running, no matter how much ram is in use, nothing will corrupt on a computer in good health, background or foreground wise.

If a computer is in bad health, either software/driver wise or hardware wise, it doesn't matter what you do, just using your computer will corrupt data. Also, if your data gets corrupted due to a power outage of some sort, you data stands a change of getting corrupted no matter what you were doing. Wether you were defragging in the background or saving a word document, anytime there's an instant power loss, you risk the chance of data corruption.

Lest you forget that a modern processor will perform 3 to 4 billion calculations per second and write 60 to 80 million bytes of data to the hard drive in a single second. Gigabytes of data are read and written to hard drives, passed around memory, crunched through the processor in a single minute of intensive work on a modern computer. No data is ever corrupted. So how is there any more risk with a defrag operation running in the background then there is by moving gigabytes of data all over the place?

Link to comment
Share on other sites

If the defragger and a bunch of apps are all trying to access the disk at the same time, then there'll obviously be contention; either the app fails to open the file, or the defragger leaves it alone and "fills in" the space around the file.

When you have several programs open, you have more open files, and that's where data can get lost when something bad happens (in addition to the filesystem mess left by the defragger). With complex filesystems like NTFS, the chances that it didn't finish the file location rewriting are even higher than with simple FSs like FAT.

Link to comment
Share on other sites

diskeeperanalysisik8.png

I did a lot of disk intensive activity yesterday and overnight, Diskeeper had my drive down to 0 fragments except for when I resized my page file and browsed the net just now to get to this topic.

LLXX, I think you don't know what you're talking about at all.

Link to comment
Share on other sites

If the defragger and a bunch of apps are all trying to access the disk at the same time, then there'll obviously be contention; either the app fails to open the file, or the defragger leaves it alone and "fills in" the space around the file.

When you have several programs open, you have more open files, and that's where data can get lost when something bad happens (in addition to the filesystem mess left by the defragger). With complex filesystems like NTFS, the chances that it didn't finish the file location rewriting are even higher than with simple FSs like FAT.

That simply won't happen. That's what file-locking is for. Yes, if a file is in use, either a defragger or any other application that tries to work with a file will get an access denied error, but all read and write functions to disk have safeguard that check for file-locking.

Data just simply doesn't get "lost". If it's read from disk to be rewritten elsewhere, it not erased from it's original location unless it was succefully written in it's new location. If any application was to "steal" a cluster on the disk that the defragger was about to write to, the defragger would simply get an error that the cluster is already in use and would try writing to a different cluster that it sees as free and would repeat doing so until either a ) it reaches a pre-defined timeout b ) reaches a pre-defined retry limit or c ) sucessfully writes to disk.

Also, hard drives have data pipelines that are single file. You cannot get a race condition on a hard drive. There is no possibility for the hard drive attempting to write twice to the same cluster SIMULTANEOUSLY. Unless you're working with low level disk functions, modern operating systems are intelligent enough to prevent this from happening in the first place.

As for FAT? Data integrity is much higher on NTFS then on FAT. I can't remember just how many thousands of times I've run a chkdsk on a FAT partition, even on modern drives with modern operating systems, where it reported "lost clusters" or "cross-linked clusters" when there wasn't even any hardware issues. NTFS has a much higher stability and things like that don't occur unless there is a serious hardware fault.

Link to comment
Share on other sites

Besides the fixes suggested here. The best way to resolve this if a fixmbr does not resolve it:

1. Boot into repair mode on the XP cd.

2. CD to the cd drive

3. navigate to the i386 dir

4. cd_drive:\>expand ntdlr.dl_

5. reboot...

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...