Content Type
Profiles
Forums
Events
Everything posted by jcarle
-
That simply won't happen. That's what file-locking is for. Yes, if a file is in use, either a defragger or any other application that tries to work with a file will get an access denied error, but all read and write functions to disk have safeguard that check for file-locking. Data just simply doesn't get "lost". If it's read from disk to be rewritten elsewhere, it not erased from it's original location unless it was succefully written in it's new location. If any application was to "steal" a cluster on the disk that the defragger was about to write to, the defragger would simply get an error that the cluster is already in use and would try writing to a different cluster that it sees as free and would repeat doing so until either a ) it reaches a pre-defined timeout b ) reaches a pre-defined retry limit or c ) sucessfully writes to disk. Also, hard drives have data pipelines that are single file. You cannot get a race condition on a hard drive. There is no possibility for the hard drive attempting to write twice to the same cluster SIMULTANEOUSLY. Unless you're working with low level disk functions, modern operating systems are intelligent enough to prevent this from happening in the first place. As for FAT? Data integrity is much higher on NTFS then on FAT. I can't remember just how many thousands of times I've run a chkdsk on a FAT partition, even on modern drives with modern operating systems, where it reported "lost clusters" or "cross-linked clusters" when there wasn't even any hardware issues. NTFS has a much higher stability and things like that don't occur unless there is a serious hardware fault.
-
http://www.ntcompatible.com/IPX_protocol_t34199.html
-
IPX is not supported (and never will be) on Windows Vista.
-
Or without any serious financial backing from some deep pockets.
-
There is no SATA 1 mode. ? Motherboards and hard drives that support Serial ATA have certain specifications that they conform to which determine the capabilities of each. Serial ATA is the protocol, regardless of which parts of the specifications they support. Serial ATA is Serial ATA. There is no SATA I nor SATA II (See: Dispelling the Confusion). There is only SATA. A Serial ATA device can support any number of the functions of the protocol, including transfers of up to 1.5Gb/s or 3Gb/s. Wether a device is 3Gb/s capable and the motherboard is 1.5Gb/s capable, or vice versa, has no bearing on how they operate together. They communicate via the Serial ATA protocol. It's no different then USB 1.1 vs USB 2.0. USB is USB, but USB 2.0 has a much higher transfer rate. Regardless of transfer rates, a USB 2.0 device talks just fine to a USB 1.1 port and vice versa. Obviously to get the maximum transfer rate, both device and port must be USB 2.0 to have the same feature set. People often make the mistake to think that if a motherboard supports Serial ATA at 3Gb/s then it must support features that were developed with that revision of the protocol, such as NCQ. That simply isn't the case. A Serial ATA port can operate at 1.5Gb/s and support NCQ, and a Serial ATA port can operate at 3Gb/s and NOT support NCQ, and of course, a Serial ATA port can operate at 3Gb/s and support NCQ. So it doesn't matter what type revision of Serial ATA the drive has or what revision of Serial ATA the motherboard has, they are fully interoperable. Transfer speeds will operate at the highest available commonly to both devices and features will be available which are commonly available between both devices as well. If you combine a motherboard that supports 1.5Gb/s, NCQ and Hot Plugging with a hard drive that supports 3Gb/s, NCQ and Staggered Spin-Up, then 1.5Gb/s and NCQ are the only features that will be available since those are the only features that both devices have in common.
-
There is no risk in defragging your files in the background. No more so then actually USING your computer. Why do people always assume this mentality that because computers had difficulty multitasking back in the early 90s that it's always the case. Today's operating systems, applications and hardware are so sophisticated that there is no risk whatsoever by defragmenting in the background. You run no more a risk of corrupting your data by defragging in the background then you do by defragging in the foreground exclusively. You run no more of a risk then copying a file from one folder to another, saving a document, renaming a file, making a change in the registry. Why? Because no data corruption will occur if the computer is in good health and operating normally. No matter how heavy the load, no matter how many threads are running, no matter how much ram is in use, nothing will corrupt on a computer in good health, background or foreground wise. If a computer is in bad health, either software/driver wise or hardware wise, it doesn't matter what you do, just using your computer will corrupt data. Also, if your data gets corrupted due to a power outage of some sort, you data stands a change of getting corrupted no matter what you were doing. Wether you were defragging in the background or saving a word document, anytime there's an instant power loss, you risk the chance of data corruption. Lest you forget that a modern processor will perform 3 to 4 billion calculations per second and write 60 to 80 million bytes of data to the hard drive in a single second. Gigabytes of data are read and written to hard drives, passed around memory, crunched through the processor in a single minute of intensive work on a modern computer. No data is ever corrupted. So how is there any more risk with a defrag operation running in the background then there is by moving gigabytes of data all over the place?
-
Development timelines is perhaps one of the most crucial aspect of any development language. The truth of the matter is that which programming languages survive and which one die are weighted heavily upon by their developmental time scales. If you can create Microsoft Office 2007 to fit on a 1.44MB floppy and run on a 486, sure that's fast and efficient coding, but if it takes you three years to develop because it's coded fully in ASM to accomplish such a feat? Does the company or the client care if the suite comes on a CD or even on a DVD? Nope. But they both will care that it takes only 6 months to develop the new version. With .NET I can create powerful database driven web powered applications in a blink of an eye. How long would it take to write the code to create the TCP/IP sockets to connect to the database, write a data handling module to interact with it, write data population code to extract the information from the database and publish it. Then the time to write another set of TCP/IP socket code to work around HTTP, extract commands, parse and post back. In .NET I simply go, Connect to said database, load said table into the data grid, and when clicked here, download this file from this url. Done. I want perform a DNS lookup? There's a function for that. Download a file? Function for that. Resize an image with high quality bicubic resampling? There's a function for that. I get ten time the work done in one tenth the time. And you know, because of the power of the .NET framework, if ALL windows applications were coded in .NET, applications would take up less space. Office would be only a few MBs instead of the 700MB it is now. Adobe Creative Suite would come on a single CD instead of five. All the duplicate code that all the different applications write would be eliminated. Why have fifty applications that write their own custom TCP/IP connection code when they could use a single .NET function that would have all this code optimized already and uniform across all applications? Tweak the TCP/IP functions that make up the .NET framework and now you've improved performance in ALL applications simultaneously. I hardly see any downsides when compared to so many benefits.
-
How many hard drives do you have? What type of hard drives are they (IDE/SATA)? Are they connected as Master or Slave? How many devices per cable? What brand / size / age are they? How many partitions? What is your motherboard?
-
There is no SATA 1 mode.
-
Same reason I stopped running CPU Idle Pro. The fact that there were never any idle cycles left caused a lot of different problems. Diskeeper did not run as it was meant to, XP did not perform it's idle time optimizations, certain scheduled low priority tasks did not run as intended as well. There are a lot things that rely on the expenctancy of eventually running into idle cpu cycles.
-
The KB article you linked to, which I'd like to point out is only regarding the MFT (exclusively), is new to me as I have NEVER seen that happen on any machine, nor first-hand, second-hand or third-hand. The screenshots are sure indicative of someone who does not keep his system defragmented, nonetheless, the fragmentation you point to has no influence on the boot sector. The MFT itself has 0 fragments, other files serve only as extended information or storage for NTFS. Also, if you have a ~1GB USN Journal then you probably have an application installed that has enabled volume change tracking. A corrupt $UsnJrnl or $LogFile would result in different errors then NTLDR missing, as seen in this KB article.
-
I disagree. There is a significant improvement of a Windows XP based computer up to the 2GB mark. There is a noticeable difference between 1GB and 2GB, especially when using large applications such as Microsoft Office, Adobe Creative Suite, the Nero Suite. 2GB is definately worth it. As for the system not using the full 2GB, that's not true unless you spend the entire time that you use your computer with Windows on idle. Even running a few large applications simultaneously is enough to start making good use of that 2GB of ram. 4GB of ram is about the maximum I'd recommend for a desktop system due to 32-bit limitations. Not everything has moved to 64-bit yet. Also, you missed a crucial aspect of his question. The answer is simply Yes. There is a significant improvement in speed going from 333MHz DDR to 400MHz DDR.
-
Blank after enter the windows! need urgent help pls!
jcarle replied to jingder's topic in Windows XP
Do you have an on-board video card as well as a dedicated video card? -
-
The first part, regarding disk failure, LLXX beat me to it. The second, I wanted to mention, Diskeeper DOES defragment the MFT. From within Diskeeper: 1) Right-Click on the drive, choose Boot-Time Defragmentation. 2) Check Enable boot-time defragmentation to run on the selected volumes. 3) Check Defragment the Master File Table (MFT). 4) Reboot.
-
You know, I don't understand what people's gripes are with .NET. I don't like .NET applications because they require the .NET framework... which is akin to the logic I don't like VB6 applications because they require the VB runtimes... which is akin to the logic I don't like Windows applications because they require access to Win32 APIs. You might as well just end with, I don't like Windows. .NET is the next generation of programming on the windows platform. It's very similar to Java, holds some of the similar roots, but it's funny, no one complained because it was a Sun product. So just because it's MS, it's bad? I just don't get it.
-
True enough. The thing is, if the person is skilled enough to disassemble, wouldn't you imagine they'd be skilled enough to work around obfuscators?
-
At the current moment, there are no Server 2000 ULs available.
-
I would suspect either a bad IDE cable or a Master/Slave conflict.
-
Well, I find that for an 8-port SATA RAID card, the 3ware 9500S-8 is reasonably priced at $480. It's interesting how some people swear by WD and some swear against it. I've never had a single dead hard drive. I've heard third-hand of people having problems with 2MB drives and a series of 120GB (i think it was). Second-hand of ONE person getting a dead 250GB. Out of the hundreds I've sold, installed, seen, I've never seen any that were dead first hand.
-
I'm honestly not sure what's on the D: partition, but I would suggest using Partition Magic and moving over your D: partition to say, 2GB. adding 15.5GB to your C:.
-
In this world anything is possible. I replaced a hard drive once because the owner had opened it to see what it looked like inside.
-
I wouldn't bother... if someone REALLY wants to disassemble your code, they will. Obfuscated or not.
-
Should'a bought a Linksys. But if you want to try to save yourself with this, try a ) updating the router's firmware to the latest version, then b ) resetting it to factory defaults followed by c ) a cold reset ( remove power, wait 5 mins, put back ).
-
Very cheap place for 80wire PATA cables?
jcarle replied to Jaqie Fox's topic in Hard Drive and Removable Media
I'll see what I can score up. I have a few "friends" I can squeeze.