Jump to content

Poll and Discuss Defragmentation Software


DigeratiPrime

Defragmentation Software  

81 members have voted

  1. 1. Which do you reccomend?

    • Windows Default
      9
    • Diskeeper
      12
    • Raxco PerfectDisk
      14
    • O&O Defrag
      7
    • Piriform Defraggler
      5
    • Auslogics Disk Defrag
      4
    • DiskTrix UltimateDefrag
      2
    • Sysinternals Contig / PageDefrag
      0
    • JkDefrag
      12
    • UltraDefrag
      3
    • mst Defrag
      0
    • Other
      8
    • None
      5


Recommended Posts


And IF :ph34r: people need to do some data recovery (file based) they should be very happy :whistle: about not being anymore able to get larger than 64 Mb contiguous files on 7:

Let’s walk through an example that helps illustrate the complexity in directly correlating extent of fragmentation with user-visible performance.

In Windows XP, any file that is split into more than one piece is considered fragmented. Not so in Windows Vista if the fragments are large enough – the defragmentation algorithm was changed (from Windows XP) to ignore pieces of a file that are larger than 64MB. As a result, defrag in XP and defrag in Vista will report different amounts of fragmentation on a volume. So, which one is correct? Well, before the question can be answered we must understand why defrag in Vista was changed. In Vista, we analyzed the impact of defragmentation and determined that the most significant performance gains from defrag are when pieces of files are combined into sufficiently large chunks such that the impact of disk-seek latency is not significant relative to the latency associated with sequentially reading the file. This means that there is a point after which combining fragmented pieces of files has no discernible benefit. In fact, there are actually negative consequences of doing so. For example, for defrag to combine fragments that are 64MB or larger requires significant amounts of disk I/O, which is against the principle of minimizing I/O that we discussed earlier (since it decreases total available disk bandwidth for user initiated I/O), and puts more pressure on the system to find large, contiguous blocks of free space. Here is a scenario where a certainly amount of fragmentation of data is just fine – doing nothing to decrease this fragmentation turns out to be the right answer!

Note that a concept that is relatively simple to understand, such as the amount of fragmentation and its impact, is in reality much more complex, and its real impact requires comprehensive evaluation of the entire system to accurately address. The different design decisions across Windows XP and Vista reflect this evaluation of the typical hardware & software environment used by customers. Ultimately, when thinking about defragmentation, it is important to realize that there are many additional factors contributing towards system responsiveness that must be considered beyond a simple count of existing fragments.

Nice new features, BTW:

Defragmentation in Windows 7 is more comprehensive – many files that could not be re-located in Windows Vista or earlier versions can now be optimally re-placed. In particular, a lot of work was done to make various NTFS metadata files movable. This ability to relocate NTFS metadata files also benefits volume shrink, since it enables the system to pack all files and file system metadata more closely and free up space “at the end” which can be reclaimed if required.

... but still they could have made more options available in the GUI instead of needing to go on command line and use the -w switch:

http://www.howtohaven.com/system/vistadefragmentation.shtml

(on Vista) and later remove completely the option in 7.

http://www.sevenforums.com/performance-maintenance/30019-full-defrag-option-via-cmd.html

http://www.techrepublic.com/blog/windows-and-office/exert-control-and-defrag-from-the-command-line-in-windows-7/4306/

jaclaz

Link to comment
Share on other sites

always having 0% defragmentation is not recommend to have good performance. You only cause a lot of disk IO for having 0.1% improvement.

Yep, my point was different, let alone for the moment performance, it is not the ONLY parameter, a fully contiguous file can be parsed (and recovered) in case of very serious filesystem issues, a fragmented one - particularly the kind of file that may be bigger than 64 Mb - hardly so.

I was not ranting about removing the automagic provision, only about being allowed to manually ask for a "full" defrag (it would have costed nothing to NOT remove an existing feature).

Moreover the 64 Mb "limit" is undocumented/unverified/untested, the size may well be connected to a number of factors that the good MS guys did not consider or did not test at all, type of usage of the machine, type of storage subsystem, type of files in use on average, etc.

I won't buy that a "magic" 64 Mb one-size-fits-all actually fits all "properly".

jaclaz

Link to comment
Share on other sites

always having 0% defragmentation is not recommend to have good performance.

well that is also impossible, at least on NTFS systems as they suck when something needs to be reallocated

I was implying "real versus MS results"

and I'll take any 3rd party app solution over what MS imposes as "truth" or their "model for..." whatever crap

Link to comment
Share on other sites

I always go by logic, not feeling

if X program (no matter which, there are tons of them) show some file or files are fragmented, then they are fragmented
there is no reason why should X company dictate what MEANS to be fragmented by THEIR standard and what doesn't

this is the same reason why people are customizing their OS-es
because given UI isn't good and because certain components are useless

yet again imposed by X company

Edited by vinifera
Link to comment
Share on other sites

  • 2 months later...
  • 2 weeks later...

Apparently you've missed the relationship between fragmentation and prefetch, beginning with XP.

XP's prefetch observes and keeps track of what files, and even file pieces, are loaded when XP and the applications load.

This information is used by XP's defrag to optimize the position of the files on the disk. That is, Defrag put contiguously files that are loaded at the same time, which competitors don't do. I even believe to have read - but am unsure - that XP's defrag splits files if the prefetch senses that some files are not loaded in one piece.

As consequences:

- In a multiboot installation, one should not defragment files used by one OS using the software on the other OS

- Different defragmenters have different opinions about what is a degragmented disk - easy to check.

- Competitors appear to operate more quickly, but the result is slower

- You can make the difference by ear, on a mechanical disk, especially during the OS boot. XP's defrag does a better job than the competitors.

Link to comment
Share on other sites

Apparently you've missed the relationship between fragmentation and prefetch, beginning with XP.

Or maybe you have missed how Mydefrag (but not only it) does have a provision to deal with Prefetch AND according to this test:

http://www.hofmannc.de/en/windows-xp-defragmenter-test/benchmarks.html

the result is actually faster than "standard" Windows defrag.

jaclaz

Link to comment
Share on other sites

  • 1 year later...

There's no defragger which would be best in an absolute sense, esp.

in multi-boot environments, as someone above correctly pointed out.

Now to the poll's simple question "which defragger do you use (in practice)?",

my simple answers :

- for casual whole-disk defragging, Windows' built-in - more precisely, the

defragger built-in to the "windoze" which that particular disk is most tied to.

I do not defrag linux "ext" or "reiser" partitions. Not that Windows (whichever)

defragger does the best possible job, but it's counter productive to run different

defraggers on successive occasions, since each defragger has different "ideas"

and algorithms and so, it will take a long time destroying the work of the previous

defragger... The choice is not critical, but choosing and keeping to one and the same (at least, per disk partition) is more important that using the "best" for whatever measure of "best"... Oh, and free (no cost) ones are good enough, keep the money for other goodies.

- for special tasks, viz when needed to defragment a single file or afew selected files : Sysinternals' contig.exe , defraggler...

Those were my 2 cents, take or leave... :=)

Edited by Ninho
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...