Jump to content

Official - Windows 10 Worst Crap Ever!


bookie32

Recommended Posts


24 minutes ago, vinifera said:

everything 1.2 MB
US doesn't create one

nor should any MFT searcher... but oh well...

"To determine the current size of the MFT, analyze the NTFS file system drive with Disk Defragmenter, then click the View Report button. The drive statistics will be displayed, including the current MFT size, and number of fragments. You can also obtain the size of the MFT by using the FSCTL_GET_NTFS_VOLUME_DATA control code."

https://msdn.microsoft.com/en-us/library/windows/desktop/aa365230%28v=vs.85%29.aspx

Link to comment
Share on other sites

1 hour ago, MikeyV said:

Eh, ok. You had asked why use a db, i was trying to shed some light on why i feel its better.

I would say that there is a need for some benchmarks in order to provide an objective reason for this (or that) preference, a $MFT search is usually "instantaneous" or "nearly instantaneous", and the "separate database" approach need to beat that (besides somehow the time taken to index needs to be taken into account).

I presume that on a "very static" filesystem there may be some advantages with the "external database" approach, whilst on a filesystem where files are continuously deleted/moved and created/copied the direct $MFT search would be faster overall, as the "indexing service" would eat CPU cycles.

While we are at it, some previous discussions and pointers:
 

http://reboot.pro/topic/18855-windows-file-search-utility-that-is-fast/

jaclaz



 

Link to comment
Share on other sites

Uh oh, I feel a pet peeve coming on...

I've always regarded any form of indexing as rather silly, myself.  One of the first things I do on any Windows system is turn Microsoft's indexing completely off.

Regarding filename searches...

Just as an exercise I just searched my 2 TB C: volume on my Win 8.1 workstation, which has about 1.2 million files holding a total of bit over a terabyte of data on it, for *srgb*.*.

  • DIR C:\*srgb*.* /s /b finished in 12 seconds.
     
  • Using *srgb*.*, the freeware GUI tool grepWin (which I like) found all the files with srgb in the name in 22 seconds, with an advantage being that the results were put in a nice table that could then be sorted, used to open the files, etc.  Also, grepWin can do regular expression based searches, exclusions, etc.
     
  • Windows Explorer, with search term filename:*srgb*.*, took 3 minutes and 10 seconds.

Granted, 22 seconds for grepWin is not "instantaneous", but for filename searches is there a real need for indexing to speed that up further?  Granted, this kind of search is likely to be slower on a non-SSD-equipped system, but the file system cache does make up for that in large part. 

And I can't remember the last time I had to search my entire volume for something.  Generally speaking I know what subfolder a file is going to be in and a search is instantaneous.  As an example, searching my C:\Astronomy folder tree (3,959 files in 335 folders taking up 266 GB) takes less than 1 second no matter which tool I choose.  Even Explorer does the job in a passable amount of time if you're not searching the whole volume.

Protection from lost files is what backups (and good practices) are for, no?  If you're concerned over the history of your files mysteriously coming and going, I imagine you could schedule a nightly job to store a full DIR of all files on the disk in a log file named with the date/time at regular intervals.  That would also have the advantage of regularly loading your file system structure into the cache, which would facilitate interactive work...

Regarding content searches...

I don't know about you but when I search for something I expect the results to be rigorous.  I'm not interested in finding something, I'm interested in finding ALL of what I'm looking for, or know the reasons why I can't.  If I get no results, it has to be because there are NO files that have that data, not because "well, maybe that data wasn't indexed", or "maybe the index was out of date", or "maybe the index was corrupted", or "gee, that file couldn't be opened, so its content was not indexed".

To presume that whatever indexing system in place has anticipated EVERY POSSIBLE SEARCH one could ever want to do is ridiculous, and there's NO WAY it could index everything on the disk - that would just be stupid.  Even those that strive to index all possible printable words could miss something you desperately need to find that contains special characters.  Maybe I'm just too geeky, but I'm as often using searches for special characters as not, and using regular expressions to find complex combinations of them.

"It's fast, I don't care!  It blows up in midair!!"

My advice: 

If you're struggling so hard to find your data so often that you feel you need to index it to speed up the process of searching for it, then 1) you should consider striving to organize it better, and possibly 2) you should seek better equipment that can keep up with you.

Keeping a separate index - which of course could be out of date - besides competing for disk access with your interactive work, reminds me of an old adage:  "A man with one watch always knows what time it is.  A man with two watches is never sure."

Indexing disk data for general computing use is, IMO, just a bad idea.

-Noel

Link to comment
Share on other sites

What topic?  That Windows 10 provides the exact same indexing, without improvement, as prior versions that leaves Windows Search even more inexact and hardly faster than not having indexing?

That's kind of the point.  No improvements means Microsoft has abandoned trying to make the software we use better.

-Noel

Edited by NoelC
Link to comment
Share on other sites

Reading this and other similar threads, I've been recommending all my friends to upgrade back to Windows 7, and I'm testing the viability of moving to a carefully-tuned 8.1 when 7's support is dropped.

However, I have recently chosen not to give in to all the fear mongering ("one MUST run the latest, most up to date software on the latest, most up to date hardware AT ALL TIMES, or the entire Internet will die"). So, with that, I will begin running XP as my main Windows OS again, hardware permitting (running XP on modern hardware has become increasingly difficult, and in some cases, impossible, so I'll run 7 in those cases). The main thing I'd need a newer version of Windows for (Pro Tools) I use on a Mac anyway, so there's nothing holding me back.

c

Edited by cc333
Link to comment
Share on other sites

+1.  What My1 said.

In summary:

I have applied every bit of my knowledge and expertise to trying to bend Win 10 into becoming a workhorse, and I have achieved a measure of success, but once all of Microsoft's current screw-ups have been removed you find that everything they've done since Win 8 has been removed, and then you find some other things (like desktop usability) have simply been degraded - thus there is really no good reason to use Win 10 as opposed to a well tweaked Win 8.1. 

If somehow Win 8.1 weren't available to me (e.g., I buy a brand new system) I could probably settle for an App-culled, no-Metro/Modern/Universal Win 10 that needs reconfiguration after many of Microsoft's updates and still be able to get my work done.  But then I would fondly remember Windows 8.1 and pine for the golden days of computing.  And I would worry that I would lose even more functionality with each and every new Microsoft update.

-Noel

Link to comment
Share on other sites

3 hours ago, NoelC said:

... there is really no good reason to use Win t10...

Hi, 

Of course.

I've spent many hours trying to obtain a manageable and smaller Windows 10 to run just that which I really need, and every effort was a complete failure. This OS is made to manage you, and not to be managed by users. So I went back to my dear Windows 7 and will keep using this until Microsoft rectifies this lack of consideration towards us all, I'm afraid.

Link to comment
Share on other sites

Speed is important, but for me personally, being able to search drives on the local system, on the network, on ftp, or 5 or 6 laptops from a single db, makes 3rd party software my preference. As i said above, having the db for a failed / lost / stolen drive is also a bonus for me. Sure i dont have the actual data, but not knowing what exactly was lost is insult to injury (imo).

You do need to manually update the db or specific volumes, for informational purposes, just took 00:00:03 to update db of a drive of 723 folders, 10,277 files, with a total size of 640.52 gb (sata 7200).

8 seconds to search for *srgb*.* in current db of 62,516 folders, 1,182,903 files, of 8.58 tb.

With something like the above, it allows me to disable windows services that help 'speed up' searches.

Its all personal choice, use what works best for you :) I really dont have anything to do with DEP, i wont make money if you use it, and i wont be sad if you dont.

Link to comment
Share on other sites

another thing that always puzzled me ...
why didn't devs made some easier to catalog db way
and then simply for a start ship within WIM whole OS indexed

later anything would be more easier on the fly to index (maybe) other things
or simply some conditions should be met by indexer, for instance to avoid indexing DLL's and simmiliar
just EXE's and since METADATA was always their goal since Longhorn, simply to scan known documents/files
music, text, movies, images...

while other things for us more picky people, let other file types go directly via MFT

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...