vinifera Posted September 5, 2016 Share Posted September 5, 2016 (edited) everything 1.2 MB US doesn't create one nor should any MFT searcher... but oh well... Edited September 5, 2016 by vinifera Link to comment Share on other sites More sharing options...
MikeyV Posted September 5, 2016 Share Posted September 5, 2016 24 minutes ago, vinifera said: everything 1.2 MB US doesn't create one nor should any MFT searcher... but oh well... "To determine the current size of the MFT, analyze the NTFS file system drive with Disk Defragmenter, then click the View Report button. The drive statistics will be displayed, including the current MFT size, and number of fragments. You can also obtain the size of the MFT by using the FSCTL_GET_NTFS_VOLUME_DATA control code." https://msdn.microsoft.com/en-us/library/windows/desktop/aa365230%28v=vs.85%29.aspx Link to comment Share on other sites More sharing options...
vinifera Posted September 6, 2016 Share Posted September 6, 2016 why ? its pointless, MFT by itself is sort of DB, and fast search programs should use it instead indexer 1 Link to comment Share on other sites More sharing options...
MikeyV Posted September 6, 2016 Share Posted September 6, 2016 Eh, ok. You had asked why use a db, i was trying to shed some light on why i feel its better. Link to comment Share on other sites More sharing options...
jaclaz Posted September 6, 2016 Share Posted September 6, 2016 1 hour ago, MikeyV said: Eh, ok. You had asked why use a db, i was trying to shed some light on why i feel its better. I would say that there is a need for some benchmarks in order to provide an objective reason for this (or that) preference, a $MFT search is usually "instantaneous" or "nearly instantaneous", and the "separate database" approach need to beat that (besides somehow the time taken to index needs to be taken into account). I presume that on a "very static" filesystem there may be some advantages with the "external database" approach, whilst on a filesystem where files are continuously deleted/moved and created/copied the direct $MFT search would be faster overall, as the "indexing service" would eat CPU cycles. While we are at it, some previous discussions and pointers: http://reboot.pro/topic/18855-windows-file-search-utility-that-is-fast/ jaclaz 1 Link to comment Share on other sites More sharing options...
NoelC Posted September 6, 2016 Share Posted September 6, 2016 Uh oh, I feel a pet peeve coming on... I've always regarded any form of indexing as rather silly, myself. One of the first things I do on any Windows system is turn Microsoft's indexing completely off. Regarding filename searches... Just as an exercise I just searched my 2 TB C: volume on my Win 8.1 workstation, which has about 1.2 million files holding a total of bit over a terabyte of data on it, for *srgb*.*. DIR C:\*srgb*.* /s /b finished in 12 seconds. Using *srgb*.*, the freeware GUI tool grepWin (which I like) found all the files with srgb in the name in 22 seconds, with an advantage being that the results were put in a nice table that could then be sorted, used to open the files, etc. Also, grepWin can do regular expression based searches, exclusions, etc. Windows Explorer, with search term filename:*srgb*.*, took 3 minutes and 10 seconds. Granted, 22 seconds for grepWin is not "instantaneous", but for filename searches is there a real need for indexing to speed that up further? Granted, this kind of search is likely to be slower on a non-SSD-equipped system, but the file system cache does make up for that in large part. And I can't remember the last time I had to search my entire volume for something. Generally speaking I know what subfolder a file is going to be in and a search is instantaneous. As an example, searching my C:\Astronomy folder tree (3,959 files in 335 folders taking up 266 GB) takes less than 1 second no matter which tool I choose. Even Explorer does the job in a passable amount of time if you're not searching the whole volume. Protection from lost files is what backups (and good practices) are for, no? If you're concerned over the history of your files mysteriously coming and going, I imagine you could schedule a nightly job to store a full DIR of all files on the disk in a log file named with the date/time at regular intervals. That would also have the advantage of regularly loading your file system structure into the cache, which would facilitate interactive work... Regarding content searches... I don't know about you but when I search for something I expect the results to be rigorous. I'm not interested in finding something, I'm interested in finding ALL of what I'm looking for, or know the reasons why I can't. If I get no results, it has to be because there are NO files that have that data, not because "well, maybe that data wasn't indexed", or "maybe the index was out of date", or "maybe the index was corrupted", or "gee, that file couldn't be opened, so its content was not indexed". To presume that whatever indexing system in place has anticipated EVERY POSSIBLE SEARCH one could ever want to do is ridiculous, and there's NO WAY it could index everything on the disk - that would just be stupid. Even those that strive to index all possible printable words could miss something you desperately need to find that contains special characters. Maybe I'm just too geeky, but I'm as often using searches for special characters as not, and using regular expressions to find complex combinations of them. "It's fast, I don't care! It blows up in midair!!" My advice: If you're struggling so hard to find your data so often that you feel you need to index it to speed up the process of searching for it, then 1) you should consider striving to organize it better, and possibly 2) you should seek better equipment that can keep up with you. Keeping a separate index - which of course could be out of date - besides competing for disk access with your interactive work, reminds me of an old adage: "A man with one watch always knows what time it is. A man with two watches is never sure." Indexing disk data for general computing use is, IMO, just a bad idea. -Noel Link to comment Share on other sites More sharing options...
My1 Posted September 6, 2016 Share Posted September 6, 2016 well if the indexing happens while idling it isnt a problem imo. but can we get back to topic please? Link to comment Share on other sites More sharing options...
NoelC Posted September 6, 2016 Share Posted September 6, 2016 (edited) What topic? That Windows 10 provides the exact same indexing, without improvement, as prior versions that leaves Windows Search even more inexact and hardly faster than not having indexing? That's kind of the point. No improvements means Microsoft has abandoned trying to make the software we use better. -Noel Edited September 6, 2016 by NoelC 1 Link to comment Share on other sites More sharing options...
cc333 Posted September 6, 2016 Share Posted September 6, 2016 (edited) Reading this and other similar threads, I've been recommending all my friends to upgrade back to Windows 7, and I'm testing the viability of moving to a carefully-tuned 8.1 when 7's support is dropped. However, I have recently chosen not to give in to all the fear mongering ("one MUST run the latest, most up to date software on the latest, most up to date hardware AT ALL TIMES, or the entire Internet will die"). So, with that, I will begin running XP as my main Windows OS again, hardware permitting (running XP on modern hardware has become increasingly difficult, and in some cases, impossible, so I'll run 7 in those cases). The main thing I'd need a newer version of Windows for (Pro Tools) I use on a Mac anyway, so there's nothing holding me back. c Edited September 6, 2016 by cc333 2 Link to comment Share on other sites More sharing options...
My1 Posted September 6, 2016 Share Posted September 6, 2016 @cc333 well I run win8.1 fine with a couple of mods, especially classic shell and aeroglass by @bigmuscle. and I was one of the number 1 win8 haters back then. but w10 is 2much anyway. Link to comment Share on other sites More sharing options...
NoelC Posted September 6, 2016 Share Posted September 6, 2016 +1. What My1 said. In summary: I have applied every bit of my knowledge and expertise to trying to bend Win 10 into becoming a workhorse, and I have achieved a measure of success, but once all of Microsoft's current screw-ups have been removed you find that everything they've done since Win 8 has been removed, and then you find some other things (like desktop usability) have simply been degraded - thus there is really no good reason to use Win 10 as opposed to a well tweaked Win 8.1. If somehow Win 8.1 weren't available to me (e.g., I buy a brand new system) I could probably settle for an App-culled, no-Metro/Modern/Universal Win 10 that needs reconfiguration after many of Microsoft's updates and still be able to get my work done. But then I would fondly remember Windows 8.1 and pine for the golden days of computing. And I would worry that I would lose even more functionality with each and every new Microsoft update. -Noel 1 Link to comment Share on other sites More sharing options...
My1 Posted September 6, 2016 Share Posted September 6, 2016 25 minutes ago, NoelC said: If somehow Win 8.1 weren't available to me (e.g., I buy a brand new system) that's why I have a win8.1 on disc with coa. Link to comment Share on other sites More sharing options...
cannie Posted September 6, 2016 Share Posted September 6, 2016 3 hours ago, NoelC said: ... there is really no good reason to use Win t10... Hi, Of course. I've spent many hours trying to obtain a manageable and smaller Windows 10 to run just that which I really need, and every effort was a complete failure. This OS is made to manage you, and not to be managed by users. So I went back to my dear Windows 7 and will keep using this until Microsoft rectifies this lack of consideration towards us all, I'm afraid. 1 Link to comment Share on other sites More sharing options...
MikeyV Posted September 6, 2016 Share Posted September 6, 2016 Speed is important, but for me personally, being able to search drives on the local system, on the network, on ftp, or 5 or 6 laptops from a single db, makes 3rd party software my preference. As i said above, having the db for a failed / lost / stolen drive is also a bonus for me. Sure i dont have the actual data, but not knowing what exactly was lost is insult to injury (imo). You do need to manually update the db or specific volumes, for informational purposes, just took 00:00:03 to update db of a drive of 723 folders, 10,277 files, with a total size of 640.52 gb (sata 7200). 8 seconds to search for *srgb*.* in current db of 62,516 folders, 1,182,903 files, of 8.58 tb. With something like the above, it allows me to disable windows services that help 'speed up' searches. Its all personal choice, use what works best for you :) I really dont have anything to do with DEP, i wont make money if you use it, and i wont be sad if you dont. 1 Link to comment Share on other sites More sharing options...
vinifera Posted September 7, 2016 Share Posted September 7, 2016 another thing that always puzzled me ... why didn't devs made some easier to catalog db way and then simply for a start ship within WIM whole OS indexed later anything would be more easier on the fly to index (maybe) other things or simply some conditions should be met by indexer, for instance to avoid indexing DLL's and simmiliar just EXE's and since METADATA was always their goal since Longhorn, simply to scan known documents/files music, text, movies, images... while other things for us more picky people, let other file types go directly via MFT Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now