Jump to content

teqguy

Member
  • Posts

    82
  • Joined

  • Last visited

  • Donations

    0.00 USD 
  • Country

    United States

Everything posted by teqguy

  1. I addressed two topics in my post, one being the fact that multiple identical dlls are being stored on the drive unnecessarily, and two being the fact that, because this is solely your project, you are sole proprietor and thus the only one who's actually making any contribution to it. As far as the system directory is concerned, I did not declare your method invalid... I simply stated that there are more elegant solutions that don't require the dlls and system executables to be ostracized from the system directory. Upon writing this, I can think of several ways this can be accomplished: 1) Using the EWF loader to mount those directories as read only, unmounting them whenever changes need to be made. 2) Running Windows in some sort of sandbox(Sandboxie on a larger scale comes to mind). 3) Having a system profile in which the system and user are allowed administrative access to the Windows partition, and a user partition in which the system and user are allowed solely read-only access to the Windows directory, including restricting the user from making registry modifications, etc, etc, etc. Any of these solutions would reduce the amount of overhead, while adhering to your original guidelines. Now, as far as my second statement is concerned...... I believe that projects get more accomplished when the average user is either able to contribute to it and/or allowed to tailor it to their specific needs. Currently, MicrowinX does not satisfy this idealistic approach, because users come to you for help instead of fixing problems on their own. Granted, I know that not everyone is as knowledegable as we'd all like them to be, but at the same time, if other users were able to make customizations more easily, they would be able to help out those who are still in the dark. Look at how far nLite has progressed on this model. And if you notice, nuhi still retains control over the project... he's just given users the building blocks to make the application their own. This suggestion comes from the observation that this project appears to be weighing you down, and is one that I believe would be in your best interest to consider.
  2. That's a shame, considering that NTFS still manages to(albeit in fewer occurrences) propagate defragmentation. Thus, it continues to be important that you defragment your drive regularly. This comes straight from the horse's mouth: "...the NTFS is still subject to fragmentation. Unlike memory, if parts of files are stored in different physical locations on the hard disk, it takes longer to read the file. Unfortunately, Microsoft has not provided any defragmentation software for Windows NT, which is available in abundance for Windows 95/98." Source: http://www.microsoft.com/technet/archive/w...n.mspx?mfr=true "NTFS Does Get Fragmented The Windows NTFS File System Driver uses a special file called the Master File Table (MFT) to track all files on that volume. The MFT starts out with some free space to allow new files to be tracked, but on a very busy system it too can run out of space. At this point NTFS extends the MFT itself, creating new stretches of it for new allocations. This situation is precipitated most often by fragmentation in the file system itself, as file system fragments consume entries in the MFT. If these new stretches are not contiguous, the MFT itself becomes fragmented." Source: http://www.execsoft.com/whats-new/whitepap...p#_Toc463769973 And finally, this last one comes from Raxco, makers of Perfect Disk. While I do believe one of their motives for providing such a writeup is to sell a product, a lot of the information they provide is factual and substantiates my case: (careful, PDF): http://www.raxco.com/products/perfectdisk2...gmentation' Now, as far as you're concerned, gdogg, as I stressed wholeheartedly in my first post, I am absolutely not trying to rock the boat here. However, the fact alone that you have not answered a single one of my questions with a dignified response(which I believe both I and the MSFN community are entitled to), and in addition have gone as far as to say I'm simply trying to spread some persuasion of discontent at either MwinX or you is simply unacceptable for someone of your recent regard. I would not bother conducting myself in a manner as such if I was simply looking to bash you or your project. I would, instead, insult you, proceed to call you names, and create numerous threads on various forums of how much of a fraud you are and how horrible your project is. Do you see any of that? Are you able to find one Google entry related to such a thing? Furthermore, I would not have the persistence and the patience to post in the fashion that I do if I did not want to get my opinion across in a manner that would be open for discussion.... rather than having someone, such as yourself, try to discredit me simply on the basis that I do not share your views on particular matters. To go as far as to threaten me for simply speaking my mind is not at all what this or any forum is about... unless of course the forum is that of fascists. To restate my original point (with hope that you will actually take the time to read it, instead of glossing it over with the simple fact that I'm not in entire agreement with you), I don't disagree that starting from as few system files as possible and adding back as necessary is the right way to go.... because it is. My disagreement lies simply in where you add these files back to. I even went as far as to present my own solution to the security issue, but you must've skipped over that part too.
  3. Naturally, the only way I could ever possibly come to any conclusion is through rigorous testing. This is why I felt the need to provide a commentary. Now don't get me wrong- I'm not trying to discredit your project, just the way you've gone about it. So, in turn, the point of the project is to load every other directory up with identical dlls, not only creating unnecessary bloat, but additional ram usage?How might it increase additional ram usage, might you ask? Well, say you have two applications that need the same dlls. If application A and B are both working out of their own working set, the dlls will be loaded twice instead of just simply being loaded by the subsystem handler and then accessed in shared memory by both applications. It would be like putting DirectX in every single directory for every game you have. I'll believe that when I see it... try doing actual benchmarks and record data for it, rather than just making arbitrary conclusions based on what you percieve to be factual and accurate. Now, as for infecting the system with viruses, everyone knows that once you remove LSASS, most viruses that would usually have some indication of infection either don't execute or simply don't pop their nasty little heads out. I don't believe this to be exclusive to your project. But you see, this is inherently why it has already failed.You're in complete control of what, in essence, MicrowinX is and will become. Therefore, users simply have to go with the flow or come up with their own version based upon what you have started. Unlike some of the other projects on MSFN, users have very little control over what goes in and what comes out, because the options are fairly limited to the applications you and the other people capable of making application patches use. I believe the term "bloat" to be universal: Unnecessary substance. This, to me, includes files that are of use, but are proliferated throughout the system. See, again, the problem is that you're not teaching anyone to fish... you're just handing them the fishing pole and assuming they know how to bait their hook and reel something in for you.If you wanted a "ground up" approach, you should start with a "ground up" approach. Instead, you elect to personally do all of the work.... making it your project, instead of an MSFN community project. At least nuhi and some of the other project developers have the humility to accept criticism and even go as far as to sometimes admit they're wrong. Well if the shoe fits... I honestly believe that if someone is passionate about something, they would fight tooth and nail to get their point across. I'm giving you that opportunity. The ball is in your court. You act like it's such an inconvinience for someone to read a post or two that isn't full of praise. If you're egotistical enough to have ever honestly believed you've created perfection here, I think you need to re-evaluate why you even started this project in the first place. While the others might be valid points, you're barking up the wrong tree when you say you've never had to defragment. Just because you've never done it doesn't mean there isn't fragmentation, nor would I ever recommend that anyone disregard defragmentation, especially with large capacity drives.
  4. Okay.... let me get this straight..... From your understanding, an application that's "stand alone"(ie; requiring no external operating system files), you would rather package tens upon hundreds of copies of the same dlls in multiple folders(which INCREASES bloat), rather than keep them in the system directory(where they belong)? First off, a stand alone application simply means that it's portable; it will work on ANY system you run it on. How is such a feat accomplished? One, the application cannot create numerous registry and user profile entries that need to be accessed(at the very most, it should create a config file that saves user preferences). Two, the application's working directory should never be its own, but rather the libraries that are inherent in every Windows operating system. If you look at applications like uTorrent or Media Player Classic, you'll find that this is the case. Second, I have a better solution that will eliminate all of this additional overhead: Change the directory that dlls and system executables are stored... and then mount that directory in the EWF loader. Now, as for these claims you're making... They seem to be a very tall order that I'm seriously doubting. Sure, you might have lower memory usage and fewer files for viruses(or the user) to fsck up, but lower pings and virus immunity? Do you have any proof, outside of personal testing, to substantiate these claims?
  5. Gdogg, I've been following the MicrowinX project every now and then since its conception and have wanted to address a few concerns: From what I gather, the name of the game is simply stripping out everything and then adding back dependencies as necessary. While I find this tactic to be fairly promising, it came to my attention that mWinX doesn't add the dlls back into the system directory, but rather into the individual applications' directories, which to me seems counterproductive. Why have the dlls in multiple places when you only need it in one? Second, due to the fact that you have deemed the project completely closed source(despite the fact that the source can be extracted fairly easily), you have since chosen to maintain project entirely on your behalf. My question to you is, when is enough, enough? If you release patches for every application requested, isn't the project simply going to turn into the "one size fits all" package we know Windows as today? I'd like to express the fact that I'm not trying to rock the boat here... I just think that this project is headed in somewhat of the wrong direction and if left to continue will end up in a dead end.
  6. Try configuring FFDShow to use another MPEG1 and 2 codec. To use the default one that comes with Windows, just set it to "Disabled" in FFDShow.
  7. Well, now that we've narrowed down that it's not due to your graphics card, we can conclude that the problem either lies in the codec's settings and/or the media player's settings. I am aware of the fact that you're having issues with multiple codecs, but if we can get at least one working properly, we'll be able to narrow the problem down even further. Try popping in a DVD and rendering it with GSpot(http://www.headbands.com/gspot). What does it say your MPEG2 decoder is? Nvidia's MPEG2 codec in particular is one that might have scaling issues in certain players(ie, in MPC, the video becomes 9:16 instead of 16:9, so you have to manually correct it).
  8. In which application? PowerDVD or MPC? In MPC, the option for video size is available when you right click inside the video window.
  9. In MPClassic, you have to set the Video Frame to either "Stretch to window" or "Touch frame from Inside/Outside". If that doesn't seem to help, you can use the numeric keypad to resize the video(9 will make it proportionately larger, 5 will reset it). I'm not familiar with PowerDVD, but it could possibly have the same option somewhere.
  10. Actually, the concept of striving for some sort of pinnacle transcends any specific forum(Why else do we upgrade?), as it just so happens to be the basic principle of Darwinism. "Dealing with things as they currently are" is accepting that they will always(or at least for a very long, indefinite amount of time) be that way. This is what brings about conformity and makes new concepts and ideas dwindle. To get back on topic, though, the way I "deal" with current software is that I don't use anything that can be deemed "bloatware". IMO, any defragmentation suite that has more lines of code devoted to GUI than to defragmentation is what I would consider bloat. If what I call bloat, you call "features", that's absolutely fine... I was simply trying to introduce an alternative utility, because as this forum has demonstrated, there isn't software that suits everyone.
  11. I understand where you're coming from. However, accepting things for what they are denies the very essence of humanity. To try to tone this discussion down a little, I'll comment on one line of your post and nothing more: Well, what if you do, in fact, have tasks that need a tremendous amount of memory? As I'm sure you're aware, encoding, rendering, and games aren't exactly light on resources. There is never an instance where I can't find RAM to devote to a process. For example, during encoding tasks, I use my memory as a ramdrive, which is immensely faster than a hard drive.
  12. That works too. It might be nice to take it a step further and write a batch script that automatically kills them when the application closes, but I guess manually closing them would suffice. Menion, a solution similar to that has already been mentioned, but I'm interested in dynamic partitions. Enlighten us please.
  13. So, I take it that everytime you want to run Perfect Disk, you have to manually re-enable the scheduler and engine services that come with it? If so, we're getting down to the meat and potatoes(don't worry, no food analogies this time) of my argument. Having to enable these services everytime I wanted to defrag is what killed the whole experience of using defragmentation "suites" for me, especially considering I get the same results with contig.
  14. How so? Assuming you're talking about his Firefox bashing, Mastertech's basis of argument is supported by one silly website, and while that website does have a few decent points, I don't believe it's worth debating. While I find hardware debating(or flaming, if you will) completely pointless, I understand that it is partially valid, in that the only leg those people have to stand on is that there is some degree of pressure put on the decision of what hardware to buy. Software, on the other hand, is an entirely different animal... it's fairly simple to develop a preference, because you can always try before you buy. However, to surmise my opinion on both would be this: Current software and hardware do more butting of heads than working together. The most feasible way to relieve this is either get better hardware, get better software, or when time and money persists, both. Both Zxian and you commented on resource usage being superfluous, due to the fact that you have large amounts of RAM. My question to both of you is this: Would you still need that much memory if hard drives could keep pace with system memory? What about if applications were more efficient? Furthermore, if you don't absolutely need a certain application running in the background at all times, why keep it there? Windows' scheduler does a decent job at allowing you to perform all of your system maintenance at a desired time. This would give you the automation you find useful, while not needlessly sucking up resources during downtime.
  15. I apologize for not being able to reply earlier, but the place at which I was staying for the weekend had done a major overhaul on their LAN... result being no connection on every floor but the first . To answer your question, contig is merely a console based application that accepts one argument- namely a file. You are correct in that without passing additional switches to contig, it will only defrag the file you tell it to. However, contig features a switch that allows it to do recursion on a subdirectory, which, if you were to type "contig -s C:", would licit the entire defragmentation of your hard drive. Now, to turn contig into a full-fledged defragmentation tool, all you have to have is a shell that will pass it these arguments. This is where Power Defragmenter GUI comes in: excessive-software.eu.tt. With that and contig in the same directory, open Power Defragmenter, then select "Power Defrag" mode. While Windows Defragmenter itself does not run in the background, it can be sufficed to say that, in essence, the portion of code that does run acts both complementary and partially supplementary to other defragmentation tools.It was my mistake to imply that it did anything superlative to that. However, there aren't any other defragmentation tools that handle Windows' prefetching, so I wouldn't discredit that portion of the tool. Forgive me for summarizing, but I'd like to reaffirm the fact that nowhere was it mentioned that Windows Defragmenter performs well or is even adequate for the job. It was simply used as a base of reference for comparison with other tools. To make my opinion clear, I do agree that Windows Defragmenter is another one of Microsoft's sorry attempts to provide a "one OS fits all" solution, but I don't turn my nose at the fact that it is a defragmentation tool nonetheless. The problem is that these tools don't know whether or not a file is fragmented until they actually do a pass over it. Granted, I don't believe this to be too taxing, considering my drive consists of hundreds of files defragmentation tools absolutely hate- namely, video files and large archives . However, I was simply making this point for the sake of argument(note that I called it a quandry, not a passage in my personal bible). To address this point more thoroughly, though, I don't believe current defragmentation tools are doing all they should be. Specifically, they should be taking advantage of Windows' built in Indexing Service to find out if a file is fragmented or not, and then proceed accordingly. Although, the same can't be said for anti-virus tools, as I find it hard to trust Windows in aiding in the security of itself. Let me start out by asking you to re-evaluate who exactly our target audience is, because I honestly believe we aren't reaching the "average joe" via the MSFN forums. Most of the people on this forum are knowledgeable and aware of their computing habits. They also seem to realize that working with computers is a give and take relationship; ie: what you put into it is what you get back from it. Now, about the so-called "average joe"... I am now under the impression that they would enjoy their life being completely automated. I find this greatly disturbing and completely unacceptable. The modern average joe has grown to depend on such automation so much that they've become over zealous lemmings(not to include politics, but this is especially true in the US), adept at clicking "OK" to every message box that pops up and opening e-mail attachments from unknown addresses. Based on the worst past few viruses and spyware, most of which could've been averted had said "average joe" not accepted what was default or automated and actually bothered to pay attention to what they were doing before they did it. And you think they should have more automation? If that's the case, they might as well just have one of those little birds that works on buoyancy. It worked for Homer. (Note: I am in no way trying to hold contention in order to offend anyone, but I honestly believe that on a forum where people come to modify Windows so that it's sleek, smaller, and faster, they would appreciate having as few background tasks as possible in order to ensure optimal performance) While EXT2 isn't as efficient as Reiser, you might want to check out this kernel mode driver for Windows, as it offers full support for EXT2 in Windows: fs-driver.org
  16. Actually, I have PerfectDisk, Diskeeper, and O&O installed... and out of the three, the only one that sparked any interest was Diskeeper. Why? Less resource usage. However, in that light, they all pale in comparison to contig. Furthermore, I don't believe you have the right to make conjectures about me based solely on my opinions... that's called prejudice.
  17. My best partitioning experiment took it slightly a step further... I had the OS spread over two drives(the system and system32 folders occupied one partition, and the rest of the operating system files occupied the other), a partition for documents and settings, one for applications, one for temporary files, and one solely for the pagefile. The result was undeniably worth the work, with everything staying neat and tidy, although it definitely isn't practical for most. I was unable to determine whether or not Windows actually booted faster, but it definitely seemed like it. My next endeavor will also include a high speed CF card set to read only mode, which I'll put the system and system32 folders on. This should not only reduce boot times, but also protect the system against pesky spyware and viruses. I checked out that defragmentation tool, but was immediately dissuaded from it, simply because it's not freeware.
  18. What I mean is that despite what tool you use, defragmentation always has the same result: the files are no longer fragmented. Therefore, the software and hardware are completely complacent with a tool like Windows Defragmenter. Now, as far as "smart sorting" is concerned, I find it to be a marketing gimmick. Why? Well, if you partition your drive so that the operating system(and only the operating system) occupies the first partition, you not only accomplish what sorting aims to accomplish, but you also ensure that your operating system remains fairly unfragmented, thereby eliminating the need for such a tool in the first place. I could enlighten you on similar partitioning strategies that take this a step further, but I believe this isn't the thread for that.
  19. Jeremy, if you were personally offended by my opinion, I apologize. However, I must reiterate my point: Windows does not differentiate between defragmentation tools, despite the user's preference. This is why fancy GUIs and special features are ultimately negligible.
  20. Okay, I stand corrected. I'll give credit where credit is due. Allow me to reiterate: My argument is not based on the fact that X application is any more efficient or effective than Y application, because the fact of the matter is, unless X or Y is less effective or efficient than our base of reference(Windows Defragmenter), they are, in essence, as effective and efficient as they're going to get. As far as I'm concerned, we're comparing apples to apples. However, the point I'm stressing is that it's frivolous to eat the much larger apple Y if a smaller apple X will satisfy your hunger just as easily, because what isn't used isn't necessary, which means it's simply wasteful.
  21. It's become my understanding that for the typical user, an application that has tons of features(most of which remain unused) is usually contradictory to usability. How often does the average joe actually need to connect to a remote computer and defragment its drives using an application installed locally? The problem, however, is that uninstalling the application is not an option for those who want to use it. So, if you do want to use it, you're either left to keeping the dependencies and drivers loaded, or activating them manually everytime you want to run the application. Furthermore, upon observation of your sig, the applications you list promote the "modular" or "tinyapps" model of computing, so I don't understand how you're able to appreciate/tolerate applications like O&O, PD, and I'm assuming Norton or McAfee as well.
  22. The benefits of Merom will be that of waiting 4-8 months for anything related to technology; that is, time waited is exponentially proportional to the "wow factor" of new technology. In terms of current availability, however, if you were to get a CoreDuo based system, your best bet would be a MacBook Pro. As far as I'm aware of, offerings from other companies still have diminished battery life, as I have yet to hear of Microsoft releasing a fix for that USB2.0 issue.
  23. Yes, but you see, this is where we run into a quandry... should a defragmenter occupy enough clusters where it needs to perform gratuitous maintenance on its own files? This makes sense with antivirus software, because it allows them to be self-healing... but I don't see the point with defragmentation tools. Provided you use my reference to "time" as a relative unit of measure, I can see where you come from. However, in order for time to be relative, it must not only be quantifiable in a sense of duration, but also in human value. I can't speak for everyone, so while a second or two(per cluster, mind you) might not seem significant in duration or value, for the sake of argument let's treat time as a quasi-unit of measure:If total time is equal to time per megabyte multiplied by the amount of space the files occupy, and there are 10-20 more files on a partition with the aforementioned "defragmentation suites" installed, then naturally total time will increase with some degree of proportion to the amount of additional space these tools occupy. Granted, because these defragmentation tools are faster than our base of reference(Windows Defragmenter), this all becomes irrelevant when the application is actually running. However, if you look at it in retrospect, larger defragmentation applications do add unnecessary time to the equation, which is simply what my argument was about. If the end result is the same no matter what tool you use, how do you gauge efficiency? If it's time of completion, then I hate to break it to you, but PerfectDisk and O&O only manage to shave off a quarter of the time it takes for Windows' defragmentation tool to complete. I've observed Contig doing two-pass defragmentation completing in almost half the time. You also fail to mention that both PerfectDisk and O&O mandate that their background services be running in order to even access basic defragmentation functionality out of the applications. To me, this seems unnecessary, considering Windows XP already runs its defragmentation tool in the background when your system is idle(Note: if you're using a defragmentation mode that sorts the files on the drive in order of precedence[ie; the Windows directory first, applications next, etc], then I would disable idle defragmentation, as Windows tends to ruin this type of sorting). I don't know about you, but the only time I run my defragmentation tool is when I decide it's time to defragment the system(usually after installations/uninstallations, cleaning temp folders, Torrent download completions). As far as file size is concerned, you shouldn't be solely concerned with its occupancy in reference to that, but also in reference to where these files are. I'm almost certain that both PD and O&O install additional DLLs in the Windows directory, which tends to make the directory ugly if you have numerous other applications doing exactly the same thing. Furthermore, both defragmentation tools install hundreds of registry entries, which again, is totally unnecessary for a tool that's supposed to be promoting cluster organization.
  24. Not to burst anyone's bubble, but in retrospect, the solitary goal of defragmentation is and always will be the same... and as long as we all arrive at the same conclusion, who could honestly care what or how the application derrives this conclusion? Let's look at it mathematically: 4(2 + 6) will always equal 32, despite whether we distribute 4 into the parentheses and then add, or add what's in the parentheses and then multiply by 4. Now, if we conclude that no matter what application we use, we will always end up with a drive with no fragmentation, we can deduce that: - Application X, Y, and Z essentially perform exactly the same function(albeit faster) as the built in Windows defragmenter - Spending money on said applications is as frivolous as spending money on anti-virus "suites"(definitely the correct nomenclature for them, considering how roomy they become on your hard drive and how well they cozy up to Windows) such as Norton, when there are free applications like ClamWin and Anti-Vir that won't gobble your memory - If application X, Y, or Z occupy more than a few megabytes and are spread over numerous files and directories, the likelihood of X, Y, or Z becoming fragmented themselves increases tenfold. Thus, we run into a redundancy issue: if the application is defragmenting itself(specifically, the fragmented clusters it occupies on the drive), it's essentially wasting time... time you originally thought you were saving by buying the application in the first place. So, in conclusion, the best defragmentation application is one that: - Can perform the task efficiently and effectively - Is priced relative to the cost of the tool included with Windows - Occupies as few clusters as possible In my experience, the tool that best fits this description is none other than Contig, coupled with the Power Defragmenter GUI(www.excessive-software.eu.tt). Sure, the GUI might not be very attractive(it's simply a shell for Contig), but because it does two-pass defragmentation, it virtually performs exactly the same task as Perfect Disk or O&O in a relatively short(er) amount of time, all while keeping a profile under 2MB.
  25. Actually, there are four proponents of the DirectX API: video audio input networking In addition, each of these components contains sub-components. For example, the video portion alone has three: DirectDraw(2D graphics, primarily used in areas where GDI is just too slow for rendering), Direct3D(3D graphics, obviously), and DirectShow(video for Windows; used by codecs such as DivX/Xvid, WMV, and MPEG). I'm also aware of the fact that most audio players, such as Windows Media Player and Winamp, are mapped through DirectSound by default, which would indicate that MP3 support will also be unavailable unless your player is appropriately configured. So, while updating or even posessing DirectX might not appear to have any tangible benefits, the implications that it won't are simply undeterminable... unless of course your computer is used strictly for Office and/or light web browsing(any and all multimedia will be unavailable). Furthermore, like many hotfixes and service packs released for Windows, most of the new files are simply overwriting pre-existing files, so the space consumed is fairly negligible.
×
×
  • Create New...