
teqguy
MemberAbout teqguy

teqguy's Achievements
0
Reputation
-
I addressed two topics in my post, one being the fact that multiple identical dlls are being stored on the drive unnecessarily, and two being the fact that, because this is solely your project, you are sole proprietor and thus the only one who's actually making any contribution to it. As far as the system directory is concerned, I did not declare your method invalid... I simply stated that there are more elegant solutions that don't require the dlls and system executables to be ostracized from the system directory. Upon writing this, I can think of several ways this can be accomplished: 1) Using the EWF loader to mount those directories as read only, unmounting them whenever changes need to be made. 2) Running Windows in some sort of sandbox(Sandboxie on a larger scale comes to mind). 3) Having a system profile in which the system and user are allowed administrative access to the Windows partition, and a user partition in which the system and user are allowed solely read-only access to the Windows directory, including restricting the user from making registry modifications, etc, etc, etc. Any of these solutions would reduce the amount of overhead, while adhering to your original guidelines. Now, as far as my second statement is concerned...... I believe that projects get more accomplished when the average user is either able to contribute to it and/or allowed to tailor it to their specific needs. Currently, MicrowinX does not satisfy this idealistic approach, because users come to you for help instead of fixing problems on their own. Granted, I know that not everyone is as knowledegable as we'd all like them to be, but at the same time, if other users were able to make customizations more easily, they would be able to help out those who are still in the dark. Look at how far nLite has progressed on this model. And if you notice, nuhi still retains control over the project... he's just given users the building blocks to make the application their own. This suggestion comes from the observation that this project appears to be weighing you down, and is one that I believe would be in your best interest to consider.
-
That's a shame, considering that NTFS still manages to(albeit in fewer occurrences) propagate defragmentation. Thus, it continues to be important that you defragment your drive regularly. This comes straight from the horse's mouth: "...the NTFS is still subject to fragmentation. Unlike memory, if parts of files are stored in different physical locations on the hard disk, it takes longer to read the file. Unfortunately, Microsoft has not provided any defragmentation software for Windows NT, which is available in abundance for Windows 95/98." Source: http://www.microsoft.com/technet/archive/w...n.mspx?mfr=true "NTFS Does Get Fragmented The Windows NTFS File System Driver uses a special file called the Master File Table (MFT) to track all files on that volume. The MFT starts out with some free space to allow new files to be tracked, but on a very busy system it too can run out of space. At this point NTFS extends the MFT itself, creating new stretches of it for new allocations. This situation is precipitated most often by fragmentation in the file system itself, as file system fragments consume entries in the MFT. If these new stretches are not contiguous, the MFT itself becomes fragmented." Source: http://www.execsoft.com/whats-new/whitepap...p#_Toc463769973 And finally, this last one comes from Raxco, makers of Perfect Disk. While I do believe one of their motives for providing such a writeup is to sell a product, a lot of the information they provide is factual and substantiates my case: (careful, PDF): http://www.raxco.com/products/perfectdisk2...gmentation' Now, as far as you're concerned, gdogg, as I stressed wholeheartedly in my first post, I am absolutely not trying to rock the boat here. However, the fact alone that you have not answered a single one of my questions with a dignified response(which I believe both I and the MSFN community are entitled to), and in addition have gone as far as to say I'm simply trying to spread some persuasion of discontent at either MwinX or you is simply unacceptable for someone of your recent regard. I would not bother conducting myself in a manner as such if I was simply looking to bash you or your project. I would, instead, insult you, proceed to call you names, and create numerous threads on various forums of how much of a fraud you are and how horrible your project is. Do you see any of that? Are you able to find one Google entry related to such a thing? Furthermore, I would not have the persistence and the patience to post in the fashion that I do if I did not want to get my opinion across in a manner that would be open for discussion.... rather than having someone, such as yourself, try to discredit me simply on the basis that I do not share your views on particular matters. To go as far as to threaten me for simply speaking my mind is not at all what this or any forum is about... unless of course the forum is that of fascists. To restate my original point (with hope that you will actually take the time to read it, instead of glossing it over with the simple fact that I'm not in entire agreement with you), I don't disagree that starting from as few system files as possible and adding back as necessary is the right way to go.... because it is. My disagreement lies simply in where you add these files back to. I even went as far as to present my own solution to the security issue, but you must've skipped over that part too.
-
Naturally, the only way I could ever possibly come to any conclusion is through rigorous testing. This is why I felt the need to provide a commentary. Now don't get me wrong- I'm not trying to discredit your project, just the way you've gone about it. So, in turn, the point of the project is to load every other directory up with identical dlls, not only creating unnecessary bloat, but additional ram usage?How might it increase additional ram usage, might you ask? Well, say you have two applications that need the same dlls. If application A and B are both working out of their own working set, the dlls will be loaded twice instead of just simply being loaded by the subsystem handler and then accessed in shared memory by both applications. It would be like putting DirectX in every single directory for every game you have. I'll believe that when I see it... try doing actual benchmarks and record data for it, rather than just making arbitrary conclusions based on what you percieve to be factual and accurate. Now, as for infecting the system with viruses, everyone knows that once you remove LSASS, most viruses that would usually have some indication of infection either don't execute or simply don't pop their nasty little heads out. I don't believe this to be exclusive to your project. But you see, this is inherently why it has already failed.You're in complete control of what, in essence, MicrowinX is and will become. Therefore, users simply have to go with the flow or come up with their own version based upon what you have started. Unlike some of the other projects on MSFN, users have very little control over what goes in and what comes out, because the options are fairly limited to the applications you and the other people capable of making application patches use. I believe the term "bloat" to be universal: Unnecessary substance. This, to me, includes files that are of use, but are proliferated throughout the system. See, again, the problem is that you're not teaching anyone to fish... you're just handing them the fishing pole and assuming they know how to bait their hook and reel something in for you.If you wanted a "ground up" approach, you should start with a "ground up" approach. Instead, you elect to personally do all of the work.... making it your project, instead of an MSFN community project. At least nuhi and some of the other project developers have the humility to accept criticism and even go as far as to sometimes admit they're wrong. Well if the shoe fits... I honestly believe that if someone is passionate about something, they would fight tooth and nail to get their point across. I'm giving you that opportunity. The ball is in your court. You act like it's such an inconvinience for someone to read a post or two that isn't full of praise. If you're egotistical enough to have ever honestly believed you've created perfection here, I think you need to re-evaluate why you even started this project in the first place. While the others might be valid points, you're barking up the wrong tree when you say you've never had to defragment. Just because you've never done it doesn't mean there isn't fragmentation, nor would I ever recommend that anyone disregard defragmentation, especially with large capacity drives.
-
Okay.... let me get this straight..... From your understanding, an application that's "stand alone"(ie; requiring no external operating system files), you would rather package tens upon hundreds of copies of the same dlls in multiple folders(which INCREASES bloat), rather than keep them in the system directory(where they belong)? First off, a stand alone application simply means that it's portable; it will work on ANY system you run it on. How is such a feat accomplished? One, the application cannot create numerous registry and user profile entries that need to be accessed(at the very most, it should create a config file that saves user preferences). Two, the application's working directory should never be its own, but rather the libraries that are inherent in every Windows operating system. If you look at applications like uTorrent or Media Player Classic, you'll find that this is the case. Second, I have a better solution that will eliminate all of this additional overhead: Change the directory that dlls and system executables are stored... and then mount that directory in the EWF loader. Now, as for these claims you're making... They seem to be a very tall order that I'm seriously doubting. Sure, you might have lower memory usage and fewer files for viruses(or the user) to fsck up, but lower pings and virus immunity? Do you have any proof, outside of personal testing, to substantiate these claims?
-
Gdogg, I've been following the MicrowinX project every now and then since its conception and have wanted to address a few concerns: From what I gather, the name of the game is simply stripping out everything and then adding back dependencies as necessary. While I find this tactic to be fairly promising, it came to my attention that mWinX doesn't add the dlls back into the system directory, but rather into the individual applications' directories, which to me seems counterproductive. Why have the dlls in multiple places when you only need it in one? Second, due to the fact that you have deemed the project completely closed source(despite the fact that the source can be extracted fairly easily), you have since chosen to maintain project entirely on your behalf. My question to you is, when is enough, enough? If you release patches for every application requested, isn't the project simply going to turn into the "one size fits all" package we know Windows as today? I'd like to express the fact that I'm not trying to rock the boat here... I just think that this project is headed in somewhat of the wrong direction and if left to continue will end up in a dead end.
-
Try configuring FFDShow to use another MPEG1 and 2 codec. To use the default one that comes with Windows, just set it to "Disabled" in FFDShow.
-
Well, now that we've narrowed down that it's not due to your graphics card, we can conclude that the problem either lies in the codec's settings and/or the media player's settings. I am aware of the fact that you're having issues with multiple codecs, but if we can get at least one working properly, we'll be able to narrow the problem down even further. Try popping in a DVD and rendering it with GSpot(http://www.headbands.com/gspot). What does it say your MPEG2 decoder is? Nvidia's MPEG2 codec in particular is one that might have scaling issues in certain players(ie, in MPC, the video becomes 9:16 instead of 16:9, so you have to manually correct it).
-
In which application? PowerDVD or MPC? In MPC, the option for video size is available when you right click inside the video window.
-
In MPClassic, you have to set the Video Frame to either "Stretch to window" or "Touch frame from Inside/Outside". If that doesn't seem to help, you can use the numeric keypad to resize the video(9 will make it proportionately larger, 5 will reset it). I'm not familiar with PowerDVD, but it could possibly have the same option somewhere.
-
Actually, the concept of striving for some sort of pinnacle transcends any specific forum(Why else do we upgrade?), as it just so happens to be the basic principle of Darwinism. "Dealing with things as they currently are" is accepting that they will always(or at least for a very long, indefinite amount of time) be that way. This is what brings about conformity and makes new concepts and ideas dwindle. To get back on topic, though, the way I "deal" with current software is that I don't use anything that can be deemed "bloatware". IMO, any defragmentation suite that has more lines of code devoted to GUI than to defragmentation is what I would consider bloat. If what I call bloat, you call "features", that's absolutely fine... I was simply trying to introduce an alternative utility, because as this forum has demonstrated, there isn't software that suits everyone.
-
I understand where you're coming from. However, accepting things for what they are denies the very essence of humanity. To try to tone this discussion down a little, I'll comment on one line of your post and nothing more: Well, what if you do, in fact, have tasks that need a tremendous amount of memory? As I'm sure you're aware, encoding, rendering, and games aren't exactly light on resources. There is never an instance where I can't find RAM to devote to a process. For example, during encoding tasks, I use my memory as a ramdrive, which is immensely faster than a hard drive.
-
That works too. It might be nice to take it a step further and write a batch script that automatically kills them when the application closes, but I guess manually closing them would suffice. Menion, a solution similar to that has already been mentioned, but I'm interested in dynamic partitions. Enlighten us please.
-
So, I take it that everytime you want to run Perfect Disk, you have to manually re-enable the scheduler and engine services that come with it? If so, we're getting down to the meat and potatoes(don't worry, no food analogies this time) of my argument. Having to enable these services everytime I wanted to defrag is what killed the whole experience of using defragmentation "suites" for me, especially considering I get the same results with contig.
-
How so? Assuming you're talking about his Firefox bashing, Mastertech's basis of argument is supported by one silly website, and while that website does have a few decent points, I don't believe it's worth debating. While I find hardware debating(or flaming, if you will) completely pointless, I understand that it is partially valid, in that the only leg those people have to stand on is that there is some degree of pressure put on the decision of what hardware to buy. Software, on the other hand, is an entirely different animal... it's fairly simple to develop a preference, because you can always try before you buy. However, to surmise my opinion on both would be this: Current software and hardware do more butting of heads than working together. The most feasible way to relieve this is either get better hardware, get better software, or when time and money persists, both. Both Zxian and you commented on resource usage being superfluous, due to the fact that you have large amounts of RAM. My question to both of you is this: Would you still need that much memory if hard drives could keep pace with system memory? What about if applications were more efficient? Furthermore, if you don't absolutely need a certain application running in the background at all times, why keep it there? Windows' scheduler does a decent job at allowing you to perform all of your system maintenance at a desired time. This would give you the automation you find useful, while not needlessly sucking up resources during downtime.
-
I apologize for not being able to reply earlier, but the place at which I was staying for the weekend had done a major overhaul on their LAN... result being no connection on every floor but the first . To answer your question, contig is merely a console based application that accepts one argument- namely a file. You are correct in that without passing additional switches to contig, it will only defrag the file you tell it to. However, contig features a switch that allows it to do recursion on a subdirectory, which, if you were to type "contig -s C:", would licit the entire defragmentation of your hard drive. Now, to turn contig into a full-fledged defragmentation tool, all you have to have is a shell that will pass it these arguments. This is where Power Defragmenter GUI comes in: excessive-software.eu.tt. With that and contig in the same directory, open Power Defragmenter, then select "Power Defrag" mode. While Windows Defragmenter itself does not run in the background, it can be sufficed to say that, in essence, the portion of code that does run acts both complementary and partially supplementary to other defragmentation tools.It was my mistake to imply that it did anything superlative to that. However, there aren't any other defragmentation tools that handle Windows' prefetching, so I wouldn't discredit that portion of the tool. Forgive me for summarizing, but I'd like to reaffirm the fact that nowhere was it mentioned that Windows Defragmenter performs well or is even adequate for the job. It was simply used as a base of reference for comparison with other tools. To make my opinion clear, I do agree that Windows Defragmenter is another one of Microsoft's sorry attempts to provide a "one OS fits all" solution, but I don't turn my nose at the fact that it is a defragmentation tool nonetheless. The problem is that these tools don't know whether or not a file is fragmented until they actually do a pass over it. Granted, I don't believe this to be too taxing, considering my drive consists of hundreds of files defragmentation tools absolutely hate- namely, video files and large archives . However, I was simply making this point for the sake of argument(note that I called it a quandry, not a passage in my personal bible). To address this point more thoroughly, though, I don't believe current defragmentation tools are doing all they should be. Specifically, they should be taking advantage of Windows' built in Indexing Service to find out if a file is fragmented or not, and then proceed accordingly. Although, the same can't be said for anti-virus tools, as I find it hard to trust Windows in aiding in the security of itself. Let me start out by asking you to re-evaluate who exactly our target audience is, because I honestly believe we aren't reaching the "average joe" via the MSFN forums. Most of the people on this forum are knowledgeable and aware of their computing habits. They also seem to realize that working with computers is a give and take relationship; ie: what you put into it is what you get back from it. Now, about the so-called "average joe"... I am now under the impression that they would enjoy their life being completely automated. I find this greatly disturbing and completely unacceptable. The modern average joe has grown to depend on such automation so much that they've become over zealous lemmings(not to include politics, but this is especially true in the US), adept at clicking "OK" to every message box that pops up and opening e-mail attachments from unknown addresses. Based on the worst past few viruses and spyware, most of which could've been averted had said "average joe" not accepted what was default or automated and actually bothered to pay attention to what they were doing before they did it. And you think they should have more automation? If that's the case, they might as well just have one of those little birds that works on buoyancy. It worked for Homer. (Note: I am in no way trying to hold contention in order to offend anyone, but I honestly believe that on a forum where people come to modify Windows so that it's sleek, smaller, and faster, they would appreciate having as few background tasks as possible in order to ensure optimal performance) While EXT2 isn't as efficient as Reiser, you might want to check out this kernel mode driver for Windows, as it offers full support for EXT2 in Windows: fs-driver.org