Jump to content

CoffeeFiend

Patron
  • Posts

    4,973
  • Joined

  • Last visited

  • Donations

    0.00 USD 
  • Country

    Canada

Everything posted by CoffeeFiend

  1. That's not its purpose. There's other tools for that (cacti, munin, nagios, etc) Nothing can by itself look beyond that. If you want to peek at traffic from elsewhere then you can use a span port on fancy cisco switches or similar. Or you can take your laptop there, preferably with a network tap
  2. Wireshark is great but you do have to know what you're looking at and networking in general. There's simply no way around that.
  3. That's quite a fancy name for a broken link
  4. It seemed like the only real option indeed. And yeah, it's not really hard to do but it kinda sucks having to deploy it. Not that I currently have a use for it but the source would be great to have for sure.
  5. So true. There's loads of them for sure, and tons of newer "simple" games are a lot of fun (I've wasted way too much time on plants vs zombies lately) Yeah, I know. I just used it as some kind of reference point (not necessarily the fastest).I had to pick something. It's a fairly well known card with decent gaming performance was/is in the reach of even most non-hardcore gamers (i.e. a fairly decent gaming video card still), and that's 12x faster than what he has. If I picked a 5770 then it's probably more like 15x. Just showing how far behind it is. I probably wouldn't get a 4850 either, not that I plan on spending $150 on a vid card anytime soon (my old passively cooled 4670 is still overkill for my needs, although even a 3200 mostly fits that bill too)
  6. That's always been a sore spot when it comes to the WSH. The include system is t3h suck (and the WSF format too, to some extent). You'd think they'd have worked on that a bit in the last few years. Code blocks from different includes just don't see each other so the only way to do it would be the whole open the file+write the text+close the file thing for every line you have to write, and that often means a lot of unnecessary round-trips across the network (slow). That pretty much forced me to do the logging "in-script" all the time. So my own code reuse/standardization strategy has been mostly using standard templates as a starting point for scripts (it's a time saver anyway). The only real alternative I can think of the moment would be a custom COM component for logging but that sucks in other ways (like having to be installed/registered on every PC), or perhaps not using VBScript/JScript anymore (e.g. powershell). Then again, there's nothing perfect... batch files are the worst thing out there in just about every possible way I can think of (yuck) save for the utmost and absolute simplest jobs but it just won't go away WSH isn't really being improved much... although the techs it uses like WMI are so it's still improving somehow vbscript has the ugly (IMO) VB syntax and craptastic error handling, but it's within the reach of pretty much anyone jscript has a better syntax (again, IMO) and decent error handling, but very few admin types are into that (being the only one writing jscript in a shop sucks) powershell is great and groundbreaking in a lot ways, but some parts of it suck too, like for instance some formatting methods that truncate columns using a "standard" programming language for scripting tasks gives you the most power and flexibility but it just takes way too long, it has to be compiled, sometimes security policies get in the way, etc so while this finally gives you the control you want, it just brings in a lot of unnecessary grief with it other scripting languages (like for instance autoit, activeperl, etc) which need 3rd party components installed (licensed and deployed), where you can't necessarily rely on their support, that most admin types don't know, etc so it's often not the best idea ... I'm still waiting for the perfect solution but I'm definitely not holding my breath. Sorry for being of no help at all.
  7. Well, that's pretty broad a question. Graphics-wise, you pretty much have to know the basics of drawing (with plain old pencil/pen and paper) to begin with (takes a few years of practice), then you only have to master photoshop and illustrator and flash (takes several hundreds of hours), actionscript isn't bad if you're already a decent programmer (which takes several years of work to become too), as for special effects and sounds, I wouldn't even know where to begin. It's very much like your other question about data recovery and electronics. It'll take years to get there. Mind you, most people looking into making games don't end up liking it at all, they often find out they just like playing games.
  8. I hadn't noticed you posted seconds before me That's a DSL modem driver that's misbehaving (doubly so if you're getting BSODs). Ditch the USB junk, and plug it using ethernet -- problem solved for good. That thing is likely causing both of the other error messages you see in your logs.
  9. If you don't have the expertize in house to accomplish this then most likely you don't have the access to all the specialized accounting-related knowledge either (complicated tax laws and what not) so it might not be such a great idea in the first place. Nevermind there's already all kinds of great apps filling this field like quickbooks. Doing something like this would require a full time team of programmers for several years. Programs like Excel have well over a million lines of code.
  10. That's normal because it's the exact same information. ...and that is not exactly normal, I'd take some time to look at them. If you're not sure which ones it is, you can clear it (save it first to see what else you had going on), then recreate the problem (seemingly rebooting does it) then check again.
  11. Check in the event viewer, particularly the system log. There should be some errors listed in there which will provide you with more information about which service(s) failed and for what reason.
  12. I've had no such issue with it myself. Then again, we don't know which version of imgburn you're using, which OS it's running on top of (they offer different methods to write to a disc), which particular I/O interface you have selected in imgburn, which writer you're using, etc so there's a lot of factors involved. Then again, good ol' Nero still works great too, and in Win7 (don't recall about Vista) I know if you select the ISO in explorer, there's a burn button up in the toolbar.
  13. I could answer too but I have no idea what you're asking for. I would have basically said the same thing as gunsmokingman but seemingly that's not what you wanted to know. Do you mean something like a calculator?
  14. It's not besides flash and actionscript, it's besides actionscript -- basically the "flash" part. You have to be pretty good at graphics (e.g. illustrator and/or photoshop -- the images being small usually makes it a bit harder too), little animations/special effects (like say, a spaceship exploding in a game), somehow making the sounds used by the game (might be easier to find some pre-made ones!), mostly everything flash-related besides actionscript (things like the timeline), and good at making fun games obviously (not as easy as it may sound)
  15. It's a flash-based game, so that would be actionscript. There's a lot more to it than just a programming language though.
  16. You make it sound like dropping support for any product is unexpected. XP is a decade old and 2 versions out of date, that's why support will eventually end. Expect the same out of other products. If MS eventually discontinues MSE (or replaces it with something newer), then eventually support will vanish too (just like every company does)
  17. Because for the most part they're low budget devices with slow CPUs that suck. For sure. The EMC, NetApp and such solutions that cost a LOT don't suck quite as badly. That pretty much sums it up, thanks. (emphasis mine) Running FreeBSD on it pretty much makes it a DIY setup anyway (in the roll-your-own way; and very much like what I had suggested with OpenSolaris/ZFS), and it basically takes away the only point NAS'es have going for them these days: ease of use. That's pretty awful performance (15MB/s) IMO. Especially for a $500 device (over twice as much as the one he was looking into) which isn't much of a "standard" NAS than a "standard" PC (Celeron 4xx and 512MB of DDR2). You could build something a lot nicer/more powerful out of the same budget (power usage would basically be the same too). Hardly. On most of THG's tests it scores in the mid-30's (and as low as 17MB/s on a test -- some sites even show it going below 10 if you ever considered going RAID5) For a small fraction of the price you can also get something much faster in form of a eSATA enclosure which I believe is a better solution for a fair amount of the folks thinking about getting a NAS.
  18. Yes, applying thousands of source fixes from hundreds of different sources to hundreds of individual packages, some individual packages (like the Linux kernel) being well over 10M LOC, and the OS spanning *hundreds* of millions of LOCs. Yeah, clearly in anyone's reach.
  19. No such thing exists. Ah, so your whole point is just based on a grossly outdated bad example then, my bad. Again, that doesn't matter if the particular distro didn't support it, and RHL didn't nor did pretty much anybody else (nor do they do today) Nah, I had just cut you some slack. With open source you're very often sold given something horribly broken. And if you're experienced at all, you can just as easily fix Windows issues (it's arguably a lot simpler and faster -- thanks to sysinternals utils and some other gems). PS: I also run Linux, and not just on x86 HW for that matter. So a modern OS should run great on this, far better than old x86 stuff, I fail to see where the problem was in the first place (unless you got your i7 all of 2GB and then try to run several memory hungry apps on it at once) If their IT staff is too incompetent to see this coming it's quite sad. It's only a matter of time before they're forced to move to something else anyway. They're going to be in a world of hurt because they waited too darn long to do anything and no transition is ever going to be smooth. They only got themselves to blame.
  20. So then you're just using it as your primary example but without actually knowing if it's the case? Pretty much all the people I've seen that still used 10 year old computers don't buy the latest software. Either being the frugal kind, or being one of those people into that old software (you know -- the "everything new is bloated" type of guy), or just having such an old machine that new apps won't run. Businesses don't base their business model around a tiny market segment that typically spends ~nothing. It's hardly just "sponsors". They say about 75% of the open source code (things like the Linux kernel, Apache and such) is written directly by the large companies I listed (as in by employees directly paid/hired by them). Money matters a LOT more than you seem to imply. Even "small" companies like the Mozilla foundation makes a lot of money -- 78.6M in 2008 alone. So buggy, featureless, barely usable, unstable, poorly supported and often worthless projects are perfectly fine because it's free? Thanks but no thanks, I'll just keep using something nice that actually works, even if it costs money. So you're blaming MS for dropping support a decade later? Because Linux distros from 10 years ago (2.2.x kernel) are perfect and still supported, right? It's not like the Ubuntu LTS releases (long term support) are only supported for 5 years or that it's only 18 months otherwise A typical Example would be RHL 7 that came out in 2000 (same time as Win2k which is currently still supported) for which support ended back in 2002 or 2003. Or RHL 7.3 which came out after XP (which still has some years of support left) for which support ended in late 2003. MS simply has the best support, and yet you're blaming them for it. Then again, you were saying that some people like you don't want to upgrade hardware for 10 years but buying software was OK, but here it seems like buying new software is also out of the question, again showing that you're probably spending just about nothing anyway. Honestly, it just looks like you want all the latest and newest, but without actually buying new hardware or software, and with ~eternal support thrown in for free (and perhaps open source too). I'd love to see you run a successful business this way. Or some people can be realistic and think about moving forward to a more modern tech in the 10+ years period before it's unsupported (the lifecycle concept). You're going to have to do this even with other OS'es than Windows (Linux distros and Mac OS X don't support their product even close to 10 years). Using Win2k (on 10 year old hardware like you said) now is exactly like someone still using Win 3.1 on a 486 after XP came out (not quite 10 years apart) and refusing to buy something newer instead. Sorry, but I have to echo cluberti's point, your blind hatred towards MS seems to be clouding your judgment.
  21. Seriously, have you even used a recent version of Photoshop? It runs faster than ever (thanks to OpenGL acceleration). It's nothing short of fantastic. And it's a huge time saver in a thousand ways compared to anything else. And that isn't the kind of people most software or hardware companies target (those who buy nothing). Most new apps improve in terms of functionality (Win7 sure did, Photoshop did in incredible ways, etc) and given decent hardware speed/performance improved a lot too (Win7 and Photoshop are both GPU-accelerated and both also benefit from a x64 CPU, etc) Most of them on the larger projects *are* getting paid by companies like Red Hat, IBM, Novell, Oracle, Intel, HP, etc. Mind you 99% of what they produce is absolute crap (IMO) that I don't want of, even for free (it's mostly crap, save for a few gems)
  22. It doesn't force me to use twice as much RAM in any way. 2GB just isn't enough for what I do e.g. I've taken Photoshop memory consumption over 4GB by itself with large files, and that's one app alone -- doing this with the x86 version (the only thing you can on a x86 OS) is painfully slow (LOTS of disk I/O -- way less than half the speed). SolidWorks also crawls with 2GB (even on a x86 machine). It would also limit my ability to run VMs or multitask. And even if you exclude those particular apps, some of my apps wouldn't work quite as smoothly on a x86 system with 2GB. So do I mind the extra 300MB of overhead of the x64 version of Windows 7 (300MB being worth under $10 at current prices taxes in) which runs everything significantly faster (especially Photoshop which I use quite a lot, doubly so when multitasking a lot i.e. always)? Am I supposed to want to save $10 worth of memory usage and prefer having pretty much everything run ~10% slower, and some tasks WAY slower? Or perhaps should I want to save a few bucks by not going dual core, also making things slower in the process? I don't get the reasoning behind it. $10 for a nice performance boost is nothing. There's ~$300 PCs with more than 2GB, it's not 2005 anymore. And again, regardless of all this, I still wouldn't get 2GB on a modern system anyway (that's what I had in my 2GHz-ish PCs running XP years ago). Sounds like your main beef isn't against x64 (or it's against it but based solely on a scenario where you go out of your way to memory starve it) but against newer software using more RAM (and needing better hardware than several years ago), in which case you can probably save more by staying with XP which has a even lower memory footprint. Right now what you're doing (as some kind of basis to bash x64) is exactly like saying XP is faster than Win7, because XP will run some memory-hungry app faster than Win7 with 512MB of RAM or whatever.
  23. If you go out of your way to memory starve a machine it doesn't perform well (that's always been the case since the introduction of virtual memory), that's all you proven, not that code performs any better on x86 -- it clearly doesn't. The only place it really loses significantly is the * Mark Vantage series which use a lot of memory (so it's hardly an indication of anything anyway, unless you plan on using a lot of memory-hungry apps on a machine with little RAM) as it does a lot of multitasking (lots of apps running at once). The x64 OS performs all-around better, except in the one scenario where you go out of your way to make a nice modern machine and purposely pair it with too little RAM to then run memory-hungry apps, then any OS with a lighter "footprint" wins. Most of us who use lots of memory-hungry apps just get enough RAM in the first place so everything runs comfortably instead.
  24. Not sure what you're talking about. I never split my login stuff in 2 parts, and the vast majority of what I do is ASP.NET (and I love it too -- it's so darn productive). The only "split" thing I can think about which you may be referring to is the sparation between markup and code (aka code-behind) but that has nothing to do with this (splitting the code in 2 in some way for no apparent reason) It's not inherently more secure in any way. It's only going to be as secure as you code it. You can write very secure and very insecure code in any language. You said you had previous experience with CSS so I assumed you'd go for that. These days we use div tags (wrapping around different parts of the page) which we then position using CSS. It's pretty simple as long as you understand the box object model (there's plenty of templates and tutorials out there to get you started as well).
  25. Not every time. I tend to say that a lot myself (because I think it's often very much the case and in many ways -- jscript is nice too), but for ridiculously simple things (e.g. starting an installer with a couple switches) it's just overkill.
×
×
  • Create New...