Jump to content

Protect .NET source code by obfuscating it?


cyberjoe2

Recommended Posts


i m not a commercially oriented progammer niether a core programmer

but AFIK m$ had implemeted lots of fuctions to .NET platform making the vey useful for programmer due to easy availability of pre coded procedures and cleasses

but the problems with the coders is that they doesnt really care abt these and write their own procedures

the makes deviation in code nad make it inefficient

Link to comment
Share on other sites

Oh wait... you're all speaking from a point of a commercially oriented programmer.

Never mind then... :whistle:

Where is the commercial viability with WUD, WMP11 Integrator, Compression Bin and XPreview? .NET allows me to code those utilities to share with the community for free taking less of my time so that I can spend more time with my wife and my son.

My revenue is generated almost entirely through computer sales, maintenance and repair. Most of the time, I made my wage by doing on-site repairs for residential customers, sometimes I get to sell hardware and often I re-install windows. All of which is uninfluenced by programming.

To come back with a statement like that is proof that you simply have no solid evidence to refute with and have giving up with unrelated pseudoslander.

Link to comment
Share on other sites

You're absolutely right jcarle. I also code a few utils in my own time (embedded projects, some "regular" apps like an installer for my unattended setups, etc). Even for hobbyists, time is a major constraint. One only has so much free time. The time I'm not spending coding can be spent with the kids, going fun places, doing all kinds of activities together, visiting relatives, reading a good book, etc.

Reinventing the wheel coding everything in lower level languages (and without the rich framework) means a LOT of time spent to recreate it (often in a lesser way) and far more code to write (and maintain/bugfix). Life's too short for that. I'd rather work to solve the problem quickly than working forever on the underlying low level implementation of everything. Between your apps using 5% extra resources and having no life at all, it's an easy call for most people.

Funny how much people who don't know what obfuscation even is know how much it affects performance and even find it obvious...

Looks like someone had nothing to backup his usual FUD after all... How surprising.

Edited by crahak
Link to comment
Share on other sites

:blink:

That explains why most of the open-source software is so poorly written...

IMHO if you're not going to put 100% effort into writing the best, most efficient software possible, you shouldn't be writing it. These "programmers" should be ashamed of themselves and their work!

Link to comment
Share on other sites

:blink:

I know, having left time for something else (aka "having a life"), such a strange concept, eh?

That explains why most of the open-source software is so poorly written...

Ah, another day, another piece of FUD.

Hobbyist like to not spend their entire time coding for nothing -> open source apps sucks. I love the logic (or the total lack of).

Nevermind that a LOT of open source projects are written by companies & paid developers (often backed up by large companies such as IBM, Red Hat and Google). Others release their code as open source, but have commercial licenses (like community server), and such things.

if you're not going to put 100% effort into writing the best, most efficient software possible, you shouldn't be writing it.

Just like if car makers should be making the very best car possible (better than a Rolls Royce), and ensure you spend day and night so the car consumes one less gallon of gas over its lifetime, or not make cars at all. Or like we should build the absolute very best houses ever (a 500 Billion $ home), spending 75 years building it to ridiculously high standards, or have nowhere to live.

Things are a bit black and white in your own little world. It's quite an extremist view. Everybody else will settle for something adequate, on time, and on budget.

So basically, if you're not going to spend 10 years hand-optimizing assembly code for your app, don't bother solving the problem (so much for someone who was saying "you ought to use them too, not just program"!) So you're saying people would be better off without something that solves a real world need, just because you it could be more optimized (as long as we didn't mind spending all our time, taking ages, and costing a lot more). People should just go without, just because! One word: ludicrous!

And no one's forcing you to use their code. Feel free to spend day and night of your own time (something most people consider valuable) to rewrite everything to gain as much extra performance as most people would get from a 50$ memory upgrade on their PCs.

Oh, why don't you create a poll? Asking if nLite or WUD (apps well known to this community) shouldn't exist, i.e. if we should do without them, just because they're not programmed your way?

These "programmers" should be ashamed of themselves and their work!

Who? The countless companies who employ people working on open source projects? The countless folks who give their time without asking, and even release their source code along with their program for free?

Yes, there are some open source apps that aren't great - but that's because those are usually written by hobbyists. That's what the explanation is (we couldn't expect you to understand that concept ). Not because they chose to have a life. Ever heard the expression "Don't look a gift horse in the mouth"? Don't like it? Don't use it. You're insulting tens if not hundreds of thousands of folks here. It's like you think they owe you something. I'm sure you think people working for charities ought to be ashamed to be helping others too.

And if your only problem with open source is resource usage, then why not fix it yourself? You've got the full source code already. Personally I think that for the most part they're far better off spending their time on bugfixes and implementing new and useful features instead of trying to squeeze the very last little bit of performance out of hardware that mostly sits idle so it can sit even more idle.

You didn't have a point, and you still don't. Why not stop bashing everything you don't understand, and make use of constructive criticism instead of trolling & talking FUD for a change?

Edited by crahak
Link to comment
Share on other sites

  • 2 weeks later...
Shouldn't we have a .net thread ?

Were people could have a single debate about .net ?

I mean, let's do it once so there's no more flaming.

Wasn't there one in the polls forum?

Let's just kick this one back up then... and since this is the general discussion forum anyway, this can turn into a programming quality/efficiency/etc. discussion.

And if your only problem with open source is resource usage, then why not fix it yourself? You've got the full source code already.
That's exactly what I did, but it's still very shocking to see how many lusers can't figure out how to use a compiler correctly, this statement coming from someone who has more experience with Asm than C/C++!

I recently downloaded an open-source Pascal -> C converter. The original program file was 280Kb, and on inspection most of it was empty space. All I did was recompile it with my default compiler options (/O1 /Og /Gf /MD) instead of the original (/Zi /O0) and it went down to 113Kb. Another program in the same set went from 28Kb down to 5.5Kb. I didn't touch the source at all, just recompiled it.

I don't know who to blame more, M$ for setting such inefficient defaults or the programmer for not knowing that the defaults aren't the best :no:

If you want to see an example of good code, look at Donald Knuth's TeX source code.

Link to comment
Share on other sites

Shouldn't we have a .net thread ?

I really see no point to have one. People flame, write and talk FUD about Vista (very popular nowadays), or about XP and how it's just a bloated Win2k with activation, how linux or windows suck/is better, or which gaming console is best, or programming languages (and countless related debates about things like ORMs), or which High Definition "DVD" format, politics, religion, car brands, sport teams, beer, mp3 player brands, music types, etc...

People will disagree on several things no matter what, and nothing will ever change this. And there will always be fanatics in each camp with their agendas that will often talk/spread FUD, shills, apologists, biased people, idiots, people with irrational hatred, optimists and pessimists, people into the latest fad, people who comment on things they have no basic/fundamental understanding of, along with a large group of uninformed/misinformed folks and the good ol' trolls. No thread will ever change anything to this.

And if anything, polls are a very bad idea. Let any 12yo kid who has no clue about a specific technology (isn't a programmer, doesn't know the language nor the framework/libs, how it affects development in any significant way, or anything like that) to vote "it sucks!" or "it teh roxx!" really accomplishes nothing at all. It's as insightful as polls with semi-random votes about the latest gen gaming consoles (mainly folks voting against the company they dislike), or the results of a Vista poll (if there was one) - lots of people would say it sucks, no matter what (with or without any reasons with a basis in reality or technical merit of any kind), and the same poll a year later would be VERY different... Polls are a fun thing to know things like the favorite colour of most members perhaps, but not so much for highly technical/complicated stuff most site users have an extremely limited knowledge/understanding of...

Link to comment
Share on other sites

The original program file was 280Kb, and on inspection most of it was empty space. All I did was recompile it with my default compiler options (/O1 /Og /Gf /MD) instead of the original (/Zi /O0) and it went down to 113Kb. Another program in the same set went from 28Kb down to 5.5Kb. I didn't touch the source at all, just recompiled it.

I'm still at a loss to understand what the big issue is. Si there's a 167Kb waste. Okay... on a 500GB hard drive, that's what?, a 0.03% waste? Is that really significant? Just installing the latest game takes up several GBs of hard drive space, and even then, it's still not significant. I can hardly see the point of worrying about it.

And by the way, I'm honestly not sure, but /Zi /O0 sounds like debugging, not release, switches to me considering /Zi is for complete debugging info. And you know, I would rather see a program compiled with /O2 (optimize for speed) then with your choice of /O1 (optimize for size). I program that is faster is, IMHO, hundreds of times better then a program that is smaller by a few KB.

I would rather have a program that's 100MB in size if it's a 100x faster then the one that's 100KB anyday.

Link to comment
Share on other sites

I'm still at a loss to understand what the big issue is. Si there's a 167Kb waste. Okay... on a 500GB hard drive, that's what?, a 0.03% waste?

Yeah, I didn't take time to comment on that, but thought the same. That's a lot of trouble to save <200kb of disk space (something I bought 1.28TB of in the last 4 months). And no, that's not even a 0.03% waste, it's a 0.00003% waste - or 0.00005$ worth of storage at today's prices (~0.30$/GB) - not something I lose very much sleep over personally (redo that job 30000 times and that'll be enough to buy you a single cup of coffee at tim hortons). And yeah, we've got tons of things wasting countless GBs of space, and i'm not even worried about it. If the program did what it's supposed to, didn't make the computer slow down to a crawl or anything, then there's no reason to even bother. 167KB is something I might have been worried about in the days where no one had hard drives ("dang, this will fill the 5 1/4" floppy so quickly!"), but nowadays...

And like you said, compiling for size over speed makes no sense whatsoever (So much for blaming others for not knowing what compiler options to use :rolleyes:), so time wasted recompiling something to save a tiny fraction of a MB and make it slower in the process. Even if it only took a single second to download the source, decompress it, start IDE of some sort, open project, tweak stuff, recompile and all (that'd be a speed record!), then you'd still be working for 0.18$/h (0.00005$ of space saved/second), which makes flipping burgers seem like a high pay job :lol: I don't know about others, but worrying that much about a few kilobytes of disk space seems like some sort of drastic obsession about saving every last byte of disk space at any costs.

Edited by crahak
Link to comment
Share on other sites

The original program file was 280Kb, and on inspection most of it was empty space. All I did was recompile it with my default compiler options (/O1 /Og /Gf /MD) instead of the original (/Zi /O0) and it went down to 113Kb. Another program in the same set went from 28Kb down to 5.5Kb. I didn't touch the source at all, just recompiled it.

I'm still at a loss to understand what the big issue is. Si there's a 167Kb waste. Okay... on a 500GB hard drive, that's what?, a 0.03% waste? Is that really significant? Just installing the latest game takes up several GBs of hard drive space, and even then, it's still not significant. I can hardly see the point of worrying about it.

And by the way, I'm honestly not sure, but /Zi /O0 sounds like debugging, not release, switches to me considering /Zi is for complete debugging info. And you know, I would rather see a program compiled with /O2 (optimize for speed) then with your choice of /O1 (optimize for size). I program that is faster is, IMHO, hundreds of times better then a program that is smaller by a few KB.

I would rather have a program that's 100MB in size if it's a 100x faster then the one that's 100KB anyday.

No, that's over 50% waste. Would you want to buy a 500Gb HDD and then only be able to use 250Gb of it? Or, the other way around, buy a 500Gb HDD but then use it as if it were 1000Gb?

One little prog sure doesn't make a lot of difference, but imagine doubling the amount of storage you have left over.

BTW, /O2 gives ~140Kb, and /Ox (maximize speed *and* minimize size, i.e. maximize efficiency) gives ~120Kb. Still a whole lot better than the original, and me recompiling with the options that the author supposedly used (i.e. debug mode) gives ~200Kb. The only thing that I can think would be different is the compiler. I'm using Visual C++ '98, he's probably using something newer. So much for that "compilers are getting better at optimizing code and in the future will be better than Asm programmers" statement.

I prefer size over speed, since I seem to have plenty of CPU power. If you prefer speed over size, that's fine too. The point is that going for both maximum size and minimum speed doesn't make any sense :wacko:

Link to comment
Share on other sites

No, that's over 50% waste.

That's 50% waste of a tiny executable's place, i.e. 50% of ~nothing. Now, if you manage to compress my several TBs of data by 50% (without zipping everything, lowering my AV stuff's quality by half and such), then you'll be able to claim 50% disk waste. Besides, your math analogy is broken, saying stuff being too big making it seem like half is one thing (even if absolutely untrue), but making it smaller would only make it appear as it's "real" size, not twice of it (2x smaller executables won't quadruple apparent HD size).

One little prog sure doesn't make a lot of difference, but imagine doubling the amount of storage you have left over.

One binary - or heck, ALL my binaries could grow by 200kb, and it wouldn't make much of a difference at all. It would add ~1GB to my current \windows directory tops - all of 30 cents worth of space wasted, big deal. Once you manage to shrink my photos, mpeg4 files, and all that by 50% (lossless), then perhaps you can speak about "doubling space"... The binaries here represent a small fraction of 1% regardless.

So much for that "compilers are getting better at optimizing code and in the future will be better than Asm programmers" statement.

They ARE getting FAR better indeed. Just not at making irrelevantly tiny apps, but rather FASTER apps (and supporting all kinds of new stuff) - which actually matters (and other ways). Again, the compilers make FAR faster code than your asm - if that's not proof enough...

I prefer size over speed, since I seem to have plenty of CPU power.

Hard disk storage is uber cheap. It will take serious optimizations, recompiling, thousands of hours wasted and what not before you save a single GB of space doing this, which again is worth all of 30 cents/GB, and always getting cheaper, hence becoming even less relevant by the day. It's just not worth it for most people. But CPU speed is NOWHERE near that cheap - it sure ain't 30 cents/100MHz flat! You can't just add more (extras) as you need it - at least not without throwing your existing CPU out. Core 2 Duos are too slow for some people's likings (some apps do a lot of number crunching, it's not uncommon AT ALL) - almost everybody nowadays encodes mpeg2 or 4 and mp3's, compresses and decompress large zip/rar files and such. And a very significant part does all kinds of other CPU-heavy tasks (virtualization, rendering, AV editing [NLE], CAD work, using/developing apps that put significant load on a DB like warehousing or load testing, photo work making panos/HDR/images with many layers/etc, and so much more things that really aren't uncommon at all). Application binary size has NEVER been an issue for any app I've ever used, but lack of CPU power frequently has been.

Link to comment
Share on other sites

One little prog sure doesn't make a lot of difference, but imagine doubling the amount of storage you have left over.

One binary - or heck, ALL my binaries could grow by 200kb, and it wouldn't make much of a difference at all. It would add ~1GB to my current \windows directory tops - all of 30 cents worth of space wasted, big deal. Once you manage to shrink my photos, mpeg4 files, and all that by 50% (lossless), then perhaps you can speak about "doubling space"... The binaries here represent a small fraction of 1% regardless.

No, I meant those binaries doubling in size. Also, those are the files that are among the most compressible. Photos, mp4, etc. are pretty much compressed already (I'm assuming you mean JPG and PNG/GIF, not BMP), and can't really decrease in size (read about 'entropy').
So much for that "compilers are getting better at optimizing code and in the future will be better than Asm programmers" statement.

They ARE getting FAR better indeed. Just not at making irrelevantly tiny apps, but rather FASTER apps (and supporting all kinds of new stuff) - which actually matters (and other ways). Again, the compilers make FAR faster code than your asm - if that's not proof enough...

The hardware is getting faster. Compare the executables generated by an older compiler vs a newer one for speed, on the same hardware, to find the real difference. There might be a tiny increase in speed, but that tiny increase is far less than the increase in size. As mentioned above,
I would rather have a program that's 100MB in size if it's a 100x faster then the one that's 100KB anyday.
1024 times the size, yet only 100 times the speed? Size and speed are often directly proportional, as in the case of opts like loop unrolling and function inlining (indeed, the fastest code is a straight line of execution), and one usually settles for a medium between extreme speed and extreme size which maximizes efficiency. In that case above, maximal efficiency (which balances speed and size) occurs somewhere between 100MB and 100KB.

"Imagine if all software was 1/128th of its current size, and 128 times faster on the same hardware. This is the level of efficiency that computers were supposed to be able to achieve, yet this ideal is drifting farther away from reality every day."

Link to comment
Share on other sites

No, I meant those binaries doubling in size.

Alright, even if they all doubled, it would still make a very similar difference (1.xGB), hardly a big deal.

Photos, mp4, etc. are pretty much compressed already (I'm assuming you mean JPG and PNG/GIF, not BMP), and can't really decrease in size

That's precisely my point. Yet that accounts for a few TB of my data. Shrinking this by 1% (or even less) would make FAR more difference than magically shrinking all binaries down to 0 bytes. In other words, shrinking the binaries would be like worrying about spending 1.50$ on coffee when you're in debt by as much as the US national deficit: wrong order of priorities in the cost cutting, and too much worrying over what esentially amounts to nothing at all.

Compare the executables generated by an older compiler vs a newer one for speed, on the same hardware, to find the real difference.

Actually, there's a very noticeable increase in speed, especially if you're doing comparisons on recent hardware. Every new version has new optimizations for new stuff - especially for things like multiple-core CPUs, multithreading/auto-paralellization/SMP and such. The increases and improvements are definitely there, and the benefits are very significant. As for the size increases, I'd say you're wrong again. The basic size of a "hello world"-complexity app has grown by a few bytes (irrelevant to most), but given a more complex app compiled with the same optimizations it shouldn't make a huge difference - and again, I can't even see why anyone would worry about this nowadays.

1024 times the size, yet only 100 times the speed?

Absolutely! "Just" 10x faster is an amazing speed increase! If an app can cut down my time by 90% (of any task that consumes a fair amount of time), I'd buy a hard drive for it if I had to. Now 100x faster... I'd buy specialized hardware to run it if I had to, so 1024 times the size (1024 times ~nothing = barely enough to notice it)? Who cares?

Imagine if all software was 1/128th of its current size

Who cares? :zzz:

and 128 times faster on the same hardware.

That's what people are trying do to (using new compilers and what not).

This is the level of efficiency that computers were supposed to be able to achieve, yet this ideal is drifting farther away from reality every day."

A totally irrelevant ratio to binary size? Who cares? Give me fast code (getting closer to "128x faster") and I'll be happy, and we're getting closer to that reality every day.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...