Jump to content

Protect .NET source code by obfuscating it?


cyberjoe2

Recommended Posts


About drivers. Tell me the Catalyst Control center is bringing performance increase? lol (and now Nvidia is following the "exemple") But maybe you won't agree with that just because it is written in .NET?

...

So do I. And they probably don't have as much money as yours because they care about space.

And by the way, speed over space for antivirus? Don't tell me the blockbuster security suites are so big because they're fast...

First off - the Catalyst Control Center isn't the video drivers. It's the Control Center. The video drivers aren't getting more bloated - they improve speed. If you don't like the CCC, don't install it. Simple!

And who said anything about how the "big AV programs" are bigger because they're faster? I said that the size wasn't the main concern of the majority of computer users. I never made any correlation between the size and speed of Norton. Again in this discussion, you disregard the real message...

And if you're complaining about the disk space that Norton takes up, then I would really recommend re-looking at your personal budget and see where else you spend those $0.25 (and I'm being generous here when it comes to the disk cost).

How do you define the "efficiency" of a compiled program?
Efficiency is directly proportional to the speed and inversely proportional to the size.

Any sources to back this up? Or is that just your own opinion?

I've never seen anything of the sort come out of any of my courses that I've taken
Many formal computer science courses don't really spend much depth going into discussions on software efficiency, because [1] they are reluctant to admit that software efficiency has decreased with time, [2] the amount of material that is written on efficiency is relatively sparse, and [3] they don't want to get into heated debates such as the one right here, that often begin to degenerate :)

Is that so? I actually e-mailed the professor of the compiler course I sat in on about this exact topic. He essentially told me that all speed-critical applications are compiled with speed as absoute paramount. Space is not a primary concern with modern systems. People care about speed - not disk space. Like crahak already pointed out, the extra cost for more disk is far less than the extra cost for a faster processor.

Let me give you a non-compiler example which follows the same argument. Databases. Which would you prefer - a database that was larger and faster, or smaller and slower? Anyone who deals with databases will gladly spend the extra money for the disk space required for a fast DB. If you think otherwise, your company is gonna go out of business real fast.

Your example about SSE actually proves crahaks point exactly. SSE was designed to be faster. The code example you gave is faster using SSE instructions. He pointed out that SSE was designed for speed - not space savings, and you proved his point perfectly.

Link to comment
Share on other sites

O RLY?

[snip]

The former is 31 bytes, while the latter is only 21 bytes. Both perform the same task, i.e. add 2 to a sequence of 8 bytes without looping. The latter is several times faster too.

Also, it takes CPU cycles to fetch instructions too, whether they be from the cache or the main RAM.

Like Zxian just said, you essentially made my point! Saved all of 10 bytes (~nothing), and after all your data is properly aligned and all, you just might have a bigger binary. It will be the case with most programs (again, reference material mentions this too). It's easy to see too, compiling an app to use SIMD instruction sets usually results in larger binaries (but they WILL be faster).

Besides, most sensible programs will do a feature check of the CPU, and use whatever is available for the math lib, which directly makes for larger binaries to start with. Similarly, 64 bit binaries are usually bigger, but also run faster. It's a total non-issue.

And if you really care so much about code density, you shouldn't be using MMX anymore - SSE is denser (and faster). MMX is old stuff that's been replaced by better/newer techs. MMX isn't even supported anymore in some environments like Win64: ML64 (MASM) will give you a "error A2222: x87 and MMX instructions disallowed; legacy FP state not saved in Win64" error. SSE/SSE2 is where it's at nowadays. Denser, faster, handles more data at once and all (even though the binaries might be even bigger, data being 16 byte aligned with SSE2)

And I wouldn't be so surprised schools don't teach your own totally bogus metrics. There's no standard "efficiency" measure of any type (you seem to confuse this with code density, which is unrelated, and has nothing to do with speed at all). And code speed isn't "inversely proportional to the size" - much the inverse. Compilers don't have options to compile for size *OR* speed for no reason. By your reasoning, they're both one and the same. Often, the larger code is faster (like 64 bit binaries, code that implements math libs that checks and uses whatever your CPU has to make it faster, binaries being larger due to aligned data and such). Speed is what really matters. Virtually nobody wants slow apps that save a couple bytes of disk space (even glocK_94 agreed on that!)

Besides, schools don't have time to teach possibly everything. A lot of it comes with continued learning & experience. So they focus on what's more important. Things like:

-they know/understand operating systems and such required stuff

-they know/understand various things like databases, XML, threads, network programming, etc.

-ensuring they have a firm grasp of concepts like OOP, MVC model, n-tier development, etc.

-teaching them about software engineering, patterns, UML, algorithms, etc.

-making sure they know a few different and useful languages: are competent with them, writing secure/stable/quality code, proper commenting, etc.

-ensuring they understand the web-side of development as well ([x]html, css, js, server-side techs, etc)

-project planning and estimates

-documenting your stuff

-profiling, load testing, effective debugging, etc.

-application life cycle management

-team development (using some SCM), unit tests, code coverage, continuous integration, etc.

-some different/more advanced stuff in various courses (3D stuff, data warehousing & OLAP, etc)

-problem solving in general, and understanding the process (user requirements, etc)

-the basics of several fields like GUI design, usability, QA, deployment, etc.

...

But like most schools, the primary intent is learning to learn (and learning by doing). Especially in programming and IT. New stuff comes at an incredible pace. It's very hard to keep up with everything (the sheer amount of new stuff that came out in the last year or so is totally insane!), and requires constant learning/retraining to stay current.

The purpose is to make employable programmers, that will be able to pick up new stuff as they have to (inevitable), that will quickly grasp the specific task at hand, be productive in a team setting, working on projects of various sizes/complexity. The intent is not to ensure write some over-optimized trivial apps so they can run on a 286 disregarding cost/time (much less optimized for a couple bytes of dirt cheap disk space over performance!)

Edited by crahak
Link to comment
Share on other sites

Like Zxian just said, you essentially made my point! Saved all of 10 bytes (~nothing), and after all your data is properly aligned and all, you just might have a bigger binary. It will be the case with most programs (again, reference material mentions this too). It's easy to see too, compiling an app to use SIMD instruction sets usually results in larger binaries (but they WILL be faster).
Multiply that 10 bytes by 1000 repetitions in an unrolled loop and see how the savings grow. The data is inherently aligned.
And if you really care so much about code density, you shouldn't be using MMX anymore - SSE is denser (and faster). MMX is old stuff that's been replaced by better/newer techs. MMX isn't even supported anymore in some environments like Win64: ML64 (MASM) will give you a "error A2222: x87 and MMX instructions disallowed; legacy FP state not saved in Win64" error. SSE/SSE2 is where it's at nowadays. Denser, faster, handles more data at once and all (even though the binaries might be even bigger, data being 16 byte aligned with SSE2)
That was just an example.
And code speed isn't "inversely proportional to the size" - much the inverse.
You're reading it wrong. I'm defining the relationship between efficiency and speed / efficiency and size, not speed and size. Putting both in one sentence does tend to confuse though... although I should point out that there is no simple relationship between speed and size.
The purpose is to make employable programmers, that will be able to pick up new stuff as they have to (inevitable), that will quickly grasp the specific task at hand, be productive in a team setting, working on projects of various sizes/complexity. The intent is not to ensure write some over-optimized trivial apps so they can run on a 286 disregarding cost/time (much less optimized for a couple bytes of dirt cheap disk space over performance!)
Once again going back to the commercialisation of programming, which I have already discussed earlier. However it's not commercial software that was the point of this debate, it was about free and open-source software. Edited by LLXX
Link to comment
Share on other sites

Multiply that 10 bytes by 1000 repetitions in an unrolled loop and see how the savings grow. The data is inherently aligned.

A couple bytes here and there just doesn't add up to much. And no, data is not inherently aligned, that's patently false.

You're reading it wrong. I'm defining the relationship between efficiency and speed / efficiency and size, not speed and size. Putting both in one sentence does tend to confuse though... although I should point out that there is no simple relationship between speed and size.

Makes no sense in any way I'm reading it.

Once again going back to the commercialisation of programming, which I have already discussed earlier.

Well, guess what? Most people don't take several years of university courses full-time, spending many thousands of $ to program trivial hobbyist apps/projects. Just like one doesn't take a full EE course to change batteries in a flashlight. You're criticizing universities over stuff that's mostly irrelevant (if not 100% wrong), but now you're even disregarding their main purpose: learning something, with the intent of earning a living.

However it's not commercial software that was the point of this debate, it was about free and open-source software.

No it's not. You trash talked all open source software in one post, but that was the only reference to OSS in the whole thread. Either ways, there's no reason for universities to change their ways and teach inane stuff instead. OSS programmers (except the lone hobbyist on a tiny project with unlimited time [at the expense of having no life] and no money contraints) need much the same - in fact, a large part of those OSS programmers work in large teams, paid by corporations e.g. mozilla. The license or price changes absolutely nothing to the task at hand, nor the required knowledge.

Link to comment
Share on other sites

And you're suggesting only experts can have one

No, I haven't. Nowhere. Ever. You're the one suggesting things here.

You lookup "democracy" - it has NOTHING to do with arguments or the lack of them. Yes, people can have opinions, but they can often be misguided, uninformed, and sometimes just plain wrong, so not always useful nor insightful.

You suggested people could only have "they steal our jobs" opinions and that's why they should leave it to the experts and I don't agree with that. Besides Right or wrong is subjective, and people, as long as they show arguments should be encouraged to express their opinions.

But anyway, even if I don't agree at all, I don't regret standing for my ideas of democracy (even on forums) you heard my point and I heard yours. So I suggest we keep it there since we won't agree on that.

When I check the thread, what I see before you started posting was objective, on-topic discussion/debate without any flaming. Check the thread really.
lol...
At the beginning of this discussion, I wouldn't have classified you as an id***, now I do. Why? Because of your pointless arguing, you're closed-mindedness and your refusal to accept that there are other opinions that count other then yours.

Oh, so now your treating me an i**** since you don't agree with you? Even if the heat went up, eveybody staid polite here. So don't fu****g get insulting or I will too !

Anyone who plays games on a Windows platform has DirectX installed. Do game users whine and complain because they have to install or update DirectX to be able to play the game they want to play? Nope. Anyone who is going to want to keep using Windows in the future is going to have no choice to install the .NET framework to be able to run applications in the future. The last few members of the small pointless resistance against the .NET framework, such as yourself, will eventually forget all about the .NET framework just like people have about DirectX.

Where did I say .NET was pure bloat and that I was against it ? I said it was a usefull language but that for the average user, having to download the frameworks (which is partly bloated) was a pain. I even said that's why it would get more popular with version 3 included in Vista. Just like DirectX people "forgot" about. My point was too make posters admit they can't blame people for not liking .NET as it is now and try to understand the reasons rather than state "they don't understand s****". You didn't read my posts and you dare judging...

As for the .NET framework being bloat? You obviously don't understand the benefits that it brings to counter just that. If you had any programming knowledge whatsoever, you would have already seen the advantages without having had to do much research. The .NET framework uses the same approach as programmers have done for years with modules, DLLs, and global functions. Seperating repeatable code into a globally available resource to be re-used from application to application.

I could say "pointless arguing, closed-mindedness and refusal to accept that there are other opinions that count other then yours" but I won't...

So...Bis repetita (talking about wasted time eh?)

And who said anything about how the "big AV programs" are bigger because they're faster? I said that the size wasn't the main concern of the majority of computer users. I never made any correlation between the size and speed of Norton. Again in this discussion, you disregard the real message...
If I quote you: "What's the major complaint about people who rant about Norton? It's slow. Most people don't care (or perhaps don't even know) how much disk space it takes up, but what do they notice - the speed (or lack thereof)." Isn't that suggesting speed should be preferd over weight ?
And if you're complaining about the disk space that Norton takes up, then I would really recommend re-looking at your personal budget and see where else you spend those $0.25 (and I'm being generous here when it comes to the disk cost).

Except you can't buy a few hundreds of megabytes. You're bound to buy a big new HD and that's expensive (especially when you need money for the rest of the PC).

(even glocK_94 agreed on that!)

There's a lot of points on which I agree but everyboy sees me as a .NET ennemy (which I'm not even)...

Edited by glocK_94
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...