Jump to content

Anti-Malware Suggestions


 Share

Recommended Posts

  • Do YOU allow IE to run ActiveX from the Internet Zone?   If so, why?

 

Actually I do NOT allow IE to run AT ALL.

 

But then I am no expert. :no:, not even a self-proclaimed one, here is one:

http://www.msfn.org/board/topic/127283-experts-say/

 

Anecdotally on the machines on which I never run IE, but only Opera as a browser (and lately only rarely a Chrome based browser), I was never infected by anything in the last 10 years or more, most probably this means that there is not a direct cause-effect relationship between how good a security model is and actual security. :unsure:

 

OT but not much also UAC, DEP and ASLR (and what not) introduced in Vista and later are good security models in theory :yes:, but in practice I did not notice the dramatic drop in infections worldwide I would have expected since their introduction:

http://www.msfn.org/board/topic/171674-mass-hysteria-on-the-interwebs/

 

jaclaz

Link to comment
Share on other sites


Let's keep things civil in this thread. We're here to have discussions and to learn. As a reminder:
 
 

7.b This community is built upon mutual respect. You are not allowed to flame other members. People who do not respect personal opinions and/or personal work will be warned in first instance. If you ignore the warning and keep on flaming, you will be banned without notice.

 
MSFN Rules

Link to comment
Share on other sites

Your warning has been noted, Tarun, as has the disrespect you have shown me by actively discrediting my suggestions to others without bothering to back up your claims.  The difference is that I don't hold power over your ability to post here.

 

I'm done in this thread and with trying to share my knowledge of computer security here.

 

-Noel

Link to comment
Share on other sites

Security can still be decent with UAC disabled.

 

not really. Sorry.

 

OT but not much also UAC, DEP and ASLR (and what not) introduced in Vista and later are good security models in theory :yes:, but in practice I did not notice the dramatic drop in infections worldwide I would have expected since their introduction

 

they do. 90% of the security issues are fixed only be having UAC on.

 

http://arstechnica.com/information-technology/2010/03/half-of-windows-flaws-mitigated-by-removing-admin-rights/

Edited by MagicAndre1981
  • Upvote 1
Link to comment
Share on other sites

they do. 90% of the security issues are fixed only be having UAC on.

 

...or simply running as non-Admin....

 

The mentioned article BTW is the usual (right) assessment of mitigation of vulnerabilities (which is a good thing, but has not that much to do with actual "security"), if you actually believe what is in that 2010 article, it seems like noone would have been infected by any malware or exploit etcetera since the second half of 2010, and it seems to me like that did not happen.

 

If (completely invented/faked numbers) in 2004 there were 10,000 "security incident" every 1,000,000 online systems and in 2014 there were (still say) 5,000 "security incidents" every 1,000,000 online systems, then the "increased security" would have halved the occurences of incidents.

 

What I failed to notice is such a high drop in this, I am talking here anecdotally, I have more or less the same number of (more or less demented) friends calling me because they have botched their PC through some virus or malware in recent years then I had 10 years ago or so

 

jaclaz

Link to comment
Share on other sites

 

this is what the UAC is doing *facepalm*

Sure it is :).

 

Most of the controversy about UAC (particularly with the initial setup in Vista :ph34r: ) was with the fact that it was far too "intruding" than actually needed, and it's use has been largely mitigated in later Windows versions (while in the meantime a number of third party software writers evidently learned how to write programs requiring less privileges).

 

I believe that it is not very easy to balance the actual *needed* protective measures with the actual common *needs* that a user (particularly an "uneducated" home, "average Joe" ) might have, and all in all the good MS guys have IMHO reached in 7 (and I presume also later versions of Windows) a good compromise.

 

Still it is not evidently a much working in practice "security" mechanism, it is simply a way to "invite" people to pay some more attention on what they allow to run on their machines but seemingly people keep pressing "yes" to those prompts anyway or downright disable UAC because they are annoyed by too many prompts.

 

jaclaz

Link to comment
Share on other sites

 

If (completely invented/faked numbers) in 2004 there were 10,000 "security incident" every 1,000,000 online systems and in 2014 there were (still say) 5,000 "security incidents" every 1,000,000 online systems, then the "increased security" would have halved the occurences of incidents.

 

What I failed to notice is such a high drop in this, I am talking here anecdotally, I have more or less the same number of (more or less demented) friends calling me because they have botched their PC through some virus or malware in recent years then I had 10 years ago or so

 

jaclaz

 

 

The explanation for this may lie in the vastly increased number of malware infections floating out there, waiting to affect vulnerable systems.

 

Continuing to use made-up numbers ;), if in 2004 there were 10,000,000 viruses, trojans, and whatnot lurking in cyberspace, and in 2014 there were 100,000,000 such pieces of malware, then the 90% protection that MagicAndre cited would, indeed, result in your anecdotally getting about as many reports of infections from your demented friends :) as before. When dealing with the absolute number of infections, in addition to the increased effectiveness of protective measures we also need to account for the increase in available malware packages.

 

--JorgeA

Link to comment
Share on other sites

 

The explanation for this may lie in the vastly increased number of malware infections floating out there, waiting to affect vulnerable systems.

 

Continuing to use made-up numbers ;), if in 2004 there were 10,000,000 viruses, trojans, and whatnot lurking in cyberspace, and in 2014 there were 100,000,000 such pieces of malware, then the 90% protection that MagicAndre cited would, indeed, result in your anecdotally getting about as many reports of infections from your demented friends :) as before. When dealing with the absolute number of infections, in addition to the increased effectiveness of protective measures we also need to account for the increase in available malware packages.

 

--JorgeA

 

 

Sure :) that could be a random number explanation as good as any other one, but IMHO you are confusing "pieces of malware" with "number of vulnerabilities" and with "number of incidents" (as much as the good MS guys and a lot of other people around confuse vulnerabilities with security).

 

There may be tens, hundreds or thousands of "pieces of malware" making use of a same, single vulnerability.

There may be tens, hundreds or thousands of "incidents" that could be related to a same, single vulnerability (actually to the corresponding exploit).

 

On the other hands there may be tens or hundreds (possibly even thousands) of vulnerabilities for which an exploit is not practically doable or for which there is not a viable exploit and tens or hundreds of vulnerabilities for which an exploit exists but that never causes an incident.

 

More loosely a number of vulnerabilities in itself is a sterile number.

 

Vulnerabilities are (largely) theoretical, in the sense that very often they need such a complex set of concurrent settings/setups to be not statistically sound.

 

Let's say that I am writing a malware of some kind, and I discovered a brand new vulnerability.

The vulnerability needs (say) that:

  • a user runs Windows XP SP3
  • his/her motherboard is an Asrock xyz model
  • the machine has more than 4 Gb RAM
  • the NIC MAC begins with 00:E0:4C (i.e. an additional RealTek network card is in the system)

and when a specially forged document (let's say an animated GIF) is accessed on a Friday between 00:01 and 00:09 GMT I can run a payload of some kind, ONLY IF the user is using Internet Explorer 7.0 AND he is logged in as Administrator.

 

It is clear that if the user runs as "normal user" the vulnerability is not anymore a vulnerability, but also if anyone of the other conditions are not met, so that the number and complexity of the other needed conditions makes it so improbable that my evil plan has any actual chance of success that even if all the world users would run as Administrator I will never be able to cause any incident by using that vulnerability.

 

Yet it would be counted among the 90% of vulnerabilities "fixed" by running as "normal user".

 

jaclaz

Link to comment
Share on other sites

  • 3 weeks later...

jaclaz seems to be most realistic person here.

 

There isn't any 'security model' good by default. There isn't some attack surface constantly being decreased by new defending techs and tools.

All the balance is dynamic - and it would be greatest mistake to think that some solution is final.

Look: Chrome is supposed to be the safest browser - by design. But the latest Facebook malware epidemies were spread only by users with Chrome as the technic was to install malicious extension designed for this browser (and most of its clones).

 

Next point. Everyone say: "Update, update and one more time update!". Well, OK. Done.
Who can manage all the knowledge about fresh vulnerabilities - we, users (even advanced ones) or people from the dark side? You know answer.
They do. Not we.

 

Paradoxically, but newly updated system is more attractive and unrestricted field for villain than old good stable with known holes. New software, new modules, new functions mean new security holes. They learn and use them faster than we react and start defending.

 

That's why I say 98 nowadays is the safest OS from Windows ;) That's why I say that need of updating is seriously overestimated.

That's why we can't rely on any even slightly outdated research saying that some tech or ware provides the best defence.

Provided. At the moment of release. Not now.

 

What I can recommend here? Not so much. Decrease attack surface by not using popular products, popular setup. Apply all the possible proactive defense to your current setup (it may be WinXP or Win 7SP1, but you really can stop racing this Updates Grand Prix). Well, the only real system-level tool that work are policies/Applocker or EMET.

 

But may be better do not install what you don't use. For XP I refused from .NET at all (except 1.1 that goes with SP3). After deinstallation I found that only 1 (!) application stopped working. And it was just wallpaper changer that I couls replace with a number of alternatives. But I decide to install back only .NET 2.0 needed.

Well, I've updated IE to the last version available (but never intend to use it for surfing) and refused from other popular browsers in favor of K-Meleon. I've changed Adobe Reader for Foxit. I use Skype through Trillian (voice only) or have video calls with ooVoo. Sylpheed instead of Outlook, Thunderbird etc.

The second point is content filtration - so I use adblockers along with web filters. DNS filtering is good but I use K9 Web Protection (and the second option could be web-filter from Forticlient).

Next I use AnVir Task Manager for advanced startup control.

Last but not least one is DrBrain antivirus ;-)

 

This doesn't mean I restrict myself in where to go and what to do.

My security concept - on the contrary - lets me work as admin, visit any sites :angel:w00t:, install software etc. It works. And lets me work too.

I'm the real owner of my PC. I don't need all these real-time AV-monsters, eating all the power and resources and producing conflicts.

But I spend some unused time of my CPU to regular checks with good AV-scaner and some other selected antimalware stuff.

No malware last number of years. For me and my family. (And they aren't PC expert-level users :whistle: )

Link to comment
Share on other sites

  • 1 year later...

Couldn't one, instead of using the hosts file as a blacklist, use a proxy like Squid to accomplish the same thing. Using a proxy would prevent system degradation that could be caused by inflating the hosts file, and speed up web browsing by freeing up bandwidth (Squid is a caching Proxy).

  • Upvote 1
Link to comment
Share on other sites

There is no clear cut method. Our computers are better nowadays so that a large HOSTS file isn't a problem like it was years ago. For a single computer it may well be fine. Then there are the more advanced methods of blocking sites elsewhere in the chain, be it using a proxy, firewall or a DNS server. It may make more sense to do it on one of these separate network devices if you have multiple computers and you don't want to manage HOSTS files on each.

Link to comment
Share on other sites

Sure, you can run your own DNS server.  I have done this, and it will serve name resolutions to even the system on which it is running.  In my case I'm using a package called "Dual DHCP DNS Server", which is open source (and modified by me to expand the list size capacity).  It has completely replaced the need to have a big hosts file on all my systems, and provides additional advantages, such as wildcarded entries.  And it serves to protect all the systems on my LAN.

Right now my blacklists have 52,000 site names plus 18,000 wildcarded domains.  Valid domains still resolve in a few milliseconds, and as an additional advantage it provides local caching. 

With the above in place the pages on this site load in around 1 to 1-1/2 seconds.

If it's at all interesting to anyone, I have shared the script I use to generate DNS server blacklists here:

And the DualServer changes to increase list size:

-Noel

Edited by NoelC
  • Upvote 1
Link to comment
Share on other sites

But for what it's worth, I never could measure a downside to having a big hosts file.  I'd love to hear of what's being seen, real-world, that leads people to think having a few tens of thousands of hosts entries affects performance negatively.  The only small slowdown I've ever experienced is right after having updated the file, the very first time it's read in.  There's a DNS caching feature right in Windows itself that serves to speed up all the rest.

-Noel

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.


×
×
  • Create New...