Jump to content

Fun with web-bots using Win-98 Personal Web Server


Recommended Posts

Starting early this month, I installed the Personal Web Server that comes with win-98 on one of my home systems and forwarded port-80 from my router to the PC. As expected, there were a lot of requests from bots for files that don't exist. After a few days I created those missing files by taking a 60 mb porn video and replicating it as many times as necessary, creating various directory trees and placing these replicated files in the correct places so they'd be served up when requested. Examples of the most commonly-requested files are:

testproxy.php (this is very common)

The default.asp page gets requested a lot, but I think it can't be a binary file because only a fraction of it ever gets uploaded and I think that corresponds with an EOF character being hit. Some of the more strange file-requests are:


I've served up that 60 mb file about 85 times so far, totalling just over 5.15 gb. My upload speed is only about 70 k/sec so I figure I'm tying up those bots for at least 15 minutes each time they request these files. My internet plan is unlimited and this is the upload direction we're talking about so I'm probably going to substitute a 1 or 2 gb file at some point.

Over at $dayjob my web server is seeing the same sort of hits for these files, but for a twist I'm performing an http re-direction on them to a 4 gb file being hosted by a remote server. I've tested the redirection to see if the bots follow it by pointing to the PWS on my home server and in most cases the redirection is followed. There doesn't seem to be a way to setup re-directions on win-98 PWS like there is on IIS4 running on NT4.


Link to comment
Share on other sites

  • 1 month later...

Ah yes, I remember several of these requests from when I hosted my own site from home.

A lot of that is basically bots seeking out vulnerable parts of a PHP server, or even IIS (which explains requests for .asp documents).

The other three that are requesting a directory seem to be random and aren't necessarily explainable. That "w00t" request comes in various shapes and sizes, although I can't recall specifically if that was targeting anything.

Basically, with Apache, I would block these not with robots.txt, but with the .htaccess file instead, particularly the Baidu search engine bots, which would not be sensible and request files that exist, but keep looking for directories on the server that DIDN'T exist. My attitude was "fair enough if you want to index my site, but FFS, INDEX WHAT IS THERE! Don't go looking for 403's and that sort of thing!"

Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.

  • Create New...