Nomen Posted October 22, 2016 Share Posted October 22, 2016 Starting early this month, I installed the Personal Web Server that comes with win-98 on one of my home systems and forwarded port-80 from my router to the PC. As expected, there were a lot of requests from bots for files that don't exist. After a few days I created those missing files by taking a 60 mb porn video and replicating it as many times as necessary, creating various directory trees and placing these replicated files in the correct places so they'd be served up when requested. Examples of the most commonly-requested files are: config.php setup.php sitemap.xml testproxy.php (this is very common) stssys.htm robots.txt wp-login.php xmlrpc.php The default.asp page gets requested a lot, but I think it can't be a binary file because only a fraction of it ever gets uploaded and I think that corresponds with an EOF character being hit. Some of the more strange file-requests are: /muieblackcat /ncsi.txt /w00tw00t.at.ISC.SANS.DFind:) I've served up that 60 mb file about 85 times so far, totalling just over 5.15 gb. My upload speed is only about 70 k/sec so I figure I'm tying up those bots for at least 15 minutes each time they request these files. My internet plan is unlimited and this is the upload direction we're talking about so I'm probably going to substitute a 1 or 2 gb file at some point. Over at $dayjob my web server is seeing the same sort of hits for these files, but for a twist I'm performing an http re-direction on them to a 4 gb file being hosted by a remote server. I've tested the redirection to see if the bots follow it by pointing to the PWS on my home server and in most cases the redirection is followed. There doesn't seem to be a way to setup re-directions on win-98 PWS like there is on IIS4 running on NT4. 1 Link to comment Share on other sites More sharing options...
rloew Posted October 22, 2016 Share Posted October 22, 2016 I'm not sure what the limit of PWS is but I wouldn't stop at 4GiB. A proper ROBOTS.TXT file might cut down on the other requests, at least from the legitimate scanners. Link to comment Share on other sites More sharing options...
nostaglic98 Posted November 27, 2016 Share Posted November 27, 2016 Ah yes, I remember several of these requests from when I hosted my own site from home. A lot of that is basically bots seeking out vulnerable parts of a PHP server, or even IIS (which explains requests for .asp documents). The other three that are requesting a directory seem to be random and aren't necessarily explainable. That "w00t" request comes in various shapes and sizes, although I can't recall specifically if that was targeting anything. Basically, with Apache, I would block these not with robots.txt, but with the .htaccess file instead, particularly the Baidu search engine bots, which would not be sensible and request files that exist, but keep looking for directories on the server that DIDN'T exist. My attitude was "fair enough if you want to index my site, but FFS, INDEX WHAT IS THERE! Don't go looking for 403's and that sort of thing!" Link to comment Share on other sites More sharing options...
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in
Sign In Now