R o o t s e c u r e . n e t
The Security News Site For Systems Administrators & Hackers Wednesday, 22nd October 2014 @ 16:21:32 GMT 
Reports | Web Server Compromise?
{18th Apr 2003}
PART 1 - No system is 100% secure, so what happens when someone does get in?

It depends upon the motive of the attacker.  If you were targeted specifically, its possible the attacker will first attempt to hide their presence on the system, and then remain there quietly observing, altering files or using it as a stepping stone to launch further attacks.  If however you just happened to be running a specific un-patched piece of software, for which a vulnerability / exploit code was recently published, its likely your server was only compromised because it was an easy target.  In this case your index page was probably renamed, and new one left in its place carrying some (relevant at the time) political message from the defacer. Along with the appearance of your page in its altered state, for all to see at one of the defacement mirror sites such as Zone-H or Delta 5 Security.

The first scenario would require a custom response given the specific circumstances, which is why the second is usually preferable, leaving the systems administrator with a clear course of action:
  1. Confirm that they got in through a known vulnerability, changed a page then got out (achieved by checking log files, software vendors pages, and security vulnerability reporting websites such as Bugtraq, Secunia, and Securiteam).
  2. Patch the vulnerability.
  3. Perform a check of important files on the server to confirm that they were not altered e.g. by running a low level comparison to backups.
PART 2 - Web page security is about managing visitors, and their level of access to a computer system.

On your system, you have every right to control what service if any, is given to a specific visitor.  If your server is defaced, and your under the misguided notion that's its better that your customers / potential customers / visitors would be better off not knowing, why not try to prevent it from being kept for posterity in a publicly available archive?

To this end systems administrators who have grown tired of having their server defacements mirrored have been actively blocking access to their systems from Zone-H's IP address (at upstream routers/web servers), which is reported as having some success, "A lot of websites started to BAN Zone-H IP address, preventing us from taking the mirror".  This however is only believed to be limited as Zone-H have changed IP addresses at twice in the past few months alone (around 28th February & 11th April) leaving systems administrators to play a time consuming game of catch-up.

An alternate/additional, step is to block access based on the ‘user agent' string (an identifier usually sent by web clients saying who they are), Zone-H's mirroring robot is known to currently be sending out the identifier "Java1.3.1_02" and it is thought that it is not possible to change this (easily) without changing the actual version of Java on the host system.

In Apache the following code placed in a .htaccess file, or between the <directory> tags in the Apache config file (for greater security) can be used to block (issue a 403 - forbidden) to the Zone-H mirroring bot.

RewriteEngine on
Options +FollowSymlinks
RewriteBase /
RewriteCond %{HTTP_USER_AGENT} ^Java1.3.1_02
# add [OR] on the end of last existing rule if applicable
RewriteRule ^.*$ - [F]


However this also means any other client connecting to the website in question using that same identifier would also be blocked.  On most sites this would only be a small issue if providing an XML feed, as there are Java based aggregators available.  Currently only very little is known about the mirroring bot, apart from the fact it is run from the same server as the website, and is reportedly coded in Sun's Java.

In the future those wishing to block the Zone-H mirroring bot via IP address will find it increasingly difficult at best, since Zone-H are "...implementing a shadowed network of mirror robots distributed around the world.  This would prevent websites to BAN Zone-H IP addresses." leaving blocking via "user agent" the only realistic option.