Prevent access to website via mx subdomain


I’ve noticed some entries in my logs where people (most likely bots/hackers) are trying to log in to my site via the mx subdomain (which seems to be automatically added to the config file).

I don’t want this to happen. Is there a way to stop this subdomain leading to the main website? I’ve probably missed something obvious, can’t put my finger on it.


Same here, and for ftp subdomain. Strange, because the www subdomain is explicitly set as ServerAlias in the apache site file, implying that apache wouldn’t serve it if it wasn’t.
Also I have ssl-only set, but the mx subdomain cheerfully loads the page with no SSL.

You could probably set Apache to trap those subdomains explicitly, but that feels like it’s hiding the symptoms instead of solving the problem.


Curious things going on.

I first noticed it on a client site, which is SSL only, but if you go to mx. it complains about the certificate (as you’d expect) but you can then force it to continue loading (as most browsers will let you do).

On my sites, the mx. version of websites go to the ‘default’ domain, rather than the named domain in question.

Either way it’s not expected behaviour.

When I visit the ftp. subdomain of a site, I get a popup asking if I want to load up my FTP software. (I presume this is a browser intervention).


Changing the lines to select domain name instead of IP address doesn’t help, either. I’m not sure I want to wade in any further, though. Apache configuration isn’t my strong suit…


At one level I’m not overly worried, all my security / defences are still in place, they can attempt WordPress logins etc. and will be blocked in the same way. It just feels a bit messy. I’m not even sure if the mx subdomain is used. I assume it’s meant for mail server traffic, but as far as I’m aware it’s not being used that way.

(Edit: I expect it’s used externally for having mail delivered to me.)


Without looking at it, this sounds like the actions of the ‘mass hosting’ configuration, seeing the hostname being passed and matching things up as much as possible.

Assuming all the sites have their own configs (as they probably all have HTTPS configured), it may be worth disabling the zz-mass-hosting configuration with:

sudo a2dissite zz-mass-hosting.conf
sudo a2dissite zz-mass-hosting.ssl.conf
sudo symbiosis-httpd-configure
sudo service apache2reload

… and seeing if it still happens.

If needed, you can re-enable it by repeating the above with a2ensite rather than a2dissite.


If your mx subdomain resolves to the same IP address as your web site (and it will be with default symbiosis setup), then yes, it is expected behaviour. Browsers will ask Apache for a web site, but Apache has no configuration that refers to it. So, it uses the first Apache configuration available. There likely won’t be a certificate that matches, but you could get that to work, if you wanted.

You could configure another subdomain to serve an error message, if you wish.

The same goes for the ftp subdomain, and a few others that Symbiosis thinks might usefully point to the server. And ANY other domain that resolves to your IP address, even if you don’t own the domain.

Alternatively, you could hack the DNS records to remove those domains, but that might mess with email delivery, or ftp access, and so on.


On mine I’ve edited the default DNS template to remove the MX domain (as well as ftp.*) and just use a fixed MX host across all domains on that server. That domain then ppoints to roundcube on the server.