Archives

Categories

Google and Certbot (Letsencrypt)

Like most people I use Certbot AKA Letsencrypt to create SSL certificates for my sites. It’s a great service, very easy to use and it generally works well.

Recently the server running www.coker.com.au among other domains couldn’t get a certbot certificate renewed, here’s the error message:

Failed authorization procedure. mail.gw90.de (http-01): urn:acme:error:unauthorized :: The client lacks sufficient authorization :: "mail.gw90.de" was considered an unsafe domain by a third-party API, listen.gw90.de (http-01): urn:acme:error:unauthorized :: The client lacks sufficient authorization :: "listen.gw90.de" was considered an unsafe domain by a third-party API

IMPORTANT NOTES:
 - The following errors were reported by the server:

   Domain: mail.gw90.de
   Type:   unauthorized
   Detail: "mail.gw90.de" was considered an unsafe domain by a third-
   party API

   Domain: listen.gw90.de
   Type:   unauthorized
   Detail: "listen.gw90.de" was considered an unsafe domain by a
   third-party API

It turns out that Google Safebrowsing had listed those two sites. Visit https://listen.gw90.de/ or https://mail.gw90.de/ today (and maybe for some weeks or months in the future) using Google Chrome (or any other browser that uses the Google Safebrowsing database) and it will tell you the site is “Dangerous” and probably refuse to let you in.

One thing to note is that neither of those sites has any real content, I only set them up in Apache to get SSL certificates that are used for other purposes (like mail transfer as the name suggests). If Google had listed my blog as a “Dangerous” site I wouldn’t be so surprised, WordPress has had more than a few security issues in the past and it’s not implausible that someone could have compromised it and made it serve up hostile content without me noticing. But the two sites in question have a DocumentRoot that is owned by root and was (until a few days ago) entirely empty, now they have a index.html that just says “This site is empty”. It’s theoretically possible that someone could have exploited a RCE bug in Apache to make it serve up content that isn’t in the DocumentRoot, but that seems unlikely (why waste an Apache 0day on one of the less important of my personal sites). It is possible that the virtual machine in question was compromised (a VM on that server has been compromised before [1]) but it seems unlikely that they would host bad things on those web sites if they did.

Now it could be that some other hostname under that domain had something inappropriate (I haven’t yet investigated all possibilities). But if so Google’s algorithm has a couple of significant problems, firstly if they are blacklisting sites related to one that had an issue then it would probably make more sense to blacklist by IP address (which means including some coker.com.au entries on the same IP). In the case of a compromised server it seems more likely to have multiple bad sites on one IP than multiple bad subdomains on different IPs (given that none of the hostnames in question have changed IP address recently and Google of course knows this). The next issue is that extending blacklisting doesn’t make sense unless there is evidence of hostile intent. I’m pretty sure that Google won’t blacklist all of ibm.com when (not if) a server in that domain gets compromised. I guess they have different policies for sites of different scale.

Both I and a friend have reported the sites in question to Google as not being harmful, but that hasn’t changed anything yet. I’m very disappointed in Google, listing sites, not providing any reason why (it could be a hostname under that domain was compromised and if so it’s not fixed yet BECAUSE GOOGLE DIDN’T REPORT A PROBLEM), and not removing the listing when it’s totally obvious there’s no basis for it.

While it makes sense for certbot to not issue SSL certificates to bad sites. It seems that they haven’t chosen a great service for determining which sites are bad.

Anyway the end result was that some of my sites had an expired SSL certificate for a day. I decided not to renew certificates before they expired to give Google a better chance of noticing their mistake and then I was busy at the time they expired. Now presumably as the sites in question have an invalid SSL certificate it will be even harder to convince anyone that they are not hostile.

2 comments to Google and Certbot (Letsencrypt)