My Git Server Was DDoSed
On the evening of the 27th of July, 2020 (two days ago), I noticed some odd behaviour with my git server. UptimeRobot, the service which provides status.paritybit.ca, kept sending me emails every dozen or so minutes telling me that my git server was down and then up again. This happened a number of times but I just chalked it up to a misbehaving server or heavier-than-normal traffic.
It wasn’t until the notifications became so numerous and noticing that turning the machine off and on again wasn’t working that I figured something wasn’t right. I shut down the server for the night to prevent it from being continuously overwhelmed (and thereby me receiving a hundred emails overnight) and went to sleep with the idea of fixing it in the morning.
When I woke up, the first thing I tried was turning the server back on and waiting to see if it went down again. Maybe whatever was going on fixed it self in the ~13 hours that the server was offline. Unfortunately, within 10 minutes the server was down again.
My next step in diagnosing the problem was to check the server logs. Normally, I have both the
error.log turned off on my servers because I don’t need the content of those logs under normal circumstances. I proceeded to turn on logging again just for git.paritybit.ca and watched them with
tail -f to try to get an idea of what was going on.
I noticed a flood of various sketchy looking user agents constantly hitting various URLs in a pattern that was clearly indicative of bots. These bots were looking at what seemed like every single diff of every single file of every single commit and this was bringing my poor PineA64+ down to its knees trying to keep up with syntax highlighting all the code and generating all the
.zip files for every single snapshot. Luckily, it was just the webserver that kept going down; git access through SSH was unaffected.
These are the kinds of user agents I noticed:
"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; The World)" "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; TencentTraveler 4.0)" "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0" "Opera/9.80 (Macintosh; Intel Mac OS X 10.6.8; U; en) Presto/2.8.131 Version/11.11" "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Maxthon 2.0)"
My first thought upon seeing this traffic was that they were just some overzealous bots. I changed my
robots.txt file to deny scraping all but the index page of my git server and I changed the robots setting in
robots=none to prevent the site from being indexed.
Unfortunately, that didn’t help one bit. When a bot ignores robots directives, it’s more than likely a malicious bot (or at the very least, one made by an incompetent programmer). My next action was to set up Fail2Ban to catch these malicious actors and ban them for a number of hours from accessing my server. So I set up an
nginx-badbots filter (by copying the
apache-badbots filter and adding the malicious user agents to the
badbots-custom variable), I set the ban time to 2 days, the retry count to 2 times, and the action to add a drop rule to my
In the first few minutes, there were about 10 or so IPs added to the list. I let this keep going for an hour all the while monitoring it and watching my git server continue to come back online only to be knocked down again. After about 15 minutes, I got the idea to run a GeoIP lookup on the addresses which were banned and this is what I saw:
Clearly, either there was someone using a botnet with a lot of infected Chinese computers or someone in China was DDoSing my server using whatever IPs they controlled. I let it get this bad before deciding I had to take much more drastic measures:
At this rate, Fail2Ban was not working and, after an hour of adding hundreds of IPs to the blocklist, my server was still going down. It was suggested to me by someone on the Fediverse (I was ranting about this whole thing over there) that I might be able to stop the attack by pre-emptively banning all Chinese IPs. I left that as a last resort because it’s an extreme measure that has the very real implication of locking out good people from viewing my sites. It’s not like everyone in China is a malicious actor and there may even be people over there who find my stuff useful.
Though, unfortunately, the attack was showing no signs of slowing down and I had to take this action just to get my server back up and running again. I found a list of IPv4 address ranges for China and added all of those IPs to my firewall (it was an extra ~2200 rules). The server pretty much instantly came back online and the list of IPs banned by Fail2Ban stopped growing. Since then, my git server has stayed up without any further issues.
It doesn’t seem to be over though. It would appear the script kiddies caught on to my blanket IP ban and have been hitting the server again from various countries. I just checked as of writing this blog post, and this is now the output of Fail2Ban + GeoIP lookup:
Clearly the attack is still going on but, thanks to Fail2Ban, it’s no longer significant enough to keep bringing down my Git server. I also don’t notice any lag out of the ordinary when using it myself which is a good sign.
What perplexes me about this DDoS is that this person or group, whoever they are, is only attacking git.paritybit.ca and none of my other services. My website, Pleroma, and Matrix all seem to be behaving just fine and there’s no extraneous bandwidth usage from any of them. My git server is also just a web server for people to look at and clone my repositories; there are no accounts to take over or databases to hack. Is this some automated attack designed to scrape all my code in an attempt to find credentials? Who knows…
I’ll give it a few more days and see if I can unblock China from accessing my web services because I’d rather not block an entire country from my stuff just because of one malicious actor. Unfortunately, if the attack continues to the point of bringing down my git server again, I’ll have to leave the block in place (though I may try rate-limiting IPs that Fail2Ban catches instead of outright blocking them).
This is my eighty-second post for the #100DaysToOffload challenge. You can learn more about this challenge over at https://100daystooffload.com.