Uhh....Dude,you got hacked

by attong
Late last month in January I started noticing some strange tasks running on a server. I ran updates and checked that things were okay and everything appeared to be just fine.

Then I started to notice some strange spikes in my CPU usage. As well as some new traffic coming from the server’s outbound traffic. It would surge and die down surge and die down. The server is serving up JSON payloads as well as regular HTML and these things are to be expected but when it started to surge to numbers that were almost unbelievable I began to become very suspicious.

After installing Iftop to monitor bandwidth and traffic I suddenly saw MASSIVE spikes in inbound and outbound traffic coming from a few different IP ranges. When I say massive I mean MASSIVE (4tb+ in less than a day)

This was beyond shocking. My CPU was humming along and for all intents and purposes you really couldn’t tell this was happening unless you were watching the logs. Services were not being interrupted, there was a slight sluggishness but this was also happening during peak traffic time.

Running iptraf allowed me to then follow the ports being used by this malicious traffic. it was settled on port 8 which up to this point I didn’t even realize was open, I am not a sysadmin so this wasn’t something I had ever run into before. I then put the server into lockdown using ufw. No traffic was allowed to go in or out. I watched my top processes and noticed a trend, there were 2–3 random tasks that would shoot up with CPU usage and then quickly die and go away. When the server was in lockdown mode they would sleep, if I opened the server back up they would instantly spin back up and start sending out traffic again.

At this point I knew I had been compromised and needed to completely do a system cleanse, find and remove what ever was causing this. My first notion was to kill the processes and watch what would happen, so I would kill them and watch as they would die and then be replaced by yet another random task.

To The Logs!

After seeing how these processes behaved I went to check my logs, after not seeing much in related to the processes, there was nothing at all, other than ufw doing its job blocking all traffic I began to do some more digging.

I looked at the startup logs and did notice that there were references to some processes that were not normal and no longer were actually on the system. I went back to the logs and started a watch on my auth.log what I noticed after watching for a bit were my cronjobs. Every so often I would see a blimp of a cronjob starting, at first this wasn’t much of a worry as I have jobs that run regularly to do cleanup and other server things so I didn’t pay much attention.

Start Digging.

At this point I knew there was an application that was running that was creating tasks and running them then killing them off. I knew that there had to be things in my init.d that were being run on startup.

Looking at my /etc/init.d/ directory I noticed a few files that really stood out. They matched the process names of the offending processes as well as one that really stood out dated Jan 27.

I knew that nothing had been done that day related to startup tasks so I had the first big piece of the trail. After doing more digging I found the symlinked tasks in my /etc/rc1.d — /etc/rc5.d directories.

Now that I had these identified it was time to really start to find the cause, I rebooted the system to see what would happen and sure enough these tasks were there at startup and or were removed and replaced with new ones. I also again took note of my cronjobs ( this should have been the first place I looked )

“Okay, I know the issue but what is the cause… explore ALL THE LOGS!”

In my crontab I didn’t notice anything out of the ordinary just my normal jobs, but then I did notice my cron.hourly had been modified… in fact it had been recently modified as of today. Opening that up I found a file udev.sh

Upon opening udev.sh I discovered a path to /lib/libgcc4.so so I went there. Guess what the date was on that file? Jan 27. A google search came up with this — https://blog.avast.com/2015/01/06/linux-ddos-trojan-hiding-itself-with-an-embedded-rootkit/

A trojan that had been planted on the server, through a brute force root attack, and had been having its way just sending tons of outgoing traffic from my server. My server had been involved in a botnet used for DDoS attacks.

Let’s fix this.

So to fix the issue here is what I did.

Rather than kill the processes I ran a stop such as kill -STOP 1234 This stopped the process and paused the whole operation.

I removed all of the tasks from /etc/init.d/ as well as the rc directories.

I removed the udev.sh

Remove the libgcc4.so file.

kill off the task

After doing this I opened the server back up… the moment of truth….

All malicious tasks had stopped. Traffic was back to normal, CPU was back to normal. That was ridiculous and for someone who has had to learn sys admin stuff on the fly over the past year I had no idea.

Things I learned.

Lock down ALL ports you are not using with either iptables or ufw

Be sure to always run system updates, even with doing this the exploit still made it in, but the updates can help.

Use tools available to you such as iptraf, nethogs, and iftop

Watch your cronjobs and strange startup files

Postmortem

So after all of that, what happened? Was any data lost, stolen, accessed? After looking through access logs and all system files thus far it appears nothing was tampered with. Databases are external so those are safe. Based on reading it appears the trojan was part of a botnet used for DDoS attacks.

I learned A LOT about server security through this and have since locked the system down. Not being a sysadmin my approach to things was pretty newbish and cost me lost sleep & a lot of paranoia. I am adding more monitoring to the server as well as doing a more diligent job of always exploring the tasks I am seeing running.

The next big steps are going to be toasting the server completely since there still may be a way in, right now things are stable and I have done as much as I can to protect against future attempts.

Luckily I was also able to work with Rackspace, who has been a great resource for helping me trouble shoot and identify things to look for, and they have been kind enough to offer credit back due to the nature of the attack.
Let others and the author know if you liked it

Liked it alot?

More from attong

My Home,Your Hoffice

My Home,Your Hoffice

by attong

A new Swedish project aims to help freelancers find it easier to get things done, by transforming apartments and homes into temporary coworking spaces. In the morning, 10 or 12 people might show up at someone’s apartment in Stockholm… They...

Stupid tricks with promoted tweets

Stupid tricks with promoted tweets

by attong

Twitter’s Promoted Tweets are, at best, a necessary evil. At worst, they’re tweets from @Satan himself — brands shoving their unwanted products into your carefully-curated timelines.

But with today’s news that we’ll...

Becoming who you are: Women in tech / UX: A brief correspondence

Becoming who you are: Women in tech / UX: A brief correspondence

by attong

Hi, Laurie

I hope you are doing well. I wanted to reach out and see if you might be interested in being on our Girl Geek Dinner panel discussion. We are lining up our panel with women in town, on the subject of professional...

"Did you just call me a bitch?"

"Did you just call me a bitch?"

by attong

There was a moment 12 years ago when a man dragged me across a room, slammed me on the ground, and called me “a little bitch”. I was thirteen. That memory lingers in my head as if it happened yesterday. For some reason, that word stung more...

Dinosaur's guide for Valentine's day

Dinosaur's guide for Valentine's day

by attong

On February 14th, romance is inescapable. Pink and red hearts hit our retail establishments like an avalanche, special offers and online deals pile up in our inboxes. Advertisers alternately inspire us and call into question our very self-worth...