LUA part 5 (of 5): Related technologies - John C. Kirk
Jan. 19th, 2010
03:46 am - LUA part 5 (of 5): Related technologies
This post is part 5 of a series about using a limited (standard) account in Windows for everyday activities rather than logging in as a computer administrator all the time. (You may want to read parts 1, 2, 3, and 4 before continuing.)
As I've explained in the previous parts, LUA is a good thing. However, it isn't a complete solution by itself. It's definitely worthwhile, by limiting the damage that malware can do, but software that runs as a limited/standard user can still cause problems. There are a few other technologies that can help with this, which are also built into Windows (i.e. you don't have to pay extra for them); I don't have time to discuss these in detail at the moment, but here's a brief overview. (This is going to be fairly technical, so it probably won't be much use to the average end user.)
* MSI files.
If it becomes common to work as a standard user, I suspect that we'll see a new attack vector: benign software distributed by a malicious installer program. You will need to run the setup program as administrator, so that provides an opportunity for malware authors to mess around with your system. The solution is to use msi files rather than exe files to install new software; more specifically, it's worth encouraging software vendors to use these. msi files are basically databases: this is an open file format, so you can check the contents of the file using various utilities. (I think that msi files are also limited in terms of their actions, but I'm not certain about that.) Also, you can only install software through Group Policy if you have an msi file, so that's an extra incentive.
* Disable AutoRun.
If you get new software on a CD/DVD, the setup program will run as soon as you insert the disk (via an "autorun.inf" file); something similar happens with USB memory sticks. This can be convenient, but it can also be dangerous.
If someone gives you a memory stick, and you plug it into your computer, are you sure that there's no dodgy software on there? Or if you lend your memory stick to someone else (e.g. for them to copy a file off it), are you sure that their computer hasn't infected the memory stick, ready for it to run a dodgy program on your PC? For that matter, you may not always recognise removable storage when you see it: in theory, you could have a memory stick embedded inside a USB mouse, ready to run code as soon as you plug it in.
So, it's far better to disable AutoRun, then spend the extra few seconds running programs manually when you actually choose to. Steve Riley has more information about this on his blog (Autorun: good for you? and More on Autorun).
Mind you, even this may not be enough. Raymond Chen brought up a new possibility (Stories of anticipating dead computers), which is remarkably evil. Since you can have a USB keyboard, that means that any USB device can claim to be a keyboard, and the OS has to assume that it's telling the truth. So, when you plug in a new device, it could send keystrokes to the computer, telling it to run a particular program! There's no real defence against that, although LUA will mitigate it.
* Software Restriction Policies (SRPs).
This is an interesting article: The Six Dumbest Ideas in Computer Security. In particular, point 2 talks about "Enumerating Badness", which is the way that antivirus software works: you get a list of all the dodgy software (which you have to keep updating), and if a program isn't on that list then you assume it's ok. The alternative is to make a list of all the software that you trust, and if a program isn't on that list then you assume it's bad. This doesn't need to include all of the good software in the world, only the programs that you actually run on your PC.
Windows XP introduced Software Restriction Policies, which have two possible defaults: "Unrestricted" or "Disallowed". (These aren't the best names, since they're negatives rather than positives, but never mind.) If you choose "Disallowed" as the default, you can then only run software that's specifically allowed. The simplest approach is to say that people can only run applications that are stored in the "Program Files" or "Windows" folders, and only administrators can put new files into those folders. So, this works particularly well in a workplace, if you want to stop people downloading unknown software and running it off their desktop. As long as you enforce LUA, people will need to ask the IT department to install new software for them.
This isn't just about being a killjoy; I'm not particularly bothered if people want to download Google Earth and play around with that. However, this will protect end users from their own naivety. For instance, back in 2007 I described a fake eCard website that told people to update their copy of Flash Player. If you use SRPs, and someone tries to run that program, it will be blocked.
Unfortunately, there are also some drawbacks to SRPs. For instance, I found that I could run a program if I double clicked the .exe file in Windows Explorer, but I couldn't use the shortcut in the Start menu. This article offers a workaround for that, by adding the "All users" profile folder to the list of trusted locations, but it's a bit kludgy.
Windows 7 still supports SRP, but it also introduces AppLocker which works better. In particular, the "Start menu" problem doesn't arise, so apparently AppLocker can distinguish between the location of a shortcut file and its target. Also, you can run AppLocker in "audit only mode" before you enforce the rules. That way, you can check which programs would be blocked, and make sure that you haven't made any mistakes.
* .NET Code Access Security (CAS).
Even if you decide to run a particular program, you don't necessarily want it to do everything that you can do. The idea of CAS is that you don't have to trust the program to behave itself, because you can limit its permissions; essentially, you're running it in a sandbox.
For instance, when you run a calculator, there's no reason for it to have access to your mp3 files, but you could encounter a new version of "Rickrolling" if it decided to replace all your music with copies of "Never gonna give you up". Similarly, does your calculator need internet access? If not, you can block that, and then you don't have to worry about it acting as a keylogger and transmitting your passwords; however, be aware that lots of programs have a "check for updates" option, which would require internet access.
Basically, there are two sides to this: the developer and the end user/administrator. As a developer, I can list the permissions that my application needs and the permissions that I'd like. As a local administrator, I can choose what permissions the application will get.
By default, applications on the local machine will get full permissions, i.e. they can do everything that the user can do. If the application doesn't provide a list of what permissions it needs, it will get everything that's offered. If it does provide a list, it will just get that subset. If it provides a list, and some of the required permissions are banned by the administrator, the application won't run.
So, suppose that a text editor doesn't have permission to save files anywhere. If the application demands that permission, it will fail right away, but if it doesn't demand anything then the user will discover this when they try to save a document. As a user, the latter case would be rather irritating, so I'd prefer the application to check permissions at the start. If an application just says that it would like certain permissions, it will still run without them, but it would then make sense to disable menu options as appropriate.
As an administrator, if an application offers a list of its permissions then that makes my life easier, because I know what permissions I can remove without causing any trouble. I can also make a more informed choice about whether I want to run this application at all.
As a developer, I have no reason to lie. If I say that I only need/want certain permissions, then that's all I'll get, even if the administrator is offering more. On the other hand, if I ask for more than I need, that will raise suspicion. ("Why does your screensaver need database access?")
I think this is a good idea, but unfortunately it's fiendishly complicated. As a developer, there's quite a lot of documentation available; I bought Visual Basic .NET Code Security Handbook (all about CAS) and Security for Microsoft Visual Basic .NET (one chapter on CAS), and they've been some help, although as yet I've never set up a proper security manifest for any of my applications. As an end user, I haven't found any decent explanation/instructions of how to configure permissions for a given application.
The other drawback is that this only works with fully "managed" code. If I include an API call, the .NET framework doesn't know what this "unmanaged" code is doing, so I need full permissions (just in case). The same principle applies if I'm launching another application, e.g. if I need to run Word so that I can put text into a template. I do my best to stay inside the framework, but very few of my applications are fully managed.
Apparently things will change in the .NET 4 Security Model (which Visual Studio 2010 will be able to target when it's released), but this is still relevant to older versions.