Stick to the script, kiddies: Some dos and don'ts for the workplace

Get your filthy, fallible hands out of filthy, fallible Github

Baby on a computer. Photo by Shutterstock

Scripting is now the first choice for clued-up administrators who want to get things done quickly and in an automated fashion. However, scripting does bring its own set of issues.

One only need look here or here to feel the pain. Software-defined data centres exist purely in code (obviously with hardware underneath) so to help prevent accidents the code should be as effectively managed and controlled

Scripting implements what the operator wants and does it hundreds or thousands of times faster. It brings efficiency, predictability and speed (usually).

But when scripts are not properly controlled accidents can happen 10,000 times quicker. Not only that, but with huge estates subject to a script, the issue can affect the entire data centre or availability zone. Downtime costs of hundreds of thousands an hour is not unusual in web scale.

What to control for

It's quite easy to see why scripting is such a double-edged sword. There is a balance between being efficient and controlling the important aspects of the script:

  • Who is running it?
  • Is the script version controlled?
  • Is the script approved for implementation?

As an administrator, I know how much easier it can be to knock up a quick script to gather some data and spit it out to a CSV sheet, but what about the scripts that write to the environment. Trivial errors multiply and its bye-bye productivity. I have seen an entire Active Directory infrastructure wiped out because of a stupid default in the script.

Visual Studio 2015 Update 1 includes a host of new features

Microsoft cloud TITSUP: Skype, Outlook, Xbox, OneDrive, Hotmail down

READ MORE

Make no mistake, this is an issue with all cloud providers, be it AWS, Azure, Google or whomever. It's all about human fallibility, exacerbated by automation (for the most part).

Managing this risk demands flexibility and balance, yet the response I've witnessed at some organisations has been heavy handed in the extreme: to remove those scripting tools. I have seen companies heavily restrict tools due to the potential risk. The reality of this approach is that it makes life harder for admins to do their jobs without ad-hoc scripts that we all know are so useful. Yes, scripts can be dangerous but a lot of the issues can be mitigated.

A better approach is to set some intelligent defaults and better manage computing process; both can significantly reduce the likelihood and impact of rogue script code.

To dive straight in, some of the best practices are obvious: split out the ad-hoc scripting from the important nightly scripts that run. It is useful to know which account ran what. All those scripts should:

  • Be run with a properly documented service account with a strong password and run with the least amount of required rights. A script that only collects information doesn't require anything except read-only rights, so why give it more. Read-only can't make changes and is therefore safer than one run with some mixture of rights. It also allows for easier auditing when those lovely auditors come knocking! Failure to do so = big audit pain.
  • Most other languages run under the current user context. Nothing beyond that. Anything that PowerShell/Ruby/BASH can do, others applications can do. It is equally as important to prevent arbitrary applications doing things. Techies all have their favourite tools but, sorry, if it's not on the list, tough. I have seen people's machines wiped and reimaged due to this. Overkill? Sure. However, the business also needs to provide a quick and accessible way to get tools on-boarded. Not doing so will cause performance to crash!
  • For the love of all that is techie, don't run scripts as Admin. It should go without saying. Running a script with full domain admin rights is just asking for problems.
  • Scripting hosts can help restrict risk. The ability of people to run scripts from their desktop is very handy but from an auditing point of view it is a nightmare. Hundreds of scripts run from random laptops, desktops and servers all over the environment. Restricting scripting to a select few scripting host boxes mean that is it easier to audit and manage.
  • Scripts should be kept in a secure location and tightly controlled. Version control is the key here, not least when a script gets changed, doesn't work or worse, the error can be traced back and remediated. It also discourages the administrators from tinkering with it (signing scripts can help in that regard).
  • A lot of modern scripting languages have transcription capabilities built in. These can be useful to log (to a log server or such) the exact script that was run. If an administrator is aware that the script they are running is logged in totality, it hopefully makes them double check what they are running and be a little more cautious.

Other dangers are hidden in the administrator-level equivalent of "just download it and run" that comes with services like GitHub. I have seen such scripts cause issues because the admin didn't fully grasp what it did. Block or restrict access is a good first step. While GitHub is good, it only needs one bad script to do significant damage.

Again, it isn't that the script itself is necessarily bad but there needs to be management and control in the environment. It'll be unpopular, but restricting access to GitHub and other such sites may be the way to go. Yes, techies will go mad but they can be offered an olive branch.

Google broke its own cloud, again, with dud DB config change

READ MORE

What I have seen work is to build up a scripting tool bank. These are matured and peer reviewed internally. It will take time, but a lot of the ad-hoc scripts are doing the same thing, preventing the proverbial reinvention of the wheel by the administrator in question.

Beyond that, some people occasionally misunderstand or don't know about the auditing capabilities of the more advanced scripting languages. Looking at PowerShell, for example, the latest release (version 5) comes with new functionality and has built-in logging to create an informational message that can also be forwarded into a SIEM (Security Information and Event Management) system.

One of the biggest risks is the people that run the scripts. A centrally controlled and managed script repository is a must. Although preventing people running scripts may be difficult, it can help mitigate the issues around using unsanctioned scripts and the consequences of doing so. Malicious intent is rarely the cause. It all comes down to a balancing act of functionality and ease of use versus security and the capabilities of the administrator. ?


Biting the hand that feeds IT ? 1998–2017