↓ Archives ↓

Category → configuration management

PEBKAC Avoidance

We’ve all said it. We’ve all done it. We’ve all shook our heads at it. PEBKAC. Problem exists between keyboard and chair. User error. While generally applied to the end-user community – those folks who are considered technical neophytes by IT professionals – it can and should also be applied to those of us who have, at least once (admit it, come on, I know you’ve done it) fat fingered a configuration on a web server, a switch, a router, or some other network or application service. It’s okay. We’ve all been there – head down on the keyboard, a litany of words we wouldn’t use in front of our mothers streaming from our lips between enumerations of how long we’ve sat at our desk looking for the problem. Mine was a misconfiguration of route metrics in the now long gone Network Computing lab that sent traffic from one side of the lab over a simulated T1 and back over the 100 Mbps link. When you’re trying to simulate a...

DevOps and Security Are Compatible

When I speak with information security organizations faced with the prospect of moving to DevOps, one of the most common fears I hear is that this transition will degrade security of infrastructure and applications.  If you’re one of these folks, I understand this fear but you can rest assured:  when you do things correctly security will actually improve. One big reason security benefits in this model is due to improved alignment and tighter feedback loops.  You see, DevOps is about creating a unified, engaged team and doesn’t make it easy to fall into the “silo thinking” that traditionally leads to security as an afterthought. DevOps embraces automation and consistency, which benefits security by allowing you to add automated checks during coding to look for obvious security issues and flag things for human review (such as the linking of new libraries or the introduction of new third-party components that could add risk).  This means you will be able to identify “areas of concern” earlier in the development process where they will be...

Devops Protocol: No Manual Changes

You’ve heard about Devops and you like the idea. But how can you grow a Devops culture in your organization? In my series about Devops Protocols I talk about the fundamental building blocks for growing a Devops culture.
No Manual Changes refers to the behavioural trait of not messing with any productive systems. Let’s discuss why messing with production systems is bad and what to do about it.

Manual Changes Lead to Configuration Drift

You know how this goes: Your servers are under heavy load and you just want to kick up the number of worker processes a bit. You ssh into the first server and just start a bunch of additional workers. Now you sit back and see whether the box is behaving as expected. You find that all is fine and make a mental note that you need to persist your change in the startup script of your service and you need to roll out that change to all other servers. But your mental note is quickly forgotten because another urgent issue demands your full attention…
You end up with one server configured differently from the rest of the bunch. Even worse, your configuration is not even restart safe. As soon as anyone starts an automated configuration run or restarts the box your changes are gone. In our example case this might lead to the box crashing under load without anyone understanding why.

Automate All Changes

The way to go is obvious. Change the number of worker processes in your configuration management tool and let it reconfigure the box for you. This approach makes sure that your changes survive restarts and configuration runs. And it makes sure that your teammates see what you’ve done if any problems arise. Even your developers are now able to see that their code might need optimizations because it requires so many worker processes to run (way more than they initially expected).
Congratulations, you’ve made one more step toward Devops collaboration!

No Rule Without Any Exception – Really?

You might argue that my example is a bad one. If your production systems are in trouble, there’s no time to lose in going the extra mile of automation. Jump in and fight the fire!
While this approach might look reasonable at first glance, it is a very dangerous one. Especially when in fire fighting mode, you might not remember all the changes made to your production system. You think your problems solved but in reality they’ll come back worse than ever (and sooner then you expect).
You need to prepare in advance to avoid the need to go in and hot fix. Setup log monitoring services like splunk or graylog2 to be able to analyze what is happening. And you should have your configuration management tool (like Puppet or Chef) setup so that you can try out possible solutions without having to go in and do it manually.

How do you avoid manual changes to your production systems? Do you “electrify the fence” as described in the Visible Ops Handbook? Please let us know in the comments

Drupal and Configuration Mgmt, we’re getting there …

For those who haven't noticed yet .. I`m into devops .. I`m also a little bit into Drupal, (blame my last name..) , so one of the frustrations I've been having with Drupal (an much other software) is the automation of deployment and upgrades of Drupal sites ...

So for the past couple of days I've been trying to catch up to the ongoing discussion regarding the results of the configuration mgmt sprint , I've been looking at it mainly from a systems point of view , being with the use of Puppet/ Chef or similar tools in mind .. I know I`m late to the discussion but hey , some people take holidays in this season :) So below you can read a bunch of my comments ... and thoughts on the topic ..

First of all , to me JSON looks like a valid option.
Initially there was the plan to wrap the JSON in a PHP header for "security" reasons, but that seems to be gone even while nobody mentioned the problems that would have been caused for external configuration management tools.
When thinking about external tools that should be capable of mangling the file plenty of them support JSON but won't be able to recognize a JSON file with a weird header ( thinking e.g about Augeas (augeas.net) , I`m not talking about IDE's , GUI's etc here, I`m talking about system level tools and libraries that are designed to mangle standard files. For Augeas we could create a separate lens to manage these files , but other tools might have bigger problems with the concept.

As catch suggest a clean .htaccess should be capable of preventing people to access the .json files There's other methods to figure out if files have been tampered with , not sure if this even fits within Drupal (I`m thinking about reusing existing CA setups rather than having yet another security setup to manage) ,

In general to me tools such as puppet should be capable of modifying config files , and then activating that config with no human interaction required , obviously drush is a good candidate here to trigger the system after the config files have been change, but unlike some people think having to browse to a web page to confirm the changes is not an acceptable solution. Just think about having to do this on multiple environments ... manual actions are error prone..

Apart from that I also think the storing of the certificates should not be part of the file. What about a meta file with the appropriate checksums ? (Also if I`m using Puppet or any other tool to manage my config files then the security , preventing to tamper these files, is already covered by the configuration management tools, I do understand that people want to build Drupal in the most secure way possible, but I don't think this belongs in any web application.

When I look at other similar discussions that wanted to provide a similar secure setup they ran into a lot of end user problems with these kind of setups, an alternative approach is to make this configurable and or plugable. The default approach should be to have it enable, but the more experienced users should have the opportunity to disable this, or replace it with another framework. Making it plugable upfront solves a lot of hassle later.

Someone in the discussion noted :
"One simple suggestion for enhancing security might be to make it possible to omit the secret key file and require the user to enter the key into the UI or drush in order to load configuration from disk."

Requiring the user to enter a key in the UI or drush would be counterproductive in the goal one wants to achieve, the last thing you want as a requirement is manual/human interaction when automating setups. therefore a feature like this should never be implemented

Luckily there seems to be new idea around that doesn't plan on using a raped json file
instead of storing the config files in a standard place, we store them in a directory that is named using a hash of your site's private key, like sites/default/config_723fd490de3fb7203c3a408abee8c0bf3c2d302392. The files in this directory would still be protected via .htaccess/web.config, but if that protection failed then the files would still be essentially impossible to find. This means we could store pure, native .json files everywhere instead, to still bring the benefits of JSON (human editable, syntax checkable, interoperability with external configuration management tools, native + speedy encoding/decoding functions), without the confusing and controversial PHP wrapper.

Figuring out the directory name for the configs from a configuration mgmt tool then could be done by something similar to

  1. cd sites/default/conf/$(ls sites/default/conf|head -1)

In general I think the proposed setup looks acceptable , it definitely goes in the right direction of providing systems people with a way to automate the deployment of Drupal sites and applications at scale.

I`ll be keeping a eye on both the direction they are heading into and the evolution of the code !

Drupal and Configuration Mgmt, we’re getting there …

For those who haven't noticed yet .. I`m into devops .. I`m also a little bit into Drupal, (blame my last name..) , so one of the frustrations I've been having with Drupal (an much other software) is the automation of deployment and upgrades of Drupal sites ...

So for the past couple of days I've been trying to catch up to the ongoing discussion regarding the results of the configuration mgmt sprint , I've been looking at it mainly from a systems point of view , being with the use of Puppet/ Chef or similar tools in mind .. I know I`m late to the discussion but hey , some people take holidays in this season :) So below you can read a bunch of my comments ... and thoughts on the topic ..

First of all , to me JSON looks like a valid option.
Initially there was the plan to wrap the JSON in a PHP header for "security" reasons, but that seems to be gone even while nobody mentioned the problems that would have been caused for external configuration management tools.
When thinking about external tools that should be capable of mangling the file plenty of them support JSON but won't be able to recognize a JSON file with a weird header ( thinking e.g about Augeas (augeas.net) , I`m not talking about IDE's , GUI's etc here, I`m talking about system level tools and libraries that are designed to mangle standard files. For Augeas we could create a separate lens to manage these files , but other tools might have bigger problems with the concept.

As catch suggest a clean .htaccess should be capable of preventing people to access the .json files There's other methods to figure out if files have been tampered with , not sure if this even fits within Drupal (I`m thinking about reusing existing CA setups rather than having yet another security setup to manage) ,

In general to me tools such as puppet should be capable of modifying config files , and then activating that config with no human interaction required , obviously drush is a good candidate here to trigger the system after the config files have been change, but unlike some people think having to browse to a web page to confirm the changes is not an acceptable solution. Just think about having to do this on multiple environments ... manual actions are error prone..

Apart from that I also think the storing of the certificates should not be part of the file. What about a meta file with the appropriate checksums ? (Also if I`m using Puppet or any other tool to manage my config files then the security , preventing to tamper these files, is already covered by the configuration management tools, I do understand that people want to build Drupal in the most secure way possible, but I don't think this belongs in any web application.

When I look at other similar discussions that wanted to provide a similar secure setup they ran into a lot of end user problems with these kind of setups, an alternative approach is to make this configurable and or plugable. The default approach should be to have it enable, but the more experienced users should have the opportunity to disable this, or replace it with another framework. Making it plugable upfront solves a lot of hassle later.

Someone in the discussion noted :
"One simple suggestion for enhancing security might be to make it possible to omit the secret key file and require the user to enter the key into the UI or drush in order to load configuration from disk."

Requiring the user to enter a key in the UI or drush would be counterproductive in the goal one wants to achieve, the last thing you want as a requirement is manual/human interaction when automating setups. therefore a feature like this should never be implemented

Luckily there seems to be new idea around that doesn't plan on using a raped json file
instead of storing the config files in a standard place, we store them in a directory that is named using a hash of your site's private key, like sites/default/config_723fd490de3fb7203c3a408abee8c0bf3c2d302392. The files in this directory would still be protected via .htaccess/web.config, but if that protection failed then the files would still be essentially impossible to find. This means we could store pure, native .json files everywhere instead, to still bring the benefits of JSON (human editable, syntax checkable, interoperability with external configuration management tools, native + speedy encoding/decoding functions), without the confusing and controversial PHP wrapper.

Figuring out the directory name for the configs from a configuration mgmt tool then could be done by something similar to

  1. cd sites/default/conf/$(ls sites/default/conf|head -1)

In general I think the proposed setup looks acceptable , it definitely goes in the right direction of providing systems people with a way to automate the deployment of Drupal sites and applications at scale.

I`ll be keeping a eye on both the direction they are heading into and the evolution of the code !

Orbitz IDEAS Video: Teyo Tyree on Model Driven Management with Puppet

posted by @martinjlogan

Teyo Tyree one of the founders of Puppet Labs talks to the about model driven configuration management with Puppet. I was really impressed by Teyo and the whole puppet team to be honest and really appreciate their rigorous sysadmin culture. They seem to be very focused on the practical issues at hand and less interested in keeping up with the latest marketing buzzword of the day.

Teyo Tyree on: Model Driven Management with Puppet from Orbitz IDEAS on Vimeo.

During this video you will learn how puppet works and what drives its architecture. You will get an understanding of how the model driven approach factors into Puppet. You will also learn how to leverage this in extending Puppet configuration management and integrating it with other systems.


Automated Configuration Management With Opscode Chef: The Basic Moving Parts

The Moving Parts Managing your infrastructure with Opscode Chef involves a few moving parts you need to be aware of. As I found it quite hard to differentiate, I want to share the basics with you: Chef server There you manage all your nodes and roles. The chef server distributes cookbooks to the nodes. Chef [...] Related posts:Getting Started...