Monday, September 22, 2014

Tip: Setting terminal tab name in Mac

Just a quick tip ...

My development environment tends to consist of a set of terminal windows and a text editor (currently Sublime Text).  Rather right clicking on each terminal tab to give it a name I wanted to set it from the command line.

That way I can open new tabs with Ctl-+, and then keep them organized. By placing the following script in /usr/local/bin

This script can be run to setup the title. For example:

For some other useful tips on using terminal such as setting the theme on the command line check out:

Monday, September 8, 2014

Short video on experiements with Rational and Docker

This is a short post, pretty much simply posting the video below.   I recently been experimenting with Docker and how it could be leveraged by Rational technology. The following video summarizes some of these experiments, most of which I have blogged about previously. 

Rational has a strong background in Agile development, and is well positioned to provide DevOps tooling. Docker, provides some really significant opportunities to bridge the gap between SaaS offerings and born on cloud style applications, and traditional on prem development.

As docker becomes more mainstream the focus of tooling is going to need to shift. Today deployment automation is a hot item and one that IBM Urbancode Deploy does very well. Moving forward though this provide will move into the background and tooling that manages the release and deploy processes for dockerized applications will be more and more important. We will need tooling that makes it simple to take a change to an application, package that change, test that change, tag the version according to quality and then release that change based on common release patterns (hotfix, canary, A/B, rolling upgrade).

These solutions will be needed both in the SaaS space and on-premise as well as needing to support hybrid scenarios. Docker will be key when enabling 'borderless cloud' scenarios both in terms of packaging applications in such a way that they can be run and scaled both on prem and in SaaS environments, but also in terms of providing powerful but simple DevOps tooling that can be leveraged in a SaaS environment, or deployed on prem. Leveraging docker containers to provide on-demand isolated environments is a nice way to scale out continuous integration processes, or to dynamically create environments for pre-integration testing or diagnostics.

While technologies to co-ordinate the deployment of an application as a set of distributed containers (fig, openstack, kubernetes, mesosphere...) are still evolving it is promising enough that we should consider this as a default packaging choice when distributing solutions moving forward.


 

Rational solutions as docker containers

Rational products as containers 

Audience

The intended audience for this post is anyone that has experience with Docker, and is interested in using Docker in combination with Rational Solutions.  The solutions described in this blog post are very much just a simple experiment and should not be considered to be a fully baked, production solution.  

Motivation

I have drunk the Kool-Aid and will deliberately over-simplify the pretty complex challenges associated with delivering content to customers.  Many of the costs associated with delivering on-premise solutions seem from there being too many deployment options.  This comes out in build times where native components are needed, complexities in installation and prereq checking, costs associated with test cross platform (huge) and cumbersome documentation that walks the line between too specific, or too general.

I'd like to live in a world where capabilities are packaged as Docker containers.  There is a lot to be understood and evolved to make this a reality and many emerging technologies that will help describe topologies made up of containers but the potential is certainly there today for this to radically simplify the production and consumption of packaged software. 

For this post, I simply wanted to build out a few basic containers made up of Rational Products essential to our on premise DevOps solutions. 

Short demo



Scenario: Adding agents to IBM Urbancode Deploy 

IBM Urbancode Deploy (UCD) is a great tool for developing a continuous delivery, or DevOps pipeline.  UCD takes an application centric view of the world, and for a given application you can create a set of environments and processes to deploy the application across them.  An environment in UCD is made up of a set of resources.  Each resource has a UCD Agent running on it.  Typically a resource is a running virtual machine with the agent installed.  While there are integrations in UCD to provision resources 'from a cloud' I wanted to see if we can we leverage Docker to make it simpler to quickly create resources and environments in UCD?


To do this I simply created a Docker Image with an IBM Urbancode Deploy Agent installed on it.  The container takes in environment information which tells it which UCD Server to connect to, and optionally the name of the agent to start.

This way I can start an agent which will automatically connect to my UCD Server and be named based on the containers hostname like so :

docker run -e "UCD_Hostname=myhostname" ucd-agent 
Or to give the agent a desired name:
docker run -e "UCD_Agent_Name=myagentname" -e "UCD_Hostname=myhostname" ucd-agent

A simple script allows me to quickly startup any number of agents.  This is really nice as it allows me to quickly create agents, resources, and environments which is really useful when creating and testing UCD automation.  Also it allows me to take machines in the lab, and to leverage docker to provide a set of isolated environments for various applications or purposes. 

Note, if you are using this container to deploy applications to you will need to expose the appropriate ports on the container for your application.  

Scenario: Standing up a DevOps Pipeline 

Dynamically creating UCD agents is great but lets take this a step further and look a running a DevOps pipeline in docker containers.  For this I created a single image for IBM Collaborative Lifecycle Management (CLM), an image for IBM Urbancode Deploy Server, and then the previous image for the UCD agents.  This is an interesting combination of products because typically we recommend Rational Team Concert (contained in CLM) for source control, planning and continuous integration and then UCD for continuous deployment.  Combined with some number of agents this gives us enough infrastructure to build out a continuous delivery pipeline.


 In this solution both the UCD agent, and the CLM containers take in the location of the UCD Server as a part of the environment.  This will automatically configure them to be connected to the UCD Server.  In addition to being able to pass in this information using the -e flag, they can also be run as linked containers which will automatically configure the connection information.

For example, to start these as linked containers exposing the default ports run:
echo "###################"
echo "starting ucd-server image"
echo "###################"
docker run -d -p 8080:8080 -p 8443:8443 --name=ucd-server ucd-server

echo "###################"
echo "Starting additional agent(s)"
echo "###################"
docker run -d --name=ucd-agent-1 -e "UCD_Agent_Name=docker-agent-remote-1" --link=ucd-server:ucdserver ucd-agent
docker run -d --name=ucd-agent-2 -e "UCD_Agent_Name=docker-agent-remote-2" --link=ucd-server:ucdserver ucd-agent

echo "###################"
echo "Starting CLM Server"
echo "###################"
docker run -d --link=ucd-server:ucdserver -p 9443:9443 -p 9080:9080 -e "UCD_Agent_NAME=docker-agent-clm-server" --name clm-server clm-simple

While the solution is very simple at this point and does not represent best practice topologies for CLM and CLM it shows quite a bit of promise. This solution has been very useful for quickly standing up isolated solutions for continuous delivery which is very useful when developing new applications/processes and for avoiding clutter on single large UCD Server deployments.   

Getting access to, and building the containers  

I have shared the source code for these images in a public IBM DevOps Services project called leanjazz-docker.

There are a number (unfortunately) of binary files that need to be downloaded as described in the README.md document.

Building the images: 
cd bin 
./build-all.sh  

Running the images:
cd bin 
./start-all.sh 
./show-info.sh  

At this point you can access the webpage for IBM Urbancode Deploy and view the connected agents, and also access the Jazz Team Server setup page to complete the setup of IBM Collaborative Lifecycle Management and Rational Team Concert.  If you used the scripts listed above the default ports will be mapped to the docker host ports. On my local machine I am using boot2docker and my docker host has an IP address of
$ boot2docker ip
The VM's Host only interface IP address is: 192.168.59.103
As such I can access the CLM console at https://192.168.59.103:9443/jts/ and the UCD Server console at http://192.168.59.103:8080
   
Adding additional 3 agents:
cd bin 
./add-agent.sh -a mynewagents -n 3 
docker ps | grep ucd-agent-docker 

Remaining work and issues 

Things are certainly not perfect.  I'll be tracking improvements to the containers  on leanjazz-docker  but here is a summary of some of the
  •  Images are quite large due to the size of the products but also due to a lack of work done optimizing the Dockerfile and build process
  • Containers for UCD and CLM run on tomcat and should be updated to include WebSphere Liberty profile
  • Containers for CLM have all application (CLM, QM, RM, JTS) installed on a single tomcat instance rather than being distributed across multiple instances. 
  • Databases are running within the container rather than being isolated on a volume, or in an external DBAS solution
  • Databases are running derby rather than IBM DB2 due to a requirement to start DB2 container in privileged mode 
  • Not currently published to DockerHub
  • No form of license acceptance built into the images