Automating Deployment of my (Micro)Services

So I read somewhere on the internets that microservices are great and we should all be doing them. Being an overly enthusiastic geek to try all sorts of new fads and see how they work, I just had to go and give it a try, of course. So proceed to try to split my project into various smallish parts, connect these using GRPC and see how it all runs.

Used GRPC because I like the efficiency, documentation and simplicity of profobufs. And Google has too much of a reputation anyway. Unfortunately the GRPC generated Java code just feels weird and oddly bloated. Also had some concurrency issues, although this might be due to my lack on understanding as it seems the docs are not that great outside of Google (where you can just ask the authors or friends..).

I split my service to 10 smaller ones, did some tries and settled on a merge at 5 services. But how do I actually sensibly deploy this vs previously uploading a single dir? Then I remember the next buzzword I keep hearing about “Continous Delivery”. Sweet, that must solve it for me, right?

Um no. I must be missing something as the CD terminology seems to come up with just some conceptual level descriptions but little concrete examples of how to do it. Maybe DockerHub or some yet another hype term. But I am still not on that boat despite using various Docker images and building some myself. So what then? Most concrete reference I found seemed to be around “I has some scripts” etc. OK, whatever, so I start cooking up some scriptz. In Python.

Python ConfigParser seemed suitable. So I created a configuration file like this:



Read it with Python:

config = configparser.ConfigParser()

service1_ip = config['service1']['ip']
service1_dst_dir = config['service1']['dst_dir']
service1_src_dir = config['service1']['src_dir']
service1_jar_prefix = config['service1']['jar_prefix']
service1_properties = config['service1']['properties']

Doing this for all services gives the information on how to upload everything.

With the paramiko Python package installed from pip, next we are off to create the target directory if it does not exist:

def mkdir_p(ssh, remote_directory):
    with paramiko.SFTPClient.from_transport(ssh.get_transport()) as sftp:
        dir_path = str()
        for dir_folder in remote_directory.split("/"):
            if dir_folder == "":
            dir_path += r"/{0}".format(dir_folder)
            except IOError:

To upload a directory recursively:

def upload_dir(ssh, localpath, remotepath, name):
    local_dirpath = os.path.join(localpath, name)
    mkdir_p(ssh, remotepath)
    with SCPClient(ssh.get_transport()) as scp:
        scp.put(local_dirpath, remotepath, recursive=True)

To upload a specific file:

def upload_file(ssh, localpath, remotepath):
    mkdir_p(ssh, remotepath)
    with SCPClient(ssh.get_transport()) as scp:
        scp.put(localpath, remotepath)

Using the information and code snippets, it is quite easy to build custom scripts for uploading specific service data to specific services. Instead of posting too much code here, I turned it into something a bit more generic and put in on Github:

It is the best thing since sliced bread. Of course it is…

And now you can tell me how it is really supposed to be done, thank you 🙂

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s