I have always kept myself updated with the latest trends in web development by having some side projects. One of them is my own portfolio, which has been everything from a completely full screen application made in Flash to the current one that runs on React, React Router, Redux and is fed by Contentful as the CMS.
I have used "git flow" for some time to do my releases by creating release branches and tag them with a version number along with a date on when the release was done. However the actual deploy process was always done manually, either by scp or rsync to my VPS hosted on Linode.
# Using my SSH key # -p preserves modification times, access times and modes # -r means recursive $ scp -rp ./build markus@linode:/path/to/my/webserver
The code for my portfolio (and other side projects) is hosted on Bitbucket, so I got pretty excited when I heard of "Bitbucket Pipelines".
Continuous delivery is now seamlessly integrated into your Bitbucket Cloud repositories.
What it means is that you can set up a pipeline, aligned with your branch structure in Bitbucket. And when a change is noticed on a specified branch, you can have something run automatically, such as a deployment. It doesn't stop there, because you can also use your own Docker image to have everything completely just the way you want it. And it is very, I mean very easy, to configure and set up.
By default Bitbucket Pipelines uses a Docker image which runs on Ubuntu (14.04 LTS) and has a couple of applications such as wget, curl, node, npm and of course git. Since I use yarn as my dependency management I set up my own Docker image on Docker hub to be used by my pipeline. You could however, instead set up your pipeline to install yarn before doing anything.
Bitbucket Pipelines is configured by specifying a YAML file in your root.
So what happens here is when a commit on my branch "master" is triggered, my pipeline installs all the dependencies, runs all tests and if everything goes well it builds the application and runs my deployment script that puts the build files on my VPS. This is my way of doing it, but you are free to specify exactly the way you want the process handled. The "default" pipeline runs on all other branches, such as "develop" or feature branches.
By using environment variables inside of Bitbucket I can give Bitbucket Pipelines easily access to remote hosts. Environment variables are accessed in the configuration file by referring them as $VAR where VAR is the name of the variable.
There are several ways of getting notified about a failed pipeline and you can set them up with ease, in addition you can also get notified within Bitbucket when something went wrong.
Supported ways of getting notified:
Bitbucket Pipelines is free as for now, and will continue to be free for small teams. However if you run a bigger business and need to go enterprise the case is different and the upcoming pricing can be found here.
The only technical limitations that I am aware of are the following.
Features not yet available (as of April, 2017):
Bitbucket has great documentation on how to get started with Bitbucket Pipelines so go ahead and read them if you want more detailed information.