Latest Blog Posts

library_booksRead more at Medium


Circle CI and DynamoDB

After searching around whilst building a Node.js API project, I realised there wasn’t too much documented on how to setup CircleCI with AWS DynamoDB for testing purposes within your build pipeline. I thought I’d do a quick post to summarise the process and hopefully make it clearer for others who are trying to achieve the same.

Getting Started

As I mentioned above, I was building a Node.js API, however this process should translate regardless of the language or tools you’re using. Before going any further, please make sure you’ve entered your AWS_ACCESS_KEY and AWS_SECRET_KEY into your CircleCI Project. This can be done from the setting page of your Project, under the Permissions heading.

Hit the AWS Permissions link and enter your variables

Now we’re going to need to edit our circleci config.yml file. The below is my finished version. I know it’s pretty long and messy and can definitely be refactored, but it does the job for now.

version: 2
- master # list of branches to build
- image: circleci/node
working_directory: ~/repo
- checkout
- run:
name: Install Java
command: 'sudo apt-get update && sudo apt-get install default-jre default-jdk'
- run:
name: Install Python
command: 'sudo apt-get update && sudo apt-get install -y python-dev'
- run:
name: Install Python
command: 'sudo curl -O'
- run:
name: Install Python
command: 'sudo python'
- run:
name: Install AWS CLI
command: 'sudo pip install awsebcli --upgrade'
- run:
name: Setup Container
command: |
curl -k -L -o dynamodb-local.tgz
tar -xzf dynamodb-local.tgz
java -Djava.library.path=./DynamoDBLocal_lib -jar DynamoDBLocal.jar -sharedDb
background: true
- run:
name: Update yarn
command: 'yarn global add npm@latest'
- restore_cache:
key: dependency-cache-{{ checksum "package.json" }}
- run:
name: Install Dependencies
command: yarn install
- save_cache:
key: dependency-cache-{{ checksum "package.json" }}
- node_modules
- run:
name: Start Server
command: 'yarn ci-start'
background: true
- run:
name: Create Table
command: 'yarn create-db'
- run:
name: Load Data
command: 'yarn load-data'
- run:
name: Run Tests
command: 'yarn test'
- run:
name: Deploy to AWS Elastic Beanstalk
command: 'eb init MyApp -r eu-west-2 -p arn:aws:elasticbeanstalk:eu-west-2::platform/Node.js running on
64bit Amazon Linux/4.4.3'
- run:
name: Deploy to AWS Elastic Beanstalk
command: 'eb deploy your-env'

First up we install Java as DynamoDB requires Java to run. The next part is optional, but as my app deploys to Elastic Beanstalk, I download Python as AWS EB CLI requires Python to run. Lastly we install DynamoDB directly from Amazon, I’ve chosen eu-west-2, but choose whichever location is nearest to you. This downloads as a zip, so we unzip it and then run the .jar file. The important thing here is to note the use of the option background: true. This ensures it runs in the background and doesn’t stall your build from going onto the next stage. From here, you can launch your server as a background task, load your data in and run your tests.

Hope you this helped anyone having trouble incorporating DynamoDB into their build pipelines. If you’re stuck, or have any questions, please ask!

As always, thanks for reading, hit 👏 if you like what you read and be sure to follow to keep up to date with future posts.

Thanks for taking the time to read through, glad it helped!

Thanks for taking the time to read through, glad it helped! If you ever need a hand with anything just give me a shout and I’ll be happy help!

Creating a Google Chrome Extension

I recently created a simple Chrome Extension (CrypCheck) that displays the current, live, price of some of the popular Crypto Currencies. After going through the process, it’s a lot quicker and easier than you think to write, test and publish your own Chrome Extension.

Firstly, setup a working directory and push it up to Github (or whatever else you use).

$ mkdir my_awesome_chrome_extension
$ cd my_awesome_chrome_extension
$ git init
$ echo "#my_awesome_chrome_extension" >>
$ git add .
$ git commit -m 'first commit, setting up project'
$ git remote add origin yourremote.git
$ git push origin master

Next, we’re gonna want to download the starter files that Google provide in their docs. Click here and go to the Resources heading. You should end up with four files downloaded, manifest.json, popup.html, popup.js and icon.png (this is optional and can be replaced with any icon of your choice).

Now we’re going to want to load the extension in so we can test it locally. Navigate to the URL chrome://extensions/. From here, you should see a button at the top labelled Load unpacked extension. After clicking it, navigate to the working directory of your plugin and select it. This loads the Plug-In into your browser from the directory. You should be able to play with the sample app now, which allows you to change colours. Now you’re ready to build and test your app!

Remember, if you require any third party libraries like jQuery, you’re going to need to load them in. Also, make sure to edit the manifest.json to reflect details of your own app, as opposed to the sample app. This is an example of what my manifest.json looks like.

"manifest_version": 2,
"name": "CrypCheck",
"description": "This extension allows the user to check the price of Bitcoin, Bitcoin Cash, Ethereum, Litecoin, Ripple and IOTA.",
"version": "2.0",
"browser_action": {
"default_icon": "icon.png",
"default_popup": "popup.html"
"permissions": [

Depending on the complexity of your project, you might also want to setup a structure to it. Especially if you find yourself loading in fonts, libraies and anything else, below is the folder structure of my app.

One thing to note, regarding the icon.png size, is that Google requires the size to be 128x128 pixels. You can also provide additional sizes of 48x48 and 16x16.

Publishing your extension

After completing your extension and making sure it works, you’re gonna want to share it with the world. Head over to Chrome’s Web Dashboard and Login with your Google Account. Once the page loads, hit the Add new Item button. This will prompt you to upload a zip of your project, which can be done by running the following command.

$ zip -r my_awesome_chrome_extension

Once the zip file uploads you can edit the details before publishing. This includes where you want to distribute the app, the category you want it to appear in and all the other details regarding it’s publication.

Once you’re finished tweaking hit Publish Changes. Congrats! You’ve published a Chrome Extension! Make sure to delete the one in your browser and re-download it from the Web Store, so you’re running the production version.

If you’re interested, you can take a look at my app’s repo as an example.

As always, thanks for reading, hit 👏 if you like what you read and be sure to follow to keep up to date with future posts.

React with CircleCI, AWS S3 and AWS CloudFront

Today we’re going to be whipping up a simple React Project with a build pipeline that deploys to an S3 bucket, which is distributed through CloudFront. The benefits of deploying your React App this way are that you can automate your build and deployment tasks and by distributing your app across CloudFront you’ll be able to provision a free SSL certificate on it, which is great!


Getting Started

First of all, let’s scaffold an app using create-react-app.

$ create-react-app myawesomeapp
Success! Created myawesomapp at /your/path/myawesomapp
Inside that directory, you can run several commands:
yarn start
Starts the development server.
yarn build
Bundles the app into static files for production.
yarn test
Starts the test runner.
yarn eject
Removes this tool and copies build dependencies, configuration files
and scripts into the app directory. If you do this, you can’t go back!
We suggest that you begin by typing:
cd myawesomapp
yarn start
Happy hacking!

Now, setup your repo on Github and push your code up.

$ git add .
$ git commit -m 'first commit, scaffolds project with create-react-app'
$ git remote add origin https://yourrepo
$ git push -u origin master

Now you should have your initial project up on Github. From here we’ll checkout feature branches and merge them in as we complete them. You can use a project management tool like Waffle or Trello if you like, it helps keep track of what needs to be done. Depending on the complexity of your project, you could also checkout a staging branch and merge your features into that, bit by bit, but for the purposes of this we’ll stick with merging into master. The process is identical though, it’ll just require a bit more configuration.

$ git checkout -b 1-setting-up-build-pipeline
$ git push origin 1-setting-up-build-pipeline

Setting up AWS

Head over to AWS and go to the S3 dashboard. Here we’ll create a bucket and set it’s permissions to public.

s3 dashboard

Click Create bucket and call your bucket something useful, like the name of your app or the domain it’ll live on, if you’ve bought one. I called mine Now we’ll quickly configure the permissions of the bucket and set it to public so that it can be accessed and viewed on a browser, by the general public. Click the Permissions tab and then the Bucket Policy button, this will bring up an editor.

AWS uses JSON policy files to manage permissions of buckets and other services, the example below allows the public to Read the contents of the S3 bucket.

"Version": "2012-10-17",
"Statement": [
"Sid": "AddPerm",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "*"

You might be tempted to change the date, but don’t. It’s taken from their docs and is linked to a particular version, changing it may break/alter your permissions.

Bucket Policy

Lastly, you’ll need to configure your S3 bucket to host static sites — this is literally just clicking a few buttons.

Head over to the Properties tab and click the Static Website Hosting card. You’ll need to input the entry page, which in our case will be index.html, you can also create an error page, but we’ll leave that blank for now. Hit Save and we’re ready to go!

Using AWS CLI, we’re gonna sync our project with our bucket. Make sure you’ve got your Access Keys setup, if not just type aws configure in your shell and enter them there. These can be found/generated in My Security Credentials.

Before syncing, we’ll need to build our project for production. We’ll then sync the contents of the build folder with our S3 bucket and BAM we’re live.

$ yarn build
yarn run v1.3.2
$ react-scripts build
Creating an optimized production build...
Compiled successfully.
File sizes after gzip:
35.65 KB  build/static/js/main.35d639b7.js
299 B build/static/css/main.c17080f1.css
The project was built assuming it is hosted at the server root.
To override this, specify the homepage in your package.json.
For example, add this to build it for GitHub Pages:
"homepage" : "",
The build folder is ready to be deployed.
You may serve it with a static server:
yarn global add serve
serve -s build
✨  Done in 7.49s.
$ aws s3 sync build/ s3:// --delete
upload: build/service-worker.js to s3://
upload: build/manifest.json to s3://
upload: build/favicon.ico to s3://
upload: build/index.html to s3://
upload: build/static/css/ to s3://
upload: build/asset-manifest.json to s3://
upload: build/static/media/logo.5d5d9eef.svg to s3://
upload: build/static/css/main.c17080f1.css to s3://
upload: build/static/js/main.35d639b7.js to s3://
upload: build/static/js/ to s3://

Now head back to your S3 bucket, under the Static Website Hosting card and click the Endpoint URL. If all is good, you’ll see the React Welcome Page.

Your app’s now on S3! We can somewhat automate this further by combining the commands. Add the following to your package.json under the scripts section.

"name": "myawesomapp",
"version": "0.1.0",
"private": true,
"dependencies": {
"react": "^16.2.0",
"react-dom": "^16.2.0",
"react-scripts": "1.0.17"
"scripts": {
"start": "react-scripts start",
"build": "react-scripts build",
"test": "react-scripts test --env=jsdom",
"eject": "react-scripts eject",
"deploy": "yarn build && aws s3 sync build/ s3:// --delete"

This runs yarn build and syncs the contents of your build/ folder with your S3 bucket.

CircleCI Setup

Firstly, add your project by clicking Setup Project on the CircleCI Dashboard.

Next you’ll want to make sure you have CircleCI 2.0 as opposed to 1.0, which is a bit older. As we’re creating a React Project, we want our container to be Preconfigured with Node.

Follow the instructions laid out on the CircleCI Dashboard. Your config.yml file inside your .circleci folder, should look something like this.

# Javascript Node CircleCI 2.0 configuration file
# Check for more details
version: 2
# specify the version you desire here
- image: circleci/node:7.10
# Specify service dependencies here if necessary
# CircleCI maintains a library of pre-built images
# documented at
# - image: circleci/mongo:3.4.4
working_directory: ~/repo
- checkout
# Download and cache dependencies
- restore_cache:
- v1-dependencies-{{ checksum "package.json" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-
- run: yarn install
- save_cache:
- node_modules
key: v1-dependencies-{{ checksum "package.json" }}
# run tests!
- run: yarn test
- run: sudo apt-get update && sudo apt-get install -y python-dev
- run: sudo curl -O
- run: sudo python
- run: sudo pip install awscli --upgrade
- run: aws --version
- run: aws s3 ls
- run: yarn run deploy

You’ll notice we’ve added a few extra things. We install python and download AWS CLI. After that we check installed by checking the version and list our S3 Buckets.

Once this is done, head back to the CircleCI dashboard and hit Start Building. From here you can navigate to your Project Settings and save your AWS_ACCESS_KEY_ID and your AWS_SECRET_ACCESS_KEY.

Once this is done, push your changes up to github and open a pull request. On the first Pull Request CircleCI won’t register, so just hit merge.

You’re pretty much set now!

Checkout another branch for your next feature and once your ready to open a pull-request you’ll notice CircleCI will run your tests and if all passes and you merge, it’ll deploy!

Congrats! You’ve setup a React App with a build pipeline, hosted on AWS!

Deploying your App Through AWS CloudFront

Now you’ve got a build pipeline setup and synced to deploy with your AWS Bucket, which is great. But it’d be even better if you distributed that across a CDN and slapped a SSL certificate on it. Enter CloudFront…

Navigate to CloudFront’s dashboard and hit Create Distribution.

You’ll then be given two options, as we’re creating a Web App, you’ll need to select Web.

On the next screen, in the Origin Domain Name field, you’ll get a dropdown list. Choose the name of your S3 bucket. You’ll also want to change the Viewer Protocol Policy to Redirect HTTP to HTTPS. Lastly you’ll need to set the Default Root Object to index.html as that’s the entry page to our app.

Once you’re done hit Create Distribution. This will take a short while to deploy, but once it’s complete head over to the Distribution URL and you should see your app, as well as an SSL certificate.

You’re done! If you own your own domain, and it’s hosted in AWS Route 53, you can create an Alias record to point to the Distribution URL.

If you have any questions, or need a hand with anything, drop a comment and I’d be happy to help! Here’s a link to the example repo.

As always, thanks for reading, hit 👏 if you like what you read and be sure to follow to keep up to date with future posts.

React with CircleCI, AWS S3 and AWS CloudFront was originally published in CloudBoost on Medium, where people are continuing the conversation by highlighting and responding to this story.

Net Neutrality

With America gearing up for a big vote on Net Neutrality I thought I’d do a repost, to emphasise how important it is and what it could mean for the UK.

I’ve done a previous post covering this issue, but in a nutshell Net Neutrality promotes a free and open internet where all data, content and applications are treated equally, without discrimination. Essentially everything carrying on like it currently is.

This image explains all you need to know about why Net Neutrality is important, because this is what it could turn out like.

Net Neutrality

An ISP could prioritise traffic however they wish and charge you for the luxury of using different applications. Whereas now you’d pay a flat fee for internet access, ISPs could impose, as the image suggests, multiple packages for different websites. It doesn’t take a genius to figure out this is negative for everyone. It’s anti-competition, anti-freedom, anti-everything. There’s no benefit, except to those right at the top.

But this is in the US, so why should I care in the UK?

For now, we’re protected by EU Law, which ensures an open and competitive internet market. However, with Brexit looming and all the other issues surrounding it, you might worry that this particular law isn’t translated correctly on the repeal bill.

To be fair, in the UK, we have a much more competitive broadband market, and it allows users to switch with ease. However, a lot of these providers also tend to offer a broadband and online tv deals — think Sky and BT, for example. These companies have incentive to prioritise their content over Netflix, Prime or any other streaming service.

Virgin already offer data-free messaging, to Facebook Messenger, WhatsApp and Twitter. Whilst this sounds amazing, it’s bad for everyone in the long term. It’s anti-competition and makes it very difficult for new startups, in a similar sector, break into the market and gain a user base. Why would a user want to try out a cool new product, if the alternative is free to use. It creates an unfair playing field, which leads to a stagnant market and less innovation.

Whilst things are better here in the UK than the US, the fact remains it’s an issue to keep in mind, especially with Brexit on the horizon. If the vote passes and the bill is repealed in the US it might pressure other countries to revisit their own Open Internet laws.

As always, thanks for reading, hit 👏 if you like what you read and be sure to follow to keep up to date with future posts.

Ethereum and Blockchain Technology

After the crazy highs of Bitcoin recently and it’s high volatility, people are getting more interested in cryptocurrencies and how they work.

What is a Cryptocurrency?

Bitcoin, Ethereum, Litecoin are all examples of Cryptocurrencies, they are all decentralized. This means no central bank or body regulates, monitors or owns them. They work on a peer to peer network, meaning users directly connect to each other and make transactions through the use of cryptography — making hacking or fraud virtually impossible. Once the transactions are verified by the network, they are added to a shared public ledger — the blockchain.

The Blockchain

Blockchain Diagram

As the digram illustrates, a transaction is requested, the network of nodes (computers) is notified, at which point it validates the transaction request. Once verified, the transaction gets added to the public ledger — the blockchain. Essentially the blockchain is just a giant immutable data structure, with each event/transaction/contract getting added to the end of it. If you’re curious, you can view live transactions for Bitcoin being added to the blockchain here.

Types of blockchain

There are a few different types of blockchain.

Blockchains are similar in theory, but their approach to tasks can be slightly different. Bitcoin looks to work as a currency, where you can transfer value directly to the recipient without any middle man (bank). Ethereum, instead, offers a much more powerful solution that allows developers to create applications utilising Blockchain technology, through Smart Contracts and Solidity.

What is a Smart Contract?

A smart contract is a program that is executed exactly the way it is set up to by their creators. A developer could write a program and deploy it, without fear of fraud, third party interference and benefit from 100% uptime. Keeping with the Bitcoin examples, a simple smart contract transfers the value from one user to another, if the necessary conditions are met. Here, we’re limited to currency though. This is where Etherum is different. It replaces Bitcoin’s somewhat restrictive scripting language with it’s own called Solidity, allowing developers to build and deploy applications with it.

What is Solidity?

Solidity is a contract-orientated, high-level language for implementing smart contracts. It was influenced by C++, Python and JavaScript and is designed to target the Ethereum Virtual Machine (EVM). It’s statically typed, meaning that the type of variable is known at compile time, as opposed to run time. This has some benefits in that type errors are picked up earlier in the development cycle and it can lead to faster programs because the compiler can produce optimised machine code if it knows the variables earlier. Having looked through some code examples, visually, it looks closest to something resembling JavaScript — but it’s not.

What can I build?

Anything you like! The poster example apps are voting , blind crowdfunding and multi-signature wallets. Ethereum is still young and obviously some of the tech is still in Beta, but there seems to be a curiosity in building these new types of applications.


Blockchain technology seems to be gathering pace, mainly through Bitcoin, but as organisations learn it’s benefits and the power of the technology, it could become a new way to develop applications. It seems now is a good time to start experimenting with it and seeing what real-world applications it could have.

As always, thanks for reading, hit 👏 if you like what you read and be sure to follow to keep up to date with future posts.

Ethereum and Blockchain Technology was originally published in Cryptocurrency Hub on Medium, where people are continuing the conversation by highlighting and responding to this story.

Creating your own Jekyll Theme Gem

After searching for a short while, I found I couldn’t quite find a Jekyll Theme that I liked. All the ones that I came across needed a lot of work, so I thought I’d whip up my own theme and make it a gem. It’s a lot quicker and easier than you think.

For my theme, I used Materialize — a front-end framework based on Material Design.

Getting Started

Firstly, head over to RubyGems and sign up for an account — you’ll need these credentials later when you push your gem up.

Jekyll already contains a new-theme command which scaffolds together a skeleton for you. It’ll look something like this.

# bash
$ jekyll new-theme testing123
create   /Users/jameshamann/Documents/Development/testing123/_layouts/page.html
create /Users/jameshamann/Documents/Development/testing123/_layouts/default.html
create /Users/jameshamann/Documents/Development/testing123/Gemfile

create /Users/jameshamann/Documents/Development/testing123/
create /Users/jameshamann/Documents/Development/testing123/LICENSE.txt

initialize /Users/jameshamann/Documents/Development/testing123/.git
create /Users/jameshamann/Documents/Development/testing123/.gitignore
Your new Jekyll theme, testing123, is ready for you in   /Users/jameshamann/Documents/Development/testing123!
For help getting started, read /Users/jameshamann/Documents/Development/testing123/

It provides a nice starter README, which explains the setup and what the theme includes. As well as this, the command creates a .gemspec file, which contains all the information and build instructions for your gem.

# ruby 
# coding: utf-8 do |spec| = "testing123"
spec.version = "0.1.0"
spec.authors = [""] = [""]
spec.summary       = %q{TODO: Write a short summary, because Rubygems requires one.}
spec.homepage = "TODO: Put your gem's website or public repo URL here."
spec.license = "MIT"
spec.files         = `git ls-files -z`.split("\x0").select { |f| f.match(%r{^(assets|_layouts|_includes|_sass|LICENSE|README)}i) }
spec.add_runtime_dependency "jekyll", "~> 3.6"
spec.add_development_dependency "bundler", "~> 1.12"
spec.add_development_dependency "rake", "~> 10.0"

When you’re done with your theme, you’ll want to go in here and edit the details at the top, so once your Gem’s live, all the necessary information is available.

The site itself functions the same as a jekyll site, so when you’re developing you can use jekyll serve to boot up your site on a server, that way you can view and test your site whilst you’re developing your theme.

Testing your Gem

To test your gem, let’s build it and load it on another jekyll site.

# bash
$ gem build YOURTHEME.gemspec

This will generate a gem file within your directory, however it’ll be hidden as it’s part of your .gitignore file. Next, generate a new jekyll site, add your gem to the gemfile (specificying it’s path), bundle install, change the _config.yml to use your theme and then jekyll serve. This should serve up your new site, using your gem as it’s theme.

# bash 
$ jekyll new mysite
  Bundler: Fetching gem metadata from
Bundler: Fetching gem metadata from
Bundler: Resolving dependencies...
Bundler: Using public_suffix 3.0.1
Bundler: Using addressable 2.5.2
Bundler: Using bundler 1.16.0.pre.3
Bundler: Using colorator 1.1.0
Bundler: Using ffi 1.9.18
Bundler: Using forwardable-extended 2.6.0
Bundler: Using rb-fsevent 0.10.2
Bundler: Using rb-inotify 0.9.10
Bundler: Using sass-listen 4.0.0
Bundler: Using sass 3.5.3
Bundler: Using jekyll-sass-converter 1.5.1
Bundler: Using ruby_dep 1.5.0
Bundler: Using listen 3.1.5
Bundler: Using jekyll-watch 1.5.1
Bundler: Using kramdown 1.16.2
Bundler: Using liquid 4.0.0
Bundler: Using mercenary 0.3.6
Bundler: Using pathutil 0.16.0
Bundler: Using rouge 2.2.1
Bundler: Using safe_yaml 1.0.4
Bundler: Using jekyll 3.6.2
Bundler: Using jekyll-feed 0.9.2
Bundler: Using minima 2.1.1
Bundler: Bundle complete! 4 Gemfile dependencies, 23 gems now installed.
Bundler: Use `bundle info [gemname]` to see where a bundled gem is installed.
New jekyll site installed in /Users/jameshamann/Documents/Development/mysite.
$ cd mysite
$ atom .
# ruby
# Gemfile
gem "YOURTHEME" => :path => "path/to/your/gem"
# bash
$ bundle
Fetching gem metadata from
Fetching gem metadata from
Resolving dependencies...
Using public_suffix 3.0.1
Using addressable 2.5.2
Using bundler 1.16.0.pre.3
Using colorator 1.1.0
Using ffi 1.9.18
Using forwardable-extended 2.6.0
Using rb-fsevent 0.10.2
Using rb-inotify 0.9.10
Using sass-listen 4.0.0
Using sass 3.5.3
Using jekyll-sass-converter 1.5.0
Using listen 3.0.8
Using jekyll-watch 1.5.0
Using kramdown 1.16.2
Using liquid 4.0.0
Using mercenary 0.3.6
Using pathutil 0.16.0
Using rouge 2.2.1
Using safe_yaml 1.0.4
Using jekyll 3.6.2
Using jekyll-feed 0.9.2
Using jekyll-material-theme 0.1.0 from source at `../material-theme`
Bundle complete! 4 Gemfile dependencies, 22 gems now installed.
Use `bundle info [gemname]` to see where a bundled gem is installed.
# _config.yml
# bash
$ jekyll serve
Configuration file: /Users/jameshamann/Documents/Development/mysite1234/_config.yml
Source: /Users/jameshamann/Documents/Development/mysite1234
Destination: /Users/jameshamann/Documents/Development/mysite1234/_site
Incremental build: disabled. Enable with --incremental
done in 0.436 seconds.
Auto-regeneration: enabled for '/Users/jameshamann/Documents/Development/mysite1234'
Server address:
Server running... press ctrl-c to stop.

Head over to http://localhost:4000 and you should be able to see your site, using your gem theme.

Going Live

Once you’ve styled, created and tested your Jekyll theme, it’s time to go live! Once you’ve edited your .gemspec file and made sure all the necessary files are included, use the build command to build the first version of your gem. Ruby Gems use Semantic Versioning so your first push might not be your major release, so it defaults to version 0.1.0.

Briefly, Semantic Versioning works by incrementing the numbers based on MAJOR.MINOR.PATCH releases. MAJOR version, as the word suggests, is a major release where you make incompatible API changes. MINOR version is adding functionality in a backwards compatibility manner. PATCH version is for any bug fixes. It’s best practice to follow these guidelines when releasing/updating your gem, so keep that in mind during future development if you further tweak your theme.

# bash
$ gem build YOURTHEME.gemspec
$ gem push YOURTHEME.gem

This is where you’ll need your login details you created earlier. Once filled in, head over to RubyGems and search for your gem. It should appear in the list of results, go ahead and view the page to ensure all the details are correct. If you make a mistake, don’t worry you can pull it off using a simple command.

# bash 
$ gem yank YOURTHEME

Congrats, you’ve just published a gem! You can also add your theme to various jekyll theme sites, most of them require you to fork the repo and open a pull request with a new post about your theme.

As always, thanks for reading, hit 👏 if you like what you read and be sure to follow to keep up to date with future posts.

Cloud Computing Service Types

These days most people refer to things being in The Cloud but what does this actually mean? A general, high-level definition is the delivery of hosted services over the internet. Email, calendars, todo lists, photos, pretty much everything fits into that description. Think Google Drive backing your photos up or iCloud backing your iPhone up, both of these keep your data in stored in the cloud. This makes it easy to access your stuff across all of your devices. There are, though, a few different levels to cloud computing.

SaaS (Software as a Service)

This is the outermost layer of the types, typically targeted at the end customer. The provider makes their application available through a web-browser, mobile app or dedicated desktop app, which is powered by cloud infrastructure. Lets use Google Sheets as an example here. The benefits provided to the consumer include the ability to create, edit and update spreadsheets from anywhere, with multiple users able to edit at one time.

PaaS (Platform as a Service)

This is the middle layer of our pyramid, typically used by developers. Here a developer can deploy their app, written in whatever language they’ve chosen, to a pre-configured cloud infrastructure. The developer doesn’t deal with the configuration of the servers, databases or operating systems. Instead they are able to deploy their app with ease and speed. Let’s use Heroku as an example here. The benefits provided to the developer is quick, easy deployment with minimal hassle. You can push a rails app live within minutes, it allows for rapid prototyping and gives developers the ability to deploy an app without needing to worry about things like server configuration.

IaaS (Infrastructure as a Service)

This is the final layer, where all the nuts and bolts lie. This level is typically used by sysadmins who would be charge of provisioning servers and deploying application builds. From here you’re able to completely configure everything, from operating systems and network settings. Lets use AWS’s EC2 instances as an example here. From the EC2 console when deploying a new instance you’re able to choose from a wide range of AMI images. When your instance is setup, you can SSH in and configure it in anyway you see fit. This gives the user huge levels of customisation and flexibility, which is important if you’re running apps that require very specific needs.

The diagram below provides a visual representation of each level.

Beyond this there are further abstractions like DaaS (Data/Desktop as a Service), STaaS (Storage as a Service) and SECaaS (Security as a Service). These are quite specific, focused in particular sectors and not as commonly known as the main three mentioned above.

As always, thanks for reading, hit 👏 if you like what you read and be sure to follow to keep up to date with future posts.

Rails 5 API, React, Bitbucket Pipelines and AWS Elastic Beanstalk — Part One

In this post we’ll be setting up a new Rails JSON API app, with a React front end. We’ll also cover how to setup a CI/CD pipeline using Bitbucket Pipelines and AWS Elastic Beanstalk.

Technology and Versions


The end goal we’re all aiming for is to have a Rails 5 API app using Postgres, with a React front end, setup with a CI/CD pipeline through Bitbucket that automatically deploys our app to our Beanstalk instance. Simple right? Well, it is!


Setting up Rails API

First things first, we need to setup our Rails backend.

$ rails new myapp --database=postgresql --api

Navigate into your newly created app directory and boot the server up, you should get the all too familiar Rails Welcome Screen.

$ cd myapp
$ rails s
$ => Booting Puma
$ => Rails 5.1.4 application starting in development
$ => Run `rails server -h` for more startup options
$ Puma starting in single mode...
$ * Version 3.10.0 (ruby 2.4.2-p198), codename: Russell's Teapot
$ * Min threads: 5, max threads: 5
$ * Environment: development
$ * Listening on tcp://
$ Use Ctrl-C to stop
Yay! We’re on Rails!

Next, we’re gonna want to install rspec as our testing suite, because we’re good and we write tests before coding, right?

Add the gem rspec-rails to your gemfile, bundle and then install it with rails. Along with this we’ll need the gem rails_serve_static_assets so our react front end is served correctly when built and deployed.

# add gem 'rspec-rails' and gem 'rails_serve_static_assets' to your gemfile in your app's root.
$ bundle install
$ rails g rspec:install
$ rspec
Finished in 0.00031 seconds (files took 0.09378 seconds to load)
0 examples, 0 failures

Now we’ve got our skeleton, lets push it up to our Bitbucket Repo and setup our build pipeline.

#cd into your app's root 
$ git add .
$ git commit -m 'first commit, initial project setup'
$ git remote add origin
$ git push -u origin master

CI/CD Pipeline Setup

First you’ll need to enable pipeline builds, to do this click Settings from the sidebar and toggle the Enable Pipelines switch to its active state.

Enable Pipelines!

Once your project is on Bitbucket, hit the Pipelines button on the sidebar and choose Ruby from the dropdown language template selections.

Copy and Paste/Commit the File to trigger a build — make sure the Ruby version is correct

Copy the file and paste the contents in a newly created bitbucket-pipelines.yml file within the root of your app, ensure you’ve stated the correct Ruby version (2.4.2). Once committed and pushed, a build will commence.

$ touch bitbucket-piplines.yml
# copy the template contents into your newly created file
$ git add .
$ git commit -m 'adds bitbucket-pipelines.yml and triggers first build'
$ git push origin master
Our first successful (pointless) build!

At this stage, it’d be useful to add our AWS Keys so that Bitbucket can successfully deploy our app to Elastic Beanstalk. Click the Settings button and add both your AWS_ACCESS_KEY_ID and your AWS_SECRET_ACCESS_KEY to the Environment Variables section on the Settings tab. This is available from the My Security Credentials dropdown on your AWS Console, when hovering over your username in the top right. If you haven’t, you should really setup an IAM user. You shouldn’t really be using your Root Account Access Keys — but that’s out of scope for this post, if you want to see how to setup an IAM user head here.

Make sure to hit the “Secured” check box when adding your Environment Variables

Now we can setup our app to deploy to Elastic Beanstalk, on each successful build.

Open up your bitbucket-pipelines.yml file and edit it to resemble the following.

# myapp/bitbucket-pipelines.yml
# This is a sample build configuration for Ruby.
# Check our guides at for more examples.
# Only use spaces to indent your .yml configuration.
# -----
# You can specify a custom docker image from Docker Hub as your build environment.
image: ruby:2.4.2
- step:
- bundler
- pip
- apt-get update && apt-get install -y python-dev
- curl -O
- python
- pip install awsebcli --upgrade
- pip install awscli --upgrade
- bundle install --path vendor/bundle
- bundle exec rake
- aws --version
- eb init My-Application -r eu-west-2 -p arn:aws:elasticbeanstalk:eu-west-2::platform/Puma with Ruby 2.4 running on 64bit Amazon Linux/2.5.0
- eb deploy MyApplication-staging
- postgres
bundler: vendor/bundle
pip: ~/.cache/pip
image: postgres
POSTGRES_DB: 'myapp_test'
POSTGRES_PASSWORD: 'test_user_password'

Lets take this moment to breakdown what exactly our pipelines file is doing. Bitbucket builds your app in a docker container, so if you wanted you could choose any docker image that best suited your app. The image we’ve chosen is pre-configured only for ruby 2.4.2, therefore we have to install any other dependancies or packages we might need during the build process. We also load in another docker container for postgres, as our tests will require a database. In order for this to work correctly you’ll also have to edit your database.yml file with the following.

# myapp/config/database.yml
<<: *default
database: myapp_test
host: localhost

Looking at the file, we first create a cache for our gems and for our other pip packages so that when the build runs, it doesn’t have to continuously download these dependancies. Instead, it can use the ones saved in our cache folder. This dramatically speeds the build up, which is important as you only get 100 minutes free per month. Next, we’re installing python in order to install AWS CLI and EB CLI. After that, we run our tests using bundle exec rake, if all tests pass we deploy the app using the eb commands.

Elastic Beanstalk Setup

Next, you’ll need to create an application in AWS’s Elastic Beanstalk console and initialise Elastic Beanstalk within your repo. This is what the console would look like once an app’s created.

AWS Elastic Beanstalk Console

You can either create an environment now in the console or via EB CLI, I chose to do one now in the console as I can setup the database all in one go, but feel free to leave the database for later if you like.

Creating a new Environment for your App

You can name your enviornment whatever you like, I’ve called mine staging as I aim to have a staging site and live site. This helps if you’re going down the route of Blue-Green deployment. Choose Ruby from the platform list and then hit Configure more Options.

Advanced Configuration for your App Environment

Here we’ll change the platform configuration, at the top, to use Puma instead of Passenger Standalone as the web server. Next, click the Modify button under the Database card.

Setting up Postgres DB Instance

Ensure to choose postgres from the dropdown options, as thats the database we’re using, then create a username and password and hit Save. Once this is done, you can Create Environment which will take a few minutes as we’ve setup our database as well. Take 5!

Once this is setup, lets add all our environment variables. We’ll also need to edit our database.yml again to ensure it can connect to our RDS postgres instance.

# myapp/config/database.yml
<<: *default
<<: *default
adapter: postgresql
encoding: unicode
database: <%= ENV['RDS_DB_NAME'] %>
username: <%= ENV['RDS_USERNAME'] %>
password: <%= ENV['RDS_PASSWORD'] %>
host: <%= ENV['RDS_HOSTNAME'] %>
port: <%= ENV['RDS_PORT'] %>

In the Elastic Beanstalk Console, hit the Configuration button and then click the Software Configuration card. Here, we’ll add our env vars, which should include:

Adding your Environment Variables to your EB App

When you’re done click Apply.

Lastly, before pushing for another build, we’ll initialise Elastic Beanstalk in the root of our app.

$ eb init

Select a default region
1) us-east-1 : US East (N. Virginia)
2) us-west-1 : US West (N. California)
3) us-west-2 : US West (Oregon)
4) eu-west-1 : EU (Ireland)
5) eu-central-1 : EU (Frankfurt)
6) ap-south-1 : Asia Pacific (Mumbai)
7) ap-southeast-1 : Asia Pacific (Singapore)
8) ap-southeast-2 : Asia Pacific (Sydney)
9) ap-northeast-1 : Asia Pacific (Tokyo)
10) ap-northeast-2 : Asia Pacific (Seoul)
11) sa-east-1 : South America (Sao Paulo)
12) cn-north-1 : China (Beijing)
13) us-east-2 : US East (Ohio)
14) ca-central-1 : Canada (Central)
15) eu-west-2 : EU (London)
(default is 3): 15
Select an application to use
1) My-Application
2) [ Create new Application ]
(default is 2): 1
Note: Elastic Beanstalk now supports AWS CodeCommit; a fully-managed source control service. To learn more, see Docs:
Do you wish to continue with CodeCommit? (y/N) (default is n): n

This will create a .elasticbeanstalk folder with a config.yml file. You’ll need to edit out the following lines in your .gitignore to be able to see the config.yml file, which should look something like this.

# myapp/.ebextensions/config.yml
environment: MyApplication-staging
branch: null
repository: null
application_name: My-Application
default_ec2_keyname: null
default_platform: arn:aws:elasticbeanstalk:eu-west-2::platform/Puma with Ruby 2.4
running on 64bit Amazon Linux/2.6.0
default_region: eu-west-2
include_git_submodules: true
instance_profile: null
platform_name: null
platform_version: null
profile: null
sc: git
workspace_type: Application

As your project gets more complex, you can specify different branches to be deployed to different environments. For now, though, this’ll work.

Boom you’re done! Setup is complete, look at all that green!

Our first successful (meaningful) build!

Ok, so you’ll probably be like wtf right now, because if you’ve clicked your app’s URL link or typed eb open in the root of your app, you’ll get the following…


Fear not! We haven’t actually set any routes up, if you download the logs you’ll see there’s a routing error. Don’t worry, we’ll setup our API next.

Creating our API

For this post, I’ll use the famous Todo example. Therefore, we should create a controller and model for our Todos. Good thing we can scaffold this up in rails. I still urge you to study what’s happening in each of the created files, so you at least get a grasp of how your app’s basic functions work. Rails is great for booting up a quick project, but it’s important you still understand how it’s put together.

$ rails g scaffold Todo title:string
Running via Spring preloader in process 79500
invoke active_record
identical db/migrate/20171107165836_create_todos.rb
identical app/models/todo.rb
invoke rspec
create spec/models/todo_spec.rb
invoke resource_route
route resources :todos
invoke scaffold_controller
identical app/controllers/todos_controller.rb
invoke rspec
create spec/controllers/todos_controller_spec.rb
create spec/routing/todos_routing_spec.rb
invoke rspec
create spec/requests/todos_spec.rb

If you run your tests you’ll see it’s even written some of those for you. So there’s no real excuse for you to not write tests here…

$ rspec
Pending: (Failures listed here are expected and do not affect your suite's status)
1) TodosController GET #index returns a success response
# Add a hash of attributes valid for your model
# ./spec/controllers/todos_controller_spec.rb:45
2) TodosController GET #show returns a success response
# Add a hash of attributes valid for your model
# ./spec/controllers/todos_controller_spec.rb:53
3) TodosController POST #create with valid params creates a new Todo
# Add a hash of attributes valid for your model
# ./spec/controllers/todos_controller_spec.rb:62
4) TodosController POST #create with valid params renders a JSON response with the new todo
# Add a hash of attributes valid for your model
# ./spec/controllers/todos_controller_spec.rb:68
5) TodosController POST #create with invalid params renders a JSON response with errors for the new todo
# Add a hash of attributes invalid for your model
# ./spec/controllers/todos_controller_spec.rb:78
6) TodosController PUT #update with valid params updates the requested todo
# Add a hash of attributes valid for your model
# ./spec/controllers/todos_controller_spec.rb:93
7) TodosController PUT #update with valid params renders a JSON response with the todo
# Add a hash of attributes valid for your model
# ./spec/controllers/todos_controller_spec.rb:100
8) TodosController PUT #update with invalid params renders a JSON response with errors for the todo
# Add a hash of attributes valid for your model
# ./spec/controllers/todos_controller_spec.rb:110
9) TodosController DELETE #destroy destroys the requested todo
# Add a hash of attributes valid for your model
# ./spec/controllers/todos_controller_spec.rb:121
10) Todo add some examples to (or delete) /Users/jameshamann/Documents/Development/rails_test/myapp/spec/models/todo_spec.rb
# Not yet implemented
# ./spec/models/todo_spec.rb:4
Finished in 0.16122 seconds (files took 1.84 seconds to load)
17 examples, 0 failures, 10 pending

After pushing your latest update to your repo and once the build has succeeded, head over to your app’s page (using eb open or the link in the console). You’ll still see an error but if you navigate to the path /todos you should see the following…

Empty JSON

Ok you might need glasses, but there’s a little []. That little [] means there’s no data, but our app works!

To give this a quick test, let’s send a POST request up to our API and create a new Todo. I use httpie as it’s pretty quick and easy to install and get setup.

$ http POST \

HTTP/1.1 201 Created
Cache-Control: max-age=0, private, must-revalidate
Connection: keep-alive
Content-Type: application/json; charset=utf-8
Date: Wed, 22 Nov 2017 20:22:29 GMT
ETag: W/"6690e9dc1a90d86fe6463be9dec11a9b"
Server: nginx/1.12.1
Transfer-Encoding: chunked
X-Request-Id: 28d7bef7-4605-4128-8c56-179ecc0a6489
X-Runtime: 0.059169
"created_at": "2017-11-22T15:22:29.064Z",
"id": 1,
"title": "PickUpBeers",
"updated_at": "2017-11-22T15:22:29.064Z"

You’ll see our request has been successfully sent, but does it display in our app?

Of course it does!

Now that you have some proof that it actually works, let’s add our front end quick and link it all together.

Setting up React Front-End

We’re gonna use create-react-app to generate the boilerplate code.

$ create-react-app client
Creating a new React app in /Users/jameshamann/Documents/myapp/client.
Installing packages. This might take a couple of minutes.
Installing react, react-dom, and react-scripts...
> fsevents@1.1.2 install /Users/jameshamann/Documents/myapp/client/node_modules/fsevents
> node install
[fsevents] Success: "/Users/jameshamann/Documents/Development/myapp/client/node_modules/fsevents/lib/binding/Release/node-v57-darwin-x64/fse.node" already installed
Pass --update-binary to reinstall or --build-from-source to recompile
> uglifyjs-webpack-plugin@0.4.6 postinstall /Users/jameshamann/Documents/Development/myapp/client/node_modules/uglifyjs-webpack-plugin
> node lib/post_install.js
+ react@16.0.0
+ react-dom@16.0.0
+ react-scripts@1.0.17
added 1267 packages in 38.902s
Success! Created client at /Users/jameshamann/Documents/Development/myapp/client
Inside that directory, you can run several commands:
npm start
Starts the development server.
npm run build
Bundles the app into static files for production.
npm test
Starts the test runner.
npm run eject
Removes this tool and copies build dependencies, configuration files
and scripts into the app directory. If you do this, you can’t go back!
We suggest that you begin by typing:
cd client
npm start
Happy hacking!

Now you keen eyes will be thinking, how can we run node and rails at the same time, what are we going to do?! Fear not, we’ll run rails on port 3001 whilst running our react app on 3000. This speeds up development massively as you can see changes instantly, instead of having to run the build command each time you wanted to view any updates. Edit your package.json to proxy requests to port 3001.

# myapp/client/package.json
"name": "client",
"version": "0.1.0",
"proxy": "http://localhost:3001",
"private": true,
"dependencies": {
"react": "^16.0.0",
"react-dom": "^16.0.0",
"react-scripts": "1.0.17"
"scripts": {
"start": "PORT=3000 react-scripts start",
"build": "react-scripts build",
"test": "react-scripts test --env=jsdom",
"eject": "react-scripts eject"

This next section is completely optional. It’s useful though as it automates a lot of the tedious tasks you’ll be doing on a regular basis.

First up, we’re going to add the foreman gem to our gem file and bundle it, this allows us to write Procfiles to manage our development and build tasks.

Once bundled, create a Procfile in the myapp/client directory called Procfile. It should look like this.

# myapp/client/Procfile
web: npm start
api: cd .. && bundle exec rails s -p 3001

It’s pretty self explanatory, but here we’re starting up node on port 3000 and rails on port 3001.

Next we’re going to create a few rake tasks, one for development and one for production. The default will be for development and we’ll have to specify rake start:production if we want a production build.

$ touch myapp/lib/tasks/start.rake
# myapp/lib/tasks/start.rake
namespace :start do
task :development do
exec 'cd client && foreman start -f Procfile'
desc 'Start production server'
task :production do
exec 'NPM_CONFIG_PRODUCTION=true npm run clientbuild && cd client'
desc 'Start development server'
task :start => 'start:development'

These rake tasks will speed up your workflow by huge amounts, it’ll make booting up your dev server and generating new builds effortless. Give it a go, try running rake start.

$ rake start 
19:22:35 web.1  | started with pid 81472
19:22:35 api.1 | started with pid 81473
19:22:35 web.1 |
19:22:35 web.1 | > client@0.1.0 start /Users/jameshamann/Documents/Development/rails_test/myapp/client
19:22:35 web.1 | > PORT=3000 react-scripts start
19:22:35 web.1 |
19:22:36 api.1 | => Booting Puma
19:22:36 api.1 | => Rails 5.1.4 application starting in development
19:22:36 api.1 | => Run `rails server -h` for more startup options
19:22:36 api.1 | Puma starting in single mode...
19:22:36 api.1 | * Version 3.10.0 (ruby 2.4.2-p198), codename: Russell's Teapot
19:22:36 api.1 | * Min threads: 5, max threads: 5
19:22:36 api.1 | * Environment: development
19:22:36 api.1 | * Listening on tcp://localhost:3001
19:22:36 api.1 | Use Ctrl-C to stop
19:22:37 web.1 | Starting the development server...
19:22:37 web.1 |
19:22:41 web.1 | Compiled successfully!
19:22:41 web.1 |
19:22:41 web.1 | You can now view client in the browser.
19:22:41 web.1 |
19:22:41 web.1 | Local: http://localhost:3000/
19:22:41 web.1 | On Your Network:
19:22:41 web.1 |
19:22:41 web.1 | Note that the development build is not optimized.
19:22:41 web.1 | To create a production build, use npm run build.
19:22:41 web.1 |

Pretty nifty ay?

Lets configure our production task correctly now. You’ll notice from the code that it runs a script within the app directory, not the client directory, this means we’ll need to create another package.json file, but this one will live in our app’s root.

$ npm init
This utility will walk you through creating a package.json file.
It only covers the most common items, and tries to guess sensible defaults.
See `npm help json` for definitive documentation on these fields
and exactly what they do.
Use `npm install <pkg>` afterwards to install a package and
save it as a dependency in the package.json file.
Press ^C at any time to quit.
package name: (myapp) myapp
version: (1.0.0)
entry point: (index.js)
test command:
git repository: (
license: (ISC)

This will generate you a package.json, which you can edit to replicate the below version.

"name": "myapp",
"engines": {
"node": "6.3.1"
"scripts": {
"build": "cd client && npm install && npm run build && cd ..",
"deploy": "cp -a client/build/. public/",
"clientbuild": "npm run build && npm run deploy && echo 'Client built!'"

All we’ve done is automated running commands like npm run deploy, npm run build. As both our API and Front End live on the same domain, for now, we’ll need to copy our build files over to the public folder in the app’s root. Everything in the public folder is your app’s front-end after it’s been built for production.

Run a production build using rake start:production, then push the contents to your repo to trigger a build.

When navigating to your app you should see this:

Congrats, you’re done! You’ve got yourself a Rails API, React app setup with a automated build pipeline, well done! Obviously there’s a lot more to cover, but the building blocks are in place for you to carry on building out your app.

In my next post I’ll look into implementing the basic CRUD functions of the app.

As always, if you have any questions please drop a comment!

Thanks for reading, hit 👏 if you like what you read and be sure to follow to keep up to date with future posts.

Rails 5 API, React, Bitbucket Pipelines and AWS Elastic Beanstalk — Part One was originally published in CloudBoost on Medium, where people are continuing the conversation by highlighting and responding to this story.

Hi Martin Raskovsky, thanks for getting in touch!

Hi Martin Raskovsky, thanks for getting in touch! Looking at your logs now and trying to figure out what the issue is. I could be completely wrong, but it looks like it’s failing when during the asset_compliation stage, so there might be something wrong, somewhere in your project.

Looking at your second error when running “eb deploy”, it looks like Postgres isn’t running. I’ve had a similar error before and it’s usually down to the Postgres server not running/being offline.

I’d be more than happy to help you with this, would you be able to share your project/repo with me so I can checkout the code and try deploying it myself?

Thanks, James

library_booksRead more at Medium