Bringing to Life an Open Source Software Project via Github & Jekyll – Part 1

Starting with Github, Automatic Page Generation & Jekyll

It’s time for another blog series! This is a series I’m starting to outline that crazy complex site I’m building to prove out all sorts of things, all located at http://adron.me. So far it’s just a site that hold portfolio information for my coding, biking and related information about me. However I’m using this as a base template, that anybody can use via the github repo I’ve created simply titled Me, to start and scale their own personal site. But beyond that, I’ll be using practices and technology that can be used to truly scale large sites with lots of users. If you have any questions, comments or suggestions about this series please ping me on Twitter @adron or leave a comment on this blog itself.

First things first, let’s get our git repository setup on github with the appropriate issues list, and project page via gh-pages (jekyll).

Create a new github project.

Creating a new github project, setting it up with a default Node.js readme file and .gitignore.
Creating a new github project, setting it up with a default Node.js readme file and .gitignore. (Click for full size image)

Next we’ll have the site created and github will display the repository page. On this page you can see that the README.md and .gitignore file have been created with some basic defaults for a Node.js Project. At the top right click on the Settings button to navigate to the settings page.

Settings, get there.
Settings, get there. (Click on image for full size)

On the settings page scroll down until you see the section for Github Pages with the Automatic Page Generator button. Click that button.

Github Pages Automatic Page Generator.
Github Pages Automatic Page Generator. (Click for full size image)

You’ll be directed to create a page, with default data as shown below.

Default Github Pages Project Page
Default Github Pages Project Page. (Click for full size image)

Scroll down on this page and click to create the automatically generated page. You’ll be sent to the page to select a theme. I just went with the default since I’ll delete it later to create the Jeckyll project page instead. Click on publish on this screen.

Default Template Selection for Github Pages.
Default Template Selection for Github Pages. (Click for full size image)

Once this page generates you’ll be directed back to the github repository again. At the top of the page there will be a link to the newly generated page. You can click on this to navigate to it and see what it looks like. Also note that the new repository branch which is named gh-pages is not displayed. This new branch includes all of the files for this Github Pages project page.

Github Pages Automatic Page Generator generated a branch for the new page.
Github Pages Automatic Page Generator generated a branch for the new page. (Click for a full size image)

Change the branch to the gh-pages branch and you’ll see that the branch has entirely different files than the master branch.

The default files created in an automatically generated Github Pages site.
The default files created in an automatically generated Github Pages site. (Click for full size image)

If you click on the link at the top of the page, you’ll see where the current automatically generated page is and where the future jekyll site we’ll build will be located at. This page that was generated (unless you chose another theme) will look like this.

Github Pages default page with a default theme.
Github Pages default page with a default theme. (Click for full size image)

Now we have an appropriate branch and we’re ready to toss a jeckyll project in its place. However, if a default template and related content works for your project, that’s a great way to go with it. However some projects may want a blog or other content, just a bit more the default generated page, and this is what we’ll create with this jeckyll site. It’s also very easy to create a jeckyll site and create an image portfolio or a host of other types of sites, all backed by a git repository on github. I’ll keep moving now, on to the jeckyll site!

A Github Jeckyll Site

First get a clone of the site that was generated in the above instructions.

[sourcecode language=”bash”]
git clone git@github.com:Adron/Me.git
[/sourcecode]

Now we have a good working directory where this is cloned at. First switch branches to the gh-pages branch. Get a list of all branches (the -a switch shows all branches, even the remote branches).

[sourcecode language=”bash”]
git branch -a
[/sourcecode]

The results will display with the full branch names for the remote branches.

[sourcecode language=”bash”]
git branch -a
* master
remotes/origin/HEAD -> origin/master
remotes/origin/gh-pages
remotes/origin/master
remotes/origin/theme-all-by-itself
[/sourcecode]

The asterisks shows the current active branch. The branch that needs edited is the remotes/origin/gh-pages branch. Check it out with the following command.

[sourcecode language=”bash”]
git checkout remotes/origin/gh-pages
[/sourcecode]

Which will then print out the following content.

[sourcecode language=”bash”]
Note: checking out ‘remotes/origin/gh-pages’.

You are in ‘detached HEAD’ state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by performing another checkout.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -b with the checkout command again. Example:

git checkout -b new_branch_name

HEAD is now at d430b0c… … etc. etc…
[/sourcecode]

What this basically means is, we have a branch that is disconnected and not attached to the repository in any way. To fix this we’ll create a working local branch to work with the remotes/origin/gh-pages branch. Do this with the following command.

[sourcecode language=”bash”]
git branch gh-pages
[/sourcecode]

Then if you get a list of the existing local branches the master and gh-pages branches will be available locally.

[sourcecode language=”bash”]
git branch
* (no branch)
gh-pages
master
[/sourcecode]

Now switch to that branch. Since we’re currently on (No branch) this content will be switched to the local gh-pages branch wit the following command.

[sourcecode language=”bash”]
git checkout gh-pages
[/sourcecode]

Now delete all those files in that directory that github automatically created. In their place add a README.md and .gitignore file. In the commands below I’m using Sublime Text 2 command line tooling to open up the files into memory with Sublime Text 2.

[sourcecode language=”bash”]
subl .gitignore
[/sourcecode]

Add the following content to the .gitignore file.

[sourcecode language=”bash”]
_site/
*.DS_Store
.DS_Store
[/sourcecode]

Save that file. Create the README.md and…

[sourcecode language=”bash”]
subl README.md
[/sourcecode]

…add some content to it.

[sourcecode language=”bash”]
My Awesome Header
===
Description: An appropriate description.
Documentation

Dammit, WRITE YOUR DOCUMENTATION!!!!
[/sourcecode]

I added a little mark down for that README.md example just so there is something more than just text displaying.

Add these files to the local repository, committing and then pushing them up to the remote gh-pages repository.

[sourcecode language=”bash”]
git add -A
git commit -m ‘Adding initial .gitignore and README.md content.’
git push origin gh-pages
[/sourcecode]

Now in this directory we’re about to get things kicking for Jeckyll. First, get Jeckyll.

[sourcecode language=”bash”]
sudo gem install jeckyll
[/sourcecode]

For your specific OS you may want to check out the actual jeckyll repository installation instructions. There is more of a break down of various operating system needs for use.

Once it is installed, we’re now ready to use jeckyll to generate static content and even run the jeckyll server to view what our site looks like locally. Now let’s get some content together that jeckyll will know how to statically generate for viewing.

Create a folder called “_layouts” in the local repository. With the “_” at the beginning of the directory, it will not be generated into the static content by jekyll, but the convention is for the _layouts directory to include the templates for creating a standard layout for the rest of the pages in the site.

[sourcecode language=”bash”]
mkdir _layouts
[/sourcecode]

After creating the new folder, create a default.html inside the folder.

[sourcecode language=”html”]
<!DOCTYPE html>
<html>
<head>
<meta charset=utf-8 />
<title> {% if page.title %} {{ page.title }} | {% endif %} Adron Hall</title>
<link rel="stylesheet" href="css/styles.css" />
</head>
<body>

<div id="main">

<header>
<h1> Adron’s Github Project Pages </h1>
<header>

<nav role="navigation">
<ul>
<li><a href="/">Home</a></li>
<li><a href="/about">About</a></li>
<li><a href="/contact">Contact</a></li>
</ul>
</nav>

{{ content }}

<footer>
<p>&copy; Adron’s Github Project Pages 2013 | All Rights Reserved. </p>
</footer>

</div>
</body>
</html>
[/sourcecode]

In the above HTML the {{ page.title }} item is a template variable that will display the title of the page when it is generated into static content by jekyll. Once this is created add an index.html page to the root path of the repository.

[sourcecode language=”bash”]
subl index.html
[/sourcecode]

Now add the following content to the index.html page and save it.
[sourcecode language=”html”]

layout: default

<section role="banner">
The banner will go here eventually.
</section>

<section class="content">
<p>
Welcome to Adron Hall’s Github Project Pages all manually created with jekyll!! Please, check out the blog at <a href="http://compositecode.com">Composite Code</a>!
</p>
</section>
[/sourcecode]

Now create a stylesheet and respective static directory for it.

[sourcecode language=”bash”]
mkdir css
[/sourcecode]

Create a styles.css file and add the following content that Dave Gamache created. I’ve included it below with all the appropriate licensing information and related content. Save this file when done.

[sourcecode language=”HTML”]
/*
* Skeleton V1.1
* Copyright 2011, Dave Gamache
* http://www.getskeleton.com
* Free to use under the MIT license.
* http://www.opensource.org/licenses/mit-license.php
* 8/17/2011
*/

/* Table of Content
==================================================
#Reset & Basics
#Basic Styles
#Site Styles
#Typography
#Links
#Lists
#Images
#Buttons
#Tabs
#Forms
#Misc */

/* #Reset & Basics (Inspired by E. Meyers)
================================================== */
html, body, div, span, applet, object, iframe, h1, h2, h3, h4, h5, h6, p, blockquote, pre, a, abbr, acronym, address, big, cite, code, del, dfn, em, img, ins, kbd, q, s, samp, small, strike, strong, sub, sup, tt, var, b, u, i, center, dl, dt, dd, ol, ul, li, fieldset, form, label, legend, table, caption, tbody, tfoot, thead, tr, th, td, article, aside, canvas, details, embed, figure, figcaption, footer, header, hgroup, menu, nav, output, ruby, section, summary, time, mark, audio, video {
border: 0 none;
font: inherit;
margin: 0;
padding: 0;
vertical-align: baseline;
}
article, aside, details, figcaption, figure, footer, header, hgroup, menu, nav, section {
display: block;
}
body {
line-height: 18px;
}
ol, ul {
list-style: none; }
blockquote, q {
quotes: none; }
blockquote:before, blockquote:after,
q:before, q:after {
content: ”;
content: none; }
table {
border-collapse: collapse;
border-spacing: 0; }

/* #Basic Styles
================================================== */
body {
background: url(../images/body_bg.png) repeat;
font-family: ‘Dosis’, sans-serif;
font-size:14px;
font-weight:500;
color:#E8E8E8;
text-shadow:0 1px 0 rgba(0, 0, 0, 0.7);
-webkit-font-smoothing: antialiased; /* Fix for webkit rendering */
-webkit-text-size-adjust: 100%;
}

.max-image { width:100%; height:auto; }
/* #Typography
================================================== */
h1, h2, h3, h4, h5, h6 {font-weight: normal;}
h1 a, h2 a, h3 a, h4 a, h5 a, h6 a { font-weight: inherit; }
h1 { font-size: 26px; line-height: 22px; margin-bottom: 14px; color:#FFCC00;}
h2 { font-size: 20px; line-height: 24px; margin-bottom: 10px; color:#FFF; font-weight:600; }
h3 { font-size: 16px; line-height: 22px; margin-bottom: 10px; color:#FFF; }
h4 { font-size: 15px; line-height: 20px; margin-bottom: 4px; color:#bc4444; }
h5 { font-size: 14px; line-height: 24px; }
h6 { font-size: 14px; line-height: 21px; }
.subheader { color: #777; }

p { margin: 0 0 20px 0; }
p img { margin: 0; }
p.lead { font-size: 21px; line-height: 27px; color: #777; }

em { font-style: italic; }
strong { font-weight: bold; color: #333; }
small { font-size: 80%; }

/* Blockquotes */
blockquote, blockquote p { font-size: 18px; line-height: 24px; color: #333; }
blockquote { margin: 0; padding: 0 20px 0 20px; border-left: 2px solid #F87536; }
blockquote cite { display: block; color: #ff8a00; }
blockquote cite:before { content: "\2014 \0020"; }
blockquote cite a, blockquote cite a:visited, blockquote cite a:visited { color: #555; }

hr { border: solid #ddd; border-width: 1px 0 0; clear: both; margin: 10px 0 30px; height: 0; }

/* #Links
================================================== */
a, a:visited { color: #f35f2a; text-decoration: none; outline: 0; }
a:hover, a:focus { color: #FFF; }
p a, p a:visited { line-height: inherit; }

/* #Lists
================================================== */
ul, ol { margin-bottom: 20px; }
ul { list-style: none outside; }
ol { list-style: decimal; }
ol, ul.square, ul.circle, ul.disc { margin-left: 30px; }
ul.square { list-style: square outside; }
ul.circle { list-style: circle outside; }
ul.disc { list-style: disc outside; }
ul ul, ul ol,
ol ol, ol ul { margin: 4px 0 5px 30px; font-size: 90%; }
ul ul li, ul ol li,
ol ol li, ol ul li { margin-bottom: 6px; }
li { line-height: 18px; margin-bottom: 12px; }
ul.large li { line-height: 21px; }
li p { line-height: 21px; }

/* #Images
================================================== */

img.scale-with-grid {
max-width: 100%;
height: auto; }

/* #Buttons
================================================== */

/*.button,
button,
input[type="submit"],
input[type="reset"],
input[type="button"] {
background: #3973a6; Old browsers
background: #3973a6 -moz-linear-gradient(top, rgba(255,255,255,.2) 0%, rgba(0,0,0,.2) 100%); FF3.6+
background: #3973a6 -webkit-gradient(linear, left top, left bottom, color-stop(0%,rgba(255,255,255,.2)), color-stop(100%,rgba(0,0,0,.2))); Chrome,Safari4+
background: #3973a6 -webkit-linear-gradient(top, rgba(255,255,255,.2) 0%,rgba(0,0,0,.2) 100%); Chrome10+,Safari5.1+
background: #3973a6 -o-linear-gradient(top, rgba(255,255,255,.2) 0%,rgba(0,0,0,.2) 100%); Opera11.10+
background: #3973a6 -ms-linear-gradient(top, rgba(255,255,255,.2) 0%,rgba(0,0,0,.2) 100%); IE10+
background: #3973a6 linear-gradient(top, rgba(255,255,255,.2) 0%,rgba(0,0,0,.2) 100%); W3C
border: 1px solid #3973a6;
border-top: 1px solid #3973a6;
border-left: 1px solid #3973a6;
color: #FFF;
display: inline-block;
font-size: 12px;
font-weight: bold;
text-decoration: none;
text-shadow: 0 1px rgba(33, 77, 121, .75);
cursor: pointer;
margin-bottom: 20px;
line-height: normal;
padding: 8px 10px; }

.button:hover,
button:hover,
input[type="submit"]:hover,
input[type="reset"]:hover,
input[type="button"]:hover {
color: #222;
background: #FF8A00; Old browsers
background: #FF8A00 -moz-linear-gradient(top, rgba(255,255,255,.3) 0%, rgba(0,0,0,.3) 100%); FF3.6+
background: #FF8A00 -webkit-gradient(linear, left top, left bottom, color-stop(0%,rgba(255,255,255,.3)), color-stop(100%,rgba(0,0,0,.3))); Chrome,Safari4+
background: #FF8A00 -webkit-linear-gradient(top, rgba(255,255,255,.3) 0%,rgba(0,0,0,.3) 100%); Chrome10+,Safari5.1+
background: #FF8A00 -o-linear-gradient(top, rgba(255,255,255,.3) 0%,rgba(0,0,0,.3) 100%); Opera11.10+
background: #FF8A00 -ms-linear-gradient(top, rgba(255,255,255,.3) 0%,rgba(0,0,0,.3) 100%); IE10+
background: #FF8A00 linear-gradient(top, rgba(255,255,255,.3) 0%,rgba(0,0,0,.3) 100%); W3C
border: 1px solid #888;
border-top: 1px solid #aaa;
border-left: 1px solid #aaa; }

.button:active,
button:active,
input[type="submit"]:active,
input[type="reset"]:active,
input[type="button"]:active {
border: 1px solid #666;
background: #ccc; Old browsers
background: #ccc -moz-linear-gradient(top, rgba(255,255,255,.35) 0%, rgba(10,10,10,.4) 100%); FF3.6+
background: #ccc -webkit-gradient(linear, left top, left bottom, color-stop(0%,rgba(255,255,255,.35)), color-stop(100%,rgba(10,10,10,.4))); Chrome,Safari4+
background: #ccc -webkit-linear-gradient(top, rgba(255,255,255,.35) 0%,rgba(10,10,10,.4) 100%); Chrome10+,Safari5.1+
background: #ccc -o-linear-gradient(top, rgba(255,255,255,.35) 0%,rgba(10,10,10,.4) 100%); Opera11.10+
background: #ccc -ms-linear-gradient(top, rgba(255,255,255,.35) 0%,rgba(10,10,10,.4) 100%); IE10+
background: #ccc linear-gradient(top, rgba(255,255,255,.35) 0%,rgba(10,10,10,.4) 100%); W3C }

.button.full-width,
button.full-width,
input[type="submit"].full-width,
input[type="reset"].full-width,
input[type="button"].full-width {
width: 100%;
padding-left: 0 !important;
padding-right: 0 !important;
text-align: center; }

Fix for odd Mozilla border & padding issues
button::-moz-focus-inner,
input::-moz-focus-inner {
border: 0;
padding: 0;
}*/

/* #Tabs (activate in tabs.js)
================================================== */
/*ul.tabs {
display: block;
margin: 0 0 20px 0;
padding: 0;
border-bottom: solid 1px #ddd; }
ul.tabs li {
display: block;
width: auto;
height: 30px;
padding: 0;
float: left;
margin-bottom: 0; }
ul.tabs li a {
display: block;
text-decoration: none;
width: auto;
height: 29px;
padding: 0px 20px;
line-height: 30px;
border: solid 1px #ddd;
border-width: 1px 1px 0 0;
margin: 0;
background: #f5f5f5;
font-size: 13px; }
ul.tabs li a.active {
background: #fff;
height: 30px;
position: relative;
top: -4px;
padding-top: 4px;
border-left-width: 1px;
margin: 0 0 0 -1px;
color: #111;
-moz-border-radius-topleft: 2px;
-webkit-border-top-left-radius: 2px;
border-top-left-radius: 2px;
-moz-border-radius-topright: 2px;
-webkit-border-top-right-radius: 2px;
border-top-right-radius: 2px; }
ul.tabs li:first-child a.active {
margin-left: 0; }
ul.tabs li:first-child a {
border-width: 1px 1px 0 1px;
-moz-border-radius-topleft: 2px;
-webkit-border-top-left-radius: 2px;
border-top-left-radius: 2px; }
ul.tabs li:last-child a {
-moz-border-radius-topright: 2px;
-webkit-border-top-right-radius: 2px;
border-top-right-radius: 2px; }

ul.tabs-content { margin: 0; display: block; }
ul.tabs-content > li { display:none; }
ul.tabs-content > li.active { display: block; }

/* Clearfixing tabs for beautiful stacking */
/*ul.tabs:before,
ul.tabs:after {
content: ‘\0020’;
display: block;
overflow: hidden;
visibility: hidden;
width: 0;
height: 0; }
ul.tabs:after {
clear: both; }
ul.tabs {
zoom: 1; }*/*/

/* #Forms
================================================== */

form {
margin-bottom: 20px; }
fieldset {
margin-bottom: 20px; }
input[type="text"],
input[type="password"],
input[type="email"],
textarea,
select {
border: 1px solid #ccc;
padding: 6px 4px;
outline: none;
-moz-border-radius: 2px;
-webkit-border-radius: 2px;
border-radius: 2px;
font: 13px "HelveticaNeue", "Helvetica Neue", Helvetica, Arial, sans-serif;
color: #777;
margin: 0;
width: 210px;
max-width: 100%;
display: block;
margin-bottom: 20px;
background: #fff; }
select {
padding: 0; }
input[type="text"]:focus,
input[type="password"]:focus,
input[type="email"]:focus,
textarea:focus {
border: 1px solid #aaa;
color: #ffcc00;
-moz-box-shadow: 0 0 3px rgba(0,0,0,.2);
-webkit-box-shadow: 0 0 3px rgba(0,0,0,.2);
box-shadow: 0 0 3px rgba(0,0,0,.2); }
textarea {
min-height: 60px; }
label,
legend {
display: block;
font-weight: bold;
font-size: 13px; }
select {
width: 220px; }
input[type="checkbox"] {
display: inline; }
label span,
legend span {
font-weight: normal;
font-size: 13px;
color: #444; }

/* #Misc
================================================== */
.remove-bottom { margin-bottom: 0 !important; }
.half-bottom { margin-bottom: 10px !important; }
.add-bottom { margin-bottom: 20px !important; }
[/sourcecode]

Now we can use the jekyll command to generate the static content and review it. In a subsequent post I’ll cover how to run this via the server to test out what the page would look like with all relative links.

[sourcecode language=”bash”]
jekyll
[/sourcecode]

…and after execution the following message will display at the command line.

[sourcecode language=”bash”]
WARNING: Could not read configuration. Using defaults (and options).
No such file or directory – /Users/adronhall/Codez/Me/_config.yml
Building site: /Users/adronhall/Codez/Me -> /Users/adronhall/Codez/Me/_site
Successfully generated site: /Users/adronhall/Codez/Me -> /Users/adronhall/Codez/Me/_site
[/sourcecode]

This warning isn’t super relevant just yet. We’ve got a basic generated jekyll site localted at /Users/adronhall/Codez/Me/_site where that path is your path to the _static directory within your git repository. Once the content is verified via the _static directory the site is ready to post.

[sourcecode language=”bash”]
git add -A
git commit -m ‘Adding the index.html and basic layouts template for a starter jekyll site.’
git push origin gh-pages
[/sourcecode]

After it pushes, it may take up to 10 minutes before it displays properly. This is likely because Github has to queue the jekyll command to run to statically generate your content for the gh-pages branch. To view the newly created site and see if the generation has occurred navigate to http://yourusername.github.io/Me/. In the case of my personal site, it’s http://adron.github.io/Me/. Also note, at this stage the page and CSS look pretty bad. But work with me here, we’re going to add elements that will align the theme with the actual project as things come together. This css page however will be a core base for everything we’ll work through.

This concludes part 1. I’ve covered created the Github Pages automatically created templates pages, how to delete those and setup a manually created jekyll site. This gives a huge amount of control to the site, to edit, add blogs or other jekyll and of course JavaScript capable site features.

In subsequent parts of this blog series I’ll cover diving into deployment to AWS (Amazon Web Services), how to setup beanstalk via AWS, setting up Riak for a data back end and how to distribute this web application and Riak to offer a massively scalable site architecture (because you never know when you’ll get slammed with page hits on your personal site righ!) Among all these massive distributed how-to topics I’ll also dive into how exactly I’ve got a template and node.js application up and running via AWS and using Riak. So lots of material and information coming in the subsequent parts of this blog series.

🙂 Please subscribe (see upper right follow link to get emails, and no, you’ll get NO spam from me, just blog entries), comment below if you have any questions & let me know if you’d like me to add in any other how-to articles. Thanks & Enjoy!

Coder Brad Heller, Cloudability & Riak

This coming April 29th (Monday) we’ll have Brad Heller coming in to give us the low down on how Cloudability has used Riak (click link meetup event). He sent me a short list outline of the topics he plans to hit. Here’s a few key data problems they’ve run into and how they’ve solved those problems with Riak & other solutions.

@Cloudability

    • What does Cloudability do?
    • Why do we like Riak?
    • Challenges of raw storage in our data pipeline.
  • Idempotency
  • Scale
  • Reliability
    • S3 vs. Riak
  • Queryable S3
  • Maintain “State”
    • Data Pipelines: Specific problems.
    • Idempotency
  • What has been processed?
  • How to figure out what to reprocess?
    • Scaling this.
  • Pipelines scale horizontally
  • Backpressure / failure modes
    • How do we use Riak to address these?
    • Raw data goes directly in Riak (metadata record, raw data record)
    • 2i on metadata record to find what needs processing
    • 2i on metadata record to find stuff to reprocess.
    • Atomic writes: Let Riak assign keys.
  • Duplicate Data is OK, idempotency maintained further down the chain.
  • Don’t worry about collisions–last one wins.
  • Predictable keys are helpful though.
    • Scale is built in!!
  • Read/write throughput gets better with more nodes.
    • What we don’t use.
    • Online MapReduce.
    • Only use for backoffice processes
  • When something isn’t indexed, identify bad data.
  • Add index.
    • Key filtering
  • Riak craps it’s pants.
    • Problems we’ve had
    • Ops is easy, but lots of unexpected behavior.
  • Riak eats all memory, craps it’s pants.
    • Erlang is hard.
  • Everyone on our team hates Erlang except me (Brad).
  • Stacktraces will confuse you a lot at first.
    • Riak is “expensive” compared to S3.
  • We run a ring of 5 m1.xlarge with EBS + provisioned IOPS.

That covers all the topics Brad wants to cover, which I’m sure that brings up a lot of questions already! So the presentation will be great and we’ll be sure to have a good chunk of time for questions and a few beers afterwards!

For more on Brad, give a follow on twitter @bradhe and endorse him for random skill sets like dishwasher programming and coffee cup holding on LinkedIn! Also you can read

Check out Cloudability for keeping tabs on your cloud compute spending. They are the leaders in helping companies manage their spend across many cloud offerings including AWS, Github, Heroku, Softlayer, Engineyard and others.

Red Hat, OpenShift PaaS and Cartridges for Riak

Today I participated in the OpenShift Community Day here in Portland at the Doubletree Hotel. One of the things I wanted to research was the possibility of putting together a OpenShift Origin Cartridge for Riak. As with most PaaS Systems this isn’t the most straight forward process. The reason is simple, OpenShift and CloudFoundry have a deployment model based around certain conventions that don’t fit with the multi-node deployment of a distributed database. But there are ways around this and my intent was to create or come up with a plan for a Cartridge to commit these work-arounds.

After reading the “New OpenShift Cartridge Format – Part 1” by Mike McGrath @Michael_Mcgrath I set out to get a Red Hat Enterprise Linux image up and running. The quickest route to that was to spool up an AWS EC2 instance. 30 seconds later I had exactly that up and running. The next goal was to get Riak installed and running on this instance. I wasn’t going to actually build a cluster right off, but I wanted at least a single running Riak node to use for trying this out.

In the article “New OpenShift Cartridge Format – Part 1” Mike skips the specifics of the cartridge and focuses on getting a service up and running that will be turned into a Cartridge. As Mike writes,

What do we really need to do to create an new cartridge? Step one is to pick something to create a cartridge for.

…to which my answer is, “alright, creating a Cartridge for Riak!”  😉

However, even though I have the RHEL instance up and running already, with Riak installed, I decided I’d follow along with his exactly example too. So I dove in with

[sourcecode language=”bash”]
sudo yum install httpd
[/sourcecode]

to install Apache. With that done I now have Riak & Apache installed on the RHEL EC2 instance. The goal with both of these services is to get them running as the regular local Unix user in a home directory.

With both Riak and Apache installed, time to create a local user directory for each of the respective tools. However, before that, with this version of Linux on AWS we’ll need to create a local user account.

[sourcecode language=”bash”]
useradd -c "Adron Hall" adron
passwd adron

Changing password for user adron.
New password:
Retype new password:
passwd: all authentication tokens updated successfully.
[/sourcecode]

Next I switched to the user I created ‘su adron’ and created the following directories in the home path for attempting to get Apache and Riak up and running locally like this. I reviewed the rest of the steps in making the Cartridge w/ Apache and then immediately started running into a few issues with getting Riak setup just like I need it to be able to build a cartridge around it. At least, with my first idea of how I should build a cartridge.

At this point I decided we need to have a conversation around the strategy here. So Bill Decoste, Ryan and some of the other Red Hat team on hand today. After a discussion with Bill it sounds like there are some possibilities to get Riak running via the OpenShift Origin Cartridges.

The Strategy

The plan now is to get a cartridge setup so that the cartridge can launch a single Riak instance. That instance, then with post-launch scripts can then join itself to the overall Riak cluster. The routing can be done via the internal routing and some other capabilities that are inherent to what is in OpenShift itself. It sounds like it’ll take a little more tweaking, but the possibility is there for the near future.

At this point I sat down and read up on the Cartridge a little more before taking off for the day. Overall a good start and interesting to get an overview of the latest around OpenShift.

Thanks to the Red Hat Team, have a great time at the OpenStack Conference and I’ll be hacking on this Cartridge strategy!

References

Using Bosh to Bootstrap Cloud Foundry via Stark & Wayne Consulting

I finally sat down and really started to take a stab at Cloud Foundry Bosh. Here’s the quick lowdown on installing the necessary bits and getting an initial environment built. Big thanks out to Dr Nic @drnic, Luke Bakken & Brain McClain @brianmmcclain for initial pointers to where the good content is. With their guidance and help I’ve put together this how-to. Enjoy…  boshing.

Prerequisites

Step: Get an instance/machine up and running.

To make sure I had a totally clean starting point I started out with an AWS EC2 Instance to work from. I chose a micro instance loaded with Ubuntu. You can use your local workstation if you want to or whatever, it really doesn’t matter. The one catch, of course is you’ll have to have a supported *nix based operating system.

Step: Get things updated for Ubuntu.

[sourcecode language=”bash”]
sudo apt-get update
[/sourcecode]

Step: Get cURL to make life easy.

[sourcecode language=”bash”]
sudo apt-get install curl
[/sourcecode]

Step: Get Ruby, in a proper way.

[sourcecode language=”bash”]
\curl -L https://get.rvm.io | bash -s stable
source ~/.rvm/scripts/rvm
rvm autolibs enable
rvm requirements
[/sourcecode]

Enabling autolibs sets up so that rvm will install all the requirements with the ‘rvm requirements’ command. It used to just show you what you needed, then you’d have to go through and install them. This requirements phase includes some specifics, such as git, gcc, sqlite, and other tools needed to build, execute and work with Ruby via rvm. Really helpful things overall, which will come in handy later when using this instance for whatever purposes.

Finish up the Ruby install and set it as our default ruby to use.

[sourcecode language=”bash”]
rvm install 1.9.3
rvm use 1.9.3 –default
rvm rubygems current
[/sourcecode]

Step: Get bosh-bootstrap.

bosh-bootstrap is the easiest way to get started with a sample bosh deployment. For more information check out Dr Nic’s Stark and Wayne repo on Github. (also check out the Cloud Foundry Bosh repo.)

[sourcecode language=”bash”]
gem install bosh-bootstrap
gem update –system
[/sourcecode]

Git was installed a little earlier in the process, so now set the default user name and email so that when we use bosh it will know what to use for cloning repositories it uses.

[sourcecode language=”bash”]
git config –global user.name "Adron Hall"
git config –global user.email plzdont@spamme.bro
[/sourcecode]

Step: Launch a bosh deploy with the bootstrap.

[sourcecode language=”bash”]
bosh-bootstrap deploy
[/sourcecode]

You’ll receive a prompt, and here’s what to hit to get a good first deploy.

Stage 1: I select AWS, simply as I’ve no OpenStack environment. One day maybe I can try out the other option. Until then I went with the tried and true AWS. Here you’ll need to enter your access & secret key from the AWS security settings for your AWS account.

For the region, I selected #7, which is west 2. That translates to the data center in Oregon. Why did I select Oregon? Because I live in Portland and that data center is about 50 miles away. Otherwise it doesn’t matter which region you select, any region can spool up almost any type of bosh environment.

Stage 2: In this stage, select default by hitting enter. This will choose the default bosh settings. The default uses a medium instance to spool up a good default Cloud Foundry environment. It also sets up a security group specifically for Cloud Foundry.

Stage 3: At this point you’ll be prompted to select what to do, choose to create an inception virtual machine. After a while, sometimes a few minutes, sometimes an hour or two – depending on internal and external connections – you should receive the “Stage 6: Setup bosh” results.

Stage 6: Setup bosh

setup bosh user
uploading /tmp/remote_script_setup_bosh_user to Inception VM
Initially targeting micro-bosh…
Target set to `microbosh-aws-us-west-2′
Creating initial user adron…
Logged in as `admin’
User `adron’ has been created
Login as adron…
Logged in as `adron’
Successfully setup bosh user
cleanup permissions
uploading /tmp/remote_script_cleanup_permissions to Inception VM
Successfully cleanup permissions
Locally targeting and login to new BOSH…
bosh -u adron -p cheesewhiz target 54.214.0.15
Target set to `microbosh-aws-us-west-2′
bosh login adron cheesewhiz
Logged in as `adron’
Confirming: You are now targeting and logged in to your BOSH

ubuntu@ip-yz-xyz-xx-yy:~$

If you look in your AWS Console you should also see a box with a key pair named “inception” and one that is under the “microbosh-aws-us-west-2” name. The inception instance is a m1.small while the microbosh instance is an m1.medium.

That should get you going with bosh. In my next entry around bosh I’ll dive into some of Dr Nic & Brian McClain’s work before diving into what exactly Bosh actually is. As one may expect, from Stark & Wayne we can expect some pretty cool stuff, so keep an eye over there on Stark & Wayne.

Write the Docs, Proper Portland Brew, Hack n’ Bike and Polyglot Conference 2013

Blog Entry Index:

I just wrapped up a long weekend of staycation. Monday kicked off Write the Docs this week and today, Tuesday, I’m getting back into the saddle.

Write the Docs

The Write the Docs Conference this week, a two day affair, has kicked off an expanding community around document creation. This conference is about what documentation is, how we create documentation as technical writers, writers, coders and others in the field.

Not only is it about those things it is about how people interact and why documentation is needed in projects. This is one of the things I find interesting, as it seems obvious, but is entirely not obvious because of the battle between good documentation, bad documentation or a complete lack of documentation. The later being the worse situation.

The Bloody War of Documentation!

At this conference it has been identified that the ideal documentation scenario is that building it starts before any software is even built. I do and don’t agree with this, because I know we must avoid BDUF (Big Design Up Front). But we must take this idea, of documentation first, in the appropriate context of how we’re speaking about documentation at the conference. Just as tests & behaviors identified up front, before the creation of the actual implementation is vital to solid, reliable, consistent, testable & high quality production software, good documentation is absolutely necessary.

There are some situations, the exceptions, such as with agencies that create software, in which the software is throwaway. I’m not and don’t think much of the conference is about those types of systems. What we’ve been speaking about at the conference is the systems, or ecosystems, in which software is built, maintained and used for many years. We’re talking about the APIs that are built and then used by dozens, hundreds or thousands of people. Think of Facebook, Github and Twitter. All of these have APIs that thousands upon thousands use everyday. They’re successful in large part, extremely so, because of stellar documentation. In the case of Facebook, there’s some love and hate to go around because they’ve gone between good documentation and bad documentation. However whenever it has been reliable, developers move forward with these APIs and have built billion dollar empires that employ hundreds of people and benefit thousands of people beyond that.

As developers that have been speaking at the conference, and developers in the audience, and this developer too all tend to agree, build that README file before you build a single other thing within the project. Keep that README updated, keep it marked up and easy to read, and make sure people know what your intent is as best you can. Simply put, document!

You might also have snarkily asked, does Write the Docs have docs,why yes, it does:

http://docs.writethedocs.org/ <- Give em’ a read, they’re solid docs.

Portland Proper Brew

Today while using my iPhone, catching up on news & events over the time I had my staycation I took a photo. On that photo I used Stitch to put together some arrows. Kind of a Portland Proper Brew (PPB) with documentation. (see what I did there!) It exemplifies a great way to start the day.

Everyday I bike (or ride the train or bus) in to downtown Porltand anywhere from 5-9 kilometers and swing into Barista on 3rd. Barista is one of the finest coffee shops, in Portland & the world. If you don’t believe me, drag your butt up here and check it out. Absolutely stellar baristas, the best coffee (Coava, Ritual, Sightglass, Stumptown & others), and pretty sweet digs to get going in the morning.

I’ll have more information on a new project I’ve kicked off. Right now it’s called Bike n’ Hack, which will be a scavenger style code hacking & bicycle riding urban awesome game. If you’re interested in hearing more about this, the project, the game & how everything will work be sure to contact me via twitter @adron or jump into the bike n’ hack github organization and the team will be adding more information about who, what, where, when and why this project is going to be a blast!

Polyglot Conference & the Zombie Apocalypse

I’ll be teaching a tutorial, “Introduction to Distributed Databases” at Polyglot Conference in Vancouver in May!  So it has begun & I’m here for you! Come and check out how to get a Riak deployment running in your survival bunker’s data center. Zombies or just your pointy hair boss scenarios of apocalypse we’ll discuss how consistent hashing, hinted handoff and gossipping can help your systems survive infestations! Here’s a basic outline of what I’ll cover…

Introducing Riak, a database designed to survive the Zombie Plague. Riak Architecture & 5 Minute History of Riak & Zombies.

Architecture deep dive:

  • Consistent Hashing, managing to track changes when your kill zone is littered with Zombies.
  • Intelligent Replication, managing your data against each of your bunkers.
  • Data Re-distribution, sometimes they overtake a bunker, how your data is re-distributed.
  • Short Erlang Introduction, a language fit for managing post-civil society.
  • Getting Erlang

Installing Riak on…

  • Ubuntu, RHEL & the Linux Variety.
  • OS-X, the only user centered computers to survive the apocolypse.
  • From source, maintained and modernized for humanities survival.
  • Upgrading Riak, when a bunker is retaken from the zomibes, it’s time to update your Riak.
  • Setting up

Devrel – A developer’s machine w/ Riak – how to manage without zombie bunkers.

  • 5 nodes, a basic cluster
  • Operating Riak
  • Starting, stopping, and restarting
  • Scaling up & out
  • Managing uptime & data integrity
  • Accessing & writing data

Polyglot client libraries

  • JavaScript/Node.js & Erlang for the zombie curing mad scientists.
  • C#/.NET & Java for the zombie creating corporations.
  • Others, for those trying to just survive the zombie apocolypse.

If you haven’t registered for the Polyglot Conference yet, get registered ASAP as it could sell out!

Some of the other tutorials that are happening, that I wish I could clone myself for…

That’s it for updates right now, more code & news later. Cheers!