Buying a Leopard!

system76-smallJanuary 24th, 2017 UPDATE: After I wrote this, I spoke with the System 76 team and I’m getting the chance to go out and tour their Denver Headquarters. This happened well after I made my purchase, which all of the following was written after. But just for full transparency, I’ve added this note. Also, I’m aiming to get a full write up of my System76 trip put together with Denver tidbits and more! Until then, here’s the review…

In the trailing days of 2016, after having moved to Redmond, Washington I sat working at my desktop workstation. This workstation, which still exists, is a iMac with an i7, 16GB RAM, 256 GB SSD, and a 1GB Video Card with a 1TB secondary drive. The machine is a 27” all in one style design, and the screen is rather beautiful. But as I did a build and tried to run Transport Tycoon at the same time in the background the machine sputtered a bit. It was definitely maxed out doing this Go code build, putting together a Docker image build, and spinning it up for go live at the same time my game ran in the background. I thought, this machine has served me extremely well, at over 5 years old it had surpassed the standard 5 year lifespan of peak Apple oomf. At the moment, I thought, maybe it’s time to dig into a serious machine with some premium hardware again.

In that moment I thought about the last dedicated, custom built, super powerful workstation machine I had. It was a powerful machine, nice form factor, and easily drove two giant 27” screens. However this machine had lived and finished it’s useful life over 6 years before 2016 had even started. But it was a sweet machine, that offered a lot of productive gaming and code writing efficiencies. It was thus, time to get in gear and get a machine again.

Immediately I thought through a few of the key features I wanted and other prerequisites of purchase.

  1. Enough RAM and processor power to drive my aforementioned gaming, docker, and code building scenario with ease.
  2. SSD drive of at least 1TB with at least a beefy 8GB Video Card.
  3. It needed to run, with full support, not-Windows. Ubuntu would be fine, but if any Linux was installed from factory or at least fully supported on the hardware I put together, that would suffice.
  4. If I were to buy it from a company, it had to be a company that wasn’t some myopic afterthought of 50s era suburbia (i.e. I didn’t really want to deal with Dell or Alienware again after the XPS 13 situation). This definitely narrowed down the options.

I started digging into hardware specifications and looking into form factors, cases, and all the various parts I’d need for a solid machine. In parallel I started checking out several companies.

  • System76 – Located in Denver, I was curious about this company and had been following them for some time. I had seen a few of the laptops over the years but had never seen or used any of their desktops.
  • Los Alamos Computers which is now LAC Portland! – Holy smokes, I had not realized this company moved. They definitely meet the 4th criteria above.
  • Puget Systems is a company located somewhere in the Puget Sound area and used to be called Puget Sound Systems. After digging I found they are located in a suburb of Seattle, in a town called Auburn. I didn’t want to rule them out so I kept them on the list and started researching.
  • Penguin Computing is another one of the companies, and kidn of a mainstay of Linux machines. They were a must have in the run up.
  • Think Penguin is another I dove into.
  • Emperor Linux is another company I found specializing in Linux machines.
  • Zareason was another, that specialized in Linux machines.

First Decision > Build or Buy?

I wrangled hardware specifications and the idea of building my own machine for some time. I came to the conclusion that the time versus money investment for me was on the side of buying a built machine. This first decision was pretty easy, but educating myself on the latest hardware was eye opening and a lot of fun. In the end however, better to let a builder get it done right instead of me creating a catastrophe for myself and nuking a whole weekend!

Decision Buy!

Second Decision > Who should I buy from?

I dug through each of the computer builders previously mentioned. I scouted out where they were located, what the general process was they used to build the machines, what testing, what involvement in the community they have, and finally a cost and parts review.

Each of the builders has a lot of positives in regards to Linux, the only one that I was hesitant about at first in regards to Linux was Puget Computing. Because by default the machines come with Windows 10. However after asking around and reviewing other reviews online, I came to find they do have Linux and a solid skill set around Linux. Puget remained a leader in the selection process.

pugetsystems

I went through Los Alamos Computers, which I realized are now LAC Portland (Win for Portland!), then Penguin, Think Penguin, and Emperor Linux. All had great skills and ethos around Linux. LAC definitely had the preeminently preferable choice in physical location (I mean, I do love Portland!), but each were short in either their customer facing desktop options. Albeit for a company or other reason, I’d likely buy a Thinkpad or other computing platform running Linux from them. But for this scenario each were disqualified for my personal workstation.

The last two I started checking out were Zareason and System76. I had been following what System76 for a while and a few things had caught my eye on their site. It led me to realize that they’re located out of Denver. Being a transit nerd, one of their website video photo coffee shop scenes had the RTD Light Rail passing in the background. But all things aside I started checking out cases and hardware that each builder puts in a box.

  • Berkely BARTZareason had several cases as shown below. With each of these I checked out the hardware options.

  • SounderNext up I checked out a number of Puget Systems.

  • RTD Light RailNext I started looking at System76 machines.

 

Challenge: Extra nerd credit if you guess why I used each of those pictures for each of those companies!

After working through and reviewing prices, features, hardware, and options things were close. I started reviewing location and what I could derive about each company’s community involvement in Linux, how they’re involved locally, and what the word is about those companies in their respective communities. Out of the three, I ended up not finding any customers to talk to about Zareason. For Puget, I found one friend that had a box purchased from a few years ago, and for System76 I actually found 2 different feedback bits from users within an hour or so of diffing around.

Kenny Spence @tekjava – Kenny and I have known each other for more years than I’m going to count. We got to meetup here in Seattle recently and he showed me his System76 laptop. The build quality was good and the overall review he gave me was a +1. Before this he’d mentioned in Twitter DM convo that this was the case, and I’d taken his word for it back then.

Dev Shop X – A group of individuals I reached out to I had met 3 years ago at the Portland @OSBridge Conference. I spoke to them again and found they were still using the System76 machines with no real complaints. They’d also bought the XPS 13 laptops well before the model I did and had a few complaints. With a short conversation we ended with them offering a +1for System76.

With the reviews from trusted sources, seeing the involvement and related culture of System76 I decided that they would be the builder of choice.

Decision System76 Leopard WS!

Leopard Workstation

With the decision made, I pulled the trigger on the purchase. In spite of the holiday season, I still received the machine in short order. It arrived at my door via UPS in a box, ya know, like a computer does when its shipped somewhere. 😉

system76-leopard-01

I cleared off the desk next, and dug into the box.

system76-leopard-02

system76-leopard-03

The computer was packaged cleanly and neatly with minimal waste compared to some I’ve seen. So far so good. I pulled pieces gently from the box. The first thing I extracted was the static bag which had all of the extra cords and respective attachments that had come with various parts of the computer hardware that were unnecessary. Another plus in my opinion, as many would likely not notice this having not built computers themselves, nor even cared, but I’m glad to have the extra pieces for this or other things I might need them for.

system76-leopard-04

The next thing I pulled out of the box was a thank you letter envelope with cool sticker and related swag.

system76-leopard-05

Stickers!

system76-leopard-07

That was it for peripheral things just floating around in the box. Next, out came the computer itself.

system76-leopard-06

It was wrapped in a static free bag itself. As it should be. I did notice a strange ink like bit of dusted debris in and around the box. I’m not really sure, and still am not sure today what exactly it was. I cleaned it up immediately. It wasn’t excessive, but was leaving slight marks on the white table which required a little scrubbing to remove.

After all things were removed from the box I removed them from envelopes and static free bags and placed them on the desk for a simple shot of all the parts in the box.

system76-leopard-08

Next I went through the steps of desk cleanup again and then connected my 28 port USB Hub, Razor Mouse, and a keyboard to the machine. It was finally time to boot this machine up!

system76-leopard-09

As for the screen which you see, it’s an LG 34” Extra Wide Screen monitor with slight curved view to it. Yes, it’s awesome, and yes it actually makes it relatively easy to not need dual monitors.

BOOTING!

system76-leopard-10

Ubuntu started, monitor fussing.

system76-leopard-11

I toyed around and had for whatever reason plugged in the HDMI, when I should have used the other monitor connection. It immediately provided more resolution options when I changed the connection and the monitor and related elements detected appropriately!

On the side of the machine is a clear window cut through the case to view the internals. The cords were managed well and overall build was very clean. Upon boot up the graphics card immediately lit up too. The nice blue tone provided a nice light within the room.

system76-leopard-12

Ubuntu booted up cleanly, and I might crazy bloody fast.

system76-leopard-13

Here’s a non-flash shot of the machine and monitor side by side.

system76-leopard-14

I then changed the respective positioning and the lighting, as you can see actually changed dramatically just by repositioning the hardware and the rear light I was shooting with.

system76-leopard-15

Lights off shot. The widow is beautiful!

system76-leopard-16

A slightly closer shot of the GTX 1080 humming away inside.

system76-leopard-17

The Ubuntu on Leopard WS Review

So far I’ve done a ton of coding & game playing on the machine. Here’s a break down of some specifics and some respective comments with a full read on the specifications of the machine.

  • Ubuntu 16.10 (64-bit)
  • 4.0 GHz i7-6850K (3.6 up to 4.0 GHz – 15 MB Cache – 6 Cores – 12 threads)
  • High Performance Self-Contained Liquid Cooler
  • 32 GB Quad Channel DDR4 at 2400MHz (2× 16 GB)
  • GB GTX 1080 with 2560 CUDA Cores
  • Chipset Intel® X99
  • Front: 2× USB 3.0 Type-A, 1× USB 2.0 Type-A, 1× eSATA
  • Rear: 3× USB 3.0 Type-A, 1× USB 3.1 Type-A, 1× USB 3.1 Type-C, 4× USB 2.0, Type-A, 1× PS/2
  • Gigabit Ethernet, optional Intel® Wireless-AC (a/b/g/n/ac)
  • GTX 1080: DVI-D, HDMI, 3× Display Port
  • Audio Front: Headphone Jack, Mic Jack
  • Audio Rear: 8 channel (HDMI, S/PDIF), Mic Jack, Line In, Line Out
  • Power Supply 750 W 80+ Certified (80% or greater power efficiency)
  • Dimensions 15.8″ × 8.3″ × 19.5″ (40.13 × 21.08 × 49.53cm)

Gaming

Using Steam I downloaded several games including my latest addiction Transport Tycoon. The others included Warhammer 40k: Dawn of War, Stronghold 3, Stellaris, Sid Meier’s Civ V, Master of Orion, and Cities: Skylines. Each of these games I loaded up and played for at least 20-30 minutes, with every graphics detail maxed out and full audio feature enabled. Where the option existed to run it at full resolution of 3440×1440 I ran the game at that resolution.

Not a blip, stir, or flake out of any sort. The color was solid (which obviously is also largely the monitor) and being able to move around these games in their respective 3d worlds was exception. All the while the speed of elapsed time in games like Transport Tycoon and Cities: Skylines barely slowed at all no matter how massive the city or layout was.

At this point I’ve also added about 16 hours of Transport Tycoon play to this, and I’ve built absurdly extensive layouts (100s of trains plus massively grown cities) and this processor and video card handles it. The aforementioned previous desktop easily choked to 1/10th the speed of this beast while running the game.

More on the gaming elements of this machine in the coming days.

Coding

I used Jetbrains Toolbox to download IntelliJWebstormCLionDataGripProject Rider, and RubyMine. I dug around for some sample projects and slung together some basic “hello world!” apps to build with each of the IDEs. All built at absurd rates, but nothing real specific as I didn’t load any large projects just yet.

One of the things I did do was load Go so that I could continue work on the Data Diluvium Project that I’ve started (Repo on Github). To hack around with Go I also installed Atom and Visual Studio Code. Both editors on this particular machine were screaming fast and with the 34” display, I could easily have both to test out features side by side. Albeit, that makes shortcut combos a nightmare! DON’T DO THIS AT HOME!

Build time for the C, Go, and C# Projects I tried out were all crazy fast, but I’m holding off posting any results as I want to get some more apples to apples comparisons put together before posting. I’m also aiming to post versus some other hardware just so there are some baselines in which to compare the build times against.

More on the coding and related projects in the coming days too.

Important Software

You may think, if you’re not an Ubuntu or Linux user, what about all the other stuff like office software and … big long list goes here. Well, most of the software that we use is either available or a comparable product is available on Linux these days. There’s really not many things that keep me – or would keep anybody tied to – OS-X/MacOS or Windows. Here are a few that I’ve tried out and am using regularly that are 1 to 1 across Windows, OS-X, and Linux.

  • Jetbrains – as mentioned before these work across all the platforms. They’re excellent developer tools.
  • Spotify – even though it states that there hasn’t been support or what not for the app for many months, it still works seemlessly on Linux. That’s what you get when you build an app for a solid platform – one doesn’t have to fix shit every week like on OS-X or Windows.
  • Slack – Slack is available on Linux too. After all the native app (or pseudo native) is built on Electron, which at its core runs on Node.js. So thus, feature parity is pretty much 100%. If you’re going to use slack, it’s not an excuse to be stuck on Windows or OS-X. The choice of platform is yours.

Summary

me-horns-up

NOTE: Nobody paid me a damned penny to write any of this btw, I reviewed all of these things because I love writing about my nerd adventures. No shill shit here. With that stated…

I have more things to review across all of these platforms and much more to write about this mean machine from System76. However, this review has gotten long enough. The TLDR; of this is, if you’re looking for a machine then System76 definitely gets the horns from me! Highly recommended!

The Latest 5th Generation Dell XPS 13 Developer Edition

Just about 4 weeks ago now I purchased a Dell XPS 13 Developer Edition directly from Dell. The reason I purchased this laptop is because of two needs I have while traveling and writing code.

  1. I wanted something smallcompact, that had reasonable power, and…
  2. It needed to run Linux (likely Ubuntu, but I’d have taken whatever) from the factory and have active support.

Here’s my experience with this machine so far. There are lots of good things, and some really lousy things about this laptop. This is the lowdown on all the plusses and minuses. But before I dive into the plusses and minuses, it is important to understand more of the context in which I’m doing this review.

  • Dell didn’t send me a free laptop. I paid $1869 for the laptop. Nobody has paid me to review this laptop. I purchased it and am reviewing it purely out of my own interest.
  • The XPS 13 Developer Edition that I have has 8GB RAM512 GB SSD, and the stunningly beautiful 13.3-inch UltraSharp™ QHD+ (3200 x 1800) InfinityEdge Touch Display.
  • Exterior Chassis Materials -> CNC machined aluminum w/ Edge-to-edge Corning® Gorilla® Glass NBT™ on QHD+ w/ Carbon fiber composite palm rest with soft touch paint.
  • Keyboard -> Full size, backlit chiclet keyboard; 1.3mm travel
  • Touchpad -> Precision touchpad, seamless glass integrated button

Negatives

The Freakin’ Keyboard and Trackpad

Let’s talk about the negatives first. This way, if you’re looking into purchasing, this will be a faster way to go through the decision tree. The first and the LARGEST negative is the keyboard. Let’s just talk about the keyboard for a moment. When I first tweeted about this laptop, one of the first responses I got in relation to this machine was a complaint – and a legitimate one at that – is the blasted keyboard.

There are plenty of complaints and issues listed herehere, and here via the Dell Support site. Twitter is flowing with such too about the keyboard. To summarise, the keyboard sticks. The trackpad, by association, also has some sticky behavior.

Now I’m going to say something that I’m sure some might fuss and hem and haw about. I don’t find the keyboard all that bad, considering it’s not an Apple chiclet keyboard and Apple trackpad, which basically make everything else on the market seem unresponsive and unable to deal with tactile response in a precise way. In that sense, the Dell keyboard is fine. I just have to be precise and understand how it behaves. So far, that seems to resolve the issue for me, same for the trackpad related issues. But if you’re someone who doesn’t type with distinct precision – just forget this laptop right now. It’s not even worth the effort. However, if you are precise, read on.

The Sleeping Issue

When I first received the laptop several weeks ago it had a sleeping issue. Approximately 1 out of every 3-5 times I’d put the computer to sleep it wouldn’t resume from sleep appropriately. It would either hang or not resume. This problem however, has a pretty clean fix available here.

Not Performant

Ok, so it has 8GB RAM, and SSD, and an i7 Proc. However it does not perform better than my 2 year old Mac Book Air (i7, 8 GB RAM, 256 GB SSD). It’s horribly slow compared to my 15” Retina w/ 16GB RAM and i7 Proc. Matter of fact, it doesn’t measure up well against any of these Apple machines. Linux however has a dramatically smaller footprint and generally performs a lot of tasks as well or better than OS-X.

When I loaded Steam and tried a few games out, the machine wasn’t even as performant as my Dell 17” from 2006. That’s right, I didn’t mistype that, my Dell from 2006. So WTF you might ask – I can only guess that it’s the embedded video card and shared video card memory or something. I’m still trying to figure out what the deal is with some of these performance issues.

However… on to the positives. Because there is also positives about the performance it does have.

Positives

The Packaging

Well the first thing you’ll notice, that I found to be a positive, albeit an insignificant one but it did make for a nice first experience is the packaging. Dell has really upped their game in this regard, instead of being the low-end game, Dell seems to have gotten some style and design put together for the packaging.

01

The box was smooth, and seamless in most ways. Giving a very elegant feel. When I opened up the box the entire laptop was in the cut plastic wrap to protect all the surfaces.

02

03

Removing the cut plastic is easy enough. It is held together with just some simple stickiness (some type of clean glue).

04

Once off the glimmer of the machine starts to really show. The aluminum surface material is really really nice.

05

The beauty of an untainted machine running Ubuntu Linux. Check out that slick carbon fiber mesh too.

06

Here it is opened and unwrapped, not turned on yet and the glimmer of that glossy screen can be seen already.

07

Here’s a side by side comparison of the screens for the glossy hi res screen against the flat standard res screen. Both are absolutely gorgeous screens, regardless of which you get.

08

Booting up you can see the glimmer on my XPS 13.

09

The Screen

The screen, even during simple bootup and first configuration of Ubuntu like this it is evident that the screen is stunning. The retina quality screen on such a small form factor is worth the laptop alone. The working resolution is 1920×1080, but of course the real resolution is 3200×1800. Now, if you want, you could run things at this resolution at your own risk to blindness and eye strain, but it is possible.

The crispness of this screen is easily one of the best on the market today and rivals that of the retina screens on any of the 13” or 15” Apple machines. The other aspect of the screen, which isn’t super relevant when suing Ubuntu is that it is touch enabled. So you can poke things and certain things will happen, albeit Ubuntu isn’t exactly configured for touch display. In the end, it’s basically irrelevant that it is a touch screen too, except in the impressive idea that they got a touch screen of this depth on such a small machine!

10

Here’s a little more of the glimmer, as I download the necessary things to do some F# builds.

Setting up F#

Performance and Boot Time

Boot time is decent. I’m not going to go into the seconds it takes but it’s quick. Also when you get the update for sleep, that’s really quick too. So no issue there at all.

On the performance front, as I mentioned in the negatives there are some issues with performance. However, for many – if not most – everyday developer tasks like building C#, F#, C++, C, Java, and a host of other languages the machine is actually fairly performant.

In doing other tasks around Ruby, PHP (yes, I wrote a little bit of PHP just to check it out, but I did it safely and deleted it afterwards), JavaScript, Node.js, and related web tasks were also very smooth, quick, and performant. I installed Atom, Sublime 3, WebStorm, and Visual Studio Code and tried these out for most of the above web development. Everything loads really fast on the machine and after a few loads they even get more responsive, especially WebStorm since it seems to load Java plus the universe.

Overall, if you do web development or some pretty standard compilable code work then you’ll be all set with this machine. I’ve been very happy with it’s performance in these areas, just don’t expect to play any cool games with the machine.

Weight and Size

I’ll kick this positive feature off with some addition photos of the laptop compared to a Mac Book Pro 15” Retina and a Apple Air 13”.

First the 13” Air.

12

13

No the Mac Book Pro 15” Retina

14

…and then on top of the Mac Air 13”.

15

16

Of course there are smaller Mac Book Pros and Mac Book Air Laptops, but these are the two I had on hand (and still use regularly) to do a quick comparison with. The 13” Dell is considerably smaller in overall footprint and is as light or lighter than both of these laptops. The XPS makes for a great laptop for carrying around all the time, and really not even noticing its presence.

Battery Life

The new XPS 13 battery life, with Ubuntu, is a solid 6-12 hours depending on activity. I mention Ubuntu, because as anybody knows the Linux options on conserving battery life are a bit awkward. Namely, they don’t always do so well. But with managing the screen lighting, back light, and resource intensive applications it would be possible to even exceed the 12 hour lifespan of the batter with Ubuntu. I expect with Windows the lifespan is probably 10-15% better than under Ubuntu. That is, without any tweaks or manual management of Ubuntu.

So if you’re looking for a long batter life, and Apple options aren’t on the table, this is definitely a great option for working long hours without needing to be plugged in.

Summary

beer

Overall, a spectacular laptop in MOST ways. However that keyboard is a serious problem for most people. I can imagine most people will NOT want to deal with the keyboard. I’m ok with it, but I don’t mind typing with hands up and off the resting points on the laptop. If Dell can fix this I’d give it a 100% buy suggestion, but with the keyboard as buggy and flaky as it is, I give the laptop at 60% buy suggestion. If you’re looking for a machine with Ubuntu out of the box, I’d probably aim for a Lenovo until Dell fixes the keyboard situation. Then I’d even suggest this machine over the Lenovo options.

…and among all things, I’d still suggest running Linux on a MBA or MBP over any of these – the machines are just more solid in manufacturing quality, durability, and the tech (i.e. battery, screen, etc) are still tops in many ways. But if you don’t want to feed the Apple Nation’s Piggy Bank, dump them and go with this Dell or maybe a Lenovo option.

Happy hacking and cheers!

_____100 |> F# Some Troubleshooting Linux

In the last article I wrote on writing a code kata with F# on OS-X or Windows, I had wanted to use Linux but things just weren’t cooperating with me. Well, since that article I have resolved some of the issues I ran into, and this is the log of those issues.

Issue 1: “How can I resolve the “Could not fix timestamps in …” “…Error: The requested feature is not implemented.””

The first issue I ran into with running the ProjectScaffold build on Linux I wrote up and posted to Stack Overflow titled “How can I resolve the “Could not fix timestamps in …” “…Error: The requested feature is not implemented.”“. You can read more about the errors I receiving on the StackOverflow Article, but below is the immediate fix. This fix should probably be added to any F# Installation instructions for Linux as part of the default.

First ensure that you have the latest version of mono. If you use the instructions to do a make and make install off of the fsharp.org site you may not actually have the latest version of mono. Instead, here’s a good way to get the latest version of mono using apt-get. More information can be found about this on the mono page here.

apt-get install mono-devel
apt-get install mono-complete

Issue 2: “ProjectScaffold Error on Linux Generating Documentation”

The second issue I ran into I also posted to Stack Overflow titled “ProjectScaffold Error on Linux Generating Documentation“. This one took a lot more effort. It also spilled over from Stack Overflow to become an actual Github Issue (323) on the project. So check out those issues in case you run into any issues there.

In the next issue, to be published tomorrow, I’ll have some script tricks to use mono more efficiently to run *.exe commands and get things done with paket and fake in F# running on any operating system.

Starting an Ubuntu Dev Tools List

I’ve recently setup a completely clean virtual machine for doing web, system, and related development on Ubuntu. Here’s the shortlist of what I’ve installed after a default installation. The ongoing list of tools and related items I have installed on my Linux dev box I’m keeping here, and it will be kept as a living doc, so I’ll change it as I add new tools, apps and related changes. So lemme know what I ought to add to that list and I’ll add it to my docs page here. Here’s what I have so far…

Other To-dos

  • Always run sudo apt-get update once the system is installed. It never hurts to have the latest updates.
  • I always install Chrome as my first app. Sometimes the Ubuntu Software Center flakes out on this, but just try again and it’ll work. I use the 64-bit Chrome btw, as I’ve noticed that the 32-bit often flakes out when attempting installation on my virtual machines. Your mileage may vary.

What this enables…

At this point I can launch into about any language; Java, JavaScript, and a few others with a minimal amount of headache. Since it’s a Linux instance it gives me a full range of Linuxy things at my disposal.


Default Java Installation

  1. Run a ‘sudo apt-get update’.
  2. To install the default Java JRE and the JDK run the following commands.
    sudo apt-get install default-jre
    sudo apt-get install default-jdk


Oracle Java v8 Installation

  1. sudo add-apt-repository ppa:webupd8team/java
    sudo apt-get update
    sudo apt-get install oracle-java8-installer


WebStorm Installation

  1. I download the application zip from JetBrains and then run
    tar xfz WebStorm-*.tar.gz
  2. Next I always move the unzipped content to the directory in which I’d like to have the application stored. It’s good practice to not keep things in the download directory, just sayin’. Generally I put these in my usr/bin directory.
    mv /downloads/WebStorm-* your/desired/spot
  3. Now at your terminal, navigate to the path where the application is stored and run the WebStorm.sh executable.
    ./bin/webstorm.sh
  4. To add WebStorm to the Quicklaunch, just right click on the icon and select to Lock to Launcher.


IDEA IntelliJ Installation

  1. Follow all the steps listed under WebStorm, it’s the exact same process.


Sublime 3

  1. Go to download the latest v3.
  2. Run the package and it should launch the actual Ubuntu installer, setup Sublime for bash use and get it installed.

(NOTE UPDATED 1/18/2016 > The installer doesn’t seem to get it installed, so I went with this link http://olivierlacan.com/posts/launch-sublime-text-3-from-the-command-line/ which has a good solution.)

Learning About Docker

Over the next dozen or so few days I’ll be ramping up on Docker, where my gaps are and where the project itself is going. I’ve been using it on and off and will have more technical content, but today I wanted to write a short piece about what, where, who and how Docker came to be.

As an open source engine Docker automates deployment of lightweight, portable, resilient and self-sufficient containers that run primarily on Linux. Docker containers are used to contain a payload, encapsulate that and consistently run it on a server.

This server can be virtual, on AWS or OpenStack, in clusters, public instances or private, bare-metal servers or wherever one can get an operating system to run. I’d bet it would show up on an Arduino cluster one of these days.  😉

User cases for Docker include taking packaging and deployment of applications and automating it into a simple container bundle. Another is to build PaaS style environments, lightweight that scale up and down extremely fast. Automate testing and continuous integration and deployment, because we all want that. Another big use case is simply building resilient, scalable applications that then can be deployed to Docker containers and scaled up and down rapidly.

A Little History

The creators of Docker formed a company called dotCloud that provided PaaS Services. On October 29th, 2013 however they changed the name from dotCloud to Docker Inc to emphasize the focus change from the dotCloud PaaS Technology to the core of dotCloud, Docker itself. As Docker became the core of a vibrant ecosystem the founders of dotCloud chose to focus on this exciting new technology to help guide and deliver on an ever more robust core.

Docker Ecosystem from the Docker Blog. Hope they don't mind I linked it, it shows the solid lifecycle of the ecosystem. (Click to go view the blog entry that was posted with the image)

Docker Ecosystem from the Docker Blog. Hope they don’t mind I linked it, it shows the solid lifecycle of the ecosystem. (Click to go view the blog entry that was posted with the image)

The community of docker has been super active with a dramatic number of contributors, well over 220 now, most who don’t work for Docker and they’ve made a significant percentage of the commits to the code base. As far as the repo goes, it has been downloaded over a 100,000 times, yup, over a hundred. thousand. times!!! It’s container tech, I’m still impressed just by this fact! On Github the repo has thousands of starred observers and over 15,000 people are using Docker. One other interesting fact is the slice of languages, with a very prominent usage of Go.

Docker Language Breakout on Github

Docker Language Breakout on Github

Overall the Docker project has exploded in popularity, which I haven’t seen since Node.js set the coder world on fire! It’s continuing to gain steam in how and in which ways people deploy and manage their applications – arguably more effectively in many ways.

Portland Docker Meetup. Click image for link to the meetup page.

Portland Docker Meetup. Click image for link to the meetup page.

The community is growing accordingly too, not just a simple push by Docker/dotCloud itself, but actively by grass roots efforts. One is even sprung up in Portland in the Portland Docker Meetup.

So Docker, Getting Operational

The Loading Bay

The Loading Bay

One of the best ways to describe docker (which the Docker team often uses, hat tip to the analogy!) and containers in general is to use a physical parallel. One of the best stories that is a great example is that of the shipping and freight industry. Before containers ships, trains,

Manually Guiding Freight, To Hand Unload Later.

Manually Guiding Freight, To Hand Unload Later.

trucks and buggies (ya know, that horses pulled) all were loaded by hand. There wasn’t any standardization around movement of goods except for a few, often frustrating tools like wooden barrels for liquids, bags for grains and other assorted things. They didn’t mix well and often were stored in a way that caused regular damage to good. This era is a good parallel to hosting applications on full hypervisor virtual machines or physical machines with one operating system. The operating system kind of being the holding bay or ship, with all the freight crammed inside haphazardly.

Shipping Yards, All of a Sudden Organized!

Shipping Yards, All of a Sudden Organized!

When containers were introduced like the shiny blue one shown here, everything began a revolutionary change. The manpower dramatically

A Flawlessly Rendered Container

A Flawlessly Rendered Container

dropped, injuries dropped, shipping became more modular and easy to fit the containers together. To put it simply, shipping was revolutionized through this invention. In the meantime we’ve all benefitted in some way from this change. This can be paralleled to the change in container technology shifting the way we deploy and host applications.

Next post, coming up in just a few hours “Docker, Containers Simplified!”

Linux Containers, LXC, FreeBSD Jails, VServer…

These days containerization of work, applications and storage on systems has become a hot topic. Not to say it wasn’t before, but it’s got a boost from the cloud computing segment of the industry. With that I felt the need to write up what I’ve discovered of the history in this industry so far. I’d love feedback and corrections if I’ve got anything out of order here or if – heaven forbid – I’ve got something wrong.

What are Containers?

Before I get into what a container is, it is best to define what operating system-level virtualization is. Sometimes this is referred to as jailed services or apps running in a jail.

This level of virtualization often provides extremely similar functionality as a VMware, Virtual Box or Hyper-V virtual server would provide. The difference however is primarily around the idea that the operating system-level virtualization actually runs as a service, usually protected, that runs apps as if it were an operating system itself.

So what’s a container?

Linux Contains is a feature that allows Linux to run a single or more isolated virtual systems that each have their own network interfaces, computer process threads and namespaces, user namespaces and states.

One of the common abbreviations for Linux Containers you’ll see is LxC. There are however many distinct operating system-level virtualization solutions.

  • Open VZ – this technology uses a single patched Linux kernel, providing the ability to use the architecture and kernel version of the system that is executing the container.
  • Linux V-Server – this technology is a virtual private server implementation that was created by adding operating system-level virtualization to the Linux kerne. The project was started by Jacques Gélinas. It is now maintained by Herbert Pötzl of Austria and is not related to the Linux Virtual Server project. The server breaks things into partitions called security contexts, within that is the virtual private server.
  • FreeBSD Jail – This container technology breaks apps and services into jails.
  • Workload Partitions – This is a technology built for AIX, introduced in AIX 6.1. Workload Partitions breaks things into WPARs. These are software partitions that are created from the resources of a single AIX OS instance. WPARs can be created on any system p (the new old thing, was the RS/6000 tech) hardware that supports AIX 6.1 or higher versions. There are two kinds of WPARs, System WPARs and Application WPARs.
  • Solaris Containers – is a container tech for x86 and SPARC systems. It was first released in February 04′ for Solaris 10. It is also available in OpenSolaris, SmartOS and others as well os Oracle Solaris 11. The Solaris container combines resource controls in seperations referred to as zones. These zones act as completely isolated virtual servers within a OS.

What is so great about a container?

Ok, so I’ve covered what a container is. You’re probably asking, “so what do I do with these containers?” There are a number of things, for starters speed is a huge advantage with containers. You can spool up entire functional application or service systems, like an API facade or something, in seconds. Often times a container will spool up and be ready in less than a second. This provides a huge amount of power to build out flexible, resilient, self-healing distributed systems that otherwise are just impossible to build with slow loading traditional virtual machine technology.

Soft memory is another capability that most containers have. This is the capability of being allocated, or being allocated and running, in memory. As one may already know, if you run something purely out of memory it is extremely fast, often 2-10x faster than running something that has to swap on a physical drive.

Managing crashing services or damaged ecosystem elements. If the containers are running, but one gets hit with an overloaded compute ask, software crashes on it, or one of the many receive some type of blocking state like a DDOS of sorts, just reboot it. Another option is just to kill it and spool up and entirely new instance of the app or service in a container. This ability really is amplified in any cloud environment like AWS where a server instance may crash with some containers on it, but having another instance running with multiple containers on it is easy, and restarting those containers on running instances is easy and extremely fast.

Security is another element that can be assisted with container technology. As I alluded to in the previous point above, if a container gets taken over or otherwise compromised, it’s very easy to just kill it and resume one that is not compromised. Often buying more time to resolve the security concern. Also, by having each container secured against each other container, controlling a container does not result in a compromised physical machine and operating system. This is a huge saving grace when security is breached.

Container Summary

Containers are a hot ticket topic, for good reason. They provide increase management of apps and services, can utilize soft memory, increase security and they’re blazing fast. The technology, albeit having been around for a good decade, is starting to grow in new ways. Containers are starting to also become a mainstay of cloud technology, almost a requirement for effective management of distributed environments.

Next up, I’ll hit on Docker tech from DotCloud and Salomon Hykes @solomonstre.

For now, anybody got some additions or corrections for this short history and definitions of containers?  🙂

Getting Docker Installed on Ubuntu 12.04 LTS

A few days ago I posted the blog entry “Using SSH Locally to Work With Ubuntu VM + VMware Tools Installation via Shell“, it was related to getting a clean Ubuntu Server install running with VMware Tools and so that I could use it as a hosted instance. Simply put, being able to SSH into it just as I would a hosted AWS or Windows Azure Ubuntu Server image. Once I had the default virtual machine running 12.04 LTS I went about another installation that is needed to run Docker. Docker will have issues with anything pre-3.8 kernel. Running the command below will show that kernel 3.5 is the current kernel in 12.04 LTS.

apt-cache search linux-headers-$(uname -r)

To update to the 3.8 kernel I ran the following command and then rebooted.

sudo apt-get install linux-image-generic-lts-raring linux-headers-generic-lts-raring
sudo shutdown -r now

With the reboot complete, I checked the kernel version again and 3.8 was installed successfully.

@ubuntu:~$ apt-cache search linux-headers-$(uname -r)
linux-headers-3.8.0-33-generic - Linux kernel headers for version 3.8.0 on 64 bit x86 SMP

To get Docker installed (as of 0.6) run the following command.

sudo sh -c "wget -qO- https://get.docker.io/gpg | apt-key add -"
sudo sh -c "echo deb http://get.docker.io/ubuntu docker main\
> /etc/apt/sources.list.d/docker.list"

Next update the sources, then install lxc-docker.

sudo apt-get update
sudo apt-get install lxc-docker

To verify that docker is installed I executed the following command and…

sudo docker run -i -t ubuntu /bin/bash

…see similar results just after issuing the command.

Unable to find image 'ubuntu' (tag: latest) locally
Pulling repository ubuntu
8dbd9e392a96: Download complete
b750fe79269d: Download complete
27cf78414709: Download complete

After that displays then I typed exit to leave docker. I now have a running version of docker on the Ubuntu 12.04 LTS instance ready for testing and hacking with docker.