Category Archives: How-to

How-to entries are descriptions of how to complete a particular action, code snippet, code pattern, architecture, or otherwise.

Coding, WTF Twitter, Twitch FTW, Getting Shit Done, Twitch Hacks, Tips, Tricks, and One Excellent Jazz Influenced Tune

WTF Twitter

I’ve been doing a lot more coding, thanks largely to the discipline that Twitch has brought to my day. It seems almost surprising to me at this point because Twitch started similarly to the way Twitter did for me. You see, I thought at first Twitter was the dumbest thing that had happened in ages. Arguably, it’s come full circle and I kind of feel the same thing about Twitter now, but during the middle decade in between that (yes, Twitter is over 10 years old!) Twitter has brought me connection, opportunities, and so much more. I couldn’t have imagined a lot of what I’ve been able to pull together because of Twitter. It’s still useful in many ways for this, albeit I like all of us are at risk of suffering the idiocy of today’s politics and political cronies, and the dog piling trash pile that follows them onto Twitter.

I’m not leaving Twitter any time soon but I’ve definitely put in on a very short leash, and limited what impact it does or doesn’t have in my day to day flow.

Twitch FTW

Amazingly however a new social and productive tool, not that it intended both, has come into being. Coding on Twitch. Don’t get me wrong I game, I just don’t game socially or on Twitch, what I do is code on Twitch. With a fair dose of hacking, breaking things, and then figuring out how to make them work. All at the same time I along with others have created a pretty excellent developers community there on Twitch. It seems to be growing all the time too. Twitch, at this point has become a focal point that has the benefits without all the annoying garbage that Twitter does these days, while adding the vast and hugely important fact that I can do things, be productive, chit chat, and generally get shit done all while I’m Twitch streaming.

VidStreamHacking

@ https://github.com/Adron/VidStreamHacking

With that, let’s talk about some of the recent notes and information I’ve been working on putting together to make Twitch even more useful. My first motive with this was to keep track of all the things I was doing, hardware I was putting together, and related things, but then another purpose grew out of all this note taking. It became obvious that this repository of information could be useful for other people. Here’s a survey of the things that I’ve added so far, hope they’re helpful to those of you digging into streaming out there!

I added some badges to identify various elements of information about the repo in the README.md.

badges

Is it maintained, yup, contributors, so far just me, zero issues filed but please feel free to add an issue or two, markdown yup, and there is indeed a Trello Board! The Trello Board is a key to insight, inspection, and what I’ve got going on in a number of my repositories. It’s where I’m keeping track of all the projects, what’s next, and what’s up in queue for the blog (this one right here). At least, in the context of the big code heavy or video reviews of sessions with code, extra commentary, and related content. If you want to get involved in any of the repos just let me know and I’m happy to walk through whatever and even get you added to the Trello board so we can work together on code.

Streaming Gear

https://github.com/Adron/VidStreamHacking/blob/master/hardware.md

My main machine is now a Dell XPS 15, which I fought through to get Linux running on it, and now that I have it’s been an absolutely stellar machine. I’ve also added additional monitor & port replicator/docking station gear to get it even more usable. The actual page I’ve got the details listed on are in the repo on the Dell XPS 15 item on the hardware page.

Along with the XPS 15 I wrote up coverage of the unboxing via video and blog entry. After a few weeks I also wrote up the conflict I had getting Linux running and removing Windows 10. In addition to the XPS 15 though I do use a MacBook from 2015 as my primary Mac machine, with an iMac from 2013 available as backup. Both machines are still resoundingly solid and performant enough to get the job done. Rounding out my fleet of machines is a Dell XPS 13 (covered here and here with the re-review).

For screens I have one at my office and one at home. They’re almost the same thing, ultra-widescreen monitors, curved displays, running 3880-1440 resolution from LG. These make keeping an eye on chat, OBS, and all sorts of other monitoring while coding, gaming, or whatever a breeze!

shotone.png

Ex 1: Just viewing a giant OBS view to get everything sorted out before starting a stream.

shottwo

Ex 2: OBS w/ VM running w/ Twitch chat, dashboard etc to the right. This way I can work, see the stream, and see chat and such all at the same time.

The docking stations and/or port replicators, whatever one calls these things these days also bring all of this tech together for me. There’s a couple I have tried and retired already (unfortunately, cuz dammit that cost some money!) and others that I use in some scenarios and others I use in others.

My main docking station contraption, shout out to James & others suggestion the Caldigit TS3. I got to this docking station through the Dell TB16 which for Linux, and kind of for Windows, is an unstable mess. Awesome potential if it worked, but it doesn’t so I tried out this USB-C pluggable option (in the tweet) which had HDMI that was unfortunately limited in resolution. Having a wide screen made this – albeit it being super compatible with Linux – unusable too. So I finally upgraded to the Caldigit TS3 and WOW, the Caldigit is super seriously wickedly bad ass. Extra USB-C ports, USB 2/3 ports, power, and more all rolled into one. It even supplies some power to the laptop, however I keep it plugged in since it’s kind of a power hog when the processor start chomping!

After trying out this USB-C pluggable (the tweet) I got the CalDigit into play. It’s really really good, here’s a shot of that from various angles with the extensive cables that I don’t have to plug into my laptop anymore. Out of this also runs a 28 port USB powered hub too, no picture, but just know I’ve got a crazy number of devices I routinely like to use!

That’s my main configuration when using the ultra widescreens and all. Good setup there, very usable, and the 32GB of memory in the laptop really get put to use in this regard. As for storage, that’s another thing. I’ve got 1 TB in my laptop but another 1 TB in a USB-C Thunderbolt Samsung Drive which is practically as fast for most things. So much so I attach it via the TS3 via USB-C and it’s screaming fast and adds that extra storage. So far, primarily I’ve been using it to store all of my virtual machines or use it as video storage while I do edits.

There’s other gear too, check out the list, like the Rode Podcoster and other things. But that gear I’ll elaborate on some other time.

Meetup Streaming Gear

https://github.com/Adron/VidStreamHacking/blob/master/meetup-streaming-kit-gear.md

Another effort I’ve undertaken is recording meetups. To do this one needs to be able to stream things with several screens combined – i.e. picture in picture and all. To do this, one needs a camera that can focus on the speaker, ideally at least 1080p with at least some ability to work in less than ideal light. Then next to that, a splitter and capture card to get the slides! Once all those pieces come together, with a little OBS finesse one can get a pretty solid single pass recording of a meetup. An example of one of my better attempts was the last meetup “Does the Cloud Kill Open Source” with Richard Seroter. If you take a look at past talks in the Meetups Playlist you can see my iterative progress from one meetup to another!

Here’s the specific gear I’m using to get this done. At least, so far, and if and when it becomes financially reasonable I might upgrade some of the gear. It largely depends on what I can get more use out of beyond just streaming meetups.

Cords and Splitter – I picked up a selection of lengths and types so that I’d have wiring options for the particular environments the meetups would be located in. Generally speaking 25ft seems to be a safe maximum for HDMI. I’ve been meaning to check out the actual specifications on it but for now it’s more than enough regardless.81fhh-w-DeL._SX679_

The splitter wasn’t expensive at all ($16.99), and kind of surprised me considering the costs of the cables. Picture to the right, or above, or somewhere depending on mobile layout.

I needed capture cards for this, one for the line out of the splitter that would capture the slides. The first I had picked up based on suggestions focusing around quality and that was the Avermedia Extreme Cap HDMI to USB 3 Capture Card. It’s really solid for higher resolution and related capabilities. For the USB 3.0 HDMI HD Game Video Capture Card I picked it up based on price (it’s almost a 1/3rd of the price) but not particular focused on quality. However, now that I’ve used both they are capable and seem fine, so I might have been able to just buy two of the cheaper options.

The camera, ideally, I’d have a much higher quality one but the Canon VIXIA HF R800 Camcorder has actually worked excellently. A little less feature rich for audio out and related things, but it zooms in good and can record at the same time I’m getting the cam feed into the stream. So it’s always a nice way to have a backup of the talk.

The last, and one of the most important aspects is getting good audio.

Streaming Meetups

https://github.com/Adron/VidStreamHacking/blob/master/meetup.md

At first thought, I made the mistake that just the gear would be enough but holy smokes there were about a million other things I needed to write. I created meetup.md to get the list going.

Jazz Influence Amidst the Heaviness!

As promised. Some music, not actually jazz, but heavily influenced by some jazz, progressive instrumentation, and esoteric, expansive, exquisite playing skills by the band. As always, be prepared. My music referrals aren’t always gentle! Happy code streaming!

Bunches of Databases in Bunches of Weeks – PostgreSQL Day 1

May the database deluge begin, it’s time for “Bunches of Databases in Bunches of Weeks”. We’ll get into looking at databases similar to how they’re approached in “7 Databases in 7 Weeks“. In this session I got into a hard look at PostgreSQL or as some refer to it just Postgres. This is the first of a few sessions on PostgreSQL in which I get the database installed locally on Ubuntu. Which is transferable to any other operating system really, PostgreSQL is awesome like that. Then after installing and getting pgAdmin 4, the user interface for PostgreSQL working against that, I go the Docker route. Again, pointing pgAdmin 4 at that and creating a database and an initial table.

Below the video here I’ve added the timeline and other details, links, and other pertinent information about this series.

0:00 – The intro image splice and metal intro with tunes..
3:34 – Start of the video database content.
4:34 – Beginning the local installation of Postgres/PostgreSQL on the local machine.
20:30 – Getting pgAdmin 4 installed on local machine.
24:20 – Taking a look at pgAdmin 4, a stroll through setting up a table, getting some basic SQL from and executing with pgAdmin 4.
1:00:05 – Installing Docker and getting PostgreSQL setup as a container!
1:00:36 – Added the link to the stellar post at Digital Ocean’s Blog.
1:00:55 – My declaration that if Digital Ocean just provided documentation I’d happily pay for it, their blog entries, tutorials, and docs are hands down some of the best on the web!
1:01:10 – Installing Postgesql on Ubuntu 18.04.
1:06:44 – Signing in to Docker hub and finding the official Postgresql Docker Image.
1:09:28 – Starting the container with Docker.
1:10:24 – Connecting to the Docker Postgresql Container with pgadmin4.
1:13:00 – Creating a database and working with SQL, tables, and other resources with pgAdmin4 against the Docker container.
1:16:03 – The hacker escape outtro. Happy thrashing code!

For each of these sessions for the “Bunches of Databases in Bunches of Weeks” series I’ll follow this following sequence. I’ll go through each database in this list of my top 7 databases for day 1 (see below), then will go through each database and work through the day 2, and so on. Accumulating additional days similarly to the “7 Databases in 7 Weeks

Day 1” of the Database, I’ll work toward building a development installation of the particular database. For example, in this session I setup PostgreSQL by installing it to the local machine and also pulled a Docker image to run PostgreSQL.

Day 2” of the respective database, I’ll get into working against the database with CQL, SQL, or whatever that one would use to work specifically with the database directly. At this point I’ll also get more deeply into the types, inserting, and storing data in the respective database.

Day 3” of the respective database, I’ll get into connecting an application with C#, Node.js, and Go. Implementing a simple connection, prospectively a test of the connection, and do a simple insert, update, and delete of some sort against the respective database built on the previous day 2 of the same database.

Day 4” and onward I’ll determine the path and layout of the topic later, so subscribe on YouTube and Twitch, and tune in. The events are scheduled, with the option to be notified when a particular episode is coming on that you’d like to watch here on Twitch.

Next Events for “Bunches of Databases in Bunches of Days

Twitz Coding Session in Go – Cobra + Viper CLI with Initial Twitter + Cassandra Installation

Part 2 of 3 – Coding Session in Go – Cobra + Viper CLI for Parsing Text Files, Retrieval of Twitter Data, Exports to various file formats, and export to Apache Cassandra.

UPDATED PARTS:

  1. Twitz Coding Session in Go – Cobra + Viper CLI for Parsing Text Files
  2. Twitz Coding Session in Go – Cobra + Viper CLI with Initial Twitter + Cassandra Installation (this post)
  3. Twitz Coding Session in Go – Cobra + Viper CLI Wrap Up + Twitter Data Retrieval

Updated links to each part will be posted at bottom of  this post when I publish them. For code, written walk through, and the like scroll down below the video and timestamps.

Hacking Together a CLI Installing Cassandra, Setting Up the Twitter API, ENV Vars, etc.

0:04 Kick ass intro. Just the standard rocking tune.

3:40 A quick recap. Check out the previous write “Twitz Coding Session in Go – Cobra + Viper CLI for Parsing Text Files” of this series.

4:30 Beginning of completion of twitz parse command for exporting out to XML, JSON, and CSV (already did the text export previous session). This segment also includes a number of refactorings to clean up the functions, break out the control structures and make the code more readable.

In the end of refactoring twitz parse came out like this. The completed list is put together by calling the buildTwitterList() function which is actually in the helpers.go file. Then prints that list out as is, and checks to see if a file export should be done. If there is a configuration setting set for file export then that process starts with a call to exportParsedTwitterList(exportFilename string, exportFormat string, ... etc ... ). Then a simple single level control if then else structure to determine which format to export the data to, and a call to the respective export function to do the actual export of data and writing of the file to the underlying system. There’s some more refactoring that could be done, but for now, this is cleaned up pretty nicely considering the splattering of code I started with at first.

50:00 I walk through a quick install of an Apache Cassandra single node that I’ll use for development use later. I also show quickly how to start and stop post-installation.

Reference: Apache Cassandra, Download Page, and Installation Instructions.

53:50 Choosing the go-twitter API library for Go. I look at a few real quickly just to insure that is the library I want to use.

Reference: go-twitter library

56:35 At this point I go through how I set a Twitter App within the API interface. This is a key part of the series where I take a look at the consumer keys and access token and access token secrets and where they’re at in the Twitter interface and how one needs to reset them if they just showed the keys on a stream (like I just did, shockers!)

57:55 Here I discuss and show where to setup the environment variables inside of Goland IDE to building and execution of the CLI. Once these are setup they’ll be the main mechanism I use in the IDE to test the CLI as I go through building out further features.

1:00:18 Updating the twitz config command to show the keys that we just added as environment variables. I set these up also with some string parsing and cutting off the end of the secrets so that the whole variable value isn’t shown but just enough to confirm that it is indeed a set configuration or environment variable.

1:16:53 At this point I work through some additional refactoring of functions to clean up some of the code mess that exists. Using Goland’s extract method feature and other tooling I work through several refactoring efforts that clean up the code.

1:23:17 Copying a build configuration in Goland. A handy little thing to know you can do when you have a bunch of build configuration options.

1:37:32 At this part of the video I look at the app-auth example in the code library, but I gotta add the caveat, I run into problems using the exact example. But I work through it and get to the first error messages that anybody would get to pending they’re using the same examples. I get them fixed however in the next session, this segment of the video however provides a basis for my pending PR’s and related work I’ll submit to the repo.

The remainder of the video is trying to figure out what is or isn’t exactly happening with the error.

I’ll include the working findem code in the next post on this series. Until then, watch the wrap up and enjoy!

1:59:20 Wrap up of video and upcoming stream schedule on Twitch.

UPDATED SERIES PARTS

    1. Twitz Coding Session in Go – Cobra + Viper CLI for Parsing Text Files
    2. Twitz Coding Session in Go – Cobra + Viper CLI with Initial Twitter + Cassandra Installation (this post)
    3. Twitz Coding Session in Go – Cobra + Viper CLI Wrap Up + Twitter Data Retrieval

 

DSE6 + .NET v?

Project Repo: Interoperability Black Box

First steps. Let’s get .NET installed and setup. I’m running Ubuntu 18.04 for this setup and start of project. To install .NET on Ubuntu one needs to go through a multi-command process of keys and some other stuff, fortunately Microsoft’s teams have made this almost easy by providing the commands for the various Linux distributions here. The commands I ran are as follows to get all this initial setup done.

wget -qO- https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.asc.gpg
sudo mv microsoft.asc.gpg /etc/apt/trusted.gpg.d/
wget -q https://packages.microsoft.com/config/ubuntu/18.04/prod.list
sudo mv prod.list /etc/apt/sources.list.d/microsoft-prod.list
sudo chown root:root /etc/apt/trusted.gpg.d/microsoft.asc.gpg
sudo chown root:root /etc/apt/sources.list.d/microsoft-prod.list

After all this I could then install the .NET SDK. It’s been so long since I actually installed .NET on anything that I wasn’t sure if I just needed the runtime, the SDK, or what I’d actually need. I just assumed it would be safe to install the SDK and then install the runtime too.

sudo apt-get install apt-transport-https
sudo apt-get update
sudo apt-get install dotnet-sdk-2.1

Then the runtime.

sudo apt-get install aspnetcore-runtime-2.1

logoAlright. Now with this installed, I wanted to also see if Jetbrains Rider would detect – or at least what would I have to do – to have the IDE detect that .NET is now installed. So I opened up the IDE to see what the results would be. Over the left hand side of the new solution dialog, if anything isn’t installed Rider usually will display a message that X whatever needs installed. But it looked like everything is showing up as installed, “yay for things working (at this point)!

rider-01

Next up is to get a solution started with the pertinent projects for what I want to build.

dse2

Kazam_screenshot_00001

For the next stage I created three projects.

  1. InteroperationalBlackBox – A basic class library that will be used by a console application or whatever other application or service that may need access to the specific business logic or what not.
  2. InteroperationalBlackBox.Tests – An xunit testing project for testing anything that might need some good ole’ testing.
  3. InteroperationalBlackBox.Cli – A console application (CLI) that I’ll use to interact with the class library and add capabilities going forward.

Alright, now that all the basic projects are setup in the solution, I’ll go out and see about the .NET DataStax Enterprise driver. Inside Jetbrains Rider I can right click on a particular project that I want to add or manage dependencies for. I did that and then put “dse” in the search box. The dialog pops up from the bottom of the IDE and you can add it by clicking on the bottom right plus sign in the description box to the right. Once you click the plus sign, once installed, it becomes a little red x.

dse-adding-package

Alright. Now it’s almost time to get some code working. We need ourselves a database first however. I’m going to setup a cluster in Google Cloud Platform (GCP), but feel free to use whatever cluster you’ve got. These instructions will basically be reusable across wherever you’ve got your cluster setup. I wrote up a walk through and instructions for the GCP Marketplace a few weeks ago. I used the same offering to get this example cluster up and running to use. So, now back to getting the first snippets of code working.

Let’s write a test first.

[Fact]
public void ConfirmDatabase_Connects_False()
{
    var box = new BlackBox();
    Assert.Equal(false, box.ConfirmConnection());
}

In this test, I named the class called BlackBox and am planning to have a parameterless constructor. But as things go tests are very fluid, or ought to be, and I may change it in the next iteration. I’m thinking, at least to get started, that I’ll have a method to test and confirm a connection for the CLI. I’ve named it ConfirmConnection for that purpose. Initially I’m going to test for false, but that’s primarily just to get started. Now, time to implement.

namespace InteroperabilityBlackBox
using System;
using Dse;
using Dse.Auth;

namespace InteroperabilityBlackBox
{
    public class BlackBox
    {
        public BlackBox()
        {}

        public bool ConfirmConnection()
        {
            return false;
        }
    }
}

That gives a passing test and I move forward. For more of the run through of moving from this first step to the finished code session check out this

By the end of the coding session I had a few tests.

using Xunit;

namespace InteroperabilityBlackBox.Tests
{
    public class MakingSureItWorksIntegrationTests
    {
        [Fact]
        public void ConfirmDatabase_Connects_False()
        {
            var box = new BlackBox();
            Assert.Equal(false, box.ConfirmConnection());
        }

        [Fact]
        public void ConfirmDatabase_PassedValuesConnects_True()
        {
            var box = new BlackBox("cassandra", "", "");
            Assert.Equal(false, box.ConfirmConnection());
        }

        [Fact]
        public void ConfirmDatabase_PassedValuesConnects_False()
        {
            var box = new BlackBox("cassandra", "notThePassword", "");
            Assert.Equal(false, box.ConfirmConnection());
        }
    }
}

The respective code for connecting to the database cluster, per the walk through I wrote about here, at session end looked like this.

using System;
using Dse;
using Dse.Auth;

namespace InteroperabilityBlackBox
{
    public class BlackBox : IBoxConnection
    {
        public BlackBox(string username, string password, string contactPoint)
        {
            UserName = username;
            Password = password;
            ContactPoint = contactPoint;
        }

        public BlackBox()
        {
            UserName = "ConfigValueFromSecretsVault";
            Password = "ConfigValueFromSecretsVault";
            ContactPoint = "ConfigValue";
        }

        public string ContactPoint { get; set; }
        public string UserName { get; set; }
        public string Password { get; set; }

        public bool ConfirmConnection()
        {
            IDseCluster cluster = DseCluster.Builder()
                .AddContactPoint(ContactPoint)
                .WithAuthProvider(new DsePlainTextAuthProvider(UserName, Password))
                .Build();

            try
            {
                cluster.Connect();
                return true;
            }
            catch (Exception e)
            {
                Console.WriteLine(e);
                return false;
            }

        }
    }
}

With my interface providing the contract to meet.

namespace InteroperabilityBlackBox
{
    public interface IBoxConnection
    {
        string ContactPoint { get; set; }
        string UserName { get; set; }
        string Password { get; set; }
        bool ConfirmConnection();
    }
}

Conclusions & Next Steps

After I wrapped up the session two things stood out that needed fixed for the next session. I’ll be sure to add these as objectives for the next coding session at 3pm PST on Thursday.

  1. The tests really needed to more resiliently confirm the integrations that I was working to prove out. My plan at this point is to add some Docker images that would provide the development integration tests a point to work against. This would alleviate the need for something outside of the actual project in the repository to exist. Removing that fragility.
  2. The application, in its “Black Box”, should do something. For the next session we’ll write up some feature requests we’d want, or maybe someone has some suggestions of functionality they’d like to see implemented in a CLI using .NET Core working against a DataStax Enterprise Cassandra Database Cluster? Feel free to leave a comment or three about a feature, I’ll work on adding it during the next session.

Oh, exFAT Doesn’t Work on Linux

But to the rescue comes the search engine. I found some material on the matter and, as I’ve learned frequently, don’t count out Linux when it comes to support of nearly everything on Earth. Sure enough, there’s support for exFAT (really, why wouldn’t there be?)

Check out this repo: https://github.com/relan/exfat

There’s of course the git clone and make and make install path or there’s also the apt install path.

git clone https://github.com/relan/exfat.git
cd exfat
autoreconf --install
./configure
make

Then make install.

make install

Of course, as with things on Linux, no reboot needed just use it now to mount a drive.

mount.exfat-fuse /dev/spec /mnt/exfat

To note, if you’re using Ubuntu 18.04 the support will just be available now so re-click on the attached drive or memory device you’ve just attached and it will now appear. Pretty sweet. If you want to use apt just run this command.

apt install exfat-fuse

That’s it. Now you’ve

Getting Started with Twitch | Twitch Thrashing Code Stream

I’d been meaning to get started for some time. I even tweeted, just trying to get further insight into what and why people watch Twitch streams and of course why and who produces their own Twitch streams.

With that I set out to figure out how to get the right tooling setup for Twitch streaming on Linux and MacOS. Here’s what I managed to dig up.

First things first, and somewhat obviously, go create a Twitch account. The sign up link is up in the top right corner.

https://www.twitch.tv/ Continue reading

Cassandra / DataStax Enterprise 6 Clusters: Marketplace Options

As I have stepped full speed into work and research at DataStax there were a few things I needed as soon as I could possibly get them put together. Before even diving into development, use case examples, or reference application development I needed to have some clusters built up. The Docker image is great for some simple local development, but beyond that I wanted to have some live 3+ node clusters to work with. The specific deployed and configured use cases I had included:

  1. I wanted to have a DataStax Enterprise 6 Cassandra Cluster up and running ASAP. A cluster that would be long lived that I could developer sample applications against, use for testing purposes, and generally develop against from a Cassandra and DSE purpose.
  2. I wanted to have an easy to use cluster setup for Cassandra – just the OSS deployment – possibly coded and configured for deployment with Terraform and related scripts necessary to get a 3 node cluster up and running in Google Cloud Platform, Azure, or AWS.
  3. I wanted a DataStax Enterprise 6 enabled deployment, that would showcase some of the excellent tooling DataStax has built around the database itself.

I immediately set out to build solutions for these three requirements.

The first cluster system I decided to aim for was figuring out a way to get some reasonably priced hardware to actually build a physical cluster. Something that would make it absurdly easy to just have something to work with anytime I want without incurring additional expenses. Kind of the ultimate local development environment. With that I began scouring the interwebs and checking out where or how I could get some boxes to build this cluster with. I also reached out to a few people to see if I could be gifted some boxes from Dell or another manufacturer.

I lucked out and found some cheap boxes someone was willing to send over my way for almost nothing. But in the meantime since shipping will take a week or two. I began scouring the easy to get started options on AWS, Google Cloud Platform, and Azure. Continue reading