Category Archives: Video

Bunches of Databases in Bunches of Weeks – PostgreSQL Day 1

May the database deluge begin, it’s time for “Bunches of Databases in Bunches of Weeks”. We’ll get into looking at databases similar to how they’re approached in “7 Databases in 7 Weeks“. In this session I got into a hard look at PostgreSQL or as some refer to it just Postgres. This is the first of a few sessions on PostgreSQL in which I get the database installed locally on Ubuntu. Which is transferable to any other operating system really, PostgreSQL is awesome like that. Then after installing and getting pgAdmin 4, the user interface for PostgreSQL working against that, I go the Docker route. Again, pointing pgAdmin 4 at that and creating a database and an initial table.

Below the video here I’ve added the timeline and other details, links, and other pertinent information about this series.

0:00 – The intro image splice and metal intro with tunes..
3:34 – Start of the video database content.
4:34 – Beginning the local installation of Postgres/PostgreSQL on the local machine.
20:30 – Getting pgAdmin 4 installed on local machine.
24:20 – Taking a look at pgAdmin 4, a stroll through setting up a table, getting some basic SQL from and executing with pgAdmin 4.
1:00:05 – Installing Docker and getting PostgreSQL setup as a container!
1:00:36 – Added the link to the stellar post at Digital Ocean’s Blog.
1:00:55 – My declaration that if Digital Ocean just provided documentation I’d happily pay for it, their blog entries, tutorials, and docs are hands down some of the best on the web!
1:01:10 – Installing Postgesql on Ubuntu 18.04.
1:06:44 – Signing in to Docker hub and finding the official Postgresql Docker Image.
1:09:28 – Starting the container with Docker.
1:10:24 – Connecting to the Docker Postgresql Container with pgadmin4.
1:13:00 – Creating a database and working with SQL, tables, and other resources with pgAdmin4 against the Docker container.
1:16:03 – The hacker escape outtro. Happy thrashing code!

For each of these sessions for the “Bunches of Databases in Bunches of Weeks” series I’ll follow this following sequence. I’ll go through each database in this list of my top 7 databases for day 1 (see below), then will go through each database and work through the day 2, and so on. Accumulating additional days similarly to the “7 Databases in 7 Weeks

Day 1” of the Database, I’ll work toward building a development installation of the particular database. For example, in this session I setup PostgreSQL by installing it to the local machine and also pulled a Docker image to run PostgreSQL.

Day 2” of the respective database, I’ll get into working against the database with CQL, SQL, or whatever that one would use to work specifically with the database directly. At this point I’ll also get more deeply into the types, inserting, and storing data in the respective database.

Day 3” of the respective database, I’ll get into connecting an application with C#, Node.js, and Go. Implementing a simple connection, prospectively a test of the connection, and do a simple insert, update, and delete of some sort against the respective database built on the previous day 2 of the same database.

Day 4” and onward I’ll determine the path and layout of the topic later, so subscribe on YouTube and Twitch, and tune in. The events are scheduled, with the option to be notified when a particular episode is coming on that you’d like to watch here on Twitch.

Next Events for “Bunches of Databases in Bunches of Days

It’s Official, ML4ALL 2019, Machine Learning Conference 4 All v2!

It’s official, we’ve got dates and tickets are open for ML4ALL 2019! Our CFP will be open in a number of hours, not days, and I’ll do another update the second that we have that live.

What is ML4ALL?

ML4ALL stands for “Machine Learning for All“. Last year I enjoyed working with Alena Hall, Troy Howard, Glenn Block, Byron Gerlach, and Ben Acker on getting a great conference put together, and I’m looking forward to rounding up a team and doing a great job putting together another great conference for the community again this year!

Last year @lenadroid put together this great video of the event and some short interviews with speakers and attendees. It’s a solid watch, take a few minutes and check it out for a good idea of what the conference will be like.

Want to Attend? Help!

Tickets are on sale, but there’s a lot of other ways to get involved now. First, the super easy way to keep track of updates is to follow the Twitter account: @ml4all. The second way is a little bit more involved, but can be a much higher return on investment for you, by joining the ML4ALL Slack Group! There we discuss conference updates, talk about machine learning, introduce ourselves, and a range of other discussions.

If you work for a company in the machine learning domain, plying the wave of artificial intelligence and related business, you may want to get involved by sponsoring the conference. We’ve got a prospectus we can send you for the varying levels, just send an email to ml4allconf@gmail.com with the subject “Plz Prospectus”. We’ll send you the prospectus and we can start a conversation on which level works best for your company!

The TLDR;

ML4ALL is a conference that will cover from beginner to advanced machine learning presentations, conversations, and community discussions. It’s a top conference choice to put on your schedule for April 28-30th, pick up tickets for, and submit a proposal to the CFP!

 

Learning Go Episode 2 – Further into packages, dependencies, application creation, and IDE’s

Episode Post & Video Links:  1, 2 (this post), 3, 4 (almost done)

In episode two I went over a lot of the material that I covered in the first episode, but added more context, historical reasons certain things are the way they are in Go and the stack, and went over a number of new elements of information too. One thing I got further into this episode is the package and dependency management with Go Dep and also how to create a package, or dependency library for use in other Go libraries or applications. It’s just a small introduction in this episode, but pivotal to future episodes, as I’ll be jumping further into library creation and related details.

In this post I’ve got the time point breakdown like usual,  but also a few additional bits of information and code examples, plus links to the repository I’ve setup for this particular episode. The quick links to those references are below, and also I’ll link at particular call out points within the time points.

Quick Links:

Key Topics Covered

Data Types, Packages, and Dependency Management

2:52 – Fumbling through getting started. It is after all Saturday morning!
3:00 – Recap of what we covered in the first session. Includes a quick review of the previous session code too, such as the random data generation library we setup and used.
6:40 – Covering some specifics of the IDE’s, the stories of the benefits of Go having a specific and somewhat detailed convention to the way syntax, variables, and related features are used.
7:40 – Covering gofmt and what it gives us.
9:45 – Looking at the gofmt plugins and IDE features around the conventions.
14:06 – New example time! In this one, I work through an example showing how to find duplicate lines in passed in text.

Duplicate Line Finder

I went through the various steps of creating the code, but then took a little bit of a detour from the example in the book. Instead of lines by the CLI it takes in content from a text file. The code in main.go ended up like this.

Then if you’d like to check out the text file and remaining content in that project, check out the master branch of the episode 2 repo.

36:34 – Here I take a thorough step through committing this project to github, which is the included repo in this post. However I step through the interface of using Jetbrains Goland to do the commit, how it enables gofmt and other features to improve the condition of code and ensure it meets linter demands and related crtieria. I also cover the .gitignore file and other elements to create a usable repository.
44:30 – Setting up the repository for today’s code at https://github.com/Adron/learning-go-…
50:00 – Setup of the key again for using Github. How to setup your ssh keys using ssh-keygen.
56:00 – Going beyond just the language, and building out a Go build on Travis CI.
1:10:16 – Creating a new branch for the next code examples and topics. At this point I shift into type declarations. Working through some constants, very basic function declarations, and related capabilities to calculate temperatures between Fahrenheit and Celsius.

The tempApp Branch is available in the repository here.

At this point I shift into type declarations. Working through some constants, very basic function declarations, and related capabilities to calculate temperatures between Fahrenheit and Celsius.

During this point, we take a look at our first package. This package ended up looking like this.

In the main.go file, I showed how you can use this package by adding a respective import shown in this code.

1:17:54 – At this point, to increase readability of font sizes I get into the various Goland IDE options.
1:38:12 – Creating the final branch for this session to pull in a public package and use it in project. For this, I pull in a random data generation package to use in some application code.

1:44:50 – Further discussion and explanation of what to include in .gitignore files to manage projects, but also what is and isn’t included for dependencies and other details around all of this.
2:13:22 – The wicked awesome hacker outtro.

Learning Go Episode 1 – Environment, Go Workspace, GOPATH/GOROOT, Types, and more Introduction

This is episode one of a multi-part series on “The Go Programming Language“. Not necessary, but if you’d like to follow along you can also pick up the book “The Go Programming Language” by Alan A. A. Donovan and Brian W. Kernighan. At the bottom of the description I have a link to the book publisher’s website and the respective book. I’ll be using that as a guideline and using a number of examples from the book. However I’ll also be adding a lot of additional material around Goland IDE from Jetbrains and Visual Studio Code. The video link to this session is at the bottom of the post, scroll all the way down and it’s waiting for you there.

3:28 – Getting started, introducing that the session I’m starting with a completely new Ubuntu Linux load so that I ensure we cover all of the steps to get up and running. These steps, even though they’re on Linux are reproducible on Windows 10 and MacOS, so any operating system is usable to follow along with, with only minor discrepancies.

5:04 – Introducing the book that I’ll be using as a guideline reference so that viewers can also follow along with a physical book. I’m a big fan of multisensory learning, so between a book, the stream, being able to ask questions in the channel, it’ll give viewers a chance to learn and undertake their coding adventures in Go using all sorts of methods.

Book Reference: “The Go Programming Language” by Alan A. A. Donovan and Brian W. Kernighan

6:58 – Discussing where Go is located on the web related to Github and the golang.org site that is useful in that one can even try out little snippets of Go code itself, on the site!

Github: https://github.com/golang/go
Golang: https://golang.org

10:40 – Setting export in the .bashrc file (or .bash_profile on MacOS or environment variables on Windows 10). Speaking of Windows 10 specifically, Linda Gregier wrote up a great blog post on getting Go setup on Windows specifically.

14:50 – Setting up the Go workspace path for GOPATH using the standard Go convention. From here I get into the first “Hello World!” with Go.

15:34 – Mention of setting up Go on a Docker container and how it is easier, but we’re staying focused on setting it up completely from scratch.

18:20 – Starting first code, a standard “Hello World” program.

19:50 – First build of that “Hello World” program.

20:34 – Here I introduce go run and how to execute a singular file instead of building an entire project.

21:32 – Installing some IDE’s to use for developing Go applications. The first two up for installation is Visual Studio Code and JetBrains Goland.

29:00 – A first variable, what is it, and how to declare one in Go in one of the ways one can declare a variable in Go!

31:08 – Introducing the terminal in Visual Studio Code.

37:12 – A little example of OBS, how I’m using it, and how I interact back and forth with chat and related tooling plus the virtual machine itself.

42:36 – Changing themes and adding plugins for Goland. In the plugins I also find the most epic of status bars, the Nyan Cat!

59:00 – Here I start to get back into some more specific Go details. Starting off with a Go command line parsing application. At this point I also cover several additional ways to declare variables, speak more about short declarations, and other ways to declare, assign, and use variables in Go.

At this point I also go through a number of examples to exemplify how to go about declaring variables, build, run, and explore the details of the code. Further along I also get into string formatting, concatenating, and related string manipulation with Go.

Other details include taking a look at extra ways to figure out Go code using autocomplete inside Goland and other exploratory features. Eventually before wrapping up I broach pointers, tuple declaration techniques, and how to declare additional functions beyond func main().

1:58:40 – Adding dependencies and generating random data. At this point I bring in a dependency. But before pulling in the dependency, after introducing it, I show how to go about doing.

2:00:10 – New machine, and I run into git not being available. The go get command uses git to pull dependencies so I go about why this is needed and the steps to install on Ubuntu.

2:09:20 – Introduction to more concepts around dependencies, what go get does versus managing dependencies with Go Dep.

2:10:00 – Installing Go Dep; MacOS, using curl, Linux installation, and a question sort of remains to get it running on Windows. The easiest method is using chocolatey however, so check that out if you’re going the Windows route.

2:15:20 – Setting up Go Dep integration with Goland.

2:23:55 – Showing off Goland’s commit dialog and some of the respective options.

Got a New Laptop, Here’s The Review

A few past reviews just for context of my general working fleet of computers and why and what led me to this review and this new laptop purchase.

Important! Do take note, I’m not paid by Dell, or System76, or anybody to write up reviews of laptops or hardware for that matter. These are my observations with these systems. I’m also not paid to use these systems for software development, but am only paid for the software development I produce with these machines. In other words, I very much roll Bring Your Own Device (BYOD) style and develop routinely without an assigned machine. I do what I can to stick to BYOD and such as it is, write up reviews of what I choose to use.

The Setting & Context for Purchase

system76-leopard-16Over the last year I’ve been pondering getting a Linux based laptop. At least a laptop that can run Linux native on the metal as the sole OS or at least a clean dual boot option. I wanted this for several specific reasons. The final decision to move forward occurred at this very tweet moment.

Here’s the short list of why I wanted a new laptop, that has good Linux support.

  1. Most of my gaming is in Linux these days. Thanks Steam OS!
  2. Most of my server workloads, server coding, internet coding, back-end work, and related coding is all for Linux. I haven’t deployed something to another OS in production for at least a decade now. As for front end apps, that’s also basically stuff that runs on Linux or MacOS. Web interfaces or usually just some simple CLI’s. I did write a Windows 8 “Metro UI” App, but it’s long gone and dead already along with the database (Riak) that it was an interface for.
  3. Most of my automation work and related site reliability coding, ops hacking, my metasoftware engineering (great words for a title from Katie Keel @akatiekeel, see tweet below), and all that good fun is often done on MacOS but always ends up being deployed to run on a Linux machine in the end.
  4. I’ve already got two Linux machines that I do a huge percentage of work on. The Dell XPS 13 and System 76 Leopard Workstation. However, the Leopard is in a bit of disrepair after a disturbingly wicked power surge here in Ballard (Seattle). The XPS 13 is just a bit weak, old, and the keyboard is still the crappy keyboard I detailed in the past review.
  5. One of the big demands for this new laptop was that I wanted to be able to – at least with a somewhat efficient hardware performance level – edit video, stream video, run the virtual machines, the pertinent container ecosystems (i.e. distributed database stuff), of course lots of code, and play the few games I do play. This meant at basic some decent video – albeit I knew it wasn’t going to be what I had/have in my System76 machine – at least a terabyte of storage on my main drive, and 32 GB of RAM.

Buy Time

huawei-matebook-x-pro-nvidia-geforce-mx-150Alright, that was the setting, so I went about searching through a lot of different laptop options. One of the most promising was this Huawai Matebook that Jeff & Adam pointed me at. It looked so good from the website that I decided I wanted to go check out the physical Matebook Pro somewhere, if possible, and found that option here in Seattle at the Microsoft store in University Park (It’s an outdoor mall, yuck, but whatevers).

huawei-matebook-x-pro-multiple-portsI rolled over via bus and a short walk, walked into the Microsoft store and made a beeline right to where one of the Matebooks sat. It was gorgeous, absolutely beautiful, flawless, and outright better bang for the hardware buck than one of the Apple products from across the street! I was instantly sold.

huawei-matebook-x-pro-with-intel-cpuBut there was an issue. Hardware specs for this thing sit at 2GB Video, 8 GB RAM, and a 512 GB SSD. That’s a problem. I checked the site again to make sure there weren’t other options. Nope, it didn’t get much more built up than that. It just wouldn’t do.

huawei-matebook-x-pro-thin-bodyI felt pained, annoyed, and frustrated. Does anybody actually want some decent power in a slim, elegant, and easy to carry laptop? Am I the only one wanting something like this? I started strolling around the floor of the Microsoft store. Looking at hard drives and Xbox stuff. Which just to point out, these Microsoft stores really are Xbox stores as much or more than they are anything else!

NOTE: All Huawai images copyright of Huawai. I’m hoping they’re cool since I’m pointing out their awesome laptop.

The reason I bring up the Matebook, is because I really was impressed by the build quality. It exceeded my expectation and based on this research, trying it out, I would happily suggest this laptop as a prime choice if the specs meet what you need. For me, sadly, I wanted and needed a bit more RAM and some more oomf in other categories.

The Final Decision

I walked around the Microsoft store checking out the Lenovo and a number of other laptops. I played some racing game thing on Xbox for a second. I wasn’t in so much of a hurry that I just had to buy something right then. I had after all waited almost a year to get to this point. Maybe I’d just save the cash and wait a little longer? Surely something would come along.

Then I walked up to another table. I first looked at the spec list, which I had been doing at every table except when I had walked up to the Matebook. I see 1 TB option on this machine. That’s cool. Then I see 32 GB of RAM. Holy shit when did the selection on the floor leap out of the piddly 8 GB range?! Then I see 4 GB Video! Specifically a NVIDIA® GeForce® GTX 1050Ti with 4GB GDDR5! Hell yeah. Alright, I’d looked and here’s the laptop that after months seemed to be the only ongoing choice to get this kind of specs in something that I sort of trusted the support for. So I started to play around with the keyboard and oh, looky there, it’s a touch screen too. Not that I cared, but it’s not a bad thing to have really, albeit a waste as I’ll likely never touch the screen.

So there it was, the decision was made, bagged, paid for, and out I walked with a brand new Dell XPS 15. Maybe I should have bought it from Dell, but meh, this will work. Support plan is nice, if anything fails I bring it into the store and they get me a new laptop too. Overall price $2499 for 3 years of coverage plus laptop. Also, yes, considering my unfortunate luck with hardware over the years – unless it’s a machine I’ve built myself – I get the coverage because I’ve got all of zero time to mess around with being my own technician.

The Dell XPS 15

Alright, so I set out to put this thing through some tasks as soon as I got home. Well, ok, not immediately cuz I had to shoot and put together this unboxing video. Here ya go, it’s a little long, but I also cover a lot of related topics of interest related to this machine.

First Test – Application Load

My first test was simply installing and setup of some standard software I use. That included Creative Cloud from Adobe, Jetbrains IDE’s and tooling, OBS for video streaming, Steam so I could pull down Cities: Skylines and Transport Fever, and some other miscellaneous software I would need. All this I would install while I get my Dropbox downloads going. With that I set out to install all of this stuff.

First I got Dropbox installed and began syncing. This would easily be about 380-400 Gig of files and video files. With that started I set out to install things I knew – or hoped – wouldn’t incur one of the required Windows reboots. First was Jetbrains IDE’s, which involves downloading their Toolbox and then clicking through every tool I’d want and letting it queue them all up. Then the same thing with Creative Cloud. Altogether that includes:

  • IntelliJ
  • WebStorm
  • Goland
  • Pycharm
  • ReSharper Ultimate w/ the pertinent tools it downloads.
  • Photoshop
  • Illustrator
  • Indesign
  • Premier Pro
  • Premier Rush (eh? Never used it, but I’ll download it)
  • After Effects
  • Character Animator
  • Audition
  • Media Encoder
  • InCopy
  • Prelude
  • Fuse CC (Another tool I don’t know but hey, it’s there now!)

All of that downloaded and installed quickly. Having a gigabit connect really, REALLY, REALLY helps out massively to get this done! Between the solid connection, the SDD being pretty lightning fast, and the proc and memory not being a bottleneck, I lost track of timing this. Suffice it to say the hardware combination is pretty quick.

Second Test – Premiere Pro Rendering

With Premiere Pro installed I set about recollecting how to edit with this software. I’ve been using Screenflow for Mac and Kdenlive on Linux for so long I don’t remember a thing about how Premiere Pro works. However as I worked through transitions (building the above unboxing video) I started to recall how much power is in this software, and I also really got a feel for another thing, the trackpad, which I’ll add more about in a moment.

The rendering for MPG4 was a little faster than the Mac Book Pro I’ve got from almost ~5 years ago and above equivalent to performance with the next to latest generation Mac Book Pro laptops (i.e. It’s about ~2 years old that I’m comparing performance on). Overall, the performance of video rendering wasn’t super impressive. At least not like the leap from a MBP to my System76 Leopard Workstation, which screams through rendering at a reasonably large percentage (~25-40%) faster than my laptop machines. So the XPS 15 really is close or better – but just a little. I’d be curious to get hold of the latest MBP and compare the performance. Considering this has dedicated video, it has dedicated video, and both have similar processors and pretty hefty RAM.

Third Test – Trackpad

Ok, this is the closest I’ve ever used in usefulness, capability, and multi-finger touch as compared to an Apple product. The tactile feel in my opinion is better too with this fiber type material that Dell has used on the trackpad and the surrounding area around the trackpad and keyboard.

The first real test was the maneuvering one has to do when video editing with Premiere Pro. Lot’s of right clicking (double finger tap or the traditional bottom right of the trackpad!) swiping up and down but also side to side, and this Dell trackpad performed exceptionally well. Still not as transparently smooth as an Apple’s trackpad is, but it gets the job done and once I’m used to it, I’ll easily be just as fast as I was with an Apple trackpad. I give myself a day or two and I’ll be up to speed with it.

Fourth Test – Weight and Sizing

Here’s a few images of the XPS 15 compared to an older model (circa 2015) Mac Book Pro.

The build quality of the XPS 15, just like the XPS 13 I have from circa ~2015 is really good. There’s elements of it that don’t seem like they’re aligned correctly compared to the smooth feel and look of some of the other laptops, but overall it feels, and appears to be more egalitarian and functional focused versus many of the other laptop options. The edges are very smooth and the cover of the laptop is a single plate of aluminum, which gives it that feel when carrying it around similar to most of the modern laptops. The edges however aren’t there to look seemless or attractive, they’re there simply to provide side plates for USB, USB-C, Audio, HDMI, and related ports. No complaints but if you’re 100% focused on looks, this might not be an ideal option. Me… well I’m a funny looking fella and it’s probably noticeably I’m not staunchly focused on the appearance of anything. I’m all about function over form.

Further Tests – I’ll have more tests in the future, for now, that’s a wrap.

Summary

Other comments, observations, and notes on this laptop will be forthcoming. In a few months I’ll plan to write a follow up. In the meantime, if you’re looking for a relatively light (1.81 kg / 4 lbs), relatively powerful laptop (32 GB RAM, 4GB Video, 8th Gen i7 or i9 option) this is about as good as you’ll get for the price. If power isn’t your ultimate intent with a laptop purchase I highly suggest checking out the Matebook.

As for Windows 10. My verdict is still out on it, leaning toward “ugh, fuggidabout it I’m going back to MacOS/Linux land” but I’m going to give it a good go before I declare that. There’s definitely a few elements of the latest OS that I like and I also want to get a good feel for it’s Linux system before I write it off. Maybe I’ll stick with it after all? Until then, happy holiday hardware hacking!

Twitz Coding Session in Go – Cobra + Viper CLI Wrap Up + Twitter Data Retrieval

Part 3 of 3 – Coding Session in Go – Cobra + Viper CLI for Parsing Text Files, Retrieval of Twitter Data, and Exports to various file formats.

UPDATED PARTS:

  1. Twitz Coding Session in Go – Cobra + Viper CLI for Parsing Text Files
  2. Twitz Coding Session in Go – Cobra + Viper CLI with Initial Twitter + Cassandra Installation
  3. Twitz Coding Session in Go – Cobra + Viper CLI Wrap Up + Twitter Data Retrieval (this post)

Updated links to each part will be posted at bottom of  this post when I publish them. For code, written walk through, and the like scroll down below the video and timestamps.

0:54 The thrashing introduction.
3:40 Getting started, with a recap of the previous sessions but I’ve not got the sound on so ignore this until 5:20.
5:20 I notice, and turn on the volume. Now I manage to get the recap, talking about some of the issues with the Twitter API. I step through setup of the app and getting the appropriate ID’s and such for the Twitter API Keys and Secrets.
9:12 I open up the code base, and review where the previous sessions got us to. Using Cobra w/ Go, parsing and refactoring that was previously done.
10:30 Here I talk about configuration again and the specifics of getting it setup for running the application.
12:50 Talking about Go’s fatal panic I was getting. The dependency reference to Github for the application was different than what is in application and don’t show the code that is actually executing. I show a quick fix and move on.
17:12 Back to the Twitter API use by using the go-twitter library. Here I review the issue and what the fix was for another issue I was having previous session with getting the active token! Thought the library handled it but that wasn’t the case!
19:26 Now I step through creating a function to get the active oath bearer token to use.
28:30 After deleting much of the code that doesn’t work from the last session, I go about writing the code around handling the retrieval of Twitter results for various passed in Twitter Accounts.

The bulk of the next section is where I work through a number of functions, a little refactoring, and answering some questions from the audience/Twitch Chat (working on a way to get it into the video!), fighting with some dependency tree issues, and a whole slew of silliness. Once that wraps up I get some things committed into the Github repo and wrap up the core functionality of the Twitz Application.

58:00 Reviewing some of the other examples in the go-twitter library repo. I also do a quick review of the other function calls form the library that take action against the Twitter API.
59:40 One of the PR’s I submitted to the project itself I review and merge into the repo that adds documentation and a build badge for the README.md.
1:02:48 Here I add some more information about the configuration settings to the README.md file.

1:05:48 The Twitz page is now updated: https://adron.github.io/twitz/
1:06:48 Setup of the continuous integration for the project on Travis CI itself: https://travis-ci.org/Adron/twitz
1:08:58 Setup fo the actual travis.yml file for Go. After this I go through a few stages of troubleshooting getitng the build going, with some white space in the ole’ yaml file and such. Including also, the famous casing issue! Ugh!
1:26:20 Here I start a wrap up of what is accomplished in this session.

NOTE: Yes, I realize I spaced and forgot the feature where I export it out to Apache Cassandra. Yes, I will indeed have a future stream where I build out the part that exports the responses to Apache Cassandra! So subcribe, stay tuned, and I’ll get that one done ASAP!!!

1:31:10 Further CI troubleshooting as one build is green and one build is yellow. More CI troubleshooting! Learn about the travis yaml here.
1:34:32 Finished, just the bad ass outtro now!

The Codez

In the previous posts I outlined two specific functions that were built out:

  • Part 1 – The config function for the twitz config command.
  • Part 2 – The parse function for the twitz parse command.

In this post I focused on updating both of these and adding additional functions for the bearer token retrieval for auth and ident against the Twitter API and other functionality. Let’s take a look at what the functions looked like and read like after this last session wrap up.

The config command basically ended up being 5 lines of fmt.Printf functions to print out pertinent configuration values and environment variables that are needed for the CLI to be used.

The parse command was a small bit changed. A fair amount of the functionality I refactored out to the buildTwitterList() and exportFile, and rebuildForExport functions. The buildTwitterList() I put in the helper.go file, which I’ll cover a littler later. But in this file, which could still use some refactoring which I’ll get to, I have several pieces of functionality; the export to formats functions, and the if else if logic of the exportParsedTwitterList function.

Next up after parse, it seems fitting to cover the helpers.go file code. First I have the check function, which simply wraps the routinely copied error handling code snippet. Check out the file directly for that. Then below that I have the buildTwitterList() function which gets the config setting for the file name to open to parse for Twitter accounts. Then the code reads the file, splits the results of the text file into fields, then steps through and parses out the Twitter accounts. This is done with a REGEX (I know I know now I have two problems, but hey, this is super simple!). It basically finds fields that start with an @ and then verifies the alphanumeric nature, combined with a possible underscore, that then remove unnecessary characters on those fields. Wrapping all that up by putting the fields into a string/slice array and returning that string array to the calling code.

The next function in the Helpers.go file is the getBearerToken function. This was a tricky bit of code. This function takes in the consumer key and secret from the Twitter app (check out the video at 5:20 for where to set it up). It returns a string and error, empty string if there’s an error, as shown below.

The code starts out with establishing a POST request against the Twitter API, asking for a token and passing the client credentials. Catches an error if that doesn’t work out, but if it can the code then sets up the b64Token variable with the standard encoding functionality when it receives the token string byte array ( lines 9 and 10). After that the request then has the header built based on the needed authoriztaion and content-type properties (properties, values? I don’t recall what spec calls these), then the request is made with http.DefaultClient.Do(req). The response is returned, or error and empty response (or nil? I didn’t check the exact function signature logic). Next up is the defer to ensure the response is closed when everything is done.

Next up the JSON result is parsed (unmarshalled) into the v struct which I now realize as I write this I probably ought to rename to something that isn’t a single letter. But it works for now, and v has the pertinent AccessToken variable which is then returned.

Wow, ok, that’s a fair bit of work. Up next, the findem.go file and related function for twitz. Here I start off with a few informative prints to the console just to know where the CLI has gotten to at certain points. The twitter list is put together, reusing that same function – yay code reuse right! Then the access token is retrieved. Next up the http client is built, the twitter client is passed that and initialized, and the user lookup request is sent. Finally the users are printed out and below that a count and print out of the count of users is printed.

I realized, just as I wrapped this up I completely spaced on the Apache Cassandra export. I’ll have those post coming soon and will likely do another refactor to get the output into a more usable state before I call this one done. But the core functionality, setup of the systemic environment needed for the tool, the pertinent data and API access, and other elements are done. For now, that’s a wrap, if you’re curious about the final refactor and the Apache Cassandra export then subscribe to my Twitch @adronhall and/or my YouTube channel ThrashingCode.

UPDATED SERIES PARTS

    1. Twitz Coding Session in Go – Cobra + Viper CLI for Parsing Text Files
    2. Twitz Coding Session in Go – Cobra + Viper CLI with Initial Twitter + Cassandra Installation
    3. Twitz Coding Session in Go – Cobra + Viper CLI Wrap Up + Twitter Data Retrieval (this post)

     

Cedrick Lunven on Creating an API for your database with Rest, GraphQL, gRPC

Here’s a talk Cedrick Lunven (who I have the fortune of working with!) about creating API’s for your database, your distributed database. He starts out with a few objectives for the talk:

  1. Provide you a working API implementing Rest, gRPC, and GraphQL.
  2. Give implementation details through Demo.
  3. Reveal hints to choose and WHY, (specifically to work with Databases)

Other topics include specific criteria around conceptual data models, shifting from relational to distributed columnar store, with differentiation between entities, relationships, queries, and their respective behaviors. All of this is pertinent to our Killrvideo reference application we have too.

Enjoy!