My 2018 Retrospective

Alright, with that 2018 is a wrap. Christmas is a wrap. New Years is a wrap. It’s all done, wrapped up, and time to move on. Well, ok, so maybe a small retrospective! Early in 2018 I wrote a list of resolutions for 2018. How did I do? First I’m going to lead off with the things I have found myself failing miserably at.

Failed Resolution: Write More Code, Build Patterns & Algorithms

Write More Code, Build Patterns & Algorithms: I want to review and go back to some of the coding roots that I haven’t hit upon in a long time. It’s odd, when coding day in and day out one tends to not touch upon a lot of the fundamental basics. I want to start writing about and reviewing this again, keep it fresh in mind so it’s always easy to reach into my mind and explain how things work to who may ask. Goal: Write 0.5 blog entry per week on coding algorithms, patterns, or related core coding concepts and skills.

Oh dear I had a simple goal of writing 0.5 blog entries per week on coding algorithms, patterns, or related core coding concepts and skills. Yeah, I didn’t get anywhere near that, even though I increased my rate of blog posts somewhere around 6x what it was in 2017! Overall, yeah, groovy, I wrote more and that’s great, but I didn’t hit on the topics I really wanted to that I knew would provide more value!

Failed Resolution: Make OSS Workable for Me

Make OSS Workable for Me: Get the OSS projects I’ve started in the last 2–3 years into a more workable state, insure others can take them, build, and get running without issue. It’s been a few years since I’ve worked on and helped with any OSS projects that are actually used, it’s a bummer and I’m going to resolve that this year. Goal: Get two projects into a workable state so others can use them and I can use them for their intended purpose and for teaching and blogging purposes.

I’m so close on this goal. I’ve got the project into a little bit better of a state, but overall they’re still kind of rough. At the rate I’m going among the Twitch streams and such, these projects will be in a nicely usable state soon however! Probably by March or April I’d suspect I’ll have 2 of the projects (Colligere and Twitz) into  a usable state! But from the goal point of view, this is a failed resolution.

Failed Resolution: Get More Active

Get More Active: Take more bike rides, train trips, and spend more time with friends and family in Portland. Goal 1: Spend at least 3–4 days at Pedalpalooza this year, and take at least 4 trips (1 per quarter at minimum) of about ~2 days each in 2018. Goal 2: Participate in at least 1 group rides per quarter in Seattle.

I failed in this resolution miserably. In all honesty, it’s hard to even talk about. It fills me with rage that I failed at this fundamentally important resolution badly. There’s not a lot of things that I enjoy anywhere close to a good group ride, hanging out at Pedalpalooza, and generally being involved in my local community this way and it came all apart on me this year. It even fills me with rage, anger, and frustration with Seattle’s communities and how uninvolved most people are in Seattle itself. After coming from Portland I have tons of frustration with this, but I’m slowly learning to just live with it. Maybe this year, maybe not, I’ve no idea how I’m going to pull this together. I’m going to need to however as my mental and physical health largely rests on being involved in the biking community, with this involvement I generally falter into an angry, apathetic, disenfranchised angry person.

Failed Resolution: Self Health

Self Health: Take more time for myself, allocate appropriate me time to make sure I can keep my sanity. Goal: Write, ponder, introspect, and watch the world turn more often. Blog on the park, lake, boat, train, or place and moment of writing, pondering, introspecting, and watching the world turn by writing about it.

The first part of this year was catastrophically bad for me. However between joining a truly amazing team at DataStax, starting to finally get back to riding around Seattle, exploring, and producing content and learning new tech, languages, and the like I’m doing a lot better. I’ve got a long ways to go still before I’m back to 100%, but I think 2019 might be that year. At least, ending 2018 and going into 2019 I’m not as angry, depressed, and perturbed at society as I was at the beginning of this year. Now I’m just kind of euphorically disconnected and apathetic about the state of the world. It’s a major improvement!

Failed Resolution: Improve Local Communities

Improve Local Communities: Stop getting involved in Internet politics I can’t improve. Get more focused in local politics. Help more people in Seattle, help more people however I can. Help in the fight for better and more housing for all people. Help in the fight for better and more advanced and modern transportation options in Seattle, Portland, and wherever I can. Goal 1: Join Cascade Bicycle Club and the monthly urbanist meets. Goal 2: Keep up with the respective communities ala Bike Portland, The Urbanist, Seattle Transit Blog, Seattle Bike Blog, WA Bikes, and Cascade Bicycle Club.

Alright, I’m angry at myself about this one, but also realize I just couldn’t get to it. Don’t get me wrong, I attended some city council meetings and got involved some. But nowhere near as much as I’d prefer to have done. The simple fact too, is that Seattle political mechanisms are just garbage compared to Portland. It’s hard to know what’s going on let alone how the hell to get something done or improve one’s community, lot in life, or environment here. Seattle is a beast! One day, if we survive that long, it’ll make a great “Manhatten” of the northwest! (Ironically I’ve learned, it once was indeed called “New York City”, which to me seems like the dumbest name since New York was thus dubbed after “York” and before that was called New Amsterdam after Amsterdam, which Seattle has no real relation to in any way whatsoever)

But now, failures aside I digress.

TIME FOR VICTORIES!

Now it’s time for the successes. Some of these I barely passed, some I beautifully knocked out of the park!

Victory Resolution, “Beer”

Beer: Live a little, treat yo’ self, and hit up at least two happy hours or other meets with friends and coder/tech/hacker crews per month. The Ballard Happy Hour has been a good one so far, and of course the exceptional Seattle Scalability Group I help organize is pretty epic unto itself. The latter often having free beer post meet up. Both meets are good conversations, great beer, and a great chance to just unwind with smart folk. Goal: Attend two meets per month.

Ok, this one wasn’t purely about beer itself, but beer tends to play a large part of the activity. The core of it all was insuring that I get more involved in the local “tech” community here in Seattle. In this I have succeeded! Currently I’m leading the return of the Seattle Scalability Group with my meetup cohort plus the addition of the DataStax team having my back. This is of course only one of the activities I’ve gotten myself involve in again. I hope to also help out and get involved in other meetups around Seattle, and have started plotting the return of ML4ALL (v2) with Troy and Alena! I’m looking forward to this being a continuing resolution, and continually succeeding at meeting this resolution throughout 2019!

Victory Resolution: Communications Improvements

Communication Improvements: Find a new and more efficient way to increase the throughput of my follow ups with everybody. Whether by email, phone, or whatever it might be. Goal: Increase rate of follow ups by 15%.

Oh jeez, this one was easier to accomplish than I thought it was going to be. I’ve since gone through and reviewed. I managed to only drop the ball on ~3 of the over 340 important productive conversation threads I initiated! I’m also impressed with my ability to actually keep track of those thanks to my combination of software tools I started using and will be using for the ongoing future. That puts the increase rate somewhere around 99% and not merely 15%! Major victory on this accomplishment!

Victory Resolution: Cell Phone Disruption

Cell Phone Disruption: Decrease my actual cell phone usage making *phone calls* even further. Regain the privacy, personal time, and focus time that one traditionally had before the time of cell phones invading every corner of existence. How I do this I’m not entirely sure, but I’m betting when I figure it out I’ll blog it. Goal: Make it so that I don’t look at or need my cell phone for anything more than 1 phone call per week on average (vid chat, etc, excluded).

I was spectacularly victories in this goal. I think I’ve taken approximately 3 work calls for the entire year via the cell phone. I’ve decreased my screen time by almost 30%! Overall, I use my device dramatically less than I have in the past and it’s helped me in a pretty significant way. Maybe I could knock it down another 5 or 10% this year eh?

Victory Resolution: Re-initiate in Industry

Reinitiate in Industry: Kick back off and give some dedicated presentations at meet ups and conferences. Goal: Give at least 4–6 talks that are dedicated, focused, mid-level or advanced topics following my standards for speaking, follow up, and improving.

Knocked out three talks: Systemic Deployments, DataStax Developer Day: Operations & Security, and Node Systems for Node.js Services on Nodes of Systemic Nodal Systems. Then, to boot, I allocated some of my talks to Twitch streams and am continuing to step through these on a daily basis. Hope y’all will join me on a future stream, they’re fun! Overall though, lot’s of moving parts going on, and I’m going to likely double my speaking schedule this year. I’ve got a lot of cool things to show ya!

Next up, 2019.

Let’s go!

Got a New Laptop, Here’s The Review

A few past reviews just for context of my general working fleet of computers and why and what led me to this review and this new laptop purchase.

Important! Do take note, I’m not paid by Dell, or System76, or anybody to write up reviews of laptops or hardware for that matter. These are my observations with these systems. I’m also not paid to use these systems for software development, but am only paid for the software development I produce with these machines. In other words, I very much roll Bring Your Own Device (BYOD) style and develop routinely without an assigned machine. I do what I can to stick to BYOD and such as it is, write up reviews of what I choose to use.

The Setting & Context for Purchase

system76-leopard-16Over the last year I’ve been pondering getting a Linux based laptop. At least a laptop that can run Linux native on the metal as the sole OS or at least a clean dual boot option. I wanted this for several specific reasons. The final decision to move forward occurred at this very tweet moment.

Here’s the short list of why I wanted a new laptop, that has good Linux support.

  1. Most of my gaming is in Linux these days. Thanks Steam OS!
  2. Most of my server workloads, server coding, internet coding, back-end work, and related coding is all for Linux. I haven’t deployed something to another OS in production for at least a decade now. As for front end apps, that’s also basically stuff that runs on Linux or MacOS. Web interfaces or usually just some simple CLI’s. I did write a Windows 8 “Metro UI” App, but it’s long gone and dead already along with the database (Riak) that it was an interface for.
  3. Most of my automation work and related site reliability coding, ops hacking, my metasoftware engineering (great words for a title from Katie Keel @akatiekeel, see tweet below), and all that good fun is often done on MacOS but always ends up being deployed to run on a Linux machine in the end.
  4. I’ve already got two Linux machines that I do a huge percentage of work on. The Dell XPS 13 and System 76 Leopard Workstation. However, the Leopard is in a bit of disrepair after a disturbingly wicked power surge here in Ballard (Seattle). The XPS 13 is just a bit weak, old, and the keyboard is still the crappy keyboard I detailed in the past review.
  5. One of the big demands for this new laptop was that I wanted to be able to – at least with a somewhat efficient hardware performance level – edit video, stream video, run the virtual machines, the pertinent container ecosystems (i.e. distributed database stuff), of course lots of code, and play the few games I do play. This meant at basic some decent video – albeit I knew it wasn’t going to be what I had/have in my System76 machine – at least a terabyte of storage on my main drive, and 32 GB of RAM.

Buy Time

huawei-matebook-x-pro-nvidia-geforce-mx-150Alright, that was the setting, so I went about searching through a lot of different laptop options. One of the most promising was this Huawai Matebook that Jeff & Adam pointed me at. It looked so good from the website that I decided I wanted to go check out the physical Matebook Pro somewhere, if possible, and found that option here in Seattle at the Microsoft store in University Park (It’s an outdoor mall, yuck, but whatevers).

huawei-matebook-x-pro-multiple-portsI rolled over via bus and a short walk, walked into the Microsoft store and made a beeline right to where one of the Matebooks sat. It was gorgeous, absolutely beautiful, flawless, and outright better bang for the hardware buck than one of the Apple products from across the street! I was instantly sold.

huawei-matebook-x-pro-with-intel-cpuBut there was an issue. Hardware specs for this thing sit at 2GB Video, 8 GB RAM, and a 512 GB SSD. That’s a problem. I checked the site again to make sure there weren’t other options. Nope, it didn’t get much more built up than that. It just wouldn’t do.

huawei-matebook-x-pro-thin-bodyI felt pained, annoyed, and frustrated. Does anybody actually want some decent power in a slim, elegant, and easy to carry laptop? Am I the only one wanting something like this? I started strolling around the floor of the Microsoft store. Looking at hard drives and Xbox stuff. Which just to point out, these Microsoft stores really are Xbox stores as much or more than they are anything else!

NOTE: All Huawai images copyright of Huawai. I’m hoping they’re cool since I’m pointing out their awesome laptop.

The reason I bring up the Matebook, is because I really was impressed by the build quality. It exceeded my expectation and based on this research, trying it out, I would happily suggest this laptop as a prime choice if the specs meet what you need. For me, sadly, I wanted and needed a bit more RAM and some more oomf in other categories.

The Final Decision

I walked around the Microsoft store checking out the Lenovo and a number of other laptops. I played some racing game thing on Xbox for a second. I wasn’t in so much of a hurry that I just had to buy something right then. I had after all waited almost a year to get to this point. Maybe I’d just save the cash and wait a little longer? Surely something would come along.

Then I walked up to another table. I first looked at the spec list, which I had been doing at every table except when I had walked up to the Matebook. I see 1 TB option on this machine. That’s cool. Then I see 32 GB of RAM. Holy shit when did the selection on the floor leap out of the piddly 8 GB range?! Then I see 4 GB Video! Specifically a NVIDIA® GeForce® GTX 1050Ti with 4GB GDDR5! Hell yeah. Alright, I’d looked and here’s the laptop that after months seemed to be the only ongoing choice to get this kind of specs in something that I sort of trusted the support for. So I started to play around with the keyboard and oh, looky there, it’s a touch screen too. Not that I cared, but it’s not a bad thing to have really, albeit a waste as I’ll likely never touch the screen.

So there it was, the decision was made, bagged, paid for, and out I walked with a brand new Dell XPS 15. Maybe I should have bought it from Dell, but meh, this will work. Support plan is nice, if anything fails I bring it into the store and they get me a new laptop too. Overall price $2499 for 3 years of coverage plus laptop. Also, yes, considering my unfortunate luck with hardware over the years – unless it’s a machine I’ve built myself – I get the coverage because I’ve got all of zero time to mess around with being my own technician.

The Dell XPS 15

Alright, so I set out to put this thing through some tasks as soon as I got home. Well, ok, not immediately cuz I had to shoot and put together this unboxing video. Here ya go, it’s a little long, but I also cover a lot of related topics of interest related to this machine.

First Test – Application Load

My first test was simply installing and setup of some standard software I use. That included Creative Cloud from Adobe, Jetbrains IDE’s and tooling, OBS for video streaming, Steam so I could pull down Cities: Skylines and Transport Fever, and some other miscellaneous software I would need. All this I would install while I get my Dropbox downloads going. With that I set out to install all of this stuff.

First I got Dropbox installed and began syncing. This would easily be about 380-400 Gig of files and video files. With that started I set out to install things I knew – or hoped – wouldn’t incur one of the required Windows reboots. First was Jetbrains IDE’s, which involves downloading their Toolbox and then clicking through every tool I’d want and letting it queue them all up. Then the same thing with Creative Cloud. Altogether that includes:

  • IntelliJ
  • WebStorm
  • Goland
  • Pycharm
  • ReSharper Ultimate w/ the pertinent tools it downloads.
  • Photoshop
  • Illustrator
  • Indesign
  • Premier Pro
  • Premier Rush (eh? Never used it, but I’ll download it)
  • After Effects
  • Character Animator
  • Audition
  • Media Encoder
  • InCopy
  • Prelude
  • Fuse CC (Another tool I don’t know but hey, it’s there now!)

All of that downloaded and installed quickly. Having a gigabit connect really, REALLY, REALLY helps out massively to get this done! Between the solid connection, the SDD being pretty lightning fast, and the proc and memory not being a bottleneck, I lost track of timing this. Suffice it to say the hardware combination is pretty quick.

Second Test – Premiere Pro Rendering

With Premiere Pro installed I set about recollecting how to edit with this software. I’ve been using Screenflow for Mac and Kdenlive on Linux for so long I don’t remember a thing about how Premiere Pro works. However as I worked through transitions (building the above unboxing video) I started to recall how much power is in this software, and I also really got a feel for another thing, the trackpad, which I’ll add more about in a moment.

The rendering for MPG4 was a little faster than the Mac Book Pro I’ve got from almost ~5 years ago and above equivalent to performance with the next to latest generation Mac Book Pro laptops (i.e. It’s about ~2 years old that I’m comparing performance on). Overall, the performance of video rendering wasn’t super impressive. At least not like the leap from a MBP to my System76 Leopard Workstation, which screams through rendering at a reasonably large percentage (~25-40%) faster than my laptop machines. So the XPS 15 really is close or better – but just a little. I’d be curious to get hold of the latest MBP and compare the performance. Considering this has dedicated video, it has dedicated video, and both have similar processors and pretty hefty RAM.

Third Test – Trackpad

Ok, this is the closest I’ve ever used in usefulness, capability, and multi-finger touch as compared to an Apple product. The tactile feel in my opinion is better too with this fiber type material that Dell has used on the trackpad and the surrounding area around the trackpad and keyboard.

The first real test was the maneuvering one has to do when video editing with Premiere Pro. Lot’s of right clicking (double finger tap or the traditional bottom right of the trackpad!) swiping up and down but also side to side, and this Dell trackpad performed exceptionally well. Still not as transparently smooth as an Apple’s trackpad is, but it gets the job done and once I’m used to it, I’ll easily be just as fast as I was with an Apple trackpad. I give myself a day or two and I’ll be up to speed with it.

Fourth Test – Weight and Sizing

Here’s a few images of the XPS 15 compared to an older model (circa 2015) Mac Book Pro.

The build quality of the XPS 15, just like the XPS 13 I have from circa ~2015 is really good. There’s elements of it that don’t seem like they’re aligned correctly compared to the smooth feel and look of some of the other laptops, but overall it feels, and appears to be more egalitarian and functional focused versus many of the other laptop options. The edges are very smooth and the cover of the laptop is a single plate of aluminum, which gives it that feel when carrying it around similar to most of the modern laptops. The edges however aren’t there to look seemless or attractive, they’re there simply to provide side plates for USB, USB-C, Audio, HDMI, and related ports. No complaints but if you’re 100% focused on looks, this might not be an ideal option. Me… well I’m a funny looking fella and it’s probably noticeably I’m not staunchly focused on the appearance of anything. I’m all about function over form.

Further Tests – I’ll have more tests in the future, for now, that’s a wrap.

Summary

Other comments, observations, and notes on this laptop will be forthcoming. In a few months I’ll plan to write a follow up. In the meantime, if you’re looking for a relatively light (1.81 kg / 4 lbs), relatively powerful laptop (32 GB RAM, 4GB Video, 8th Gen i7 or i9 option) this is about as good as you’ll get for the price. If power isn’t your ultimate intent with a laptop purchase I highly suggest checking out the Matebook.

As for Windows 10. My verdict is still out on it, leaning toward “ugh, fuggidabout it I’m going back to MacOS/Linux land” but I’m going to give it a good go before I declare that. There’s definitely a few elements of the latest OS that I like and I also want to get a good feel for it’s Linux system before I write it off. Maybe I’ll stick with it after all? Until then, happy holiday hardware hacking!

Twitz Coding Session in Go – Cobra + Viper CLI Wrap Up + Twitter Data Retrieval

Part 3 of 3 – Coding Session in Go – Cobra + Viper CLI for Parsing Text Files, Retrieval of Twitter Data, and Exports to various file formats.

UPDATED PARTS:

  1. Twitz Coding Session in Go – Cobra + Viper CLI for Parsing Text Files
  2. Twitz Coding Session in Go – Cobra + Viper CLI with Initial Twitter + Cassandra Installation
  3. Twitz Coding Session in Go – Cobra + Viper CLI Wrap Up + Twitter Data Retrieval (this post)

Updated links to each part will be posted at bottom of  this post when I publish them. For code, written walk through, and the like scroll down below the video and timestamps.

0:54 The thrashing introduction.
3:40 Getting started, with a recap of the previous sessions but I’ve not got the sound on so ignore this until 5:20.
5:20 I notice, and turn on the volume. Now I manage to get the recap, talking about some of the issues with the Twitter API. I step through setup of the app and getting the appropriate ID’s and such for the Twitter API Keys and Secrets.
9:12 I open up the code base, and review where the previous sessions got us to. Using Cobra w/ Go, parsing and refactoring that was previously done.
10:30 Here I talk about configuration again and the specifics of getting it setup for running the application.
12:50 Talking about Go’s fatal panic I was getting. The dependency reference to Github for the application was different than what is in application and don’t show the code that is actually executing. I show a quick fix and move on.
17:12 Back to the Twitter API use by using the go-twitter library. Here I review the issue and what the fix was for another issue I was having previous session with getting the active token! Thought the library handled it but that wasn’t the case!
19:26 Now I step through creating a function to get the active oath bearer token to use.
28:30 After deleting much of the code that doesn’t work from the last session, I go about writing the code around handling the retrieval of Twitter results for various passed in Twitter Accounts.

The bulk of the next section is where I work through a number of functions, a little refactoring, and answering some questions from the audience/Twitch Chat (working on a way to get it into the video!), fighting with some dependency tree issues, and a whole slew of silliness. Once that wraps up I get some things committed into the Github repo and wrap up the core functionality of the Twitz Application.

58:00 Reviewing some of the other examples in the go-twitter library repo. I also do a quick review of the other function calls form the library that take action against the Twitter API.
59:40 One of the PR’s I submitted to the project itself I review and merge into the repo that adds documentation and a build badge for the README.md.
1:02:48 Here I add some more information about the configuration settings to the README.md file.

1:05:48 The Twitz page is now updated: https://adron.github.io/twitz/
1:06:48 Setup of the continuous integration for the project on Travis CI itself: https://travis-ci.org/Adron/twitz
1:08:58 Setup fo the actual travis.yml file for Go. After this I go through a few stages of troubleshooting getitng the build going, with some white space in the ole’ yaml file and such. Including also, the famous casing issue! Ugh!
1:26:20 Here I start a wrap up of what is accomplished in this session.

NOTE: Yes, I realize I spaced and forgot the feature where I export it out to Apache Cassandra. Yes, I will indeed have a future stream where I build out the part that exports the responses to Apache Cassandra! So subcribe, stay tuned, and I’ll get that one done ASAP!!!

1:31:10 Further CI troubleshooting as one build is green and one build is yellow. More CI troubleshooting! Learn about the travis yaml here.
1:34:32 Finished, just the bad ass outtro now!

The Codez

In the previous posts I outlined two specific functions that were built out:

  • Part 1 – The config function for the twitz config command.
  • Part 2 – The parse function for the twitz parse command.

In this post I focused on updating both of these and adding additional functions for the bearer token retrieval for auth and ident against the Twitter API and other functionality. Let’s take a look at what the functions looked like and read like after this last session wrap up.

The config command basically ended up being 5 lines of fmt.Printf functions to print out pertinent configuration values and environment variables that are needed for the CLI to be used.


var configCmd = &cobra.Command{
Use: "config",
Short: "A brief description of your command",
Long: `A longer description that spans multiple lines and likely contains examples
and usage of using your command. For the custom example:
Cobra is a CLI library for Go that empowers applications.
This application is a tool to generate the needed files
to quickly create a Cobra application.`,
Run: func(cmd *cobra.Command, args []string) {
fmt.Printf("Twitterers File: %s\n", viper.GetString("file"))
fmt.Printf("Export File: %s\n", viper.GetString("fileExport"))
fmt.Printf("Export Format: %s\n", viper.GetString("fileFormat"))
fmt.Printf("Consumer API Key: %s\n", viper.GetString("consumer_api_key")[0:6])
fmt.Printf("Consumer API Secret: %s\n", viper.GetString("consumer_api_secret")[0:6])
},
}

view raw

config.go

hosted with ❤ by GitHub

The parse command was a small bit changed. A fair amount of the functionality I refactored out to the buildTwitterList() and exportFile, and rebuildForExport functions. The buildTwitterList() I put in the helper.go file, which I’ll cover a littler later. But in this file, which could still use some refactoring which I’ll get to, I have several pieces of functionality; the export to formats functions, and the if else if logic of the exportParsedTwitterList function.


var parseCmd = &cobra.Command{
Use: "parse",
Short: "This command will extract the Twitter Accounts form a text file.",
Long: `This command will extract the Twitter Accounts and clean up or disregard other characters
or text around the twitter accounts to create a simple, clean, Twitter Accounts only list.`,
Run: func(cmd *cobra.Command, args []string) {
completedTwittererList := buildTwitterList()
fmt.Println(completedTwittererList)
if viper.Get("fileExport") != nil {
exportParsedTwitterList(viper.GetString("fileExport"), viper.GetString("fileFormat"), completedTwittererList)
}
},
}
func exportParsedTwitterList(exportFilename string, exportFormat string, twittererList []string) {
if exportFormat == "txt" {
exportTxt(exportFilename, twittererList, exportFormat)
} else if exportFormat == "json" {
exportJson(exportFilename, twittererList, exportFormat)
} else if exportFormat == "xml" {
exportXml(exportFilename, twittererList, exportFormat)
} else if exportFormat == "csv" {
exportCsv(exportFilename, twittererList, exportFormat)
} else {
fmt.Println("Export type unsupported.")
}
}
func exportXml(exportFilename string, twittererList []string, exportFormat string) {
fmt.Printf("Starting xml export to %s.", exportFilename)
xmlContent, err := xml.Marshal(twittererList)
check(err)
header := xml.Header
collectedContent := header + string(xmlContent)
exportFile(collectedContent, exportFilename+"."+exportFormat)
}
func exportCsv(exportFilename string, twittererList []string, exportFormat string) {
fmt.Printf("Starting txt export to %s.", exportFilename)
collectedContent := rebuildForExport(twittererList, ",")
exportFile(collectedContent, exportFilename+"."+exportFormat)
}
func exportTxt(exportFilename string, twittererList []string, exportFormat string) {
fmt.Printf("Starting %s export to %s.", exportFormat, exportFilename)
collectedContent := rebuildForExport(twittererList, "\n")
exportFile(collectedContent, exportFilename+"."+exportFormat)
}
func exportJson(exportFilename string, twittererList []string, exportFormat string) {
fmt.Printf("Starting %s export to %s.", exportFormat, exportFilename)
collectedContent := collectContent(twittererList)
exportFile(string(collectedContent), exportFilename+"."+exportFormat)
}
func collectContent(twittererList []string) []byte {
collectedContent, err := json.Marshal(twittererList)
check(err)
return collectedContent
}
func rebuildForExport(twittererList []string, concat string) string {
var collectedContent string
for _, twitterAccount := range twittererList {
collectedContent = collectedContent + concat + twitterAccount
}
if concat == "," {
collectedContent = strings.TrimLeft(collectedContent, concat)
}
return collectedContent
}
func exportFile(collectedContent string, exportFile string) {
contentBytes := []byte(collectedContent)
err := ioutil.WriteFile(exportFile, contentBytes, 0644)
check(err)
}

view raw

parse.go

hosted with ❤ by GitHub

Next up after parse, it seems fitting to cover the helpers.go file code. First I have the check function, which simply wraps the routinely copied error handling code snippet. Check out the file directly for that. Then below that I have the buildTwitterList() function which gets the config setting for the file name to open to parse for Twitter accounts. Then the code reads the file, splits the results of the text file into fields, then steps through and parses out the Twitter accounts. This is done with a REGEX (I know I know now I have two problems, but hey, this is super simple!). It basically finds fields that start with an @ and then verifies the alphanumeric nature, combined with a possible underscore, that then remove unnecessary characters on those fields. Wrapping all that up by putting the fields into a string/slice array and returning that string array to the calling code.


func buildTwitterList() []string {
theFile := viper.GetString("file")
theTwitterers, err := ioutil.ReadFile(theFile)
check(err)
stringTwitterers := string(theTwitterers[:])
splitFields := strings.Fields(stringTwitterers)
var completedTwittererList []string
for _, aField := range splitFields {
if strings.HasPrefix(aField, "@") && aField != "@" {
reg, _ := regexp.Compile("[^a-zA-Z0-9_@]")
processedString := reg.ReplaceAllString(aField, "")
completedTwittererList = append(completedTwittererList, processedString)
}
}
return completedTwittererList
}

The next function in the Helpers.go file is the getBearerToken function. This was a tricky bit of code. This function takes in the consumer key and secret from the Twitter app (check out the video at 5:20 for where to set it up). It returns a string and error, empty string if there’s an error, as shown below.

The code starts out with establishing a POST request against the Twitter API, asking for a token and passing the client credentials. Catches an error if that doesn’t work out, but if it can the code then sets up the b64Token variable with the standard encoding functionality when it receives the token string byte array ( lines 9 and 10). After that the request then has the header built based on the needed authoriztaion and content-type properties (properties, values? I don’t recall what spec calls these), then the request is made with http.DefaultClient.Do(req). The response is returned, or error and empty response (or nil? I didn’t check the exact function signature logic). Next up is the defer to ensure the response is closed when everything is done.

Next up the JSON result is parsed (unmarshalled) into the v struct which I now realize as I write this I probably ought to rename to something that isn’t a single letter. But it works for now, and v has the pertinent AccessToken variable which is then returned.


func getBearerToken(consumerKey, consumerSecret string) (string, error) {
req, err := http.NewRequest("POST", "https://api.twitter.com/oauth2/token",
strings.NewReader("grant_type=client_credentials"))
if err != nil {
return "", fmt.Errorf("cannot create /token request: %+v", err)
}
b64Token := base64.StdEncoding.EncodeToString(
[]byte(fmt.Sprintf("%s:%s", consumerKey, consumerSecret)))
req.Header.Add("Authorization", "Basic "+b64Token)
req.Header.Add("Content-Type", "application/x-www-form-urlencoded;charset=UTF-8")
resp, err := http.DefaultClient.Do(req)
if err != nil {
return "", fmt.Errorf("/token request failed: %+v", err)
}
defer resp.Body.Close()
var v struct {
AccessToken string `json:"access_token"`
}
if err := json.NewDecoder(resp.Body).Decode(&v); err != nil {
return "", fmt.Errorf("error parsing json in /token response: %+v", err)
}
if v.AccessToken == "" {
return "", fmt.Errorf("/token response does not have access_token")
}
return v.AccessToken, nil
}

Wow, ok, that’s a fair bit of work. Up next, the findem.go file and related function for twitz. Here I start off with a few informative prints to the console just to know where the CLI has gotten to at certain points. The twitter list is put together, reusing that same function – yay code reuse right! Then the access token is retrieved. Next up the http client is built, the twitter client is passed that and initialized, and the user lookup request is sent. Finally the users are printed out and below that a count and print out of the count of users is printed.


var findemCmd = &cobra.Command{
Use: "findem",
Short: "A brief description of your command",
Long: `A longer description that spans multiple lines and likely contains examples
and usage of using your command. For example:
Cobra is a CLI library for Go that empowers applications.
This application is a tool to generate the needed files
to quickly create a Cobra application.`,
Run: func(cmd *cobra.Command, args []string) {
fmt.Println("Starting Twitter Information Retrieval.")
completedTwitterList := buildTwitterList()
fmt.Printf("Getting Twitter details for: \n%s", completedTwitterList)
accessToken, err := getBearerToken(viper.GetString("consumer_api_key"), viper.GetString("consumer_api_secret"))
check(err)
config := &oauth2.Config{}
token := &oauth2.Token{AccessToken: accessToken}
// OAuth2 http.Client will automatically authorize Requests
httpClient := config.Client(context.Background(), token)
// Twitter client
client := twitter.NewClient(httpClient)
// users lookup
userLookupParams := &twitter.UserLookupParams{ScreenName: completedTwitterList}
users, _, _ := client.Users.Lookup(userLookupParams)
fmt.Printf("\n\nUsers:\n%+v\n", users)
howManyUsersFound := len(users)
fmt.Println(howManyUsersFound)
},
}

view raw

Findem.go

hosted with ❤ by GitHub

I realized, just as I wrapped this up I completely spaced on the Apache Cassandra export. I’ll have those post coming soon and will likely do another refactor to get the output into a more usable state before I call this one done. But the core functionality, setup of the systemic environment needed for the tool, the pertinent data and API access, and other elements are done. For now, that’s a wrap, if you’re curious about the final refactor and the Apache Cassandra export then subscribe to my Twitch @adronhall and/or my YouTube channel ThrashingCode.

UPDATED SERIES PARTS

    1. Twitz Coding Session in Go – Cobra + Viper CLI for Parsing Text Files
    2. Twitz Coding Session in Go – Cobra + Viper CLI with Initial Twitter + Cassandra Installation
    3. Twitz Coding Session in Go – Cobra + Viper CLI Wrap Up + Twitter Data Retrieval (this post)

     

Conflicts of Building a Real World Example Application Starter Kit

As I dug through a number of JavaScript user interface frameworks lately, reading a number of posts, and building a more informed opinion. All this to decide which one I should use for a sample application for some starter kits. One post I read hit home that it does need to be a bit more complex than a todo app.

However, I’m still starting with a todo app anyway, but it’s going to turn into something else that is much more than a mere todo app. In this post I’m going to write up some of those larger plans and what complexities lie in wait – dragons are indeed there – for this more extensive real world app.

Modernizing Real World US Passenger Rail Ticket Sales!

Ok, I picked this topic since it is one of the things I find frustrating in the United States. The passenger rail systems, pretty much all of them, are barely better than many 3rd world countries, let alone the developed nations. One of those elements that the United States falls far behind on is an effective, efficient, accurate, and useful ticketing and seat assignment system. Let’s talk about this particular problem for a moment and you’ll start to visualize the problems that exist with the current system.

The Problem(s): Train Seating Options

Siemens Charger engine waiting with Talgo train.
Siemens Charger engine waiting with Talgo train.

Getting people on and off of a transport system like a train, airplane, ferry, or other mode of transport isn’t a simple process. However, many times it doesn’t have to be as complex, wrought with error, confusion, or disarray as there often is in the United States. Let’s step back and focus on one particular set of trains, the four particular trains that leave form King Street Station in Seattle, Washington on an almost daily basis.

  1. Sound Transit Sounder – [Stations] [Fares] [Wikipedia] This is a commuter route that has two lines:
    1. North Line – Seattle to Everett.
    2. South Line – Seattle to Tacoma, then onward to Lakewood.
  2. Amtrak Cascades – [Wikipedia] Seattle is one of the major stops on the Cascades route, which starts in Eugene down in Oregon and traverses all the way into Canada to Vancouver.
  3. Amtrak Empire Build – [Wikipedia] This is one of the two Superliner cross country overnight trains that leaves Seattle, connects with a sister train in Spokane everyday from Portland, and then combines and travels all the way to Chicago!
  4. Amtrak Coast Starlight – [Wikipedia] This is one of the other Superliner cross country overnight trains. It departs from Seattle, travels south with a number of stops and eventually ends in Los Angeles.

These four trains use specific train equipment with a particular accommodations for ticket sales.

One of the Amtrak Superliner Coach Car's seating layout.
One of the Amtrak Superliner Coach Car’s seating layout. (Images found here)

The Sounder provides tickets via the Sound Transit System in the area, which is a relatively cheap, non-reserved seat, heavily used train. Often there’s standing room only. It’s one of those things, that if one could purchase a ticket and know if they’re getting a seat, or if the train is full or not, that would encourage or discourage use accordingly. Currently, you buy a ticket and just get on. Rarely are they even checked, there is no gated entry, it’s basically a free for all.

The Amtrak Cascades are a reserved seat system. You purchase a ticket with the contract agreement that you will be provided a seat – either business class or regular – upon boarding. Emphasis on upon boarding as this can cause great confusion when entering the station and attempting to determine how to pick up these seat assignments even though you’ve already purchased a ticket. It adds time to boarding, requires the train sits waiting longer, and passengers have to arrive much earlier than the train departure. Albeit, just for context this earlier arrival (~20-30 minutes before) is nothing compared to the horrors of airports (2 hour suggested arrival before departure), it’s still unnecessary if modern systems were used to provide a streamlined and more efficient boarding process.

Amtrak Empire Builder
Amtrak Empire Builder

The Amtrak Empire Builder and Coast Starlight are currently an interesting mix. Both trains have sleeping accommodations that give a reserved room number before boarding. A very efficient process indeed, something to aim for. Since one knows the car number and room number, one could theoretically just board without even being guided. The rest of the seats however, some 200-300 or more of them depending on the train, are reserved seats albeit one doesn’t receive the seat assignment until they arrive at the station. Again, causing unnecessary chaos.

The Problem(s): Technology Deeper Dive

Problem: Passenger Navigation to Seat Reservation

Amtrak Cascades Bistro
Amtrak Cascades Bistro

Every single one of the trains listed above: Amtrak Empire Builder, Amtrak Coast Starlight, Amtrak Cascades, and Sound Transit Sounder all have some similar characteristics that would make it cheap and relatively easy to implement a ticketing and seat reservation system. In all of the train equipment, whether Sounder Bombardier, Superliner, or Talgo Amtrak Cascades there are seat numbers and car numbers. This provides us a core basis in which to work, to make all of this processing much easier.

At each station where these trains stop, each car of each train stops at a particular point – or could be made to stop at a particular point – at each station. The Sounder trains for example all have floor mats at the station that read “Welcome Aboard”! This is another element we could use to navigate a particular seat reservation. Automating the process of not just assigning a seat, but providing the information on each ticket for where and exactly when each passenger should arrive at a particular point at the station.

Since the cars and stations all have known characteristics about where to be, where the train will arrive and depart from, and what car number and door position is at this can all be automated per train. This is a repeatable process. Something that easily meets the exact definition of why we build computer systems and automate things with computer systems!

Problem: Equipment Changes, Modifiable Trains

Sometimes I’ve had conversations with what might change within the system. Almost all changes with a rail system are very known. From a disaster all the way to a simple everyday equipment change. For example, the train arriving may have an extra coach car or sleeper car on the Coast Starlight for some reason. Since we can build a system to model around the specific vehicles, and the vehicles numbers on a train can easily be set these changes can extrapolate out to tickets so they can be accurately assigned by a computer the day of. Changing equipment may take multiple minutes in the rail yard, but in the computer it’s a few keystrokes and it’s done. All tickets re-assigned, everything rebalanced, it’s almost as magical as a distributed database.

Problem: Common Concurrency, Purchasing, and Related Issues

There are also a number of issues a proper ticketing and reservation system would have to cover, such as managing for multiple people attempting to buy the same seat at relatively the same time. A locking and concurrency mechanism will be needed, something that’s been solved before, so appropriate planning around this will solve the issue.

There are of course timing issues too, once a ticket is locked, eventing within the system should unlock it appropriately. These event based timers will be an interesting challenge too. Solved already, but fun that they’ll need solved again specifically for this system!

Problem: Or Feature “See a Mountain”?

Aerial view of mount Rainier
Aerial view of mount Rainier

Some other things I’ve pondered include, the selling of some seats as choice preferences. For example, for the Empire Builder, Coast Starlight, and Cascades trains each have specific views that are easier or harder to see depending on the side of the train the accommodations are for. An example, if you’re facing west on the Coast Starlight you get all of the ocean views in southern California. If you’re on the east side, you get views of all the mountains like Rainier (see above picture!) and even Shasta if there is a full moon. Depending on these views and related characteristics, I’d happily pay a few bucks more to ensure I get a specific assignment or get to pick a specific assignment, so why not offer the ability to choose the seat for a specific fare?

IMG_4090-XL
The Puget Sound, traveling north out of Seattle on the Amtrak Cascades or Sound Transit Sounder north line.

Summary  & Next Steps

Summary – This is post one of many about the very distributed nature of purchasing tickets for one of the trains into and out of the city. As comparison with my todo app, this will definitely provide a very real world application option indeed! As soon as I wrap up the initial todo app samples, just to get started and provide details on how to get started I’m going to move on to building a real, real world applications sample, so real that it could be implemented by Sound Transit, Brightline, Virgin Rail, SNCF’s TGV, Germany’s ICE, or even good ole’ Amtrak here in the United States.

Next Steps – Next up I’m going to finish up the todo applications, with the notion that they provide some starting points for people but also for this more complex real world application. I’ll also add some more details and thoughts, and would love to converse, discuss, contributions, or co-hack on this project. Maybe you’ll join me, onward, and may you enjoy this flanged wheel ride and code slinging adventure!

The State of JavaScript Frameworks 2019 – A researched list of the top 5 appearing…

Quick SITREP

(If you just want the meat of the tech, skip to the “Framework Cookoff” or “Summary and Victor” section.)

I’m on vacation, so of course I’m writing and reviewing frameworks to write some code!

🤓

An Aside: Hey, if you’re on vacation and not writing code, more power to you. Gotta keep work and one’s personal life separated in as healthy a way as possible! My life however involves a lot of hobbies, that I could do professionally, and one of those hobbies I do indeed do professionally: write code, design, and understand domains to build and implement solutions for people and organizations of people!

coffee.jpgA few days ago I decided I’d create a few reference applications for development against distributed databases systems, using Apache Cassandra and whatever else. That left one of the first priorities to figure out what UI framework to use, or which to not use, or to use any at all. The application I intend to build out is a simple todo list, so nothing extravagant.

I sat down at Ballard Coffee Works today (that’s part of Seattle Coffee Works, they’re pretty rad) to get started on this. The first thing I needed to do however was figure out what the hell is going on with the state of the user interface realm of the JavaScript universe. What are my options? Are we all still just going on about React? Is Angular dead yet? What’s this Vue.js dealio? Does everybody use Bootstrap underneath all of this or has something else taken over?

Framework Cookoff!

Fire up the cooker, time to rip these frameworks apart and see which one is going to be the choice option for this task!

Bootstrap

URI: https://getbootstrap.com/

Project Self-Description:

Build responsive, mobile-first projects on the web with the world’s most popular front-end component library.

Bootstrap is an open source toolkit for developing with HTML, CSS, and JS. Quickly prototype your ideas or build your entire app with our Sass variables and mixins, responsive grid system, extensive prebuilt components, and powerful plugins built on jQuery.

My 2 Cents:

As I read the description on the website I immediately noticed “jQuery” and couldn’t recall how many times I’ve heard it’s dead, have had people swear one ought not to use it today, and generally infer to not use it. That leaves me curious if Bootstrap really is the most common open source toolkit for developing sites these days. Is it still, or is it not?

The description overall is accurate however. It is indeed focused around providing an interface design standard with HTML, CSS, and JS (JavaScript). Being able to prototype with it is very fast, and setup is something that can be done quickly even with manual setup. In other words, I don’t have to do a magical npm install and hope that everything just sets itself up. That’s a plus in my book, since I like to know the actual working parts of what I intend to and want to use.

There’s also a ton of themes for Bootstrap which I’m always excited about. Anyway to get something I can reskin with ease is a huge plus one in my book. I keep interfaces pretty simple, and aim to keep user experience uncluttered, so being able to reskin an application quickly is always like an ice cream treat on a hot day for me.

I years ago had gone through the Bootstrap introduction and reviewed it again for this article. It appears that it is as straight forward and barebones as it was years ago. Another plus in my book. Of course, going through the docs and getting some quality RTFM time in, I noticed of course there’s a npm install bootstrap these days. Why wouldn’t there be! The beautiful thing too however was that I ran this (cuz’ of course I have node.js and npm installed!) and it setup things in a pretty standard way which I immediately understood from past experience. That builds confidence that the framework and such has been consistent over the years. Another advantage for what I’m aiming to do.

This framework might just be the simple thing I’m looking for. Maybe combined with Backbone (keep reading, I did indeed stumble back into good ole’ Backbone). I’ll revamp and return after a review of all the frameworks I dig up, with a victor and the end of this post!

Angular

URI: https://angular.io/

Project Self-Description:

Angular is a platform and framework for building client applications in HTML and TypeScript. Angular is written in TypeScript. It implements core and optional functionality as a set of TypeScript libraries that you import into your apps.

The basic building blocks of an Angular application are NgModules, which provide a compilation context for components. NgModules collect related code into functional sets; an Angular app is defined by a set of NgModules. An app always…

…ok, enough of that. Basically the Angular website seemed to not have a concise description anywhere and instead just leapt directly into descriptions of the architecture. Maybe I missed it, maybe I didn’t. Upon review however that was enough to lead me down a path of determination.

I was going to review it more significantly, but see the greenfield use and existing application use has kind of just trailed off into something about turtles all the way down. With that, and the fact I’ve still never used it – somehow I managed to entirely skip that phase of JavaScript trends – I’ll just leave it were it rests and move on. It appears, based on trends, you may want to do the same thing.

React

URI: https://reactjs.org/

Project Self-Description:

Declarative – React makes it painless to create interactive UIs. Design simple views for each state in your application, and React will efficiently update and render just the right components when your data changes.

Declarative views make your code more predictable and easier to debug.

Component-Based – Build encapsulated components that manage their own state, then compose them to make complex UIs.

Since component logic is written in JavaScript instead of templates, you can easily pass rich data through your app and keep state out of the DOM.

Learn Once, Write Anywhere – We don’t make assumptions about the rest of your technology stack, so you can develop new features in React without rewriting existing code.

React can also render on the server using Node and power mobile apps using React Native.

My 2 Cents:

Ohhhhhh yeah, React is still ridiculously popular among polls and actual measured use on Github and other sites. But do I want to delve into this thing? Is it overkill for this project? Well, first off, since usage is one of the very important criteria since I hope it will continue to be a standard for some time, React is one of the top options. But I needed to research and confirm. After just a few minutes, reading about a half dozen articles it was easy to assume React still held a mantle among the most popular frameworks. It appeared with numerous tutorials and in almost every recent frameworks post listing out popular frameworks! So that’s a good start.

I do like various characteristics of React but getting it put into place on projects, it just seems like it starts to add a lot of unnecessary complexity for simple apps (especially like a todo app). In addition, it does require additional knowledge and routine RTFMing beyond the standard HTML, CSS, and JavaScript RTFMing. This could mean I might end up hacking through docs more than I do contributing to and putting together the actual todo application. But that user base which is familiar with this beast of a framework is wildly huge! So maybe I just buck up and put in the effort?

I do like a number of things about the framework, and even though I’m definitely all about the data, maybe this framework would be worth the investment in time to get better at? I checked out the tutorial in an effort to help make a decision.

Frist off, I gotta say I’m not a big fan of – and I know the industry is currently against me it appears – of have XML and HTML jammed inside of JavaScript that’s jammed inside of HTML pages that’s jammed in well… ugh. So it’s turtles all the way down. This basically looks like Classic ASP (Wikipedia) from Microsoft circa 1998, which is NOT hyperbole! Ok, enough of that point.

As I worked through the tutorial you’ve got things like a class, with HTML shoved into it like this.


class ShoppingList extends React.Component {
render() {
return (
<div className="shopping-list">
<h1>Shopping List for {this.props.name}</h1>
<ul>
<li>Instagram</li>
<li>WhatsApp</li>
<li>Oculus</li>
</ul>
</div>
);
}
}

view raw

reactStuff.js

hosted with ❤ by GitHub

Now, this might be some hellish looking merging of code and configuration but it is kind of slick too. You’ve got the object, in a way kind of instantiated in this code that is referred to as a class. Based on object oriented paradigms it’s like it’s a class that is also the actual instantiated object. It kind of simplifies creating a component. Component being the connector word describing both the class and object instantiation here, which by association is in reference to this being something that is rendered. Ok, all that sounds kind of complex, because conceptually it kind of is, but when you cram through enough examples it eventually starts to make sense.

If my description and comparison to object oriented programming paradigms is confusing, here’s the tutorial doc’s description, “Here, ShoppingList is a React component class, or React component type. A component takes in parameters, called props (short for “properties”), and returns a hierarchy of views to display via the render method.”

I’m not sure that helps more, being there are redundant descriptions of React component type and React component class being referenced as the same thing. Not sure I’d have made a naming decision like that, but one has to manage these definitions mentally when using a framework like this.

Weighing these types of things I’ll need to learn and also re-remember, and the factor of large and continued industry use, React is one of the top options. But still not the clear victor at this point. Let’s move on and knock a out some more reviews.

Backbone

URI: http://backbonejs.org/

Project Self-Description:

Backbone.js gives structure to web applications by providing models with key-value binding and custom events, collections with a rich API of enumerable functions, views with declarative event handling, and connects it all to your existing API over a RESTful JSON interface.

My 2 Cents:

Wow. I haven’t heard a peep out of anybody using Backbone since…

…recollecting memories… whiiir buzz bing pop pop, blurp, ping, ping pong.

…sometime around 2010! Wow, I can’t even believe it’s still in use but will add. It was solid then and it’s still rather solid now! I dig backbone. I however gotta say, among all my Google Kung-fu I found this an interesting one to come up as a result among user interface reviews and such. It’s not really user interface focused, it’s model based binding with events and such. Per the description from the site itself! I gave it a good ring in testing, but let’s move on to the last framework I gave a shot at today.

Vue

URI: https://vuejs.org/

Project Self-Description:

Vue (pronounced /vjuː/, like view) is a progressive framework for building user interfaces. Unlike other monolithic frameworks, Vue is designed from the ground up to be incrementally adoptable. The core library is focused on the view layer only, and is easy to pick up and integrate with other libraries or existing projects. On the other hand, Vue is also perfectly capable of powering sophisticated Single-Page Applications when used in combination with modern tooling and supporting libraries.

This seems to be the hot, and I mean like blue flame hot, framework to build a user interface in JavaScript these days. It’s actually blown past React even! I guess my 1.5 year old JavaScript knowledge is just out of step at this point! Err ma gawd! I took a dive into it and made some rather pleasant discoveries!

First thing I ended up strolling through was the modern tooling and supporting libraries links:

First off I noticed jQuery had reared up in this framework too. Funny, it keeps popping back up even in spite of being declared dead by a lot of people. But meh. Check it out, I clicked on this one component and it immediately gave me some insight to how I might build an application with vue.js.

This data table component HTML looks like this. I got a little excited that it was just plain old HTML.


<datatable>
<datatable-column id="datatable-column-1" label="Column 1"></datatable-column>
<datatable-column id="datatable-column-2" label="Column 2"></datatable-column>
<datatable-column id="datatable-column-3" label="Column 3"></datatable-column>
</datatable>

Then for this type fo datatable the docs show I could create a viewmodel being wired up to bind the data into a datatable.


new Vue({
el: "#app",
data: function() {
return {
columns: [
{ id: "datatable-column-1", label: "Column 1" },
{ id: "datatable-column-2", label: "Column 2" },
{ id: "datatable-column-3", label: "Column 3" }
]
};
}
});

view raw

vuehello.js

hosted with ❤ by GitHub

Then the datatable itself.


<datatable>
<datatable-column v-for="column in columns" :id="column.id" :label="column.label"></datatable-column>
</datatable>

Ok, now that’s what I’m talking about! A viewmodel, prospectively a model somewhere binds it or populates it via a push. I didn’t know where it might be, but just looking at this it looked clean with a clearer separation of concerns than some of the other libraries, and no injected HTML or other magic black box type stuff like React has.

Nice, very nice. I liked this so much, I can see where the popularity is coming from, so research continued. Next I stepped into the introduction on the vue.js site. The first bit of code shown was this Hello.vue.


<script src="https://unpkg.com/vue"></script&gt;
<div id="app">
<p id="hello">{{ message }} hi!</p>
</div>
<script>
new Vue({
el: '#app',
data: {
message: 'Hello Vue.js!'
}
})
</script>
<style>
#hello {
font-family: 'Avenir', Helvetica, Arial, sans-serif;
text-align: center;
color: #2c3e50;
margin-top: 60px;
}
</style>

view raw

DatHello.vue

hosted with ❤ by GitHub

There was the start, with template. It seemed self-explanatory with an HTML tag “p” being the singular element, with {{ and {} signifying what I assumed is a bindable value or variable. Then below the script was clearly some JavaScript code embedded in this “DatHello.vue” file thing. Clearly readable as module providing export of JavaScript that has data, returning via function the string “Hello”. Alright, this seems pretty absurdly simple. Is it really a full framework? Finally wrapping up this file is some CSS. It doesn’t even particularly seem relevant, except that I’m guessing the CSS encapsulates the characteristics that detail how this thing should appear once it is rendered, however that happens!

Alright, that does it. It’s time to decide.

Summary and Victor

I looked at a number of articles on usage (all referenced articles are below). One can see from data collected  React climbing and climbing. Never blowing up as popular as Backbone, but upon closer inspection that doesn’t really measure exactly how many uses or users, just that it gained popularity along these curves. Vue.js on the other hand is crawling along, but based on other data, has blown up this last year or so.

Knowing this the usage and prospective usage of these frameworks put Vue.js and React in the lead. After my re-review of the tutorials and where these frameworks are today, both were still really close to victory, even though I had a happier time dealing with Vue.js and it’s respective components. One other underlying thought sitting in my mind was the fact React was just kind of a bulkier beast in concepts, and burdened with it’s history with its origins in Facebook. Not that I really cared, nor was concerned all that much. But just seeing Facebook involved as the origin and knowing it’s seedy method of extracting profits just seems like a nasty mess of spilled sour, nasty smelling, bad food sitting amid the floor while your lovely dog licks away at it. Ok, that was graphic, my apologies, but just leaving that idea with you for serious consideration.

There was one more combination that might bring things together, and that’s the ease of pulling bootstrap into things and then apply Vue.js or React together with bootstrap or something similar. What exactly was the underlying method in which these user interface frameworks could be reskinned? I did a quick follow up ~20 minutes exploration of skinning React components, what existing skins are out there, looked at some purchasable themes, looked at the same for Vue.js. One site of note that I kind of dug was vuetifyjs. It looked nice. But here I was stuck in deciding between these two.

The tie breaker was simple however. I did a search for todo apps with each respective framework, “vue.js todo apps” and “react todo apps” respectively. I reviewed them and both seemed to have plusses and minuses but then one app on Codepen, as simple as it is, sold me on a choice.

The victor is Vue.js. Just look at that Codepen sample, and look at the docs sample! It’s clean, simple, and minimalistic but the interaction is just smooth. Sure, React samples were decent, but it just seemed like more work, especially as a project grows. I’m sure some of my past history in industry has biased me somewhat, but Vue.js will work out just fine for me from the looks of things. Unless someone in this last minute can convince me otherwise! If you think I’ve made a horrible mistake, please tweet a twit at me on the twitters @Adron or message me via the ole’ contact form!

Resources – Past Articles I’ve Written:

Resources – Framework Articles: