Coding, WTF Twitter, Twitch FTW, Getting Shit Done, Twitch Hacks, Tips, Tricks, and One Excellent Jazz Influenced Tune

WTF Twitter

I’ve been doing a lot more coding, thanks largely to the discipline that Twitch has brought to my day. It seems almost surprising to me at this point because Twitch started similarly to the way Twitter did for me. You see, I thought at first Twitter was the dumbest thing that had happened in ages. Arguably, it’s come full circle and I kind of feel the same thing about Twitter now, but during the middle decade in between that (yes, Twitter is over 10 years old!) Twitter has brought me connection, opportunities, and so much more. I couldn’t have imagined a lot of what I’ve been able to pull together because of Twitter. It’s still useful in many ways for this, albeit I like all of us are at risk of suffering the idiocy of today’s politics and political cronies, and the dog piling trash pile that follows them onto Twitter.

I’m not leaving Twitter any time soon but I’ve definitely put in on a very short leash, and limited what impact it does or doesn’t have in my day to day flow.

Twitch FTW

Amazingly however a new social and productive tool, not that it intended both, has come into being. Coding on Twitch. Don’t get me wrong I game, I just don’t game socially or on Twitch, what I do is code on Twitch. With a fair dose of hacking, breaking things, and then figuring out how to make them work. All at the same time I along with others have created a pretty excellent developers community there on Twitch. It seems to be growing all the time too. Twitch, at this point has become a focal point that has the benefits without all the annoying garbage that Twitter does these days, while adding the vast and hugely important fact that I can do things, be productive, chit chat, and generally get shit done all while I’m Twitch streaming.

VidStreamHacking

@ https://github.com/Adron/VidStreamHacking

With that, let’s talk about some of the recent notes and information I’ve been working on putting together to make Twitch even more useful. My first motive with this was to keep track of all the things I was doing, hardware I was putting together, and related things, but then another purpose grew out of all this note taking. It became obvious that this repository of information could be useful for other people. Here’s a survey of the things that I’ve added so far, hope they’re helpful to those of you digging into streaming out there!

I added some badges to identify various elements of information about the repo in the README.md.

badges

Is it maintained, yup, contributors, so far just me, zero issues filed but please feel free to add an issue or two, markdown yup, and there is indeed a Trello Board! The Trello Board is a key to insight, inspection, and what I’ve got going on in a number of my repositories. It’s where I’m keeping track of all the projects, what’s next, and what’s up in queue for the blog (this one right here). At least, in the context of the big code heavy or video reviews of sessions with code, extra commentary, and related content. If you want to get involved in any of the repos just let me know and I’m happy to walk through whatever and even get you added to the Trello board so we can work together on code.

Streaming Gear

https://github.com/Adron/VidStreamHacking/blob/master/hardware.md

My main machine is now a Dell XPS 15, which I fought through to get Linux running on it, and now that I have it’s been an absolutely stellar machine. I’ve also added additional monitor & port replicator/docking station gear to get it even more usable. The actual page I’ve got the details listed on are in the repo on the Dell XPS 15 item on the hardware page.

Along with the XPS 15 I wrote up coverage of the unboxing via video and blog entry. After a few weeks I also wrote up the conflict I had getting Linux running and removing Windows 10. In addition to the XPS 15 though I do use a MacBook from 2015 as my primary Mac machine, with an iMac from 2013 available as backup. Both machines are still resoundingly solid and performant enough to get the job done. Rounding out my fleet of machines is a Dell XPS 13 (covered here and here with the re-review).

For screens I have one at my office and one at home. They’re almost the same thing, ultra-widescreen monitors, curved displays, running 3880-1440 resolution from LG. These make keeping an eye on chat, OBS, and all sorts of other monitoring while coding, gaming, or whatever a breeze!

shotone.png
Ex 1: Just viewing a giant OBS view to get everything sorted out before starting a stream.
shottwo
Ex 2: OBS w/ VM running w/ Twitch chat, dashboard etc to the right. This way I can work, see the stream, and see chat and such all at the same time.

The docking stations and/or port replicators, whatever one calls these things these days also bring all of this tech together for me. There’s a couple I have tried and retired already (unfortunately, cuz dammit that cost some money!) and others that I use in some scenarios and others I use in others.

My main docking station contraption, shout out to James & others suggestion the Caldigit TS3. I got to this docking station through the Dell TB16 which for Linux, and kind of for Windows, is an unstable mess. Awesome potential if it worked, but it doesn’t so I tried out this USB-C pluggable option (in the tweet) which had HDMI that was unfortunately limited in resolution. Having a wide screen made this – albeit it being super compatible with Linux – unusable too. So I finally upgraded to the Caldigit TS3 and WOW, the Caldigit is super seriously wickedly bad ass. Extra USB-C ports, USB 2/3 ports, power, and more all rolled into one. It even supplies some power to the laptop, however I keep it plugged in since it’s kind of a power hog when the processor start chomping!

After trying out this USB-C pluggable (the tweet) I got the CalDigit into play. It’s really really good, here’s a shot of that from various angles with the extensive cables that I don’t have to plug into my laptop anymore. Out of this also runs a 28 port USB powered hub too, no picture, but just know I’ve got a crazy number of devices I routinely like to use!

That’s my main configuration when using the ultra widescreens and all. Good setup there, very usable, and the 32GB of memory in the laptop really get put to use in this regard. As for storage, that’s another thing. I’ve got 1 TB in my laptop but another 1 TB in a USB-C Thunderbolt Samsung Drive which is practically as fast for most things. So much so I attach it via the TS3 via USB-C and it’s screaming fast and adds that extra storage. So far, primarily I’ve been using it to store all of my virtual machines or use it as video storage while I do edits.

There’s other gear too, check out the list, like the Rode Podcoster and other things. But that gear I’ll elaborate on some other time.

Meetup Streaming Gear

https://github.com/Adron/VidStreamHacking/blob/master/meetup-streaming-kit-gear.md

Another effort I’ve undertaken is recording meetups. To do this one needs to be able to stream things with several screens combined – i.e. picture in picture and all. To do this, one needs a camera that can focus on the speaker, ideally at least 1080p with at least some ability to work in less than ideal light. Then next to that, a splitter and capture card to get the slides! Once all those pieces come together, with a little OBS finesse one can get a pretty solid single pass recording of a meetup. An example of one of my better attempts was the last meetup “Does the Cloud Kill Open Source” with Richard Seroter. If you take a look at past talks in the Meetups Playlist you can see my iterative progress from one meetup to another!

Here’s the specific gear I’m using to get this done. At least, so far, and if and when it becomes financially reasonable I might upgrade some of the gear. It largely depends on what I can get more use out of beyond just streaming meetups.

Cords and Splitter – I picked up a selection of lengths and types so that I’d have wiring options for the particular environments the meetups would be located in. Generally speaking 25ft seems to be a safe maximum for HDMI. I’ve been meaning to check out the actual specifications on it but for now it’s more than enough regardless.81fhh-w-DeL._SX679_

The splitter wasn’t expensive at all ($16.99), and kind of surprised me considering the costs of the cables. Picture to the right, or above, or somewhere depending on mobile layout.

I needed capture cards for this, one for the line out of the splitter that would capture the slides. The first I had picked up based on suggestions focusing around quality and that was the Avermedia Extreme Cap HDMI to USB 3 Capture Card. It’s really solid for higher resolution and related capabilities. For the USB 3.0 HDMI HD Game Video Capture Card I picked it up based on price (it’s almost a 1/3rd of the price) but not particular focused on quality. However, now that I’ve used both they are capable and seem fine, so I might have been able to just buy two of the cheaper options.

The camera, ideally, I’d have a much higher quality one but the Canon VIXIA HF R800 Camcorder has actually worked excellently. A little less feature rich for audio out and related things, but it zooms in good and can record at the same time I’m getting the cam feed into the stream. So it’s always a nice way to have a backup of the talk.

The last, and one of the most important aspects is getting good audio.

Streaming Meetups

https://github.com/Adron/VidStreamHacking/blob/master/meetup.md

At first thought, I made the mistake that just the gear would be enough but holy smokes there were about a million other things I needed to write. I created meetup.md to get the list going.

Jazz Influence Amidst the Heaviness!

As promised. Some music, not actually jazz, but heavily influenced by some jazz, progressive instrumentation, and esoteric, expansive, exquisite playing skills by the band. As always, be prepared. My music referrals aren’t always gentle! Happy code streaming!

Adding and Returning Value to the Community via Twitter, LinkedIn, and Twitch

Twitter-512Twitter

Goal: Grow our follower count and reach, entertain, laugh, make hot takes – as one does on Twitter, educate, and get value out of it ourselves.

Don’t!

  • Don’t buy followers (i.e. don’t pay anybody that promises X followers, market share, or whatever it is they’re selling). We can’t trust this method as it’s often just a pile of Russian bots or other garbage followers. This does nothing to increase visibility and penetration to those that want, are interested in, or need to communicate with us (i.e. customers and fans).
  • Do not just repost things via RT or use tooling to just post arbitrary things. People notice this and won’t follow or will unfollow you. It’s a sure fire way to be blacklisted as *marketing* which will involve going to zero eyeballs, even when the account statistics keep showing people see it.
  • Do not post identical or similar content one tweet after another. i.e. Don’t post a marketing blurb with one image, then post another marketing blurb with another image that’s exactly the same size, theme, and fill up the entire tweet stream this way. The followers you get will not be active, will not be who you actually want to speak with or interact with, and don’t really add value over time if this is all that is done. It’s similar to those blog theft sites that just re-post the exact RSS stream and then, by proxy, get blacklisted and erased from Google/search results.

Do!

  • Just make it about you. Grow your personal brand first and foremost. Such as “Dern this is a wicked awesome band.” or “Wow, best burger in the world” and add pictures, content, and other interesting things for people. It doesn’t have to be “I just cured all diseases yo, check me out!” you can, and people will follow based on honesty, integrity, taking a stance, being informative, and providing useful information of all sorts. But more than anything they’ll follow the person not any specific *thing* you’re selling, pushing or what not. So be yourself, share, and be involved with the network you create.
  • Build things you’re interested in, especially when they’re related in some way to products and services you like to use and find interesting – i.e. Apache Cassandra, DSE, Databases, Application Development, etc. Build on these things via threading, via initiating discussion with others that are discussing these things, and among all this find valuable fellow Twitterers that you want to be connected to. This helps all involved, you, your network, the company, the people and companies you connect to, and more. Bringing the network wide with an on point effort around topics will dramatically increase your collective opportunity but also anything and everybody around you.
  • When retweeting, intersperse it among other things, and happily add content to RT’s. In other words don’t just make it endless retweets, but just throw in a few retweets for things you’re interested in or support, and then have your regular stream of tweets, links, and other content.
  • Use emoticons, use pictures, and definitely blurt memes out there. Aim to have fun with Twitter.

Examples of good Twitterers that really provide high value to followers, but also back to the Twitterer themselves in the way of speaking opportunities and all sorts of other things:

LinkedIn-512LinkedIn

Goal: Build an extensive professional network and return value to the LinkedIn Network of connections you have.

Don’t!

  • Don’t use LinkedIn like Facebook. This is obvious but for some reason much the world doesn’t seem to get this, so it feels like it needs stated for the LOL’s. i.e. Don’t hit on people, don’t ask people out on dates, just talk business. Ideally leave politics out of it too.
  • Ideally, don’t send droves of InMail messages to people unless that’s specifically the game being played on LinkedIn. For more grassroots and non-marketing community focus, just interact with people directly, that you know, and don’t arbitrary chase down people you do NOT actually know. This is another thing that decreases authenticity, and makes an individual – even if not – appear like they’re shilling for something.

Do!

  • Post content regularly about what you’re working on, provide links, and provide respective researched content for other mediums you might have like Medium, a blog, Twitter, and all that jazz.
  • Talk about your professional achievements and whatever else that might come up related to your work, hobbies (pending some business relation or something you do/did professional, i.e. like the music you play, or other hobby of sorts). Sometimes hobbies count too, so put that content into rotation now and again too. But do remember, if it fits better on Facebook than LinkedIn, just don’t post it on LinkedIn.
  • Reach out if there is legitimate business that you are both involved in. Start that as a simple conversation, not a sale, not something pushy, just simple, friendly, curious conversation.

Examples of good LinkedIn Accounts, that use their accounts for benefit for themselves but also provide benefit directly or indirectly for all of us:

iconmonstr-twitch-5Twitch

Goal: Grow our follower count and increase our collective content and work material to show, teach, work, and hang out with viewers to build tomorrow’s best, most kick ass, wicked awesome applications, data science analysis, and more!

Don’t!

  • Not a whole lot here yet. Twitch is kind of wide open and not a lot of no no’s here. Don’t do illegal things is all I’ve got at the moment.

Do!

  • Setup your OBS or streaming process so that you have chat on screen, chat somewhere you can monitor it, code is clear and fonts are readable, you add all the interaction content you can for new follows (alerts), subscribes (alerts), and whatever else comes up.
  • When on stream, take your time, interact with people that follow, subscribe, or chat/whisper with you.
  • Don’t worry about making mistakes, just work through them, let the audience help if they offer it. Even if you know that they’re wrong, work through things with them and let them get involved. Then lead into the correct fix, etc. This is a great way to teach and build involvement on stream so that everybody gets a win, and you get an advocate to your own advocacy.
  • If you’re going to heavily curse or do anything even slightly liberal/conservative/religious/ideological etc it’s probably best to mark one’s stream as 13 or older (I think that’s the setting).

Some excellent Twitch streamers to reference for their involvement, OBS setup, configuration, and general awesomeness in the community.

That’s it for this post. Got more do’s or don’ts? Lemme know, will start a repo!

A little lagniappe for ya, that hygge feeling.

Twitz Coding Session in Go – Cobra + Viper CLI Wrap Up + Twitter Data Retrieval

Part 3 of 3 – Coding Session in Go – Cobra + Viper CLI for Parsing Text Files, Retrieval of Twitter Data, and Exports to various file formats.

UPDATED PARTS:

  1. Twitz Coding Session in Go – Cobra + Viper CLI for Parsing Text Files
  2. Twitz Coding Session in Go – Cobra + Viper CLI with Initial Twitter + Cassandra Installation
  3. Twitz Coding Session in Go – Cobra + Viper CLI Wrap Up + Twitter Data Retrieval (this post)

Updated links to each part will be posted at bottom of  this post when I publish them. For code, written walk through, and the like scroll down below the video and timestamps.

0:54 The thrashing introduction.
3:40 Getting started, with a recap of the previous sessions but I’ve not got the sound on so ignore this until 5:20.
5:20 I notice, and turn on the volume. Now I manage to get the recap, talking about some of the issues with the Twitter API. I step through setup of the app and getting the appropriate ID’s and such for the Twitter API Keys and Secrets.
9:12 I open up the code base, and review where the previous sessions got us to. Using Cobra w/ Go, parsing and refactoring that was previously done.
10:30 Here I talk about configuration again and the specifics of getting it setup for running the application.
12:50 Talking about Go’s fatal panic I was getting. The dependency reference to Github for the application was different than what is in application and don’t show the code that is actually executing. I show a quick fix and move on.
17:12 Back to the Twitter API use by using the go-twitter library. Here I review the issue and what the fix was for another issue I was having previous session with getting the active token! Thought the library handled it but that wasn’t the case!
19:26 Now I step through creating a function to get the active oath bearer token to use.
28:30 After deleting much of the code that doesn’t work from the last session, I go about writing the code around handling the retrieval of Twitter results for various passed in Twitter Accounts.

The bulk of the next section is where I work through a number of functions, a little refactoring, and answering some questions from the audience/Twitch Chat (working on a way to get it into the video!), fighting with some dependency tree issues, and a whole slew of silliness. Once that wraps up I get some things committed into the Github repo and wrap up the core functionality of the Twitz Application.

58:00 Reviewing some of the other examples in the go-twitter library repo. I also do a quick review of the other function calls form the library that take action against the Twitter API.
59:40 One of the PR’s I submitted to the project itself I review and merge into the repo that adds documentation and a build badge for the README.md.
1:02:48 Here I add some more information about the configuration settings to the README.md file.

1:05:48 The Twitz page is now updated: https://adron.github.io/twitz/
1:06:48 Setup of the continuous integration for the project on Travis CI itself: https://travis-ci.org/Adron/twitz
1:08:58 Setup fo the actual travis.yml file for Go. After this I go through a few stages of troubleshooting getitng the build going, with some white space in the ole’ yaml file and such. Including also, the famous casing issue! Ugh!
1:26:20 Here I start a wrap up of what is accomplished in this session.

NOTE: Yes, I realize I spaced and forgot the feature where I export it out to Apache Cassandra. Yes, I will indeed have a future stream where I build out the part that exports the responses to Apache Cassandra! So subcribe, stay tuned, and I’ll get that one done ASAP!!!

1:31:10 Further CI troubleshooting as one build is green and one build is yellow. More CI troubleshooting! Learn about the travis yaml here.
1:34:32 Finished, just the bad ass outtro now!

The Codez

In the previous posts I outlined two specific functions that were built out:

  • Part 1 – The config function for the twitz config command.
  • Part 2 – The parse function for the twitz parse command.

In this post I focused on updating both of these and adding additional functions for the bearer token retrieval for auth and ident against the Twitter API and other functionality. Let’s take a look at what the functions looked like and read like after this last session wrap up.

The config command basically ended up being 5 lines of fmt.Printf functions to print out pertinent configuration values and environment variables that are needed for the CLI to be used.


var configCmd = &cobra.Command{
Use: "config",
Short: "A brief description of your command",
Long: `A longer description that spans multiple lines and likely contains examples
and usage of using your command. For the custom example:
Cobra is a CLI library for Go that empowers applications.
This application is a tool to generate the needed files
to quickly create a Cobra application.`,
Run: func(cmd *cobra.Command, args []string) {
fmt.Printf("Twitterers File: %s\n", viper.GetString("file"))
fmt.Printf("Export File: %s\n", viper.GetString("fileExport"))
fmt.Printf("Export Format: %s\n", viper.GetString("fileFormat"))
fmt.Printf("Consumer API Key: %s\n", viper.GetString("consumer_api_key")[0:6])
fmt.Printf("Consumer API Secret: %s\n", viper.GetString("consumer_api_secret")[0:6])
},
}

view raw

config.go

hosted with ❤ by GitHub

The parse command was a small bit changed. A fair amount of the functionality I refactored out to the buildTwitterList() and exportFile, and rebuildForExport functions. The buildTwitterList() I put in the helper.go file, which I’ll cover a littler later. But in this file, which could still use some refactoring which I’ll get to, I have several pieces of functionality; the export to formats functions, and the if else if logic of the exportParsedTwitterList function.


var parseCmd = &cobra.Command{
Use: "parse",
Short: "This command will extract the Twitter Accounts form a text file.",
Long: `This command will extract the Twitter Accounts and clean up or disregard other characters
or text around the twitter accounts to create a simple, clean, Twitter Accounts only list.`,
Run: func(cmd *cobra.Command, args []string) {
completedTwittererList := buildTwitterList()
fmt.Println(completedTwittererList)
if viper.Get("fileExport") != nil {
exportParsedTwitterList(viper.GetString("fileExport"), viper.GetString("fileFormat"), completedTwittererList)
}
},
}
func exportParsedTwitterList(exportFilename string, exportFormat string, twittererList []string) {
if exportFormat == "txt" {
exportTxt(exportFilename, twittererList, exportFormat)
} else if exportFormat == "json" {
exportJson(exportFilename, twittererList, exportFormat)
} else if exportFormat == "xml" {
exportXml(exportFilename, twittererList, exportFormat)
} else if exportFormat == "csv" {
exportCsv(exportFilename, twittererList, exportFormat)
} else {
fmt.Println("Export type unsupported.")
}
}
func exportXml(exportFilename string, twittererList []string, exportFormat string) {
fmt.Printf("Starting xml export to %s.", exportFilename)
xmlContent, err := xml.Marshal(twittererList)
check(err)
header := xml.Header
collectedContent := header + string(xmlContent)
exportFile(collectedContent, exportFilename+"."+exportFormat)
}
func exportCsv(exportFilename string, twittererList []string, exportFormat string) {
fmt.Printf("Starting txt export to %s.", exportFilename)
collectedContent := rebuildForExport(twittererList, ",")
exportFile(collectedContent, exportFilename+"."+exportFormat)
}
func exportTxt(exportFilename string, twittererList []string, exportFormat string) {
fmt.Printf("Starting %s export to %s.", exportFormat, exportFilename)
collectedContent := rebuildForExport(twittererList, "\n")
exportFile(collectedContent, exportFilename+"."+exportFormat)
}
func exportJson(exportFilename string, twittererList []string, exportFormat string) {
fmt.Printf("Starting %s export to %s.", exportFormat, exportFilename)
collectedContent := collectContent(twittererList)
exportFile(string(collectedContent), exportFilename+"."+exportFormat)
}
func collectContent(twittererList []string) []byte {
collectedContent, err := json.Marshal(twittererList)
check(err)
return collectedContent
}
func rebuildForExport(twittererList []string, concat string) string {
var collectedContent string
for _, twitterAccount := range twittererList {
collectedContent = collectedContent + concat + twitterAccount
}
if concat == "," {
collectedContent = strings.TrimLeft(collectedContent, concat)
}
return collectedContent
}
func exportFile(collectedContent string, exportFile string) {
contentBytes := []byte(collectedContent)
err := ioutil.WriteFile(exportFile, contentBytes, 0644)
check(err)
}

view raw

parse.go

hosted with ❤ by GitHub

Next up after parse, it seems fitting to cover the helpers.go file code. First I have the check function, which simply wraps the routinely copied error handling code snippet. Check out the file directly for that. Then below that I have the buildTwitterList() function which gets the config setting for the file name to open to parse for Twitter accounts. Then the code reads the file, splits the results of the text file into fields, then steps through and parses out the Twitter accounts. This is done with a REGEX (I know I know now I have two problems, but hey, this is super simple!). It basically finds fields that start with an @ and then verifies the alphanumeric nature, combined with a possible underscore, that then remove unnecessary characters on those fields. Wrapping all that up by putting the fields into a string/slice array and returning that string array to the calling code.


func buildTwitterList() []string {
theFile := viper.GetString("file")
theTwitterers, err := ioutil.ReadFile(theFile)
check(err)
stringTwitterers := string(theTwitterers[:])
splitFields := strings.Fields(stringTwitterers)
var completedTwittererList []string
for _, aField := range splitFields {
if strings.HasPrefix(aField, "@") && aField != "@" {
reg, _ := regexp.Compile("[^a-zA-Z0-9_@]")
processedString := reg.ReplaceAllString(aField, "")
completedTwittererList = append(completedTwittererList, processedString)
}
}
return completedTwittererList
}

The next function in the Helpers.go file is the getBearerToken function. This was a tricky bit of code. This function takes in the consumer key and secret from the Twitter app (check out the video at 5:20 for where to set it up). It returns a string and error, empty string if there’s an error, as shown below.

The code starts out with establishing a POST request against the Twitter API, asking for a token and passing the client credentials. Catches an error if that doesn’t work out, but if it can the code then sets up the b64Token variable with the standard encoding functionality when it receives the token string byte array ( lines 9 and 10). After that the request then has the header built based on the needed authoriztaion and content-type properties (properties, values? I don’t recall what spec calls these), then the request is made with http.DefaultClient.Do(req). The response is returned, or error and empty response (or nil? I didn’t check the exact function signature logic). Next up is the defer to ensure the response is closed when everything is done.

Next up the JSON result is parsed (unmarshalled) into the v struct which I now realize as I write this I probably ought to rename to something that isn’t a single letter. But it works for now, and v has the pertinent AccessToken variable which is then returned.


func getBearerToken(consumerKey, consumerSecret string) (string, error) {
req, err := http.NewRequest("POST", "https://api.twitter.com/oauth2/token",
strings.NewReader("grant_type=client_credentials"))
if err != nil {
return "", fmt.Errorf("cannot create /token request: %+v", err)
}
b64Token := base64.StdEncoding.EncodeToString(
[]byte(fmt.Sprintf("%s:%s", consumerKey, consumerSecret)))
req.Header.Add("Authorization", "Basic "+b64Token)
req.Header.Add("Content-Type", "application/x-www-form-urlencoded;charset=UTF-8")
resp, err := http.DefaultClient.Do(req)
if err != nil {
return "", fmt.Errorf("/token request failed: %+v", err)
}
defer resp.Body.Close()
var v struct {
AccessToken string `json:"access_token"`
}
if err := json.NewDecoder(resp.Body).Decode(&v); err != nil {
return "", fmt.Errorf("error parsing json in /token response: %+v", err)
}
if v.AccessToken == "" {
return "", fmt.Errorf("/token response does not have access_token")
}
return v.AccessToken, nil
}

Wow, ok, that’s a fair bit of work. Up next, the findem.go file and related function for twitz. Here I start off with a few informative prints to the console just to know where the CLI has gotten to at certain points. The twitter list is put together, reusing that same function – yay code reuse right! Then the access token is retrieved. Next up the http client is built, the twitter client is passed that and initialized, and the user lookup request is sent. Finally the users are printed out and below that a count and print out of the count of users is printed.


var findemCmd = &cobra.Command{
Use: "findem",
Short: "A brief description of your command",
Long: `A longer description that spans multiple lines and likely contains examples
and usage of using your command. For example:
Cobra is a CLI library for Go that empowers applications.
This application is a tool to generate the needed files
to quickly create a Cobra application.`,
Run: func(cmd *cobra.Command, args []string) {
fmt.Println("Starting Twitter Information Retrieval.")
completedTwitterList := buildTwitterList()
fmt.Printf("Getting Twitter details for: \n%s", completedTwitterList)
accessToken, err := getBearerToken(viper.GetString("consumer_api_key"), viper.GetString("consumer_api_secret"))
check(err)
config := &oauth2.Config{}
token := &oauth2.Token{AccessToken: accessToken}
// OAuth2 http.Client will automatically authorize Requests
httpClient := config.Client(context.Background(), token)
// Twitter client
client := twitter.NewClient(httpClient)
// users lookup
userLookupParams := &twitter.UserLookupParams{ScreenName: completedTwitterList}
users, _, _ := client.Users.Lookup(userLookupParams)
fmt.Printf("\n\nUsers:\n%+v\n", users)
howManyUsersFound := len(users)
fmt.Println(howManyUsersFound)
},
}

view raw

Findem.go

hosted with ❤ by GitHub

I realized, just as I wrapped this up I completely spaced on the Apache Cassandra export. I’ll have those post coming soon and will likely do another refactor to get the output into a more usable state before I call this one done. But the core functionality, setup of the systemic environment needed for the tool, the pertinent data and API access, and other elements are done. For now, that’s a wrap, if you’re curious about the final refactor and the Apache Cassandra export then subscribe to my Twitch @adronhall and/or my YouTube channel ThrashingCode.

UPDATED SERIES PARTS

    1. Twitz Coding Session in Go – Cobra + Viper CLI for Parsing Text Files
    2. Twitz Coding Session in Go – Cobra + Viper CLI with Initial Twitter + Cassandra Installation
    3. Twitz Coding Session in Go – Cobra + Viper CLI Wrap Up + Twitter Data Retrieval (this post)

     

Twitz Coding Session in Go – Cobra + Viper CLI with Initial Twitter + Cassandra Installation

Part 2 of 3 – Coding Session in Go – Cobra + Viper CLI for Parsing Text Files, Retrieval of Twitter Data, Exports to various file formats, and export to Apache Cassandra.

UPDATED PARTS:

  1. Twitz Coding Session in Go – Cobra + Viper CLI for Parsing Text Files
  2. Twitz Coding Session in Go – Cobra + Viper CLI with Initial Twitter + Cassandra Installation (this post)
  3. Twitz Coding Session in Go – Cobra + Viper CLI Wrap Up + Twitter Data Retrieval

Updated links to each part will be posted at bottom of  this post when I publish them. For code, written walk through, and the like scroll down below the video and timestamps.

Hacking Together a CLI Installing Cassandra, Setting Up the Twitter API, ENV Vars, etc.

0:04 Kick ass intro. Just the standard rocking tune.

3:40 A quick recap. Check out the previous write “Twitz Coding Session in Go – Cobra + Viper CLI for Parsing Text Files” of this series.

4:30 Beginning of completion of twitz parse command for exporting out to XML, JSON, and CSV (already did the text export previous session). This segment also includes a number of refactorings to clean up the functions, break out the control structures and make the code more readable.

In the end of refactoring twitz parse came out like this. The completed list is put together by calling the buildTwitterList() function which is actually in the helpers.go file. Then prints that list out as is, and checks to see if a file export should be done. If there is a configuration setting set for file export then that process starts with a call to exportParsedTwitterList(exportFilename string, exportFormat string, ... etc ... ). Then a simple single level control if then else structure to determine which format to export the data to, and a call to the respective export function to do the actual export of data and writing of the file to the underlying system. There’s some more refactoring that could be done, but for now, this is cleaned up pretty nicely considering the splattering of code I started with at first.


var parseCmd = &cobra.Command{
Use: "parse",
Short: "This command will extract the Twitter Accounts form a text file.",
Long: `This command will extract the Twitter Accounts and clean up or disregard other characters
or text around the twitter accounts to create a simple, clean, Twitter Accounts only list.`,
Run: func(cmd *cobra.Command, args []string) {
completedTwittererList := buildTwitterList()
fmt.Println(completedTwittererList)
if viper.Get("fileExport") != nil {
exportParsedTwitterList(viper.GetString("fileExport"), viper.GetString("fileFormat"), completedTwittererList)
}
},
}
func exportParsedTwitterList(exportFilename string, exportFormat string, twittererList []string) {
if exportFormat == "txt" {
exportTxt(exportFilename, twittererList, exportFormat)
} else if exportFormat == "json" {
exportJson(exportFilename, twittererList, exportFormat)
} else if exportFormat == "xml" {
exportXml(exportFilename, twittererList, exportFormat)
} else if exportFormat == "csv" {
exportCsv(exportFilename, twittererList, exportFormat)
} else {
fmt.Println("Export type unsupported.")
}
}
func exportXml(exportFilename string, twittererList []string, exportFormat string) {
fmt.Printf("Starting xml export to %s.", exportFilename)
xmlContent, err := xml.Marshal(twittererList)
check(err)
header := xml.Header
collectedContent := header + string(xmlContent)
exportFile(collectedContent, exportFilename+"."+exportFormat)
}
func exportCsv(exportFilename string, twittererList []string, exportFormat string) {
fmt.Printf("Starting txt export to %s.", exportFilename)
collectedContent := rebuildForExport(twittererList, ",")
exportFile(collectedContent, exportFilename+"."+exportFormat)
}
func exportTxt(exportFilename string, twittererList []string, exportFormat string) {
fmt.Printf("Starting %s export to %s.", exportFormat, exportFilename)
collectedContent := rebuildForExport(twittererList, "\n")
exportFile(collectedContent, exportFilename+"."+exportFormat)
}
func exportJson(exportFilename string, twittererList []string, exportFormat string) {
fmt.Printf("Starting %s export to %s.", exportFormat, exportFilename)
collectedContent := collectContent(twittererList)
exportFile(string(collectedContent), exportFilename+"."+exportFormat)
}
func collectContent(twittererList []string) []byte {
collectedContent, err := json.Marshal(twittererList)
check(err)
return collectedContent
}
func rebuildForExport(twittererList []string, concat string) string {
var collectedContent string
for _, twitterAccount := range twittererList {
collectedContent = collectedContent + concat + twitterAccount
}
if concat == "," {
collectedContent = strings.TrimLeft(collectedContent, concat)
}
return collectedContent
}
func exportFile(collectedContent string, exportFile string) {
contentBytes := []byte(collectedContent)
err := ioutil.WriteFile(exportFile, contentBytes, 0644)
check(err)
}

view raw

parse.go

hosted with ❤ by GitHub

50:00 I walk through a quick install of an Apache Cassandra single node that I’ll use for development use later. I also show quickly how to start and stop post-installation.

Reference: Apache Cassandra, Download Page, and Installation Instructions.

53:50 Choosing the go-twitter API library for Go. I look at a few real quickly just to insure that is the library I want to use.

Reference: go-twitter library

56:35 At this point I go through how I set a Twitter App within the API interface. This is a key part of the series where I take a look at the consumer keys and access token and access token secrets and where they’re at in the Twitter interface and how one needs to reset them if they just showed the keys on a stream (like I just did, shockers!)

57:55 Here I discuss and show where to setup the environment variables inside of Goland IDE to building and execution of the CLI. Once these are setup they’ll be the main mechanism I use in the IDE to test the CLI as I go through building out further features.

1:00:18 Updating the twitz config command to show the keys that we just added as environment variables. I set these up also with some string parsing and cutting off the end of the secrets so that the whole variable value isn’t shown but just enough to confirm that it is indeed a set configuration or environment variable.

1:16:53 At this point I work through some additional refactoring of functions to clean up some of the code mess that exists. Using Goland’s extract method feature and other tooling I work through several refactoring efforts that clean up the code.

1:23:17 Copying a build configuration in Goland. A handy little thing to know you can do when you have a bunch of build configuration options.

1:37:32 At this part of the video I look at the app-auth example in the code library, but I gotta add the caveat, I run into problems using the exact example. But I work through it and get to the first error messages that anybody would get to pending they’re using the same examples. I get them fixed however in the next session, this segment of the video however provides a basis for my pending PR’s and related work I’ll submit to the repo.

The remainder of the video is trying to figure out what is or isn’t exactly happening with the error.

I’ll include the working findem code in the next post on this series. Until then, watch the wrap up and enjoy!

1:59:20 Wrap up of video and upcoming stream schedule on Twitch.

UPDATED SERIES PARTS

    1. Twitz Coding Session in Go – Cobra + Viper CLI for Parsing Text Files
    2. Twitz Coding Session in Go – Cobra + Viper CLI with Initial Twitter + Cassandra Installation (this post)
    3. Twitz Coding Session in Go – Cobra + Viper CLI Wrap Up + Twitter Data Retrieval

 

Twitter for Developers, Cutting the Bullshit, Quelling the Trash Tire Fire

It’s been over a decade that Twitter has been an active part of the developer community. It’s grown in popularity from day one, and now holds the uneasy crown as the place for hot takes, trash from politicians, and the general tire fire that is the news. In many ways, that’s what they’ve aimed for. But then there’s us developers, people who make software, who make Twitter, who build all of this technology internet stuff right? We’re here using Twitter still, even amid the backstabbing and Twitter UI’s API’s being yanked from under us. They’ve of course in the past also banned UI’s and somehow here we are still using the service. However, I digress, Twitter’s wrongs against developers are numerous after we effectively built the service. In spite of all this we developers are a large contingent of people on Twitter. It’s still an amazingly useful medium for software developers, and especially new software developers, to get involved with. It’s a very effective tool to strengthen our careers and continue conversations within the developer communities themselves. One just has to avoid the cruft, and that’s what I intent to tackle some of in this article.

This list I’ve put together is of things that I personally have learned, often by stumbling through and discovering myself. These activities on Twitter do have a net positive effect on your career and ability to communicate with the world and local developer communities. First I’ll cover positive use cases of Twitter that are immensely useful as a software developer. These are even compounded if you’re an advocate of open source, cool technologies and libraries, and other miscellaneous things.

1. Twitter as a Communication Tool

First and foremost, Twitter has been and does – mostly – continue to be a communication tool. I make use of Twitter to connect with people for conference organizing, code projects, open source work, to have geek lunch, nerd brunch, and many other things that come up. It can and ought to be one of your primary communication mediums in that it connects many of the key active people within our overall communities. More so than email and other mediums by a large percent. If you intend to have a long term net effect and grow your presence and activities (conferences, meetups, coding groups, etc) you want to foster Twitter has become the de facto medium to be active on.

2. Twitter as a Collector of People

Twitter, even though it does seem to attract some of the most villainous scum (literally, not a figure of speech or hyperbole) and have some pretty horrifying problems (people calling in SWAT’s on people (extremely illegal), death threats, harassment) the net benefit within the community to bring people together has far outclassed pretty much any other system out there. Hacker News doesn’t, Facebook doesn’t, Google+ is cancelled, and about every other social media platform has failed to bring together the develop community in an effective and useful way.

3. Twitter for Answers

Even though I don’t often go to Twitter to find answers, sometimes I do. Often it is a last resort. After all, Twitter is most efficient at providing a place for links, quick blurbs, bumbling and babbling threads from people, and of course cat pictures and hot takes.

The combination powers of Twitter with other services however exponentially increases the ability of Twitter to help with answers. For example, write up a solidly written question on Stackoverflow or one of the branched out services and then post the question on Twitter, maybe inquiring for some retweets and boom, doubling, tripling, and greater multiplier of people looking at the question that can provide a prospective answer!

4. Twitter, Firestarter

One of the things I’ve also found Twitter good for is an outlet for pushing and often straightening out bad behavior in the community. Ever done something racist? Ever known someone to pull some misogynistic action? Yeah, unfortunately I know of these things happening too, and Twitter forces apologies and better behavior among people. But it also is a place people can wreck themselves and be just as destructive as they can learn to better themselves, especially those humans of us that have poor behavior and disrespectful tendencies.

But just as much as individual behaviors among us, Twitter has been used to straighten out some pretty trash behaviors from corporations. Sure, they’re not really people, but the conflagrations of this notion – true or not – make for pressure to be applied to corporations through other means besides the products and services they sell us individual humans, which to often are things we have to buy regardless, and this medium provides us an avenue to induce better behaviors in spite of purchases.

There of course is the positive and negative of this forced societal behavior and in many ways, improving corporate behavior throughout the world, but it’s here. Pressure of the people, often organized and started through Twitter, including against Twitter itself sometimes, is heavily rooted in activity right there on ole’ Twitter itself.

GSD Tactical Twitter

Alright, now to the meat of things. Twitter is great at all these things but how does one make the best use of it without it turning into an outright tire fire trash dump of distraction and stress? Well, it’s moderately easy, but one has to be careful.

1. Find Good and Entertaining People

My personal advice when starting on Twitter is to skip the companies. Don’t follow any of them. Same goes for organizations or any group account of sorts. The key to find good content, good common ground, and useful links, news, and related communities is to follow individuals that are involved in those things you want to be involved in already. The following are some specific examples, and for me, great people to follow.

2. Lift Up Others, Tweet to Others, Get Involved

When on Twitter, one can just lurk. It’s a completely valid thing to do. However lurking isn’t super high value. You just won’t get that much out of it. Instead, get involved. Find a link with something interesting, write up a tweet and post it. See something interesting someone else just tweeted, respond! See something that isn’t right, maybe tweet why it isn’t.

Always a good idea, regardless of the trash that is often on Twitter to still stay courteous, kind, and friendly. Remember, not everyone is from the mold you’ve come from, or seen things the way you have, so tread lightly and friendly and things mostly work out real well. Overall, people are attuned to helping those that help themselves and helping those that we run in social circles with.

All in all, get involved, tweet at, with, and all around your fellow Twitterers. Your return will improve and in the process you’ll add more value for others too.

3. Follow & Prune the Firehose of Tweets

Alright, I’ve written to follow and lift up others. That’s groovy, but also you gotta bring the hammer down sometimes. When that firehose of tweets just gets a little overwhelming check out what tweets are helpful, rate them to yourself, and unfollow some people if it’s not the direction or the tweets you’re getting value from.

Even though it’s difficult when just starting to use Twitter, the ratio will be more followed than followers for you. But as time goes forward and you get past 50 followers, 100, 500, 1000 you’ll need to make sure to keep the list of people you’ve followed just equal to or less than how many people follow you. It’ll help keep your feed manageable and also help you to keep interactions beneficial for you, followers, and followed.

4. The Down-Low on Conferences

If you’re looking to attend a conference, Twitter via hashtags is a great way to get information on conferences. Dig in, dig deep. Talk to people about the conference in particular. If necessary get into direct messages and invoke the whisper net if need be. Sometimes conferences can be exponentially useful and sometimes they end up bothersome cash burning wastes of time. Figure out what you want from a prospective conference and dig in via Twitter, you’ll prevent wasting time and burning cash, and exponentially increase the positives you can get out of a conference.

5. Filter the Trash Fire

Ok, let’s get super serious. One way Twitter has become a trash fire for many or most people these days is because of the political trash dumped in. Much of Twitter for the general public is bot armies from Russia, crazies like the nutty Wohl kid, and other junk nut accounts. One way to notch this down to a minimal trash fire is to throw some filters (i.e. mute certain words) on your Twitter account. For example here’s my list:

filters-muted-words

Now as you’ve read that, remember that my goal has been to focus the stream on tech content with a little heavy metal, a few cats, and other entertainment here and there. For example I’m fine with sports events like baseball and football but really don’t want to get distracted by it in on my Twitter stream. On game day those events just overwhelm the tweets and things that are useful get drowned out.

Now a lot of the other stuff in the list is the horrifying reality of the United States today, reflected on Twitter, and part of something that I don’t want distracting me either. Overall this has made Twitter dramatically more useful for me again.