Tag Archives: twitch

Coding, WTF Twitter, Twitch FTW, Getting Shit Done, Twitch Hacks, Tips, Tricks, and One Excellent Jazz Influenced Tune

WTF Twitter

I’ve been doing a lot more coding, thanks largely to the discipline that Twitch has brought to my day. It seems almost surprising to me at this point because Twitch started similarly to the way Twitter did for me. You see, I thought at first Twitter was the dumbest thing that had happened in ages. Arguably, it’s come full circle and I kind of feel the same thing about Twitter now, but during the middle decade in between that (yes, Twitter is over 10 years old!) Twitter has brought me connection, opportunities, and so much more. I couldn’t have imagined a lot of what I’ve been able to pull together because of Twitter. It’s still useful in many ways for this, albeit I like all of us are at risk of suffering the idiocy of today’s politics and political cronies, and the dog piling trash pile that follows them onto Twitter.

I’m not leaving Twitter any time soon but I’ve definitely put in on a very short leash, and limited what impact it does or doesn’t have in my day to day flow.

Twitch FTW

Amazingly however a new social and productive tool, not that it intended both, has come into being. Coding on Twitch. Don’t get me wrong I game, I just don’t game socially or on Twitch, what I do is code on Twitch. With a fair dose of hacking, breaking things, and then figuring out how to make them work. All at the same time I along with others have created a pretty excellent developers community there on Twitch. It seems to be growing all the time too. Twitch, at this point has become a focal point that has the benefits without all the annoying garbage that Twitter does these days, while adding the vast and hugely important fact that I can do things, be productive, chit chat, and generally get shit done all while I’m Twitch streaming.

VidStreamHacking

@ https://github.com/Adron/VidStreamHacking

With that, let’s talk about some of the recent notes and information I’ve been working on putting together to make Twitch even more useful. My first motive with this was to keep track of all the things I was doing, hardware I was putting together, and related things, but then another purpose grew out of all this note taking. It became obvious that this repository of information could be useful for other people. Here’s a survey of the things that I’ve added so far, hope they’re helpful to those of you digging into streaming out there!

I added some badges to identify various elements of information about the repo in the README.md.

badges

Is it maintained, yup, contributors, so far just me, zero issues filed but please feel free to add an issue or two, markdown yup, and there is indeed a Trello Board! The Trello Board is a key to insight, inspection, and what I’ve got going on in a number of my repositories. It’s where I’m keeping track of all the projects, what’s next, and what’s up in queue for the blog (this one right here). At least, in the context of the big code heavy or video reviews of sessions with code, extra commentary, and related content. If you want to get involved in any of the repos just let me know and I’m happy to walk through whatever and even get you added to the Trello board so we can work together on code.

Streaming Gear

https://github.com/Adron/VidStreamHacking/blob/master/hardware.md

My main machine is now a Dell XPS 15, which I fought through to get Linux running on it, and now that I have it’s been an absolutely stellar machine. I’ve also added additional monitor & port replicator/docking station gear to get it even more usable. The actual page I’ve got the details listed on are in the repo on the Dell XPS 15 item on the hardware page.

Along with the XPS 15 I wrote up coverage of the unboxing via video and blog entry. After a few weeks I also wrote up the conflict I had getting Linux running and removing Windows 10. In addition to the XPS 15 though I do use a MacBook from 2015 as my primary Mac machine, with an iMac from 2013 available as backup. Both machines are still resoundingly solid and performant enough to get the job done. Rounding out my fleet of machines is a Dell XPS 13 (covered here and here with the re-review).

For screens I have one at my office and one at home. They’re almost the same thing, ultra-widescreen monitors, curved displays, running 3880-1440 resolution from LG. These make keeping an eye on chat, OBS, and all sorts of other monitoring while coding, gaming, or whatever a breeze!

shotone.png

Ex 1: Just viewing a giant OBS view to get everything sorted out before starting a stream.

shottwo

Ex 2: OBS w/ VM running w/ Twitch chat, dashboard etc to the right. This way I can work, see the stream, and see chat and such all at the same time.

The docking stations and/or port replicators, whatever one calls these things these days also bring all of this tech together for me. There’s a couple I have tried and retired already (unfortunately, cuz dammit that cost some money!) and others that I use in some scenarios and others I use in others.

My main docking station contraption, shout out to James & others suggestion the Caldigit TS3. I got to this docking station through the Dell TB16 which for Linux, and kind of for Windows, is an unstable mess. Awesome potential if it worked, but it doesn’t so I tried out this USB-C pluggable option (in the tweet) which had HDMI that was unfortunately limited in resolution. Having a wide screen made this – albeit it being super compatible with Linux – unusable too. So I finally upgraded to the Caldigit TS3 and WOW, the Caldigit is super seriously wickedly bad ass. Extra USB-C ports, USB 2/3 ports, power, and more all rolled into one. It even supplies some power to the laptop, however I keep it plugged in since it’s kind of a power hog when the processor start chomping!

After trying out this USB-C pluggable (the tweet) I got the CalDigit into play. It’s really really good, here’s a shot of that from various angles with the extensive cables that I don’t have to plug into my laptop anymore. Out of this also runs a 28 port USB powered hub too, no picture, but just know I’ve got a crazy number of devices I routinely like to use!

That’s my main configuration when using the ultra widescreens and all. Good setup there, very usable, and the 32GB of memory in the laptop really get put to use in this regard. As for storage, that’s another thing. I’ve got 1 TB in my laptop but another 1 TB in a USB-C Thunderbolt Samsung Drive which is practically as fast for most things. So much so I attach it via the TS3 via USB-C and it’s screaming fast and adds that extra storage. So far, primarily I’ve been using it to store all of my virtual machines or use it as video storage while I do edits.

There’s other gear too, check out the list, like the Rode Podcoster and other things. But that gear I’ll elaborate on some other time.

Meetup Streaming Gear

https://github.com/Adron/VidStreamHacking/blob/master/meetup-streaming-kit-gear.md

Another effort I’ve undertaken is recording meetups. To do this one needs to be able to stream things with several screens combined – i.e. picture in picture and all. To do this, one needs a camera that can focus on the speaker, ideally at least 1080p with at least some ability to work in less than ideal light. Then next to that, a splitter and capture card to get the slides! Once all those pieces come together, with a little OBS finesse one can get a pretty solid single pass recording of a meetup. An example of one of my better attempts was the last meetup “Does the Cloud Kill Open Source” with Richard Seroter. If you take a look at past talks in the Meetups Playlist you can see my iterative progress from one meetup to another!

Here’s the specific gear I’m using to get this done. At least, so far, and if and when it becomes financially reasonable I might upgrade some of the gear. It largely depends on what I can get more use out of beyond just streaming meetups.

Cords and Splitter – I picked up a selection of lengths and types so that I’d have wiring options for the particular environments the meetups would be located in. Generally speaking 25ft seems to be a safe maximum for HDMI. I’ve been meaning to check out the actual specifications on it but for now it’s more than enough regardless.81fhh-w-DeL._SX679_

The splitter wasn’t expensive at all ($16.99), and kind of surprised me considering the costs of the cables. Picture to the right, or above, or somewhere depending on mobile layout.

I needed capture cards for this, one for the line out of the splitter that would capture the slides. The first I had picked up based on suggestions focusing around quality and that was the Avermedia Extreme Cap HDMI to USB 3 Capture Card. It’s really solid for higher resolution and related capabilities. For the USB 3.0 HDMI HD Game Video Capture Card I picked it up based on price (it’s almost a 1/3rd of the price) but not particular focused on quality. However, now that I’ve used both they are capable and seem fine, so I might have been able to just buy two of the cheaper options.

The camera, ideally, I’d have a much higher quality one but the Canon VIXIA HF R800 Camcorder has actually worked excellently. A little less feature rich for audio out and related things, but it zooms in good and can record at the same time I’m getting the cam feed into the stream. So it’s always a nice way to have a backup of the talk.

The last, and one of the most important aspects is getting good audio.

Streaming Meetups

https://github.com/Adron/VidStreamHacking/blob/master/meetup.md

At first thought, I made the mistake that just the gear would be enough but holy smokes there were about a million other things I needed to write. I created meetup.md to get the list going.

Jazz Influence Amidst the Heaviness!

As promised. Some music, not actually jazz, but heavily influenced by some jazz, progressive instrumentation, and esoteric, expansive, exquisite playing skills by the band. As always, be prepared. My music referrals aren’t always gentle! Happy code streaming!

Adding and Returning Value to the Community via Twitter, LinkedIn, and Twitch

Twitter-512Twitter

Goal: Grow our follower count and reach, entertain, laugh, make hot takes – as one does on Twitter, educate, and get value out of it ourselves.

Don’t!

  • Don’t buy followers (i.e. don’t pay anybody that promises X followers, market share, or whatever it is they’re selling). We can’t trust this method as it’s often just a pile of Russian bots or other garbage followers. This does nothing to increase visibility and penetration to those that want, are interested in, or need to communicate with us (i.e. customers and fans).
  • Do not just repost things via RT or use tooling to just post arbitrary things. People notice this and won’t follow or will unfollow you. It’s a sure fire way to be blacklisted as *marketing* which will involve going to zero eyeballs, even when the account statistics keep showing people see it.
  • Do not post identical or similar content one tweet after another. i.e. Don’t post a marketing blurb with one image, then post another marketing blurb with another image that’s exactly the same size, theme, and fill up the entire tweet stream this way. The followers you get will not be active, will not be who you actually want to speak with or interact with, and don’t really add value over time if this is all that is done. It’s similar to those blog theft sites that just re-post the exact RSS stream and then, by proxy, get blacklisted and erased from Google/search results.

Do!

  • Just make it about you. Grow your personal brand first and foremost. Such as “Dern this is a wicked awesome band.” or “Wow, best burger in the world” and add pictures, content, and other interesting things for people. It doesn’t have to be “I just cured all diseases yo, check me out!” you can, and people will follow based on honesty, integrity, taking a stance, being informative, and providing useful information of all sorts. But more than anything they’ll follow the person not any specific *thing* you’re selling, pushing or what not. So be yourself, share, and be involved with the network you create.
  • Build things you’re interested in, especially when they’re related in some way to products and services you like to use and find interesting – i.e. Apache Cassandra, DSE, Databases, Application Development, etc. Build on these things via threading, via initiating discussion with others that are discussing these things, and among all this find valuable fellow Twitterers that you want to be connected to. This helps all involved, you, your network, the company, the people and companies you connect to, and more. Bringing the network wide with an on point effort around topics will dramatically increase your collective opportunity but also anything and everybody around you.
  • When retweeting, intersperse it among other things, and happily add content to RT’s. In other words don’t just make it endless retweets, but just throw in a few retweets for things you’re interested in or support, and then have your regular stream of tweets, links, and other content.
  • Use emoticons, use pictures, and definitely blurt memes out there. Aim to have fun with Twitter.

Examples of good Twitterers that really provide high value to followers, but also back to the Twitterer themselves in the way of speaking opportunities and all sorts of other things:

LinkedIn-512LinkedIn

Goal: Build an extensive professional network and return value to the LinkedIn Network of connections you have.

Don’t!

  • Don’t use LinkedIn like Facebook. This is obvious but for some reason much the world doesn’t seem to get this, so it feels like it needs stated for the LOL’s. i.e. Don’t hit on people, don’t ask people out on dates, just talk business. Ideally leave politics out of it too.
  • Ideally, don’t send droves of InMail messages to people unless that’s specifically the game being played on LinkedIn. For more grassroots and non-marketing community focus, just interact with people directly, that you know, and don’t arbitrary chase down people you do NOT actually know. This is another thing that decreases authenticity, and makes an individual – even if not – appear like they’re shilling for something.

Do!

  • Post content regularly about what you’re working on, provide links, and provide respective researched content for other mediums you might have like Medium, a blog, Twitter, and all that jazz.
  • Talk about your professional achievements and whatever else that might come up related to your work, hobbies (pending some business relation or something you do/did professional, i.e. like the music you play, or other hobby of sorts). Sometimes hobbies count too, so put that content into rotation now and again too. But do remember, if it fits better on Facebook than LinkedIn, just don’t post it on LinkedIn.
  • Reach out if there is legitimate business that you are both involved in. Start that as a simple conversation, not a sale, not something pushy, just simple, friendly, curious conversation.

Examples of good LinkedIn Accounts, that use their accounts for benefit for themselves but also provide benefit directly or indirectly for all of us:

iconmonstr-twitch-5Twitch

Goal: Grow our follower count and increase our collective content and work material to show, teach, work, and hang out with viewers to build tomorrow’s best, most kick ass, wicked awesome applications, data science analysis, and more!

Don’t!

  • Not a whole lot here yet. Twitch is kind of wide open and not a lot of no no’s here. Don’t do illegal things is all I’ve got at the moment.

Do!

  • Setup your OBS or streaming process so that you have chat on screen, chat somewhere you can monitor it, code is clear and fonts are readable, you add all the interaction content you can for new follows (alerts), subscribes (alerts), and whatever else comes up.
  • When on stream, take your time, interact with people that follow, subscribe, or chat/whisper with you.
  • Don’t worry about making mistakes, just work through them, let the audience help if they offer it. Even if you know that they’re wrong, work through things with them and let them get involved. Then lead into the correct fix, etc. This is a great way to teach and build involvement on stream so that everybody gets a win, and you get an advocate to your own advocacy.
  • If you’re going to heavily curse or do anything even slightly liberal/conservative/religious/ideological etc it’s probably best to mark one’s stream as 13 or older (I think that’s the setting).

Some excellent Twitch streamers to reference for their involvement, OBS setup, configuration, and general awesomeness in the community.

That’s it for this post. Got more do’s or don’ts? Lemme know, will start a repo!

A little lagniappe for ya, that hygge feeling.

Baseline Collateral for Useful Video Streaming

This post I’m making just to outline, and provide future reference, of a few things I’ve realized between this post about what you dear readers want and things that would make my video streams more useful, plus what content would be more useful. I’ve taken this survey (which you can still fill out your two cents) and what I’ve learned form it, combined that with what I’ve learned with doing some Twitch and YouTube streaming and prepared to introduce better content, with more details and related samples, along with the streaming I’ve been doing. With that, here’s an outline of the basic collateral I’m going to produce for every single streaming session I do from now on.

Per Stream Per Video Collateral

  1. I’ll post updates with content and objectives overview for each session well before the actual session that I schedule. So that there’s a blog entry with the next dozen or so sessions, plus the events will be listed on Twitch here. So subscribe to my Twitch channel @adronhall if you want to get updates when I go live!
  2. The full unedited video will be on Youtube here. Subscribe to get updates when the stream goes live on Youtube and to be notified when they post. With each video after it is recorded I will provide a timeline of topics covered for each video that goes live on Youtube. Similar to the timeline in this session or this session.
  3. I’ll post a post-stream blog post with the stream link on Youtube, the timeline, and a write up of the code, repositories, and any other collateral that goes along with the stream session. That way everybody will have access to a quick way of getting caught back up, review streams they’ve watched, and reuse or refer to the material after the stream is over for implementation or other use.
  4. I’ll also endeavor to cut out slices of the video into smaller multiple minute segments for specific how-to videos on how I’ve put together certain things, written certain pieces of code, configured things, or otherwise useful little segments. These segments, which I don’t have an example just yet, will be somewhere around 30 seconds to less than 5 minutes. Hopefully quickly watchable and useful.

I’m looking forward to these new sessions and the collateral I’m planning to put together. It ought to be good material and I hope you find it useful. Cheers! \m/

What could I do for you?

I’m in the process (with the team I work with) of trying to figure out what would be most useful to you, the community and its members in which we all work. Whether you’re a coder, working toward being a coder, programmer, engineer, or whatever it is that you  aim for we want to know what would help you out? I myself produce a ton of material that I personally find entertaining and fun to produce myself, and hope it’s useful for people. So – if you would, take a moment and answer these few questions. Thanks and cheers!

Setting Up Nodes, Firewall, & Instances in Google Cloud Platform

Here’s the run down of what I covered in the latest Thrashing Code Session (go subscribe here to the channel for future sessions or on Twitch). The core focus of this session was getting some further progress on my Terraform Project around getting a basic Cassandra and DataStax Enterprise Apache Cassandra Cluster up and running in Google Cloud Platform.

The code and configuration from the work is available on Github at terraform-fields and a summary of code changes and other actions taken during the session are further along in this blog entry.

Streaming Session Video

In this session I worked toward completing a few key tasks for the Terraform project around creating a Cassandra Cluster. Here’s a run down of the time points where I tackle specific topics.

  • 3:03 – Welcome & objectives list: Working toward DataStax Enterprise Apache Cassandra Cluster and standard Apache Cassandra Cluster.
  • 3:40 – Review of what infrastructure exists from where we left off in the previous episode.
  • 5:00 – Found music to play that is copyright safe! \m/
  • 5:50 – Recap on where the project lives on Github in the terraformed-fields repo.
  • 8:52 – Adding a google_compute_address for use with the instances. Leads into determining static public and private google_compute_address resources. The idea being to know the IP for our cluster to make joining them together easier.
  • 11:44 – Working to get the access_config and related properties set on the instance to assign the google_compute_address resources that I’ve created. I run into a few issues but work through those.
  • 22:28 – Bastion server is set with the IP.
  • 37:05 – I setup some files, following a kind of “bad process” as I note. Which I’ll refactor and clean up in a subsequent episode. But the bad process also limits the amount of resources I have in one file, so it’s a little easier to follow along.
  • 54:27 – Starting to look at provisioners to execute script files and commands before or after the instance creation. Super helpful, with the aim to use this feature to download and install the DataStax Enterprise Apache Cassandra or standard Apache Cassandra software.
  • 1:16:18 – Ah, a need for firewall rule for ssh & port 22. I work through adding those and then end up with an issue that we’ll be resolving next episode!

Session Content

Starting Point: I started this episode from where I left off last session.

Work Done: In this session I added a number of resources to the project and worked through a number of troubleshooting scenarios, as one does.

Added firewall resources to open up port 22 and icmp (ping, etc).

resource "google_compute_firewall" "bastion-ssh" {
  name    = "gimme-bastion-ssh"
  network = "${google_compute_network.dev-network.name}"

  allow {
    protocol = "tcp"
    ports    = ["22"]
  }
}

resource "google_compute_firewall" "bastion-icmp" {
  name    = "gimme-bastion-icmp"
  network = "${google_compute_network.dev-network.name}"

  allow {
    protocol = "icmp"
  }
}

I also broke out the files so that each instances has its own IP addresses with it in the file specific to that instance. Later I’ll add context for why I gave the project file bloat, by refactoring to use modules.

terraform-files.png

Added each node resource as follows. I just increased each specific node count by one for each subsequent node, such as making this node1_internal IP google_compute_address increment to node2_internal. Everything also statically defined, adding to my file and configuration bloat.

resource "google_compute_address" "node1_internal" {
  name         = "node-1-internal"
  subnetwork   = "${google_compute_subnetwork.dev-sub-west1.self_link}"
  address_type = "INTERNAL"
  address      = "10.1.0.5"
}

resource "google_compute_instance" "node_frank" {
  name         = "frank"
  machine_type = "n1-standard-1"
  zone         = "us-west1-a"

  boot_disk {
    initialize_params {
      image = "ubuntu-minimal-1804-bionic-v20180814"
    }
  }

  network_interface {
    subnetwork = "${google_compute_subnetwork.dev-sub-west1.self_link}"
    address    = "${google_compute_address.node1_internal.address}"
  }

  service_account {
    scopes = [
      "userinfo-email",
      "compute-ro",
      "storage-ro",
    ]
  }
}

I also setup the bastion server so it looks like this. Specifically designating a public IP so that I can connect via SSH.

resource "google_compute_address" "bastion_a" {
  name = "bastion-a"
}

resource "google_compute_instance" "a" {
  name         = "a"
  machine_type = "n1-standard-1"
  zone         = "us-west1-a"

  provisioner "file" {
  source      = "install-c.sh"
  destination = "install-c.sh"

  connection {
    type     = "ssh"
    user     = "root"
    password = "${var.root_password}"
    }
  }

  boot_disk {
    initialize_params {
      image = "ubuntu-minimal-1804-bionic-v20180814"
    }
  }

  network_interface {
    subnetwork = "${google_compute_subnetwork.dev-sub-west1.self_link}"
    access_config {
      nat_ip = "${google_compute_address.bastion_a.address}"
    }
  }

  service_account {
    scopes = [
      "userinfo-email",
      "compute-ro",
      "storage-ro",
    ]
  }
}

Plans for next session include getting the nodes setup so that the bastion server can work with and deploy or execute commands against them without the nodes being exposed publicly to the internet. We’ll talk more about that then. For now, happy thrashing code!

Thrashing Code Twitch Schedule September 19th-October 2nd

I’ve got everything queued back up with some extra Thrashing Code Sessions and will have some on the rails travel streams. Here’s what the schedule looks like so far.

Today at 3pm PST (UPDATED: Sep 19th 2018)

thrashing-code-terraformUPDATED: Video available https://youtu.be/NmlzGKUnln4

I’m going to get back into the roll of things this session after the travels last week. In this session I’m aiming to do several things:

  1. Complete next steps toward getting a DataStax Enterprise Apache Cassandra cluster up and running via Terraform in Google Cloud Platform. My estimate is I’ll get to the point that I’ll have three instances that launch and will automate the installation of Cassandra on the three instances. Later I’ll aim to expand this, but for now I’m just going to deploy 3 nodes and then take it from there. Another future option is to bake the installation into a Packer deployed image and use it for the Terraform execution. Tune in to find out the steps and what I decide to go with.
  2. I’m going to pull up the InteroperabilityBlackBox and start to flesh out some objects for our model. The idea, is based around something I stumbled into last week during travels, the thread on that is here.

Friday (Today) @ 3pm PST

thrashing-code-gopherThis Friday I’m aiming to cover some Go basics before moving further into the Colligere CLI  app. Here are the highlights of the plan.

  1.  I’m going to cover some of the topics around program structure including: type declarations, tuple assignment, variable lifetime, pointers, and other variables.
  2.  I’m going to cover some basics on packages, initialization of packages, imports, and scope. This is an important aspect of ongoing development with Colligere since we’ll be pulling in a number of packages for generation of the data.
  3. Setting up configuration and schema for the Colligere application using Viper and related tooling.

Tuesday, October 2nd @ 3pm PST

thrashing-code-terraformThis session I’m aiming to get some more Terraform work done around the spin up and shutdown of the cluster. I’ll dig into some more specific points depending on where I progress to in sessions previous to this one. But it’s on the schedule, so I’ll update this one in the coming days.

 

Thrashing Code Sessions via Twitch & Kick Ass Dis-Sys Meetup

Got some excellent coding and systems setup coming up in the next few days. Also a meetup on the 28th with Tim Kellogg and Alena Hall presenting on some interesting topics around distributed database data working on Kubernetes and WebAssembly of the hot temperament type. A new surprise guest addition on my Twitch channel that is scheduled to swing into Valhalla and help build out a cluster and respective needed DHCP, DNS, and related configuration for a setup on the metal!

Schedule