How to Reconnoiter a New Role!

“i.e. Starting a challenging new role!”

I’m stepping into a role right now, which I announced recently “Career Update: Back to Engineering!“. In that role I have a number of key topics and knowledge specific to the role that I need to attain. Most of this is centered around the current state of teams, members of those teams, work in progress, product, and service status. The following are some of the important steps I’ve taken to reconnoiter the current state of things. These steps I’ve taken to get up to speed as quickly as possible! Continue reading “How to Reconnoiter a New Role!”

DSE6 + .NET v?

Project Repo: Interoperability Black Box

First steps. Let’s get .NET installed and setup. I’m running Ubuntu 18.04 for this setup and start of project. To install .NET on Ubuntu one needs to go through a multi-command process of keys and some other stuff, fortunately Microsoft’s teams have made this almost easy by providing the commands for the various Linux distributions here. The commands I ran are as follows to get all this initial setup done.

[sourcecode language=”bash”]
wget -qO- https://packages.microsoft.com/keys/microsoft.asc | gpg –dearmor > microsoft.asc.gpg
sudo mv microsoft.asc.gpg /etc/apt/trusted.gpg.d/
wget -q https://packages.microsoft.com/config/ubuntu/18.04/prod.list
sudo mv prod.list /etc/apt/sources.list.d/microsoft-prod.list
sudo chown root:root /etc/apt/trusted.gpg.d/microsoft.asc.gpg
sudo chown root:root /etc/apt/sources.list.d/microsoft-prod.list
[/sourcecode]

After all this I could then install the .NET SDK. It’s been so long since I actually installed .NET on anything that I wasn’t sure if I just needed the runtime, the SDK, or what I’d actually need. I just assumed it would be safe to install the SDK and then install the runtime too.

[sourcecode language=”bash”]
sudo apt-get install apt-transport-https
sudo apt-get update
sudo apt-get install dotnet-sdk-2.1
[/sourcecode]

Then the runtime.

[sourcecode language=”bash”]
sudo apt-get install aspnetcore-runtime-2.1
[/sourcecode]

logoAlright. Now with this installed, I wanted to also see if Jetbrains Rider would detect – or at least what would I have to do – to have the IDE detect that .NET is now installed. So I opened up the IDE to see what the results would be. Over the left hand side of the new solution dialog, if anything isn’t installed Rider usually will display a message that X whatever needs installed. But it looked like everything is showing up as installed, “yay for things working (at this point)!

rider-01

Next up is to get a solution started with the pertinent projects for what I want to build.

dse2

Kazam_screenshot_00001

For the next stage I created three projects.

  1. InteroperationalBlackBox – A basic class library that will be used by a console application or whatever other application or service that may need access to the specific business logic or what not.
  2. InteroperationalBlackBox.Tests – An xunit testing project for testing anything that might need some good ole’ testing.
  3. InteroperationalBlackBox.Cli – A console application (CLI) that I’ll use to interact with the class library and add capabilities going forward.

Alright, now that all the basic projects are setup in the solution, I’ll go out and see about the .NET DataStax Enterprise driver. Inside Jetbrains Rider I can right click on a particular project that I want to add or manage dependencies for. I did that and then put “dse” in the search box. The dialog pops up from the bottom of the IDE and you can add it by clicking on the bottom right plus sign in the description box to the right. Once you click the plus sign, once installed, it becomes a little red x.

dse-adding-package

Alright. Now it’s almost time to get some code working. We need ourselves a database first however. I’m going to setup a cluster in Google Cloud Platform (GCP), but feel free to use whatever cluster you’ve got. These instructions will basically be reusable across wherever you’ve got your cluster setup. I wrote up a walk through and instructions for the GCP Marketplace a few weeks ago. I used the same offering to get this example cluster up and running to use. So, now back to getting the first snippets of code working.

Let’s write a test first.

[sourcecode language=”csharp”]
[Fact]
public void ConfirmDatabase_Connects_False()
{
var box = new BlackBox();
Assert.Equal(false, box.ConfirmConnection());
}
[/sourcecode]

In this test, I named the class called BlackBox and am planning to have a parameterless constructor. But as things go tests are very fluid, or ought to be, and I may change it in the next iteration. I’m thinking, at least to get started, that I’ll have a method to test and confirm a connection for the CLI. I’ve named it ConfirmConnection for that purpose. Initially I’m going to test for false, but that’s primarily just to get started. Now, time to implement.

[sourcecode language=”csharp”]
namespace InteroperabilityBlackBox
using System;
using Dse;
using Dse.Auth;

namespace InteroperabilityBlackBox
{
public class BlackBox
{
public BlackBox()
{}

public bool ConfirmConnection()
{
return false;
}
}
}
[/sourcecode]

That gives a passing test and I move forward. For more of the run through of moving from this first step to the finished code session check out this

By the end of the coding session I had a few tests.

[sourcecode language=”csharp”]
using Xunit;

namespace InteroperabilityBlackBox.Tests
{
public class MakingSureItWorksIntegrationTests
{
[Fact]
public void ConfirmDatabase_Connects_False()
{
var box = new BlackBox();
Assert.Equal(false, box.ConfirmConnection());
}

[Fact]
public void ConfirmDatabase_PassedValuesConnects_True()
{
var box = new BlackBox(“cassandra”, “”, “”);
Assert.Equal(false, box.ConfirmConnection());
}

[Fact]
public void ConfirmDatabase_PassedValuesConnects_False()
{
var box = new BlackBox(“cassandra”, “notThePassword”, “”);
Assert.Equal(false, box.ConfirmConnection());
}
}
}
[/sourcecode]

The respective code for connecting to the database cluster, per the walk through I wrote about here, at session end looked like this.

[sourcecode language=”csharp”]
using System;
using Dse;
using Dse.Auth;

namespace InteroperabilityBlackBox
{
public class BlackBox : IBoxConnection
{
public BlackBox(string username, string password, string contactPoint)
{
UserName = username;
Password = password;
ContactPoint = contactPoint;
}

public BlackBox()
{
UserName = “ConfigValueFromSecretsVault”;
Password = “ConfigValueFromSecretsVault”;
ContactPoint = “ConfigValue”;
}

public string ContactPoint { get; set; }
public string UserName { get; set; }
public string Password { get; set; }

public bool ConfirmConnection()
{
IDseCluster cluster = DseCluster.Builder()
.AddContactPoint(ContactPoint)
.WithAuthProvider(new DsePlainTextAuthProvider(UserName, Password))
.Build();

try
{
cluster.Connect();
return true;
}
catch (Exception e)
{
Console.WriteLine(e);
return false;
}

}
}
}
[/sourcecode]

With my interface providing the contract to meet.

[sourcecode language=”csharp”]
namespace InteroperabilityBlackBox
{
public interface IBoxConnection
{
string ContactPoint { get; set; }
string UserName { get; set; }
string Password { get; set; }
bool ConfirmConnection();
}
}
[/sourcecode]

Conclusions & Next Steps

After I wrapped up the session two things stood out that needed fixed for the next session. I’ll be sure to add these as objectives for the next coding session at 3pm PST on Thursday.

  1. The tests really needed to more resiliently confirm the integrations that I was working to prove out. My plan at this point is to add some Docker images that would provide the development integration tests a point to work against. This would alleviate the need for something outside of the actual project in the repository to exist. Removing that fragility.
  2. The application, in its “Black Box”, should do something. For the next session we’ll write up some feature requests we’d want, or maybe someone has some suggestions of functionality they’d like to see implemented in a CLI using .NET Core working against a DataStax Enterprise Cassandra Database Cluster? Feel free to leave a comment or three about a feature, I’ll work on adding it during the next session.

Documentation First w/ README.md && Project Tree Build

I’m sitting here trying to get the folder structure for my project into a kind of ASCII Tree or something. I wasn’t going to manually do this, it would be insane. Especially on any decent size Enterprise Project with an endless supply of folders and nested content. I went digging to come up with a better solution. On Linux I immediately found the Tree Utility which was perfect.

Except I was on OS-X.

First option I gave a go to was to build the thing. Because I like to do things the hard way sometimes. First I needed to get the source, which is available here.

[sourcecode language=”bash”]curl -O ftp://mama.indstate.edu/linux/tree/tree-1.7.0.tgz[/sourcecode]

Once downloaded, unzip the source into a directory and find the following section for the particular operating system you want to use the utility on. The section for OS settings looked like this when I finished editing it.

[sourcecode language=”bash”]
# Uncomment options below for your particular OS:

# Uncomment for OS X:
CC=cc
CFLAGS=-O2 -Wall -fomit-frame-pointer -no-cpp-precomp
LDFLAGS=
MANDIR=/usr/share/man/man1
OBJS+=strverscmp.o
[/sourcecode]

Now get a good build of the command file.

[sourcecode language=”bash”]
./configure
make
[/sourcecode]

Now let’s get tree into the executable path.

[sourcecode language=”bash”]
sudo mkdir -p /usr/local/bin
sudo cp tree /usr/local/bin/tree
[/sourcecode]

Make sure your ~/.bash_profile is setup right, include this.

[sourcecode language=”bash”]
export PATH="/usr/local/bin:$PATH"
[/sourcecode]

Reload the shell and tree should be available as a command.

The other option which is really simple, if you don’t want to compile to code, is to just use brew to install it.

[sourcecode language=”bash”]
brew install tree
[/sourcecode]

So now you can use tree, and do cool stuff like pipe it out to a file. If you’re running this against a Node.js Project you may want to delete the node_modules directory and then just reinstall it after running the tree command.

[sourcecode language=”bash”]
tree > prof-tree.md
[/sourcecode]

Then in your README.md file you can include the folder structure in the description of the project. Here’s a sample output!

[sourcecode language=”bash”]
.
├── README.md
├── client
│   └── README.md
├── package.json
├── proj-tree.md
├── server
│   ├── boot
│   │   ├── authentication.js
│   │   ├── explorer.js
│   │   ├── rest-api.js
│   │   └── root.js
│   ├── config.json
│   ├── datasources.json
│   ├── middleware.json
│   ├── model-config.json
│   └── server.js
└── test
└── test_exists.js

4 directories, 14 files
[/sourcecode]

That’s a super easy way to offer better documentation that provides some real insight into what various parts of the project structure are actually for.

Plotting Good Things in Portland :: pdxbridge.js / WTF Databases /

Several people got together yesterday to start planning things for 2014 in PDX. It ranged from coding workshops to PDX Node to Node PDX to what kind of food to eat at for lunch. Ya know, daily tactical things that come along with the big picture items. 😉

bridge.js badge.
bridge.js badge.

Two things that I want to bring up to the community out there. One is a workshop that I’ll likely lead efforts to organize and the other is something I’ll just call pdxbridge.js for now. The workshop will cover the topics of which and what databases to use for what data and how to implement. The pdxbridge.js project is about determining the raised or lowered state of the bridges here in Portland.

Some of the other projects, workshops and other topics we discussed included getting a workshop put together around unit, integration and testing code from a behavioral, test driven development or other approaches. This workshop we don’t have anyone to teach, but we’d (ok, so I really really would love to attend a workshop on this) really like to find somebody who would be willing to teach a workshop of this sort, with a focus on Javascript as the language. On that same topic however, if you’re into Java, Erlang, Scala, Haskell or others and would like to teach a TDD, BDD or related testing workshop please get in touch with me. We will work on making that happen! Ping me at adron at composite code dot com. 😉

Workshop: Intro to Databases & Data

(Relational, Key/Value, Distributed, Graph, Event Series, etc.)

This is a course I’ll lead and others will work with me on to put something extra useful together. We will then teach the workshop as a group, kind of a team paired programming teaching workshop. If there is anything in particular that you’d like to learn about, any questions that you have about data and usage in applications or otherwise add your two cents on this blog entries comments. Over the next month we’ll be putting together the material and have the course available sometime early this year. So if you’d like to attend, jump in at any time with the conversation or just keep a read here and I’ll have more information about the course as we get it put together.

Let’s Make pdxbridge.js Happen!

The pdxbridge.js project is all about determining if a bridge in Portland is up or down. Right now there are  several bridges that matter, that are on this list;

If we add other information to track about the bridges we might add the other 3 that exist and the new bridge that is being built. however the five listed are the only bridges that have a raised and lowered state, and in one case the Steel Bridge has a lowered, partially raised and fully raised state. As shown on the pdxbridge.js badge I threw together (shown above).

To get involved with pdxbridge.js go add your input on this issue I started to discuss our first meet, plan and hack.

How to Build an NPM Package, Beginning the Symphonize Project

NPM has helped to build on the massive Node.js popularity and drive JavaScript from a simple scripting language in the web browser to a powerful and capable back-end server language. A quick refresher, NPM stands for Node.js Package Manager and each package is made up of:

  1. a folder containing a program described by a package.json file.
  2. a gzipped tarball containing [1]
    1. a url that resolves to [2]
    2. a <name>@<version> that is published on the registry with [A]
    3. a <name>@<tag> that points to [B]
    4. a <name> that has a “latest” tag satisfying [C]
    5. a git url that, when cloned, results in [1]
Path structure view in Jetbrains Webstorm IDE.
Path structure view in Jetbrains Webstorm IDE.

With that basic understanding of what a module is that NPM provides, let’s jump through the steps to build a module that provides some basic functionality. I won’t cover too many parts in detail yet, just the happy path to getting an NPM library running.

First let’s create an appropriate folder and file structure to get started with. Here’s the commands I ran to get started.

[sourcecode language=”bash”]
mkdir bin
mkdir lib
[/sourcecode]

With these two directories created I then created the following files in the designated paths. In bin I created the symphonize.js file and in lib I created a main.js file.

Now, I added the following code to the symphonize.js file.

[sourcecode language=”javascript”]
exports.Coupling = function (searchThis, forThis) {
var returnValue = ‘no’;
if (searchThis.indexOf(forThis) > -1) {
returnValue = ‘yes’;
}
return returnValue;
}
[/sourcecode]

In the main.js file I added the following.

[sourcecode language=”javascript”]
(function () {
var couple = require(‘../bin/symphonize’);
couple.Coupling("Sample text", "Sample");
}).call(this)
[/sourcecode]

There are a number of issues with this code, I know, but it’s just a sample of the minimal amount of code, folder structure and packages.json that I need to get this package installed and ready for iteration as I move forward with the actual code base and what functionality will actually be added. Speaking of the packages.json file, I created one and added the following configuration settings to it.

[sourcecode language=”javascript”]
{
"author": "Adron Hall",
"name": "symphonize",
"description": "Prints out data to the console! Will be iterating soon for real functionality!",
"version": "0.1.0",
"repository": {
"url": "git@github.com:Adron/symphonize.git"
},
"main": "./lib/main",
"bin": {
"replaceme": "./bin/symphonize"
},
"dependencies": {},
"devDependencies": {},
"optionalDependencies": {},
"engines": {
"node": "*"
}
}
[/sourcecode]

That is now enough for me to at least get the module added to the global NPM repository, get things pointed back to Github appropriately and move forward with actual coding. I might even setup some continuous builds and delivery at some point, since I’ve now got the end point of where the libraries will be going. The commands to get a module uploaded to the NPM Repository are as follows. This command of course assumes I’ve already added a user using npm adduser or I’ve added one via the web site interface at https://npmjs.org/.

[sourcecode language=”bash”]
npm publish
[/sourcecode]

I’ve now got everything prepared and uploaded to NPM there is now a symphonize module library ready for use.

My NPM Page for Symphonize. Click to go to the actual NPM page.
My NPM Page for Symphonize. Click to go to the actual NPM page.

Here’s a few quick references to where everything is: