Category Archives: Software Projects

DSE6 + .NET v?

Project Repo: Interoperability Black Box

First steps. Let’s get .NET installed and setup. I’m running Ubuntu 18.04 for this setup and start of project. To install .NET on Ubuntu one needs to go through a multi-command process of keys and some other stuff, fortunately Microsoft’s teams have made this almost easy by providing the commands for the various Linux distributions here. The commands I ran are as follows to get all this initial setup done.

wget -qO- https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.asc.gpg
sudo mv microsoft.asc.gpg /etc/apt/trusted.gpg.d/
wget -q https://packages.microsoft.com/config/ubuntu/18.04/prod.list
sudo mv prod.list /etc/apt/sources.list.d/microsoft-prod.list
sudo chown root:root /etc/apt/trusted.gpg.d/microsoft.asc.gpg
sudo chown root:root /etc/apt/sources.list.d/microsoft-prod.list

After all this I could then install the .NET SDK. It’s been so long since I actually installed .NET on anything that I wasn’t sure if I just needed the runtime, the SDK, or what I’d actually need. I just assumed it would be safe to install the SDK and then install the runtime too.

sudo apt-get install apt-transport-https
sudo apt-get update
sudo apt-get install dotnet-sdk-2.1

Then the runtime.

sudo apt-get install aspnetcore-runtime-2.1

logoAlright. Now with this installed, I wanted to also see if Jetbrains Rider would detect – or at least what would I have to do – to have the IDE detect that .NET is now installed. So I opened up the IDE to see what the results would be. Over the left hand side of the new solution dialog, if anything isn’t installed Rider usually will display a message that X whatever needs installed. But it looked like everything is showing up as installed, “yay for things working (at this point)!

rider-01

Next up is to get a solution started with the pertinent projects for what I want to build.

dse2

Kazam_screenshot_00001

For the next stage I created three projects.

  1. InteroperationalBlackBox – A basic class library that will be used by a console application or whatever other application or service that may need access to the specific business logic or what not.
  2. InteroperationalBlackBox.Tests – An xunit testing project for testing anything that might need some good ole’ testing.
  3. InteroperationalBlackBox.Cli – A console application (CLI) that I’ll use to interact with the class library and add capabilities going forward.

Alright, now that all the basic projects are setup in the solution, I’ll go out and see about the .NET DataStax Enterprise driver. Inside Jetbrains Rider I can right click on a particular project that I want to add or manage dependencies for. I did that and then put “dse” in the search box. The dialog pops up from the bottom of the IDE and you can add it by clicking on the bottom right plus sign in the description box to the right. Once you click the plus sign, once installed, it becomes a little red x.

dse-adding-package

Alright. Now it’s almost time to get some code working. We need ourselves a database first however. I’m going to setup a cluster in Google Cloud Platform (GCP), but feel free to use whatever cluster you’ve got. These instructions will basically be reusable across wherever you’ve got your cluster setup. I wrote up a walk through and instructions for the GCP Marketplace a few weeks ago. I used the same offering to get this example cluster up and running to use. So, now back to getting the first snippets of code working.

Let’s write a test first.

[Fact]
public void ConfirmDatabase_Connects_False()
{
    var box = new BlackBox();
    Assert.Equal(false, box.ConfirmConnection());
}

In this test, I named the class called BlackBox and am planning to have a parameterless constructor. But as things go tests are very fluid, or ought to be, and I may change it in the next iteration. I’m thinking, at least to get started, that I’ll have a method to test and confirm a connection for the CLI. I’ve named it ConfirmConnection for that purpose. Initially I’m going to test for false, but that’s primarily just to get started. Now, time to implement.

namespace InteroperabilityBlackBox
using System;
using Dse;
using Dse.Auth;

namespace InteroperabilityBlackBox
{
    public class BlackBox
    {
        public BlackBox()
        {}

        public bool ConfirmConnection()
        {
            return false;
        }
    }
}

That gives a passing test and I move forward. For more of the run through of moving from this first step to the finished code session check out this

By the end of the coding session I had a few tests.

using Xunit;

namespace InteroperabilityBlackBox.Tests
{
    public class MakingSureItWorksIntegrationTests
    {
        [Fact]
        public void ConfirmDatabase_Connects_False()
        {
            var box = new BlackBox();
            Assert.Equal(false, box.ConfirmConnection());
        }

        [Fact]
        public void ConfirmDatabase_PassedValuesConnects_True()
        {
            var box = new BlackBox("cassandra", "", "");
            Assert.Equal(false, box.ConfirmConnection());
        }

        [Fact]
        public void ConfirmDatabase_PassedValuesConnects_False()
        {
            var box = new BlackBox("cassandra", "notThePassword", "");
            Assert.Equal(false, box.ConfirmConnection());
        }
    }
}

The respective code for connecting to the database cluster, per the walk through I wrote about here, at session end looked like this.

using System;
using Dse;
using Dse.Auth;

namespace InteroperabilityBlackBox
{
    public class BlackBox : IBoxConnection
    {
        public BlackBox(string username, string password, string contactPoint)
        {
            UserName = username;
            Password = password;
            ContactPoint = contactPoint;
        }

        public BlackBox()
        {
            UserName = "ConfigValueFromSecretsVault";
            Password = "ConfigValueFromSecretsVault";
            ContactPoint = "ConfigValue";
        }

        public string ContactPoint { get; set; }
        public string UserName { get; set; }
        public string Password { get; set; }

        public bool ConfirmConnection()
        {
            IDseCluster cluster = DseCluster.Builder()
                .AddContactPoint(ContactPoint)
                .WithAuthProvider(new DsePlainTextAuthProvider(UserName, Password))
                .Build();

            try
            {
                cluster.Connect();
                return true;
            }
            catch (Exception e)
            {
                Console.WriteLine(e);
                return false;
            }

        }
    }
}

With my interface providing the contract to meet.

namespace InteroperabilityBlackBox
{
    public interface IBoxConnection
    {
        string ContactPoint { get; set; }
        string UserName { get; set; }
        string Password { get; set; }
        bool ConfirmConnection();
    }
}

Conclusions & Next Steps

After I wrapped up the session two things stood out that needed fixed for the next session. I’ll be sure to add these as objectives for the next coding session at 3pm PST on Thursday.

  1. The tests really needed to more resiliently confirm the integrations that I was working to prove out. My plan at this point is to add some Docker images that would provide the development integration tests a point to work against. This would alleviate the need for something outside of the actual project in the repository to exist. Removing that fragility.
  2. The application, in its “Black Box”, should do something. For the next session we’ll write up some feature requests we’d want, or maybe someone has some suggestions of functionality they’d like to see implemented in a CLI using .NET Core working against a DataStax Enterprise Cassandra Database Cluster? Feel free to leave a comment or three about a feature, I’ll work on adding it during the next session.

Documentation First w/ README.md && Project Tree Build

I’m sitting here trying to get the folder structure for my project into a kind of ASCII Tree or something. I wasn’t going to manually do this, it would be insane. Especially on any decent size Enterprise Project with an endless supply of folders and nested content. I went digging to come up with a better solution. On Linux I immediately found the Tree Utility which was perfect.

Except I was on OS-X.

First option I gave a go to was to build the thing. Because I like to do things the hard way sometimes. First I needed to get the source, which is available here.

curl -O ftp://mama.indstate.edu/linux/tree/tree-1.7.0.tgz

Once downloaded, unzip the source into a directory and find the following section for the particular operating system you want to use the utility on. The section for OS settings looked like this when I finished editing it.

# Uncomment options below for your particular OS:

# Uncomment for OS X:
CC=cc
CFLAGS=-O2 -Wall -fomit-frame-pointer -no-cpp-precomp
LDFLAGS=
MANDIR=/usr/share/man/man1
OBJS+=strverscmp.o

Now get a good build of the command file.

./configure
make

Now let’s get tree into the executable path.

sudo mkdir -p /usr/local/bin
sudo cp tree /usr/local/bin/tree

Make sure your ~/.bash_profile is setup right, include this.

export PATH="/usr/local/bin:$PATH"

Reload the shell and tree should be available as a command.

The other option which is really simple, if you don’t want to compile to code, is to just use brew to install it.

brew install tree

So now you can use tree, and do cool stuff like pipe it out to a file. If you’re running this against a Node.js Project you may want to delete the node_modules directory and then just reinstall it after running the tree command.

tree > prof-tree.md

Then in your README.md file you can include the folder structure in the description of the project. Here’s a sample output!

.
├── README.md
├── client
│   └── README.md
├── package.json
├── proj-tree.md
├── server
│   ├── boot
│   │   ├── authentication.js
│   │   ├── explorer.js
│   │   ├── rest-api.js
│   │   └── root.js
│   ├── config.json
│   ├── datasources.json
│   ├── middleware.json
│   ├── model-config.json
│   └── server.js
└── test
    └── test_exists.js

4 directories, 14 files

That’s a super easy way to offer better documentation that provides some real insight into what various parts of the project structure are actually for.

JavaScript

Plotting Good Things in Portland :: pdxbridge.js / WTF Databases /

Several people got together yesterday to start planning things for 2014 in PDX. It ranged from coding workshops to PDX Node to Node PDX to what kind of food to eat at for lunch. Ya know, daily tactical things that come along with the big picture items. 😉

bridge.js badge.

bridge.js badge.

Two things that I want to bring up to the community out there. One is a workshop that I’ll likely lead efforts to organize and the other is something I’ll just call pdxbridge.js for now. The workshop will cover the topics of which and what databases to use for what data and how to implement. The pdxbridge.js project is about determining the raised or lowered state of the bridges here in Portland.

Some of the other projects, workshops and other topics we discussed included getting a workshop put together around unit, integration and testing code from a behavioral, test driven development or other approaches. This workshop we don’t have anyone to teach, but we’d (ok, so I really really would love to attend a workshop on this) really like to find somebody who would be willing to teach a workshop of this sort, with a focus on Javascript as the language. On that same topic however, if you’re into Java, Erlang, Scala, Haskell or others and would like to teach a TDD, BDD or related testing workshop please get in touch with me. We will work on making that happen! Ping me at adron at composite code dot com. 😉

Workshop: Intro to Databases & Data

(Relational, Key/Value, Distributed, Graph, Event Series, etc.)

This is a course I’ll lead and others will work with me on to put something extra useful together. We will then teach the workshop as a group, kind of a team paired programming teaching workshop. If there is anything in particular that you’d like to learn about, any questions that you have about data and usage in applications or otherwise add your two cents on this blog entries comments. Over the next month we’ll be putting together the material and have the course available sometime early this year. So if you’d like to attend, jump in at any time with the conversation or just keep a read here and I’ll have more information about the course as we get it put together.

Let’s Make pdxbridge.js Happen!

The pdxbridge.js project is all about determining if a bridge in Portland is up or down. Right now there are  several bridges that matter, that are on this list;

If we add other information to track about the bridges we might add the other 3 that exist and the new bridge that is being built. however the five listed are the only bridges that have a raised and lowered state, and in one case the Steel Bridge has a lowered, partially raised and fully raised state. As shown on the pdxbridge.js badge I threw together (shown above).

To get involved with pdxbridge.js go add your input on this issue I started to discuss our first meet, plan and hack.

How to Build an NPM Package, Beginning the Symphonize Project

NPM has helped to build on the massive Node.js popularity and drive JavaScript from a simple scripting language in the web browser to a powerful and capable back-end server language. A quick refresher, NPM stands for Node.js Package Manager and each package is made up of:

  1. a folder containing a program described by a package.json file.
  2. a gzipped tarball containing [1]
    1. a url that resolves to [2]
    2. a <name>@<version> that is published on the registry with [A]
    3. a <name>@<tag> that points to [B]
    4. a <name> that has a “latest” tag satisfying [C]
    5. a git url that, when cloned, results in [1]
Path structure view in Jetbrains Webstorm IDE.

Path structure view in Jetbrains Webstorm IDE.

With that basic understanding of what a module is that NPM provides, let’s jump through the steps to build a module that provides some basic functionality. I won’t cover too many parts in detail yet, just the happy path to getting an NPM library running.

First let’s create an appropriate folder and file structure to get started with. Here’s the commands I ran to get started.

mkdir bin
mkdir lib

With these two directories created I then created the following files in the designated paths. In bin I created the symphonize.js file and in lib I created a main.js file.

Now, I added the following code to the symphonize.js file.

exports.Coupling = function (searchThis, forThis) {
    var returnValue = 'no';
    if (searchThis.indexOf(forThis) > -1) {
        returnValue = 'yes';
    }
    return returnValue;
}

In the main.js file I added the following.

(function () {
    var couple = require('../bin/symphonize');
    couple.Coupling("Sample text", "Sample");
}).call(this)

There are a number of issues with this code, I know, but it’s just a sample of the minimal amount of code, folder structure and packages.json that I need to get this package installed and ready for iteration as I move forward with the actual code base and what functionality will actually be added. Speaking of the packages.json file, I created one and added the following configuration settings to it.

{
    "author": "Adron Hall",
    "name": "symphonize",
    "description": "Prints out data to the console! Will be iterating soon for real functionality!",
    "version": "0.1.0",
    "repository": {
        "url": "git@github.com:Adron/symphonize.git"
    },
    "main": "./lib/main",
    "bin": {
        "replaceme": "./bin/symphonize"
    },
    "dependencies": {},
    "devDependencies": {},
    "optionalDependencies": {},
    "engines": {
        "node": "*"
    }
}

That is now enough for me to at least get the module added to the global NPM repository, get things pointed back to Github appropriately and move forward with actual coding. I might even setup some continuous builds and delivery at some point, since I’ve now got the end point of where the libraries will be going. The commands to get a module uploaded to the NPM Repository are as follows. This command of course assumes I’ve already added a user using npm adduser or I’ve added one via the web site interface at https://npmjs.org/.

npm publish

I’ve now got everything prepared and uploaded to NPM there is now a symphonize module library ready for use.

My NPM Page for Symphonize. Click to go to the actual NPM page.

My NPM Page for Symphonize. Click to go to the actual NPM page.

Here’s a few quick references to where everything is:

JavaScript

Orchestrate.io JavaScript Client Library

Today I’m starting a project working with Orchestrate.io’s API & open source software collaborations. More about the project in a moment, let’s get up to speed on what I’ll be including in this project. My main focus is to build a client library to access Orchestrate.io. During building this I’ll dive into the key value, graph and other storage mechanisms that the client library will provide. Beyond that, I’ll take a stroll through building an NPM library and the pertinent JavaScript the library. So buckle up, we’re going on a code slinging hash writing hacking session.

Over the course of putting together this material, I’ll be posting most of the core material on Orchestrate.io’s blog, so subscribe for updates as they come out. Feedly is a good option, connect via searching for “orchestrate.io” or navigate over to the Orchestrate.io blog itself. 😉

Project Effort Context

During building the client I’ll take a dive into who, what, where, when, why and how to interact with the various data structures. I’ll aim for the client to follow the model of the existing Go Client Library that is available at Orchestrate Go Client on Github. It follows a basic model as shown below in Go language.

    c := client.NewClient("Your API Key")
    // Get a value
    value, _ := c.Get("collection", "key")
    // Put a value
    c.Put("collection", "key", strings.NewReader("Some JSON"))
    // Search
    results, _ := c.Search("collection", "A Lucene Query")
    // Get Events
    events, _ := c.GetEvents("collection", "key", "kind")
    // Put Event
    c.PutEvent("collection", "key", "kind", strings.NewReader("Some JSON"))
    // Get Relations
    relations, _ := c.GetRelations("collection", "key", []string{"kind", "kind"})
    // Put Relation
    c.PutRelation("sourceCollection", "sourceKey", "kind", "sinkCollection", "sinkKey")

I’ll be working on this client, but don’t hold back on me, feel free to jump in with some of your own code or telling me I wrote some code wrong or whatever. I’d gladly accept any committers jumping in to help out. The more we all work together the more useful information I can provide during this project.

Once this project has produced a workable client pending interest from the community I’ll put together some material about where, how and some best uses around using the client in your Node.js Application. Even prospectively build a JavaScript client side library prospectively for use with Angular or other popular client side libraries.

References

Junction Two Weeks on Tuesday on Friday Bi-weekly Review : Issue #003

…and another update on Junction.

The review slipped past me this week. A little food poisoning will do that to a person. But I’m kicking again.

A Quick Summary

The RSS/News Feed section of the app has been built and put into action. So if you pull the latest code and run the application, navigate into the news section you’ll get the Basho Blog feed. This definitely needs cleaned up a bit from the UI perspective but the main elements are there.

Next steps are…

One of the things the team is aiming to knock out next is to get some MVVM (Mode View ViewModel) architecture setup to build against versus what we’ve started with, which is just the basic skeleton of things thrown together. It works, but it’ll be nice to have some clean architecture behind the application to work with.

I’m aiming to put together a blog entry on troubleshooting the build server for Junction and also the how-to on setting up the RSS/News Feed Reader section of the app in the coming week also. Subscribe to keep up with the latest in Junction news and also all the other tidbits on the blog. Cheers!

Junction Two Weeks on Tuesday Bi-weekly Review : Issue #002

It’s time for another Tuesday Bi-weekly Review! We’ve been making some progress and so far we’ve tackled a few elements of the project. The first big task was to get more information out there for the community & team working on the project. I’ve spent some time along with the contributors on github and via other means to make more information available to what the intent is and how people can contribute. So if you’re interested in helping with an entire domain space or merely a small element of the application, ping me and I’ll work with you to make it as easy as possible to contribute. With that, let’s jump into what’s what and what’s new. Cheers!

We Have a Build Server, More on This Soon, but for now…

I’ll have a post on how to setup Team City and quick tour of what is setup for the Junction Project. So stay tuned and I’ll have that and other news posted as it happens this coming week along with Team City & other tutorials related to the project itself. For a quick sneak peek feel free to take a look at the build server located at:  http://teamcity.cascadiahacks.org/.  Just login with “guest” and no password.

More Items Listed and Working on First Feature Commits and Comments For…

We also got a conversation started among a few of us “What would teams that use Riak like to see in a Riak Admin Application?” Jump into and add your two cents regardless of whether you’re diving into the project or not.

Until later, happy coding!