Tag Archives: tdd

DSE6 + .NET v?

Project Repo: Interoperability Black Box

First steps. Let’s get .NET installed and setup. I’m running Ubuntu 18.04 for this setup and start of project. To install .NET on Ubuntu one needs to go through a multi-command process of keys and some other stuff, fortunately Microsoft’s teams have made this almost easy by providing the commands for the various Linux distributions here. The commands I ran are as follows to get all this initial setup done.

wget -qO- https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.asc.gpg
sudo mv microsoft.asc.gpg /etc/apt/trusted.gpg.d/
wget -q https://packages.microsoft.com/config/ubuntu/18.04/prod.list
sudo mv prod.list /etc/apt/sources.list.d/microsoft-prod.list
sudo chown root:root /etc/apt/trusted.gpg.d/microsoft.asc.gpg
sudo chown root:root /etc/apt/sources.list.d/microsoft-prod.list

After all this I could then install the .NET SDK. It’s been so long since I actually installed .NET on anything that I wasn’t sure if I just needed the runtime, the SDK, or what I’d actually need. I just assumed it would be safe to install the SDK and then install the runtime too.

sudo apt-get install apt-transport-https
sudo apt-get update
sudo apt-get install dotnet-sdk-2.1

Then the runtime.

sudo apt-get install aspnetcore-runtime-2.1

logoAlright. Now with this installed, I wanted to also see if Jetbrains Rider would detect – or at least what would I have to do – to have the IDE detect that .NET is now installed. So I opened up the IDE to see what the results would be. Over the left hand side of the new solution dialog, if anything isn’t installed Rider usually will display a message that X whatever needs installed. But it looked like everything is showing up as installed, “yay for things working (at this point)!

rider-01

Next up is to get a solution started with the pertinent projects for what I want to build.

dse2

Kazam_screenshot_00001

For the next stage I created three projects.

  1. InteroperationalBlackBox – A basic class library that will be used by a console application or whatever other application or service that may need access to the specific business logic or what not.
  2. InteroperationalBlackBox.Tests – An xunit testing project for testing anything that might need some good ole’ testing.
  3. InteroperationalBlackBox.Cli – A console application (CLI) that I’ll use to interact with the class library and add capabilities going forward.

Alright, now that all the basic projects are setup in the solution, I’ll go out and see about the .NET DataStax Enterprise driver. Inside Jetbrains Rider I can right click on a particular project that I want to add or manage dependencies for. I did that and then put “dse” in the search box. The dialog pops up from the bottom of the IDE and you can add it by clicking on the bottom right plus sign in the description box to the right. Once you click the plus sign, once installed, it becomes a little red x.

dse-adding-package

Alright. Now it’s almost time to get some code working. We need ourselves a database first however. I’m going to setup a cluster in Google Cloud Platform (GCP), but feel free to use whatever cluster you’ve got. These instructions will basically be reusable across wherever you’ve got your cluster setup. I wrote up a walk through and instructions for the GCP Marketplace a few weeks ago. I used the same offering to get this example cluster up and running to use. So, now back to getting the first snippets of code working.

Let’s write a test first.

[Fact]
public void ConfirmDatabase_Connects_False()
{
    var box = new BlackBox();
    Assert.Equal(false, box.ConfirmConnection());
}

In this test, I named the class called BlackBox and am planning to have a parameterless constructor. But as things go tests are very fluid, or ought to be, and I may change it in the next iteration. I’m thinking, at least to get started, that I’ll have a method to test and confirm a connection for the CLI. I’ve named it ConfirmConnection for that purpose. Initially I’m going to test for false, but that’s primarily just to get started. Now, time to implement.

namespace InteroperabilityBlackBox
using System;
using Dse;
using Dse.Auth;

namespace InteroperabilityBlackBox
{
    public class BlackBox
    {
        public BlackBox()
        {}

        public bool ConfirmConnection()
        {
            return false;
        }
    }
}

That gives a passing test and I move forward. For more of the run through of moving from this first step to the finished code session check out this

By the end of the coding session I had a few tests.

using Xunit;

namespace InteroperabilityBlackBox.Tests
{
    public class MakingSureItWorksIntegrationTests
    {
        [Fact]
        public void ConfirmDatabase_Connects_False()
        {
            var box = new BlackBox();
            Assert.Equal(false, box.ConfirmConnection());
        }

        [Fact]
        public void ConfirmDatabase_PassedValuesConnects_True()
        {
            var box = new BlackBox("cassandra", "", "");
            Assert.Equal(false, box.ConfirmConnection());
        }

        [Fact]
        public void ConfirmDatabase_PassedValuesConnects_False()
        {
            var box = new BlackBox("cassandra", "notThePassword", "");
            Assert.Equal(false, box.ConfirmConnection());
        }
    }
}

The respective code for connecting to the database cluster, per the walk through I wrote about here, at session end looked like this.

using System;
using Dse;
using Dse.Auth;

namespace InteroperabilityBlackBox
{
    public class BlackBox : IBoxConnection
    {
        public BlackBox(string username, string password, string contactPoint)
        {
            UserName = username;
            Password = password;
            ContactPoint = contactPoint;
        }

        public BlackBox()
        {
            UserName = "ConfigValueFromSecretsVault";
            Password = "ConfigValueFromSecretsVault";
            ContactPoint = "ConfigValue";
        }

        public string ContactPoint { get; set; }
        public string UserName { get; set; }
        public string Password { get; set; }

        public bool ConfirmConnection()
        {
            IDseCluster cluster = DseCluster.Builder()
                .AddContactPoint(ContactPoint)
                .WithAuthProvider(new DsePlainTextAuthProvider(UserName, Password))
                .Build();

            try
            {
                cluster.Connect();
                return true;
            }
            catch (Exception e)
            {
                Console.WriteLine(e);
                return false;
            }

        }
    }
}

With my interface providing the contract to meet.

namespace InteroperabilityBlackBox
{
    public interface IBoxConnection
    {
        string ContactPoint { get; set; }
        string UserName { get; set; }
        string Password { get; set; }
        bool ConfirmConnection();
    }
}

Conclusions & Next Steps

After I wrapped up the session two things stood out that needed fixed for the next session. I’ll be sure to add these as objectives for the next coding session at 3pm PST on Thursday.

  1. The tests really needed to more resiliently confirm the integrations that I was working to prove out. My plan at this point is to add some Docker images that would provide the development integration tests a point to work against. This would alleviate the need for something outside of the actual project in the repository to exist. Removing that fragility.
  2. The application, in its “Black Box”, should do something. For the next session we’ll write up some feature requests we’d want, or maybe someone has some suggestions of functionality they’d like to see implemented in a CLI using .NET Core working against a DataStax Enterprise Cassandra Database Cluster? Feel free to leave a comment or three about a feature, I’ll work on adding it during the next session.

BDD Style Test Phrasing… What’s Your Poison?

I’ve had this question come up a few times recently, and I wanted to get everybody’s take on it…  when you write BDD style tests, what practice do you prefer?

…I’ll have a follow up to this poll in a few days and explain some of my own reasoning to the whole situation.

Distributed Coding Prefunc: Ubuntu Erlang Dev & EUnit

Erlang LogoAfter installing Erlang on OS-X and then getting QuickCheck installing via Erlang, I wanted to expand the OS options I’m using to Ubuntu (i.e. Linux). So in this entry I’m going to cover the Erlang install, a quick eunit bit, and then a QuickCheck install and sample. The first step in geting Erlang installed is deciding how you want to install it. So far, it isn’t like Rails where it is pretty important which method you pick to how well it will work for you on Ubuntu. For Erlang all methods get you started with a good version that is working. The method I used was simply to install with apt-get.

sudo apt-get install erlang erlang-doc

After installing, always a good idea to run things and make sure they’re all content and happy with your machine. Startup the erlang shell.

erl

Then run some of the commands. Some I’ve found that will present you with useful and interesting information ist he erlang:system_info function with appropriate parameter passed. The otp_release parameter will get the version of erlang, the cpu_topology shows you the processor outlay for you machine (or in this case my virtual machine, with a single processor core allocated to it), and allocated_areas shows a bit about system memory allocations.

Eshell V5.9.1  (abort with ^G)
1> erlang:system_info(otp_release).
"R15B01"
2> erlang:system_info(cpu_topology).
[{processor,{logical,0}}]
3> erlang:system_info(allocated_areas).
[{sys_misc,80748},
 {static,1007616},
 {atom_space,98328,73387},
 {atom_table,95961},
 {module_table,9084},
 {export_table,50316},
 {export_list,240960},
 {register_table,180},
 {fun_table,3266},
 {module_refs,2048},
 {loaded_code,3437028},
 {dist_table,403},
 {node_table,227},
 {bits_bufs_size,0},
 {bif_timer,80200},
 {link_lh,0},
 {process_table,262144},
 {ets_misc,52504}]
[{processor,{logical,0}}]

Now that erlang is effectively installed we can write a little sample code. To do this I created a directory called “TestingErlang” and in it placed an Erlang code file called “eunit_tests.erl”. Note: I’m using Sublime 2 on Ubuntu, so exchange that for whatever text editor you’re using for your Erlang coding.

adron@ubuntu:~/Codez$ mkdir TestingErlang
adron@ubuntu:~/Codez$ cd TestingErlang
adron@ubuntu:~/Codez/TestingErlang$ sublime eunit_tests.erl

Add the header file include.

-define(NOTEST, true).
-include_lib("eunit/include/eunit.hrl").

Adding the header file include will cause all the function with _test() or _test_() to automatically be exported. An exported function of test() is created that can be used for running all of the unit tests. This also will include the preprocessor macros of EUnit for writing tests. So now throw a super simple test into the file.

You may want to, amid the automatic export of the methods ending in _test() or _test_() not name them this, and you’ll then need to add a line at the top of your code file like this.

-export([reverse_test/0]).

After this, add the function test as shown below.

reverse_test() -> lists:reverse([1,2,3]).

The complete code file should look like this.

-module(eunit_test).
-define(NOTEST, true).
-include_lib("eunit/include/eunit.hrl").
-export([reverse_test/0]).

reverse_test() -> lists:reverse([1,2,3]).

Build it and call the function.

Eshell V5.9.1  (abort with ^G)
1> c(eunit_test).
{ok,eunit_test}
2> eunit_test:reverse_test().
[3,2,1]
3> 

BOOM! Passing. So now we know we have a good Erlang install and eunit is setup and usable. In the following blog entries I have in the works, we’ll dive deeper into what and how Erlang works from an extremely basic level all the way to diving into some of the more complex features.

Distributed Coding Prefunc: Installing QuickCheck for Great Testing

Erlang LogoA few weeks ago I kicked off this series of “Distributed Coding Prefunc: Up and Running with Erlang” and had wanted to keep up the momentum, but as life goes I had to tackle a few other things first. But now, it’s time to get back on track with some distributed computing. I intend to write tests with my samples, as I often do, I decided to take a stab at .

Before going forward, note that there is QuickCheck for Haskell and there is a QuickCheck for Erlang. Since the point of this “Distributed Coding Prefunc” is to get started coding with Erlang from zero, I’ll be talking about the Erlang version here. This version is created by John Hughes and Koen Claessen, starting the Quviq Company in 2006.

To download QuickCheck choose the version you intend to use, I’ve chosen the commercial license version from the download page.

At the command prompt, install QuickCheck by running Erlang and then run the install with these commands.

Launch Erlang:

$ erl
Erlang R15B01 (erts-5.9.1)  [smp:4:4] [async-threads:0] [hipe] [kernel-poll:false]

Eshell V5.9.1  (abort with ^G)
1>

Then execute the install:

1> eqc_install:install().

If the execution of the install displays this error, you’ll need to use sudo.

Installing ["pulse-1.27.7","eqc-1.27.7","eqc_mcerlang-1.27.7"].
Failed to copy pulse-1.27.7--copy returned {error,eacces}??
** exception exit: {{error,eacces},"pulse-1.27.7"}
     in function  eqc_install:'-copy_quickcheck/3-lc$^0/1-0-'/3 (../src/eqc_install.erl, line 63)
     in call from eqc_install:install2/4 (../src/eqc_install.erl, line 44)

Kill Erlang with a ctrl+c and restart Erlang with the sudo command.

$ sudo erl

Now when you install you should see the following result or something similar. You’ll be asked to continue, select lowercase ‘y’ to continue. It may be different for some, but when I hit uppercase ‘Y’ (I suppose I got overzealous to install QuickCheck) it finished as if I’d hit no or something else.

1> eqc_install:install().
Installation program for "Quviq QuickCheck" version 1.27.7.
Installing in directory /usr/local/lib/erlang/lib.
This will delete conflicting versions of QuickCheck, namely
    []
Proceed? y
Installing ["pulse-1.27.7","eqc-1.27.7","eqc_mcerlang-1.27.7"].
Quviq QuickCheck is installed successfully.
Looking in "/Users/adronhall"...   .emacs not found
Could not find your .emacs file!
Try install("path-to-emacs-file") or install(new_emacs).
Bookmark the documentation at /usr/local/lib/erlang/lib/eqc-1.27.7/doc/index.html.
ok

You’ll note above, I don’t currently have emacs installed. The reason it looks for emacs is because QuickCheck has templates/ops mode for emacs. So if you use emacs you’re in luck. I on the other hand, don’t, so I’ll just be using this from wherever I’m using it.

In addition to the lack of emacs, another important thing to note from the message is the link to documentation. Once you get this link open it up and check out the docs. They’re broken out into easily readily topic spaces and are a good place to do initial reference checking while you’re writing up your specs.

If you have a license, it is important to note, that if you’ve used sudo with your installation you’ll need to kill your running Erlang session and start it anew without sudo. Otherwise you’ll run into issue down the road trying to use the libs (unless of course you want to go hack on your permissions manually). Once you’re ready to register the software it’s simply one command, where xxxxx is your license key.

eqc:registration("xxxxxxxxxxxx").

Alright, next time we’re on to next steps…

JavaScript Libraries Spilled EVERYWHERE! Series #002

In the last blog entry I wrote up vows.js for testing JavaScript, in this I’ve tried out another testing framework called mocha. This framework is pretty extensive as you can do the things you do with vows.js as well as a lot of other techniques. In addition to mocha I’ve added a few other things to the mix. As well as a few obvious points where I need to RTFM still about how mocha works.

mocha.js

Before falling off into a conversation about reading the manual, I’ll dive into a bit about mocha. Mocha is a project, hosted on github as you might expect, that aims to provide a very feature rich test framework that can run via node or the browser. It also enables asynchronous and synchronous testing with some pretty sweet reporting.

The installation is the standard simplicity of a beautiful and elegant package from npm for node.js.

$ npm install -g mocha

Of course, depending on the way you’ve setup your machine, you may need to hit that command with sudo. The sample test on the main documentation & project page is pretty straight forward, I’ve copied it below for easy reference.

$ npm install -g mocha
$ mkdir test
$ $EDITOR test/test.js

Note that above, where “$EDITOR” is whatever editor you’re using, such as Sublime 2, Textmate, Webstorm or whatever.

var assert = require("assert")
describe('Array', function(){
  describe('#indexOf()', function(){
    it('should return -1 when the value is not present', function(){
      assert.equal(-1, [1,2,3].indexOf(5));
      assert.equal(-1, [1,2,3].indexOf(0));
    })
  })
})

Then run the test.

$  mocha

  .

  ✔ 1 test complete (1ms)

Overall, pretty sweet and simple. There are ways to setup mocha to do TDD or BDD style also. What I’ve done for my basic name generator I’m building has started of super simple. One of the other additions that I’ve added below is the should.js library. I’ll add more about this library, and the intent behind using it in the next blog article I write up.

My first tests I’ve put together I’ve entered below.

require('should');
var factory = require('./../lib/NameFactory').nameFactory;

describe('when working with names using NameFactory', function(){
  var generateThisManyNames = 3;

  describe('generating names', function(){
    it('should return a name between 3 and 20 characters', function(){
        factory.generate_names().length.should.be.approximately(3,20);
    });
    it('should return correct number of names', function(){
        factory.generate_names(generateThisManyNames).length.should.equal(generateThisManyNames);
    });
  })
});

…and then I’ve implemented the absolute basic to get those tests to pass. You JavaScripters that have been at it for a while can likely detect my massive newbism among this code. I’d love feedback btw. 🙂 So here’s my basic implementation so far.

module.exports.nameFactory = {
    generate_names: function (count) {
        if (count > 1){
            return ['','',''];
        }
        else{
            return 'stuff';
        }
    }
};

I’m still sketch on best practices around a number of object uses and creation in JavaScript. For instance in my code above, I’m creating a module, and setting up properties with functions, which leads to a number of questions… (maybe it is stack overflow time)

  • Is this an efficient way to create a JavaScript function for returning randomly created names?
  • What would your first tests be when creating a name generator? Any suggestions on some tests to add?
  • This shows some of my intent, such as returning a single result if no count is entered or it is less than 2, any suggestions on doing a kind of overloaded factory pattern method like that?
  • Now that I’ve gotten what I do above, I need some more tests to add to confirm that the returned content isn’t [”,”,”] <- cuz’ obviously that’s useless and not random, just returns the 3 string array that would prospectively have generated (or randomly selected) names. Ideas?
  • Is there another way to create a class or namespace that isn’t “module.export…” like I have above with “module.exports.nameFactory”? I’d love to just have a “NameFactory” or something. Not sure how or what way would be best in JavaScript land to put something like that together.

I’ve already moved past just this, but would love any feedback on the above questions and code snippets. I’ll post my results and the changes and additions from any body that posts feedback and suggestions in a subsequent post.  🙂

Cheers!

Me on TDD/BDD/Pairing and Jason Fried’s TED Talk and “why work isn’t done in the office…”

This talk is so right, but could it be so wrong at the same time?

Just watch this, that’s all I have to say. Jason is so right about this topic. Here’s a few quotes to convince you.

  • I’m going to talk about work, and why people can’t seem to get things done at work…
  • If you ask people the question, “where do you go when you really need to get something done?” you typically get three different types of answers; one is a kind of a place, a location or a room, another is a moving object, a third is a time…
  • The Train”  <  – That one caught my fancy, if you’ve ever talked to me about transit you know that one caught me…  🙂
  • What you almost never hear people say is “the office”
  • Managers and bosses will tell you the distractions at work are things like Facebook, Youtube…”  “…and they’ll go so far as to ban it…”  “…what is this China?!”
  • The real problem in the modern office is the M & Ms”   <  –  Oh hell yeah, so very true.
  • Manager’s jobs are really to interrupt people…” “…they don’t really do work so they have to interrupt you.
  • You would never see a spontaneous meeting of employees, no, managers do that…

To summarize, do telecommuting right, and it will absolutely blow away anything that is ever accomplished “at the office“.

Oh my Adron, you’re such a hypocrite! You are always talking about TDD and BDD and Pair Programming and teams being together and…

YES! You have a point, so let me throw this prospective hypocriticalness of mine away and prevent any concern that I’ve missed a logical connection. I assure you, I haven’t.  🙂

I do support people working remotely. I also love to have a team close together with high communication (and here’s the catch) that is focused on the problem. This is what Jason is talking about! People generally don’t stay focused in cross-cut teams, with this focus and that focus and then throw managers on top of that. The next thing you have the dreaded M & Ms dramatically decrease any chance of work getting done.

If a team can be left to their work, especially if they have clear problems to attack, to pair on, to write tests against and to implement this is the precise example of why to work together. However, I’ve also seen successful, very successful teams working together remotely. Jason & the 37signal’s crew have done that before! They’re a prime example of it.

But How Does Remote Work, Work?

You have to be disciplined, you have to have check in points, but take 2-4 hours at a chunk and do work! Use e-mail and instant messaging as Jason points out. These are the keys to successfully getting things done! Where I currently work, we actually get this type of allowance. We even do remote pairing (albeit rarely, but it has been done)!  It can work, and it can work very well. However we often break away and have time chunked where we don’t talk, but instead leap forward in our efforts to get work done. Sometimes we pair, sometimes we don’t, it generally depends on if we’re writing code or just getting configs and databases put together to write code against. No reason to pair on a configuration file!  😉

So really, the key isn’t to be physically collocated, or that you have to be remote to each other. The key is to have communication, high levels of communication, but at the right moments in time! The communication must be focused and to the point. It much bring information that is needed, not long drawn out meetings of vacuous boredom and emptiness. The work is done when someone, or a pair, can focus on the problem at hand and find the solution to that problem – alone or with their pair. These are the keys to getting real work done!

Thanks TED Talks for getting me all fired up this morning!  🙂

Urban Lean Agile Tech Breakfast Meetup, Be There This Wednesday!

Are you hard core into technology and software development like Node.js, JavaScript, Ruby, Rails, .NET, Java, Clojure and more?

Do you like the ideas behind the agile manifesto, lean startup, kanban, and thinking outside of the box?

Are you digging that ASP.NET MVC Framework or waiting for the next ALT.NET meetup?

Loving that ease of Ruby on Rails to wow your user base with ease, to implement with Sinatra those clean JavaScript & jQuery enabled UX for your clients?

Want to talk shop, eat some grub, have a beverage, and get a nerd kick start in the morning?

In that case meet us for Urban Lean Agile Tech Breakfast Meetup at Mod Pizza @ 1302 6th Avenue @ 8 am on Wednesday, August 3rd.