Growing a development team is somewhat parallel to building a skyscraper. You start with a solid foundation – you’d at least hope – add floors with meticulous planning, and eventually, you reach the pinnacle with strong leadership guiding the entire structure. In software development, this process involves strategic hiring decisions at various stages. Let’s delve into why focusing on Senior and Mid-Level developers initially is crucial, the role of Junior developers in the maturation phase, and the point at which hiring Principal developers becomes essential.
Continue reading “A Guide Path for Strategic Growth and Leadership in Software Development”Category: Software Projects
Software Development: Getting Started by Getting Organized
A few weeks ago I wrote up the post on the tech I’ve decided to move forward with for my new project. This post is going to cover the collection of features, domain details (i.e. what is the use case, etc), and related project collateral. Instead of just slinging code like many of us programmers often do, I’m going to layout what I’m trying to build, what features I want, and how I’m going to put those features together before delving into actual code. This way, my hope is I’ll be able to keep track over time better, and if any of this turns into something I’ll then have something to keep working from instead of throwing a code base over the fence to other devs. A crazy action that happens all the time, but is something worth avoiding!
Continue reading “Software Development: Getting Started by Getting Organized”How to Reconnoiter a New Role!
“i.e. Starting a challenging new role!”
I’m stepping into a role right now, which I announced recently “Career Update: Back to Engineering!“. In that role I have a number of key topics and knowledge specific to the role that I need to attain. Most of this is centered around the current state of teams, members of those teams, work in progress, product, and service status. The following are some of the important steps I’ve taken to reconnoiter the current state of things. These steps I’ve taken to get up to speed as quickly as possible! Continue reading “How to Reconnoiter a New Role!”
DSE6 + .NET v?
Project Repo: Interoperability Black Box
First steps. Let’s get .NET installed and setup. I’m running Ubuntu 18.04 for this setup and start of project. To install .NET on Ubuntu one needs to go through a multi-command process of keys and some other stuff, fortunately Microsoft’s teams have made this almost easy by providing the commands for the various Linux distributions here. The commands I ran are as follows to get all this initial setup done.
[sourcecode language=”bash”]
wget -qO- https://packages.microsoft.com/keys/microsoft.asc | gpg –dearmor > microsoft.asc.gpg
sudo mv microsoft.asc.gpg /etc/apt/trusted.gpg.d/
wget -q https://packages.microsoft.com/config/ubuntu/18.04/prod.list
sudo mv prod.list /etc/apt/sources.list.d/microsoft-prod.list
sudo chown root:root /etc/apt/trusted.gpg.d/microsoft.asc.gpg
sudo chown root:root /etc/apt/sources.list.d/microsoft-prod.list
[/sourcecode]
After all this I could then install the .NET SDK. It’s been so long since I actually installed .NET on anything that I wasn’t sure if I just needed the runtime, the SDK, or what I’d actually need. I just assumed it would be safe to install the SDK and then install the runtime too.
[sourcecode language=”bash”]
sudo apt-get install apt-transport-https
sudo apt-get update
sudo apt-get install dotnet-sdk-2.1
[/sourcecode]
Then the runtime.
[sourcecode language=”bash”]
sudo apt-get install aspnetcore-runtime-2.1
[/sourcecode]
Alright. Now with this installed, I wanted to also see if Jetbrains Rider would detect – or at least what would I have to do – to have the IDE detect that .NET is now installed. So I opened up the IDE to see what the results would be. Over the left hand side of the new solution dialog, if anything isn’t installed Rider usually will display a message that X whatever needs installed. But it looked like everything is showing up as installed, “yay for things working (at this point)!”

Next up is to get a solution started with the pertinent projects for what I want to build.


For the next stage I created three projects.
- InteroperationalBlackBox – A basic class library that will be used by a console application or whatever other application or service that may need access to the specific business logic or what not.
- InteroperationalBlackBox.Tests – An xunit testing project for testing anything that might need some good ole’ testing.
- InteroperationalBlackBox.Cli – A console application (CLI) that I’ll use to interact with the class library and add capabilities going forward.
Alright, now that all the basic projects are setup in the solution, I’ll go out and see about the .NET DataStax Enterprise driver. Inside Jetbrains Rider I can right click on a particular project that I want to add or manage dependencies for. I did that and then put “dse” in the search box. The dialog pops up from the bottom of the IDE and you can add it by clicking on the bottom right plus sign in the description box to the right. Once you click the plus sign, once installed, it becomes a little red x.

Alright. Now it’s almost time to get some code working. We need ourselves a database first however. I’m going to setup a cluster in Google Cloud Platform (GCP), but feel free to use whatever cluster you’ve got. These instructions will basically be reusable across wherever you’ve got your cluster setup. I wrote up a walk through and instructions for the GCP Marketplace a few weeks ago. I used the same offering to get this example cluster up and running to use. So, now back to getting the first snippets of code working.
Let’s write a test first.
[sourcecode language=”csharp”]
[Fact]
public void ConfirmDatabase_Connects_False()
{
var box = new BlackBox();
Assert.Equal(false, box.ConfirmConnection());
}
[/sourcecode]
In this test, I named the class called BlackBox and am planning to have a parameterless constructor. But as things go tests are very fluid, or ought to be, and I may change it in the next iteration. I’m thinking, at least to get started, that I’ll have a method to test and confirm a connection for the CLI. I’ve named it ConfirmConnection for that purpose. Initially I’m going to test for false, but that’s primarily just to get started. Now, time to implement.
[sourcecode language=”csharp”]
namespace InteroperabilityBlackBox
using System;
using Dse;
using Dse.Auth;
namespace InteroperabilityBlackBox
{
public class BlackBox
{
public BlackBox()
{}
public bool ConfirmConnection()
{
return false;
}
}
}
[/sourcecode]
That gives a passing test and I move forward. For more of the run through of moving from this first step to the finished code session check out this
By the end of the coding session I had a few tests.
[sourcecode language=”csharp”]
using Xunit;
namespace InteroperabilityBlackBox.Tests
{
public class MakingSureItWorksIntegrationTests
{
[Fact]
public void ConfirmDatabase_Connects_False()
{
var box = new BlackBox();
Assert.Equal(false, box.ConfirmConnection());
}
[Fact]
public void ConfirmDatabase_PassedValuesConnects_True()
{
var box = new BlackBox(“cassandra”, “”, “”);
Assert.Equal(false, box.ConfirmConnection());
}
[Fact]
public void ConfirmDatabase_PassedValuesConnects_False()
{
var box = new BlackBox(“cassandra”, “notThePassword”, “”);
Assert.Equal(false, box.ConfirmConnection());
}
}
}
[/sourcecode]
The respective code for connecting to the database cluster, per the walk through I wrote about here, at session end looked like this.
[sourcecode language=”csharp”]
using System;
using Dse;
using Dse.Auth;
namespace InteroperabilityBlackBox
{
public class BlackBox : IBoxConnection
{
public BlackBox(string username, string password, string contactPoint)
{
UserName = username;
Password = password;
ContactPoint = contactPoint;
}
public BlackBox()
{
UserName = “ConfigValueFromSecretsVault”;
Password = “ConfigValueFromSecretsVault”;
ContactPoint = “ConfigValue”;
}
public string ContactPoint { get; set; }
public string UserName { get; set; }
public string Password { get; set; }
public bool ConfirmConnection()
{
IDseCluster cluster = DseCluster.Builder()
.AddContactPoint(ContactPoint)
.WithAuthProvider(new DsePlainTextAuthProvider(UserName, Password))
.Build();
try
{
cluster.Connect();
return true;
}
catch (Exception e)
{
Console.WriteLine(e);
return false;
}
}
}
}
[/sourcecode]
With my interface providing the contract to meet.
[sourcecode language=”csharp”]
namespace InteroperabilityBlackBox
{
public interface IBoxConnection
{
string ContactPoint { get; set; }
string UserName { get; set; }
string Password { get; set; }
bool ConfirmConnection();
}
}
[/sourcecode]
Conclusions & Next Steps
After I wrapped up the session two things stood out that needed fixed for the next session. I’ll be sure to add these as objectives for the next coding session at 3pm PST on Thursday.
- The tests really needed to more resiliently confirm the integrations that I was working to prove out. My plan at this point is to add some Docker images that would provide the development integration tests a point to work against. This would alleviate the need for something outside of the actual project in the repository to exist. Removing that fragility.
- The application, in its “Black Box”, should do something. For the next session we’ll write up some feature requests we’d want, or maybe someone has some suggestions of functionality they’d like to see implemented in a CLI using .NET Core working against a DataStax Enterprise Cassandra Database Cluster? Feel free to leave a comment or three about a feature, I’ll work on adding it during the next session.
- Project Repo: https://github.com/Adron/InteroperabilityBlackBox
- File an Feature Request: https://github.com/Adron/InteroperabilityBlackBox/issues/new?template=feature_request.md
Documentation First w/ README.md && Project Tree Build
I’m sitting here trying to get the folder structure for my project into a kind of ASCII Tree or something. I wasn’t going to manually do this, it would be insane. Especially on any decent size Enterprise Project with an endless supply of folders and nested content. I went digging to come up with a better solution. On Linux I immediately found the Tree Utility which was perfect.
Except I was on OS-X.
First option I gave a go to was to build the thing. Because I like to do things the hard way sometimes. First I needed to get the source, which is available here.
[sourcecode language=”bash”]curl -O ftp://mama.indstate.edu/linux/tree/tree-1.7.0.tgz[/sourcecode]
Once downloaded, unzip the source into a directory and find the following section for the particular operating system you want to use the utility on. The section for OS settings looked like this when I finished editing it.
[sourcecode language=”bash”]
# Uncomment options below for your particular OS:
# Uncomment for OS X:
CC=cc
CFLAGS=-O2 -Wall -fomit-frame-pointer -no-cpp-precomp
LDFLAGS=
MANDIR=/usr/share/man/man1
OBJS+=strverscmp.o
[/sourcecode]
Now get a good build of the command file.
[sourcecode language=”bash”]
./configure
make
[/sourcecode]
Now let’s get tree into the executable path.
[sourcecode language=”bash”]
sudo mkdir -p /usr/local/bin
sudo cp tree /usr/local/bin/tree
[/sourcecode]
Make sure your ~/.bash_profile is setup right, include this.
[sourcecode language=”bash”]
export PATH="/usr/local/bin:$PATH"
[/sourcecode]
Reload the shell and tree should be available as a command.
The other option which is really simple, if you don’t want to compile to code, is to just use brew to install it.
[sourcecode language=”bash”]
brew install tree
[/sourcecode]
So now you can use tree, and do cool stuff like pipe it out to a file. If you’re running this against a Node.js Project you may want to delete the node_modules directory and then just reinstall it after running the tree command.
[sourcecode language=”bash”]
tree > prof-tree.md
[/sourcecode]
Then in your README.md file you can include the folder structure in the description of the project. Here’s a sample output!
[sourcecode language=”bash”]
.
├── README.md
├── client
│ └── README.md
├── package.json
├── proj-tree.md
├── server
│ ├── boot
│ │ ├── authentication.js
│ │ ├── explorer.js
│ │ ├── rest-api.js
│ │ └── root.js
│ ├── config.json
│ ├── datasources.json
│ ├── middleware.json
│ ├── model-config.json
│ └── server.js
└── test
└── test_exists.js
4 directories, 14 files
[/sourcecode]
That’s a super easy way to offer better documentation that provides some real insight into what various parts of the project structure are actually for.
You must be logged in to post a comment.