Today at Hasura we released Hasura v2.0! This is a pretty major release with a number of new features that will dramatically increase the capabilities for Hasura. For several of my projects, specifically the infrastructure as code projects terrazura (check out the previous blog post w/ video time points and more) and tenancy-bydata I was able to get the upgrade to Hasura v2.0 done in moments! Since I don’t have to pull backups or anything for these projects, it merely involved the following steps.
Upgrade the Hasura CLI. This is super easy, just issue the command hasura update-cli --version v2.0.0-alpha.1. This command will then download and update the CLI.
Next I updated the Terraform file so the container pulls the latest version image = "hasura/graphql-engine:v2.0.0-alpha.1".
Next run an updated terraform apply command, which in my case is this command in the case of the terrazura project for example.
Boom! Everything is now updated to v2.0 and we’re ready for all the upcoming Twitch streams relating back to these particular projects!
For more, be sure to subscribe to the HasuraHQ Twitch Channel and my Twitch Channel Thrashing Code as I’ll be covering more of the new features in the coming days!
Review: In the last blog entry I went through more than a few examples of using cURL to issue GET requests against various end points using Node.js & Restify. I also covered the basics on where to go to find cURL in case it isn’t installed. The last part I covered was a little bit of WebStorm info to boot. In this part of the series I’m now going to dive into the HTTP verbs beyond GET.
POST
The practice around issuing a command via http verb to save data is via a post. When you issue a post via cURL use the -X followed by POST to designate a post verb, then -H to assign the content type parameter. In this particular example I’ve set it to application/json since my payload of data will be JSON format. Then add the final data with a -d option, followed by the actual data.
[sourcecode language=”bash”]curl -X POST -H "Content-Type: application/json" -d ‘{"uuid":"79E5591A-1E54-4562-A276-AFC266F54390","webid":"56E62C3A-D6BC-4F4F-B72A-E6CE081190B6"}’ http://localhost:3000/ident%5B/sourcecode%5D
Other data types can be sent, which the content type can be appropriately set for including; html, json, script, text or html. One example of this same command, issued with jQuery on the client side would actually look like this.
[sourcecode language=”javascript”]
var data = {"uuid":"79E5591A-1E54-4562-A276-AFC266F54390","webid":"56E62C3A-D6BC-4F4F-B72A-E6CE081190B6"};
When building post end points via express one of the things you may run into is the following message being displayed in the console.
[sourcecode language=”bash”]
/usr/local/bin/node app.js
connect.multipart() will be removed in connect 3.0
visit https://github.com/senchalabs/connect/wiki/Connect-3.0 for alternatives
connect.limit() will be removed in connect 3.0
[/sourcecode]
The immediate fix for this, until the changes are made (which may or may not mean to just alwasy is to replace this line
var quotes = [
{ author : ‘Audrey Hepburn’, text : "Nothing is impossible, the word itself says ‘I’m possible’!"},
{ author : ‘Walt Disney’, text : "You may not realize it when it happens, but a kick in the teeth may be the best thing in the world for you"},
{ author : ‘Unknown’, text : "Even the greatest was once a beginner. Don’t be afraid to take that first step."},
{ author : ‘Neale Donald Walsch’, text : "You are afraid to die, and you’re afraid to live. What a way to exist."}
];
So often I end up putting together some RESTful services (or the intent is to at least build them with that premise, but we all know how that ends up). The API URIs routing gets put together and one wants to take a crack at the service as soon as possible. Here’s a quick guide for using cURL to take some basic actions against the services and understand what you’re getting back.
The first thing to do is make sure you can run JavaScript, which means you have a computer. The second thing is to get cURL, which means you’re running some variant of Linux or UNIX. In most scenarios one would be running OS-X. The easiest way to determine if it is installed on your computer just open up a terminal and type ‘curl –help’. You should get a result with all the switches, which is almost always a bit of overload.
[sourcecode language=”bash”]$ curl –help
Usage: curl [options…]
Options: (H) means HTTP/HTTPS only, (F) means FTP only
–anyauth Pick "any" authentication method (H)
-a, –append Append to target file when uploading (F/SFTP)
–basic Use HTTP Basic Authentication (H)
–cacert FILE CA certificate to verify peer against (SSL)
–capath DIR CA directory to verify peer against (SSL)
-E, –cert CERT[:PASSWD] Client certificate file and password (SSL)
–cert-type TYPE Certificate file type (DER/PEM/ENG) (SSL)
–ciphers LIST SSL ciphers to use (SSL)
–compressed Request compressed response (using deflate or gzip)
-K, –config FILE Specify which config file to read
–connect-timeout SECONDS Maximum time allowed for connection
-C, –continue-at OFFSET Resumed transfer offset
-b, –cookie STRING/FILE String or file to read cookies from (H)
-c, –cookie-jar FILE Write cookies to this file after operation (H)
–create-dirs Create necessary local directory hierarchy
–crlf Convert LF to CRLF in upload
–crlfile FILE Get a CRL list in PEM format from the given file
-d, –data DATA HTTP POST data (H)
–data-ascii DATA HTTP POST ASCII data (H)
–data-binary DATA HTTP POST binary data (H)
–data-urlencode DATA HTTP POST data url encoded (H)
–delegation STRING GSS-API delegation permission
–digest Use HTTP Digest Authentication (H)
–disable-eprt Inhibit using EPRT or LPRT (F)
–disable-epsv Inhibit using EPSV (F)
-D, –dump-header FILE Write the headers to this file
–egd-file FILE EGD socket path for random data (SSL)
–engine ENGINE Crypto engine (SSL). "–engine list" for list
-f, –fail Fail silently (no output at all) on HTTP errors (H)
-F, –form CONTENT Specify HTTP multipart POST data (H)
–form-string STRING Specify HTTP multipart POST data (H)
–ftp-account DATA Account data string (F)
–ftp-alternative-to-user COMMAND String to replace "USER [name]" (F)
–ftp-create-dirs Create the remote dirs if not present (F)
–ftp-method [MULTICWD/NOCWD/SINGLECWD] Control CWD usage (F)
–ftp-pasv Use PASV/EPSV instead of PORT (F)
-P, –ftp-port ADR Use PORT with given address instead of PASV (F)
–ftp-skip-pasv-ip Skip the IP address for PASV (F)
–ftp-pret Send PRET before PASV (for drftpd) (F)
–ftp-ssl-ccc Send CCC after authenticating (F)
–ftp-ssl-ccc-mode ACTIVE/PASSIVE Set CCC mode (F)
–ftp-ssl-control Require SSL/TLS for ftp login, clear for transfer (F)
-G, –get Send the -d data with a HTTP GET (H)…[/sourcecode]
Don’t get intimidated! It goes on and on and on, but just know it’s installed if you see all these goodies. If you don’t get the results above, then installing cURL is the next step. I’ll leave that to you. Here’s some links to download and get started however.
Next you’ll of course need Node.js and Restify installed. I’ll assume you have Node.js installed. Create a directory and in that directory just run the following command.
Next create a file called server.js in that directory you’ve just installed restify in. Here’s the initial JavaScript code for that file that I’ve used to put together for the first few examples of using cURL.
[sourcecode language=”javascript”]
var restify = require(‘restify’);
function respond(req, res, next) {
res.send(‘hello ‘ + req.params.name);
}
var server = restify.createServer();
server.get(‘/hello/:name’, respond);
server.head(‘/hello/:name’, respond);
Ok, now to run this with node.js just issue the command to launch node.js with this file that was just created.
[sourcecode language=”bash”]
node server.js
restify listening at http://0.0.0.0:8080
[/sourcecode]
Getting Get
Now the service is running on port 8080 against 0.0.0.0. To check out what a standard GET verb will do in a browser, open up a browser and navigate to http://0.0.0.0:8080.
Browsing the GET response via Chrome.
You’ll see this in the browser window. Just straight plain text too. If you look at source, this is all you get back. Now open up a terminal and run the following cURL command to execute a GET against the URI & port. This is the most basic cURL command one can make. It is simply issuing a GET request against the URI and will display the body of the response.
The response will be similar to this for the particular request.
[sourcecode language=”bash”]
{"code":"ResourceNotFound","message":"/ does not exist"}
[/sourcecode]
Your terminal will probably stick the subsequent prompt at the end of the result too, because the result doesn’t end in a newline. Beware of that, your prompt hasn’t disappeared. 😉
To get a little more information you can get the header of the response dumped into the terminal with a -i. The -i option stands for –include, to include the header. Issue the command as either line shown below.
The response will be provide a little bit more about what is going on.
[sourcecode language=”bash”]
HTTP/1.1 404 Not Found
Content-Type: application/json
Content-Length: 56
Date: Wed, 27 Nov 2013 00:27:36 GMT
Connection: keep-alive
{"code":"ResourceNotFound","message":"/ does not exist"}
[/sourcecode]
With this response the actual response error code number is shown. In this case we have a 404, which points us to the problem with this curl request. The server isn’t returning anything to our curl request. If we look at the code, we can see that the ‘get’ route is setup as ‘/hello/:name’ which means that the domain root is only looking at http://url_root/hello/someName for a request to be made in order to return a response.
[sourcecode language=”javascript”]
var server = restify.createServer();
server.get(‘/hello/:name’, respond);
server.head(‘/hello/:name’, respond);
[/sourcecode]
Issue a command against the server now with the following curl request.
The response should come back as an actual response with content.
[sourcecode language=”bash”]
HTTP/1.1 200 OK
Content-Type: application/json
Content-Length: 13
Date: Wed, 27 Nov 2013 00:34:04 GMT
Connection: keep-alive
"hello Adron"
[/sourcecode]
Here the content is returned as “hello Adron” and the header returns a 200. The content type is application/json format with the length returned as 13. Note also the connection is set to keep-alive. Let’s dive into that.
If we change the connection type, which is important for many scenarios, we have to send extra header information to ask for the response to be returned accordingly. In order to do that we can pass the -H or –header option in with the curl request. If the command is issued with an -i and -H as shown below the result will be as follows.
[sourcecode language=””]
curl -iH "connection: close" http://0.0.0.0:8080/hello/Adron
HTTP/1.1 200 OK
Content-Type: application/json
Content-Length: 13
Date: Wed, 27 Nov 2013 00:41:07 GMT
Connection: close
"hello Adron"
[/sourcecode]
If we take away the -i we’ll just get the response, which is “hello Adron” and wouldn’t get the header, which now returns Connection: close in the response. By default, curl sets the connection as keep-alive, but in order to make the request return right away the connection needs to be issued a request for it to close. By setting the -H or –header value of connection to close, we get the response immediately. With restify, it is also important to note that it checks if the user agent is curl.
If it is curl the connection header to close and removes the content-length header. However I’ve experienced that restify is not doing this in all circumstances or that the use of curl is being changed in some of my usage. So don’t always assume that this will be the case. The safest bet is to set the connection closed when done. Thus, adding -H or –header and setting connection to close with a “Connection: close”.
Beyond Basic Get
Ok, so that’s a pretty solid use of GET with cURL. Let’s dive into some puts and deletes with a get or two thrown in for comparison. Change the executing code to the code shown in the server.js file below.
[sourcecode language=”javascript”]
var restify = require(‘restify’);
This function is setup to take req, res, and then handle next. The req is the request, the res is the response and the next is for issuing to return and continue with the result. The next bit of code starts the server with the restify.createServer();. Just below that there are several handlers that are setup.
Now at this point I got a little sidetracked writing this blog entry. But I thought to myself, “hell, I’m just figuring out some parts of Webstorm, I ought to blog a little about it!” So, here’s…
A Little Webstorm Love
Webstorm and cURL. Click the image for a full size image.
Before continuing on I wanted to cover a few tidbits of the Jetbrains Webstorm IDE. I often switch back and forth between the Sublime/Terminal combo and the Webstorm IDE. The really cool thing about this IDE is that it actually has a Terminal built in, color coding and autocomplete of the code, refactoring, and file and folder viewer and a whole slew of other features. In the image above that I’ve included there are four neon pointers that are displaying some of the key functionality that I’m using to work through this blog entry with cURL and Restify.
The arrows, from left to right are pointing to the following IDE elements. The first is pointing to the javascript files storgie.js and starter.js which I added specifically to show the git status colors. Each color reflect if the file is new (green), has changes (light blue) or is committed with no changes (white). The second arrow is just pointing to the general folder structure. Here you can see the hidden .* files like the .gitignore and .npmignore and also easy to dig through the node_modules directory. Webstorm also uses the node_modules directory to provide extra information and autocomplete to the code as you work through your coding session. The next arrow is pointing out the terminal in the editor, which is where I’m working up the curl examples in this blog entry. Then of course the color coded starter.js file that is one of the working examples. Webstorm, simply, is pretty sweet. I’m looking to do some more walk throughs and work sessions with the editor in the near future. So if interested, be sure to keep reading and subscribe, I’ll be sure to post any links to wherever the material ends up right here.
Now, back to the cURLing. 😉
After I toyed around with Webstorm and bit to get it work in a way that was efficient for me to use it for developing these APIs I stumbled into an idea. I’d provide a page for the APIs that could be located at the root of the API service such as http://api.blagh.com. The APIs would still be a restful type schema like http://api.blagh.com/thing/create or http://api.blagh.com/thing/destroy but at the very root would be a kind of docs. Maybe this could just be a status page even. Whatever the case, there needs to be something at http://api.blagh.com so I decided right then and there I’d switch to express.js to build the rest of the API services. Restify is fine and all but for this, it seemed like express would have all of the pieces I need for this.
Just to boot, I then read a few articles about express being faster such as this one. But then I read this issue on github and almost thought, “maybe I should keep using restify” but then I thought, “dammit, just get it done the way you want it built” so it was back to express. It’s easy enough to change this later so I just got back to coding, albeit with express now. So keep reading and in the next day or two I’ll have part two of this series on using cURL to hack at your APIs.
I’m a huge advocate for high quality code. I will admit I don’t always get to write, or am always able to write high quality code. But day in and out I make my best effort at figuring out the best way to write solid, high quality, easy to maintain, easy to read code.
Over the last year or so I’ve been working with Windows Azure (Amazon Web Services and other Cloud/Utility Platforms & Infrastructure also). One of the largest gaps that I’ve experienced when working with Windows Azure is the gross disregard for unit testing and especially unit testing in a Test Driven Development style way. The design of the SDK doesn’t make unit testing a high priority, and instead focuses mostly on what one might call F5 & Run Development.
I’ll be the first to stand up and point out why F5 Driven Development (for more on this, check out Jeff Schumacher‘s Blog Entry) is the slowest & distracting ways to build high quality code. I’d also be one to admit that F5 Development encourages poor design and development. A developer has to juggle far too many things to waste time hitting F5 every few seconds to assure that the build is running and code changes, additions, or deletions have been made correctly. If a developer disregards running the application when forced to do F5 Development the tendancy is to produce a lot of code, most likely not refactored or tested, during each run of the application. The list of reasons to not develop this way can get long pretty quick. A developer needs to be able to write a test, implement the code, and run the test without a framework launching the development fabric, or worse being forced to not write a test and running code that launches a whole development fabric framework.
Now don’t get me wrong, the development fabric is freaking AWESOME!! It is one of the things that really sets Windows Azure apart from other platforms and infrastructure models that one can develop to. But the level of work and effort makes effectively, cleanly, and intelligently unit testing code against Windows Azure with the development fabric almost impossible.
But with that context, I’m on a search to find some effective ways, with the current SDK limitations and frustrations, to write unit tests and encourage test driven design (TDD) or behaviour driven design (BDD) against Windows Azure, preferably using the SDK.
So far I’ve found the following methods of doing TDD against Windows Azure.
Don’t use the SDK. The easiest way to go TDD or BDD against Windows Azure and not being tightly bound to the SDK & Development Fabric is to ignore the SDK altogether and use regular service calls against the Windows Azure service end points. The problem with this however, is that it basically requires one rewrite all the things that the SDK wraps (albeit with better design principles). This is very time consuming but truly gives one absolute control over what they’re writing and also releases one from the issues/nuances that the Windows Azure SDK (1.3 comes to mind) has had.
Abstract, abstract, and abstract with a lock of stubbing, mocking, more stubbing, and some more abstractions underneath all of that to make sure the development fabric doesn’t kick off every time the tests are run. I don’t want to abstract something just to fake, stub, or mock it. The level of indirection needed gets a bit absurd because of the design issues with the SDK. The big problem with this design process to move forward with TDD and BDD is that it requires the SDK to basically be rewritten as a whole virtual stubbed, faked, and mocked layer. Reminds me of many of the reasons the Entity Framework is so difficult to work with for testing (has the EF been cleaned up, opened up, and those nasty sealed classes removed yet??)
Now I’ll admit, sometimes I miss the obvious things and maybe there is a magic “build tests real easy right here” button for Windows Azure, but I haven’t found it. I’d love to hear what else people are doing to enable good design principles around Windows Azure’s SDK. Any thoughts, ideas, or things I ought to try would be absolutely great – I’d love to read them. Please do comment!
I’ve had the pleasure of working with WCF on three specific projects that have brought me to this blog entry. I haven’t used WCF on only three projects, there are just three that have brought me to write this entry. I’ve used WCF a lot, since back when it was a beta. WCF is great when creating SOAP services and you aren’t too worried about the extra overhead. WCF is great for what it does, for the ideas behind what it does.
But writing RESTful web services doesn’t seem to be its strong point. On two huge projects WCF has basically been dropped, or so scaled back one really can’t honestly say that WCF is used, and either an alternate framework has been used or a LOT of custom code ends up being written.
The first time I used WCF to implement RESTful service was at Webtrends. Albeit, there is a single service that returns all types of awesome reporting goodness, however to implement basic auth, logging, polling, and a whole host of other Enterprise Scale needs we had to custom roll most of it. Keep in mind, when doing this the WCF REST capabilities were brand shiny and new, so there were a few issues to work out. Now, maybe WCF could be used and a lot of it would be built in. However as it was, we easily spent 60% of the time writing custom bits because WCF just didn’t have the right options with the right bindings.
But I digress, I recently implemented an architecture using RESTful services using WCF. But now I’ve come to find myself dropping WCF because of the back and forth and going with ASP.NET MVC controller actions to return JSON instead. With that, here’s to the lean mean controller actions rockin’ the JSON. Here’s what I’ve done to port everything from WCF to MVC.
To see what I had done, except on a smaller scale, check out my previous blog entry on ASP.NET MVC with a WCF project smack in the middle of it. This will give you an idea of what I was using the WCF services for, merely to provide JSON results via RESTful services to an ASP.NET MVC front end requesting data with jQuery.
This is how I’ve setup the controller to return JSON results via an action.
First start a new ASP.NET MVC Project and add a new controller. Cleanup the controller so that you have the following in the controller.
[sourcecode language=”csharp”]
using System.Web.Mvc;
namespace RestWebServicesWithMvc.Controllers
{
public class ServicesController : Controller
{
}
}
[/sourcecode]
Now create a testing project to create your test first. Remember to add the reference to the ASP.NET MVC project. From here we can create the first test.
[sourcecode language=”csharp”]
using Microsoft.VisualStudio.TestTools.UnitTesting;
using RestWebServicesWithMvc.Controllers;
namespace RestWebServicesWithMvc.Tests
{
[TestClass]
public class UnitTest1
{
[TestMethod]
public void TestMethod1()
{
var controller = new ServicesController();
var result = controller.GetBiz();
Assert.IsNotNull(result);
}
}
}
[/sourcecode]
Now fill out the basic skeleton of the action in the controller.
[sourcecode language=”csharp”]
using System;
using System.Web.Mvc;
namespace RestWebServicesWithMvc.Controllers
{
public class ServicesController : Controller
{
public ActionResult GetBiz()
{
throw new NotImplementedException();
}
}
}
[/sourcecode]
Now we should have a good red running on our test. Let’s create a business model class to return as our result next.
[sourcecode language=”csharp”]
namespace RestWebServicesWithMvc.Models
{
public class BizEntity
{
public string BizName { get; set; }
public string StartupDate { get; set; }
public int SalesThisMonth { get; set; }
}
}
[/sourcecode]
Now let’s return that object with some fake data. First add [AcceptVerbs(HttpVerbs.Post)] to the action in the controller. Then return a serializable object to the actual method as shown.
This is a quick starter. There are a few dozen other options around this capability including other verb usage. For many, this is all you need for your services, especially if their primary purpose is to communicate with a specific website and one doesn’t want the overhead of WCF.
You must be logged in to post a comment.