Decoding JSON Data Using the BizTalk Server 2013 R2 JSONDecode Pipeline Component

This is the fourth in a series of posts exploring What’s New in BizTalk Server 2013 R2. It is also the second in a series of three posts covering the enhancements to BizTalk Server’s support for RESTful services in the 2003 R2 release.

In my last post, I wrote about the support for JSON Schemas in BizTalk Server 2013 R2. I started out with a small project that included a schema generated from the Yahoo Finance API and a unit test to verify the schema model. I was going to put together a follow-up article last week, but spent the week traveling, in the hospital, and then recovering from being sick.

However, I am back and ready to tear apart the next installment that already hit the github repo a few days back.

Pipeline Support for JSON Documents

In BizTalk Server 2013 R2, Microsoft approached the problem of dealing with JSON content in a way fairly similar to the approach that we used in the previous version with custom components – performing the JSON conversion as an operation in the Decode stage of the pipeline, thus requiring the Disassemble stage to include an XMLDisassemble component for property promotion.

The official component Microsoft.BizTalk.Component.JsonDecoder takes in two properties Root Node and Root Node Namespace that help determine how the XML message will be created.

Finally, there isn’t a JSONReceive pipeline included in BizTalk Server 2013 R2 – only the pipeline component was included. In other words, in order to work with JSON, you will need a custom pipeline.

image

 

Creating a Pipeline for Receiving JSON Messages

Ultimately, I would like to create a pipeline that is going to be reusable so that I don’t have to create a new pipeline for each and every message that will be received. Since BizTalk message types are all about the target namespace and root node name, it’s not reasonable to set that value to be the same for every message – despite having different message bodies and content. As a result, it might be best to leave the value blank and only set it at design time.

This is also an interesting constraint, because if we are receiving this message not necessarily just as a service response, we might end up needing to create a fairly flexible schema (i.e., with a lot more choice groups) depending on the variety of inputs / responses that may be received – something that will not be explored within this blog post, but would be an excellent discussion to bring up during one of QuickLearn’s BizTalk Server Developer Deep Dive classes.

In order to make the pipeline behave in a way that will be consistent with typical BizTalk Server message processing, I decided to essentially take what we have in the XMLReceive pipeline and simply add a JsonDecoder in the Decode stage, with none of its properties set at design time.

image

Testing the JSONReceive Pipeline

In the same vein as my last post, I will be creating automated tests for the pipeline to verify its functionality. However, we cannot use the built-in support for testing pipelines in this case – because properties of the pipeline were left blank, and the TestablePipelineBase class does not support per instance configuration. Luckily, the Winterdom PipelineTesting library does support per instance configuration – and it has a nifty NuGet package as of June.

Unfortunately, the per-instance configuration is not really pretty. It requires an XML configuration file that resembles the guts of a bindings file in the section dedicated to the same purpose. In other words, it’s not as easy as setting properties on the class instance in code in any way. To get around that to some degree, and to be able to reuse the configuration file with different property values, I put together a template with tokens in place of the actual property values.

image

NOTE: If you’re copying this approach for some other pipeline components, the vt attribute is actually very important in ensuring your properties will be read correctly. See KB884981 for details.

From there, the per-instance configuration is a matter of XML manipulation and use of the ReceivePipelineWrapper class’ ApplyInstanceConfig method:

private void configureJSONReceivePipeline(ReceivePipelineWrapper pipeline, string rootNode, string namespaceUri)
{
    string configPath = Path.Combine(TestContext.DeploymentDirectory, "pipelineconfig.xml");

    var configDoc = XDocument.Load(configPath);

    configDoc.Descendants("RootNode").First().SetValue(rootNode);
    configDoc.Descendants("RootNodeNamespace").First().SetValue(namespaceUri);

    configDoc.Save(configPath);

    pipeline.ApplyInstanceConfig(configPath);
}

The final test code includes a validation of the output against the schema from last week’s post. As a result, we’re really dealing with an integration test here rather than a unit test, but it’s a test nonetheless.

[TestMethod]
[DeploymentItem(@"Messages\sample.json")]
[DeploymentItem(@"Configuration\pipelineconfig.xml")]
public void JSONReceive_JSONMessage_CorrectValidXMLReturned()
{

    string rootNode = "ServiceResponse";
    string namespaceUri = "http://schemas.finance.yahoo.com/API/2014/08/";

    string sourceDoc = Path.Combine(TestContext.DeploymentDirectory, "sample.json");
    string schemaPath = Path.Combine(TestContext.DeploymentDirectory, "ServiceResponse.xsd");
    string outputDoc = Path.Combine(TestContext.DeploymentDirectory, "JSONReceive.out");

    var pipeline = PipelineFactory.CreateReceivePipeline(typeof(JSONReceive));

    configureJSONReceivePipeline(pipeline, rootNode, namespaceUri);

    using (var inputStream = File.OpenRead(sourceDoc))
    {
        pipeline.AddDocSpec(typeof(ServiceResponse));
        var result = pipeline.Execute(MessageHelper.CreateFromStream(inputStream));

        Assert.IsTrue(result.Count > 0, "No messages returned from pipeline.");

        using (var outputFile = File.OpenWrite(outputDoc))
        {
            result[0].BodyPart.GetOriginalDataStream().CopyTo(outputFile);
            outputFile.Flush();
        }

    }

    ServiceResponse schema = new ServiceResponse();
    Assert.IsTrue(schema.ValidateInstance(outputDoc, Microsoft.BizTalk.TestTools.Schema.OutputInstanceType.XML),
        "Output message failed validation against the schema");

    Assert.AreEqual(XDocument.Load(outputDoc).Descendants("Bid").First().Value, "44.97", "Incorrect Bid amount in output file");

}

After giving it a run, it looks like we have a winner.

image

Coming Up in the Next Installment

In the next installment of this series, I will actually put to use what we have here, and build out a more complete integration that allows us to experience sending JSON messages as well, using the new JsonEncoder component.

Take care until then!

If you would like to access sample code for this blog post, you can find it on github.

JSON Schemas in BizTalk Server 2013 R2

This is the third in a series of posts exploring What’s New in BizTalk Server 2013 R2. It is also the first in a series of three posts covering the enhancements to BizTalk Server’s support for RESTful services in the 2003 R2 release.

In my blog post series covering the last release of BizTalk Server 2013, I ran a 5 post series covering the support for RESTful services, with one of those 5 discussing how one might deal with JSON data. That effort yielded three separate executable components:

  1. JSON to XML Converter for use with the WFX Schema Generation tool
  2. JSON Decoder Pipeline Component (JSON –> XML)
  3. JSON Encoder Pipeline Component (XML –> JSON)

It also yielded some good discussion of the same over on the connectedcircuits blog, wherein some glitches in my sample code were addressed – many thanks for that!

All of that having been said, similar components in one form or another are now available out of the box with BizTalk Server 2013 R2 – and I must say the integrated VS 2013 tooling blows away a 5 minute WinForms app. In this post we will begin an in-depth examination of this improved JSON support by first exploring the support for JSON Schemas within a BizTalk Server 2013 R2 project.

How Does BizTalk Server Understand My Messages?

All BizTalk Server message translation occurs at the intersection between 2 components: (1) A declarative XSD file that defines the model of a given message, with optional inline parsing/processing annotations, and (2) an executable pipeline component (usually within the disassemble stage of a receive pipeline or assemble stage of the send pipeline) that reads the XSD file and uses any inline annotations necessary to parse the source document.

This is the case for XML documents, X12, EDIFACT, Flat-file, etc… It only logically follows then that this model could be extended for JSON. In fact, that’s exactly what the BizTalk Server team has done.

JSON is an interesting beast however, as there already exists a schema format for specifying the shape of JSON data. BizTalk Server prefers working with XSD, and makes no exception for JSON. Surprisingly this XSD looks no different than any other XSD, and contains no special annotations to reflect the message being typically represented as JSON content.

What Does a JSON Schema Look Like?

Let’s consider this JSON snippet, which represents the output of the Yahoo! Finance API performing a stock quote for MSFT:

image

This is a pretty simple instance, and it is also an interesting case because it has a null property Ask, as well as a repeating record quote that does not actually repeat in this instance. I went ahead and saved this snippet to the worst place possible – my Desktop – as quote.json and then created a new Empty BizTalk Server Project in Microsoft Visual Studio 2003 (with the recently released Update 3).

From there I can generate a schema for this document by using the Add New Items… context-menu item for the project within Solution Explorer. From there, I can choose JSON Schema Wizard in the Add New Item dialog:

image

The wizard looks surprisingly like the Flat-file schema wizard, and it looks like quite a bit of that work might have been lifted and re-purposed for the JSON schema wizard. What’s nice about this wizard though, is that this is really the only page requiring input (the previous page is the obligatory Welcome screen) – so you won’t be walking through the input document while also recursively walking through the wizard.

image

Instead the wizard makes some core assumptions about what the schema should look like (much like the WFX schema generator). In the case of this instance, it’s not looking so great. Besides from essentially every single element being optional in the schema, the quote record was not set as having a maxOccurs of unbounded – though this should really be expected given that our input instance gave no indication of this. However, maybe you’re of the opinion that the wizard may have been written to infer that upon noticing it was a child of a record with a plural name – which might be an interesting option to see.

image

Next the Ask record included was typed as anyType instead of decimal – which again should be expected given that it was simply null in the input instance. However, maybe this could be an opportunity to add pages to the wizard asking for the proper type of any null items in the input instance.

Essentially, it may take some initial massaging to get everything in place and happy. After tweaking the minOccurs and maxOccurs, as well as types assigned to each node, I decided it would be a good time to ensure that my modifications would still yield a schema that would properly validate the input instance I provided to the wizard.

How do We Test These Schemas Or Validate Our JSON Instances?

Quite simply, you don’t. At least not using the typical Validate Instance option available in the Solution Explorer context-menu for the .xsd file. Instead this will require a little bit of work in custom-code.

Where am I writing that custom code? Well right now I’m on-site in Enterprise, Alabama teaching a class that involves a lot of automated testing. As a result, I’m in the mood for writing some unit tests for the schema – which also means updating the project properties so that the class generated for the schema derives from TestableSchemaBase and adds a method we can use to quickly validate an instance against the schema.

image

It also means adding a new test project to the solution with a reference to the following assemblies:

  • System.Xml
  • C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\PublicAssemblies\Microsoft.BizTalk.TOM.dll
  • C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\PublicAssemblies\Microsoft.BizTalk.TestTools.dll
  • C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\PublicAssemblies\Microsoft.XLANGs.BaseTypes.dll
  • Newtonsoft.Json (via Nuget Package)

That’s not all the setup required unfortunately. I still have to add a new TestSettings file to the solution, ensure that deployment is enabled, that it is deploying the bolded Microsoft.BizTalk.TOM.dll assembly above, and that it is configured to run tests in a 32-bit hosts. From there I need to click TEST > Test Settings > Select Test Settings File, to select the added TestSettings file.

image

image

image

image

With all the references in place and the solution all setup, I’ll want to bring in the message instance(s) to validate. In order to ensure that the test has access to these items at runtime, I will add the applicable DeploymentItem attribute to each test case that requires one.

using Microsoft.VisualStudio.TestTools.UnitTesting;
using Newtonsoft.Json;
using System.IO;
using System.Xml;

namespace QuickLearn.Finance.Messaging.Test
{
    [TestClass]
    public class ServiceResponseTests
    {

        [TestMethod]
        [DeploymentItem(@"Messages\sample.json")]
        public void ServiceResponse_ValidInstanceSingleResultNullAsk_ValidationSucceeds()
        {

            // Arrange
            ServiceResponse target = new ServiceResponse();
            string rootNode = "ServiceResponse";
            string namespaceUri = "http://schemas.finance.yahoo.com/API/2014/08/";
            string sourceDoc = Path.Combine(TestContext.DeploymentDirectory, "sample.json");
            string sourceDocAsXml = Path.Combine(TestContext.DeploymentDirectory, "sample.json.xml");

            ConvertJsonToXml(sourceDoc, sourceDocAsXml, rootNode, namespaceUri);

            // Act
            bool validationResult = target.ValidateInstance(sourceDocAsXml, Microsoft.BizTalk.TestTools.Schema.OutputInstanceType.XML);

            // Assert
            Assert.IsTrue(validationResult, "Instance {0} failed validation against the schema.", sourceDoc);

        }

        public void ConvertJsonToXml(string inputFilePath, string outputFilePath,
            string rootNode = "Root", string namespaceUri = "http://tempuri.org", string namespacePrefix = "ns0")
        {
            var jsonString = File.ReadAllText(inputFilePath);
            var rawDoc = JsonConvert.DeserializeXmlNode(jsonString, rootNode, true);

            // Here we are ensuring that the custom namespace shows up on the root node
            // so that we have a nice clean message type on the request messages
            var xmlDoc = new XmlDocument();
            xmlDoc.AppendChild(xmlDoc.CreateElement(namespacePrefix, rawDoc.DocumentElement.LocalName, namespaceUri));
            xmlDoc.DocumentElement.InnerXml = rawDoc.DocumentElement.InnerXml;

            xmlDoc.Save(outputFilePath);
        }

        public TestContext TestContext { get; set; }
    }
}

What Exactly Am I Looking At Here?

Here in the code we’re converting our JSON instance first to XML using the Newtonsoft.Json library. Once it is in an XML format, it should (in theory at least) conform to the schema definition generated by the BizTalk JSON Schema Wizard. So from there, we take output XML, and feed it into the ValidateInstance method of the schema to perform validation.

The nice thing about doing it this way, is that you will not only get a copy of the file to use within the automated test itself, but you can also use the file generated within the test in concert with the Validate Input Instance option of the schema for performing quick manual verifications as well.

After updating the schema, it looks like it’s going to be in a usable state for consuming the service:

Screenshot of final code

Coming Up Next Week

Next week will be part 2 of the JSON series in which we will test and then use this schema in concert with the tools that BizTalk Server 2013 R2 provides for consuming JSON content.

If you would like to access sample code for this blog post, you can find it on github.

BizTalk Server 2013 Support for RESTful Services (Part 5/5)

This post is the seventeenth in a weekly series intended to briefly spotlight those things that you need to know about new features in BizTalk Server 2013. It is also the final part of a five-part series on REST support in BizTalk Server 2013.

I’m going to forego a long introduction this week by simply stating outright that we are going to see how it is possible to do OAuth with BizTalk Server 2013. That being said up front, I do want to take a few moments to recall where we have been, and what has led up to this post.

Re-cap of the RESTful Services Series

We’ve covered quite a bit of ground so far in this mini-series. If you have not yet read the parts leading up to this post, you will want to do so before we begin. You can find the pervious posts here:

Giving BizTalk 2013 the Valet Keys

If you’ve been listening to the messaging and buzz around BizTalk Server 2013, it would appear that one of the major focuses of the 2013 version is all about enabling integration with cloud-based services. Most of these services expose APIs that make it possible to automate that integration, and a growing number of those are RESTful APIs. Each service provider can choose the methods with which they perform authentication and authorization. However a fairly common choice1 is to use OAuth for authorization.

Let’s imagine that we’re going to use BizTalk Server 2013 to call the Salesforce API. Before we can make the API call, a call is made to request a special token (essentially a valet key) that will provide limited access to specific functionality within Salesforce for BizTalk Server’s use.

This means for each call that is made, a corresponding token request call must be made. How can we handle this? Ultimately, we could make BizTalk server do this any number of ways. We could use a custom pipeline component in the Send pipeline to request the token before making the call. We could setup an orchestration to coordinate these two sends – or we could even call into custom .NET code which will get the token. However, as with the CORs headers last week, I would rather push this need closer to the transport (since that’s closer to where the issue originates).

Making WCF Do the Heavy Lifting

Since, in our fictional scenario here, we’re dealing with a RESTful API, the WCF-WebHttp adapter will be in play. As a WCF adapter, we are able to take advantage of the WCF extensibility model during configuration via the Behavior tab of the adapter configuration:

image

What does this enable us to do? At the end of the day, it allows us to take full control of the message right before being transmitted over the wire (and even right after being received over the wire in the case of a reply). For the purposes of implementing OAuth, we can actually simply look to a portion of one of the tutorials that shipped as part of the product documentation for BizTalk Server 2013 on MSDN: Integrating BizTalk Server 2013 with Salesforce.

By reading the document linked above, you will find that the tutorial is taking advantage of the IClientMessageInspector interface to provide a last minute hook into the communications process. During that last minute in the BeforeSendRequest method, the OAuth token is being retrieved through a specially constructed POST request to the OAuth authorization endpoint (which can differ depending on which instance to which you have been assigned):

    private void FetchOAuthToken()
    {
        if ((tokenExpiryTime_ == null) || (tokenExpiryTime_.CompareTo(DateTime.Now) <= 0))
        {
            StringBuilder body = new StringBuilder();
            body.Append("grant_type=password&")
                .Append("client_id=" + consumerKey_ + "&")
                .Append("client_secret=" + consumerSecret_ + "&")
                .Append("username=" + username_ + "&")
                .Append("password=" + password_);

            string result;

            try
            {
                result = HttpPost(SalesforceAuthEndpoint, body.ToString());
            }
            catch (WebException)
            {
                // do something
                return;
            }

            // Convert the JSON response into a token object
            JavaScriptSerializer ser = new JavaScriptSerializer();
            this.token_ = ser.Deserialize<SalesforceOAuthToken>(result);
            this.tokenExpiryTime_ = DateTime.Now.AddSeconds(this.sessionTimeout_);
        }
    }

NOTE: I do not advocate merely swallowing exceptions. You will want to do something with that WebException that is caught.

After the token is returned (which is decoded here using the JavaScriptSerializer as opposed to taking a dependency on JSON.NET for the JSON parsing), an Authorization header is appended, to the request message, containing the OAuth token provided by the authorization endpoint.

WebHeaderCollection headers = httpRequest.Headers;
        FetchOAuthToken();

        headers.Add(HttpRequestHeader.Authorization, "OAuth " + this.token_.access_token);
        headers.Add(HttpRequestHeader.Accept, "application/xml");

Registering Extensions

While the tutorial provides instructions for registering extensions via the machine.config file, a ready alternative is available.

For any WCF-based adapter, each host can provide unique configuration data specifying extensions that are available for use by that adapter when executed by the host. Performing the configuration here can ease the configuration burden on those BizTalk groups containing more than a single machine.

image

Ready to REST

I highly recommend that you read that tutorial, and spend some time replicating that solution so that you can see what BizTalk Server 2013 is bringing to the table with regards to integration with RESTful services.

I hope that this series has also accomplished the same – that you can see the possibilities, the potential extensibility points, and have a greater appreciation for BizTalk Server 2013 as the integration tool ready to tackle those more tricky hybrid on-premise meets cloud scenarios.

Until next week, take care!

1 Just because it’s common does not mean it is the least complex or best choice – but it does mean that it can end up being one you’re stuck with.

BizTalk Server 2013 Support for RESTful Services (Part 4/5)

Did you click a link from the newsletter expecting a post on ESB Toolkit 2.2? Head right over here. Otherwise, keep reading!

This post is the sixteenth in a weekly series intended to briefly spotlight those things that you need to know about new features in BizTalk Server 2013. It is also the fourth part of a five-part series on REST support in BizTalk Server 2013.

If you haven’t been following the REST series, stop reading right now, and then read from the beginning of the series. I’m not re-covering any already covered ground in this post. Winking smile

In this post we will discuss dealing with incoming JSON requests, such that they become readable/map-able XML for which we can generate a schema. Ultimately there are two ways that this can be accomplished. Both of these methods have actually already been extensively documented. The first method is by using a custom pipeline component (demonstrated here by Nick Heppleston), the second is by extending WCF with a custom behavior (briefly discussed here by Saravana Kumar).

Both of the articles linked in the above paragraph really only deal with the receive side of the story. In this article I will set out to demonstrate both receiving JSON data, and responding with JSON data. In the process I will demonstrate two methods of dealing with cross-domain requests (CORS and JSONP), and build pipeline components and pipelines to handle both scenarios.

Getting Started with a Custom Pipeline Component

In order to get started with custom pipeline component development, I will be reaching again to the BizTalk Server Pipeline Component Wizard (link points to the patch for BizTalk Server 2013, which requires InstallShield LE and an utterly maddening registration/download process). I am going to use the wizard to create two projects – one for a JsonDecoder pipeline component and one for a JsonEncoder pipeline component.

Each of these projects will be taking a dependency on Newtonsoft’s excellent Json.NET library for the actual JSON serialization/deserialization. This will ultimately be done through the official NuGet package for Json.NET.

Since we can have JSON data that starts as just a raw list of items, we likely want to wrap a named root item around that for the purpose of XML conversion. In order to choose the name for that root node, we will rely on the name of the operation that is being processed. However, in the case of a single operation with a lame name, we can expose a pipeline component property to allow overriding that behavior.

Another pipeline component property that would be nice to expose is a target namespace for the XML that will be generated.

Creating XML from JSON Data

Creating XML from JSON formatted data is actually supported out of the box with the JSON.NET library. We can do so with the following code:

var jsonString = jsonStream.Length == 0
                 	? string.Empty
                        : Encoding.GetEncoding(inmsg.BodyPart.Charset ?? Encoding.UTF8.WebName)
				  .GetString(jsonStream.ToArray());
           
// Name the root node of the generated XML doc after the operation, unless
// a specific name has been specified via the RootNode property.

var rawDoc = JsonConvert.DeserializeXmlNode(jsonString,
				string.IsNullOrWhiteSpace(this.RootNode)
					? operationName
					: this.RootNode, true);

// Here we are ensuring that the custom namespace shows up on the root node
// so that we have a nice clean message type on the request messages

var xmlDoc = new XmlDocument();
xmlDoc.AppendChild(
	xmlDoc.CreateElement(DEFAULT_PREFIX, rawDoc.DocumentElement.LocalName, this.Namespace));
xmlDoc.DocumentElement.InnerXml = rawDoc.DocumentElement.InnerXml;

In the above snippet, you are seeing the operationName variable out of context. That variable will contain the value of the current operation requested (which is defined by you within the adapter properties for the WCF-WebHttp adapter, and then matched at runtime to the HTTP Method and URL requested).

Another weird thing that we are doing here is making sure that we have full control over the root node, by allowing it to be generated by the JSON.NET library, and then replacing it with our own namespace-qualified root node.

Creating JSON from XML Data

Taking XML data and generating JSON formatted data is nearly as easy:

XmlDocument xmlDoc = new XmlDocument();
xmlDoc.Load(inmsg.BodyPart.Data);

if (xmlDoc.FirstChild.LocalName == "xml")
	xmlDoc.RemoveChild(xmlDoc.FirstChild);

// Remove any root-level attributes added in the process of creating the XML
// (Think xmlns attributes that have no meaning in JSON)

xmlDoc.DocumentElement.Attributes.RemoveAll();

string jsonString = JsonConvert.SerializeXmlNode(xmlDoc, Newtonsoft.Json.Formatting.Indented, true);

There are a few lines of codes there dedicated to dealing with some quirks in the process. Namely if you do not remove both the potentially pre-pended “xml” processor directive node, or the namespace attributes on the root node, these will show up in the JSON output – yikes!

Creating Schemas for JsON Requests and Responses

Assuming that the data will be in XML form whenever BizTalk gets a hold of it, how can we generate a representative schema without much pain? Our best bet will be simply taking some JSON, throwing it through the same process above, and then generating a schema from the resulting XML.

If you haven’t already, you will need to install the Well Formed XML schema generator by heading over to the C:\Program Files (x86)\Microsoft BizTalk Server 2013\SDK\Utilities\Schema Generator folder and then running the InstallWFX.vbs script that you will find there:

image

In order to make it easier to run some JSON formatted data through this process, I created a quick no-frills sample app that can do the conversion and save the result as an XML file:

image

Once we have the output of that application, we can run it through the schema wizard, and end up with a schema that looks somewhat like the following (for which much polishing is required with regards to the selection of data types):

image

Dealing with Cross-Site Requests

Modern browsers will be quite leery of allowing for cross-domain requests (i.e., requests originating from a site on one domain which target a service on a different domain). That is not to say, however, that every instance of this is inherently bad.

One of the things that I wanted to make sure of, is that I could publish a service which would allow for requests originating from a different domain. There are two common approaches for allowing this to happen.

One of them is handled at the http level through specialized headers that are passed through a series of requests to ensure that a service is willing to be called from a different domain. This method is known as CORS or Cross-Origin Resource Sharing.

The other method is much more hacky and really ought to be reserved for those cases where the browser can’t cope with CORS. This other method is JSONP (JSON with Padding).

Examining CORS

Let’s start this whole discussion by taking a look at CORS first. Assume that we have the following code:

var request = $.ajax({
	url: $("#serverUrl").val(),
	type: 'POST',
	contentType: 'application/json',
	data: JSON.stringify(data),
	processData: false,
	dataType: 'json'
})

Now assume that the item on the page with the id serverUrl contains a URL for a completely external site. If that is the case, and you’re using a modern browser that supports CORS, you will see something like this if you watch your HTTP traffic:

image

Before the actual request is sent (with the data posted in the body), a simple OPTIONS request is issued to determine what is possible with the resource that is about to be requested. Additionally it includes Access-Control-Request headers that indicate which HTTP Method is desired, and which headers are desired (to be sent). Here the server must respond in kind saying that it allows requests from the origin domain (notice the origin header above calling out the site originating the request), and saying that it supports the Method/Headers that the origin site is interested in.

If it does not respond in the correct manner, the conversation is over, and errors explode all over the place.

Getting back to the BizTalk world for a second, if your REST Receive Location isn’t expecting this, you will see this lovely response:

image

Technically, you wont’ see any Access-Control headers at all in the response (I took the screenshot after un-configuring some things, but apparently I missed that one). Ultimately the error comes down to a contract mismatch. More specifically it will read something like this:

The message with To ‘http://your.service.url.here.example.org/service.svc/operation cannot be processed at the receiver, due to an AddressFilter mismatch at the EndpointDispatcher.

This OPTIONS request actually needs to become part of your contract for each endpoint for which you wish to allow cross-domain communications. Additionally, you have to be ready to provide a happy and valid response for this message.

Configuring IIS to Send CORS Headers

Ultimately there’s a few ways we could inject the headers required to make CORS happen. I’m going to go with having IIS inject those headers at the server level (rather than doing it at the adapter configuration level). I am going to do this at the level of the virtual directory created by the WCF Service publishing wizard when I published my REST endpoint. Examining the Features View for that virtual directory we should find HTTP Response Headers under the IIS group:

image

From there, I can begin adding the headers for those things that I would like to allow:

image

Above, I’m setting a header which indicates that I wish to allow requests from any origin domain. After some mucking around with fiddler to find out what my little test page would actually be requesting using IE, I ended up with this set of headers that seemed to make the world happy:

image

Pesky Empty Messages

Even with the correct headers being sent back in the response, the response code will still be 500 until we can properly deal with the incoming empty OPTIONS message, and ensure that we are routing a response back such that a 200 response code is returned.

In BizTalk Server the easiest way to get a message from the receive pipeline to route back to the send pipeline is through the RouteDirectToTP property. This is the same property that ensures that EDI Acknowledgements are routed directly back to the trading partner on the send pipeline of the receive port, and the same property used by the ESB Toolkit to create the Forwarder pipeline component (short circuiting a default instance subscription for messaging-only request/response service requests, and restoring it after multiple requests have been made).

So how are we going to use it here? Well we’re going to detect if we’re dealing with a CORS message in the pipeline. If we are, then we are going to promote that property with a value of “true” to indicate that we want that empty message to flow right back to the response pipeline as the empty body of our lovely 200 OK response. The code for that looks a little something like that (use your imagination or github for the values of those constants):

#region Handle CORS Requests

// Detect if the incoming message is an HTTP CORS request
// http://www.w3.org/TR/cors/
            
// If it is, we will promote both the RouteDirectToTP property and the
// EpmRRCorrelationToken so that the request is immediately routed
// back to the send pipeline of this receive port

object httpMethod = null;
httpMethod = inmsg.Context.Read(HTTP_METHOD_PROPNAME, WCF_PROPERTIES_NS);

if (httpMethod != null && (httpMethod as string) == OPTIONS_METHOD)
{
	object curCorrToken = inmsg.Context.Read(EPM_RR_CORRELATION_TOKEN_PROPNAME, SYSTEM_PROPERTIES_NS);
	inmsg.Context.Promote(EPM_RR_CORRELATION_TOKEN_PROPNAME, SYSTEM_PROPERTIES_NS, curCorrToken);
	inmsg.Context.Promote(ROUTE_DIRECT_TO_TP_PROPNAME, SYSTEM_PROPERTIES_NS, true);

	var corsDoc = new XmlDocument();
	corsDoc.AppendChild(corsDoc.CreateElement(DEFAULT_PREFIX, CORS_MSG_ROOT, JSON_SCHEMAS_NS));
	writeMessage(inmsg, corsDoc);

	return inmsg;
}

#endregion

The reason that we are bothering to create an XML document out of this empty message is that this decode component will be followed by an XML Disassemble component that will need to recognize the data. By the end of this post we will have two special case XML Documents for which we will create schemas. In the case of this CORS message, we will need to ensure that the JsonEncoder returns this XML as an empty response body. The following code takes care of this other side of the equation:

#region Handle CORS Requests

// Detect if the incoming message is an HTTP CORS request
// http://www.w3.org/TR/cors/

object httpMethod = null;
httpMethod = inmsg.Context.Read(HTTP_METHOD_PROPNAME, WCF_PROPERTIES_NS);

if (httpMethod != null && (httpMethod as string) == OPTIONS_METHOD)
{
	// Remove the message body before returning
	var emptyOutputStream = new VirtualStream();
	inmsg.BodyPart.Data = emptyOutputStream;

	return inmsg;
}

#endregion

At this point we should be able to receive the options message through the receive pipeline, and have it property route back through the response pipeline. Beyond that, we just need to make sure that we pair our actual resources, within the operation mapping of the adapter config, with mappings for the CORS request:

<Operation Name="CrossDomainCheck" Method="OPTIONS" Url="/ItemAverageRequest" />
<Operation Name="ItemAverageRequest" Method="POST" Url="/ItemAverageRequest" />

Examining JSONP

Well that was one way that we could allow for cross-domain requests – requests that allowed for the POSTing of lots of wonderful data. But what if we just want to get some data, we don’t want to waste HTTP requests, and we’re not using a browser that supports CORS (or maybe we just want to live dangerously)? There is an alternative – JSONP.

Essentially this is a cheap hack that got its own acronym to legitimize how terrible of an idea it actually is.

I think the best way to explain this one is to examine a sample code snippet, and then the conversation that we want to get out of it:

var request = $.getJSON($("#serverUrlJsonp").val() + "?callback=?");

Notice the ?callback=? portion of that snippet. It’s appending a querystring parameter named callback to the URL with a value to be filled in by jQuery. So what will it fill in? A randomly generated function for receiving data back from the server. But how is the server going to execute local code within the browser you might ask? Let’s take a look:

image

jQuery seems to populate the callback querystring parameter with a suitably random value. Check out what happens in the response. The response is a function call to that randomly generated callback function. The way JSONP is cheating the system to get a cross-domain request working is by injecting a script node into the DOM with the source set to the service (passing the callback name in the querystring). Then as the service responds, the response is executed as JavaScript code. This is definitely one where you’re putting a lot of trust in the service.

So how do we make it work with BizTalk?

Capturing the JSONP Callback Name on Receive

JSONP’s ability to work depends on the ability of the service to craft a call to the callback method named by the callback querystring parameter. The easiest way to capture this is through an operation mapping combined with variable mapping:

<Operation Name="GetItemListJsonp" Method="GET" Url="/ItemsList?callback={callback}" />

image

Essentially what we’re doing here is relying on the adapter to promote the callback function’s name to the context of the message, so that it will be available as we generate the response JSON.

Pipeline Component Changes to Deal with JSONP

In order to deal properly with JSONP, we will have to make some minor modifications to both the JsonDecoder and the JsonEncoder pipeline components. Starting with the JsonDecode component which will be executing upon initial receipt of our request message, we will add the following code:

#region Handle JSONP Request

// Here we are detecting if there has been any value promoted to the jsonp callback property
// which will contain the name of the function that should be passed the JSON data returned
// by the service.

object jsonpCallback = inmsg.Context.Read(JSONP_CALLBACK_PROPNAME, JSON_SCHEMAS_NS);
string jsonpCallbackName = (jsonpCallback ?? (object)string.Empty) as string;

if (!string.IsNullOrWhiteSpace(jsonpCallbackName))
{
	var jsonpDoc = new XmlDocument();
	jsonpDoc.AppendChild(jsonpDoc.CreateElement(DEFAULT_PREFIX, JSONP_MSG_ROOT, JSON_SCHEMAS_NS));
	writeMessage(inmsg, jsonpDoc);

	return inmsg;
}

#endregion

This code is examining the callback context property to determine if there is any value promoted. If there is, it is assumed that this is a JSONP request. It also assumes that all JSONP requests will have an empty body, and as such will require a placeholder node in order to make it through the following XmlDisassembler.

On the response side, these are the modifications made to the JsonEncoder component:

#region Handle JSONP Request

// Here we are detecting if there has been any value promoted to the jsonp callback property
// which will contain the name of the function that should be passed the JSON data returned
// by the service.

object jsonpCallback = inmsg.Context.Read(JSONP_CALLBACK_PROPNAME, JSON_SCHEMAS_NS);
string jsonpCallbackName = (jsonpCallback ?? (object)string.Empty) as string;

if (!string.IsNullOrWhiteSpace(jsonpCallbackName))
	jsonString = string.Format("{0}({1});", jsonpCallbackName, jsonString);

#endregion

Here we’re jumping in at the last minute, before the JSON formatted data is returned, and wrapping the whole thing in a function call – if the callback function name exists in the context.

Building a Simple Proof of Concept Service

After burning an evening on these components, I decided it would only be worthwhile if I could actually use them to do something interesting.

The parameters I set out for myself was that I have to have a service that could be called cross-domain from a simple .html page with only client-side JavaScript goodness. As a result I resurrected my dual listener orchestration anti-pattern from one of my previous posts (a hijacked convoy pattern?) to create the following monstrosity:

image

So what’s going on here, and what does this service allow me to do? Well it exposes two resources, (1) ItemsList and (2) ItemAverageRequest. They both deal in “items”. There really is no meaning for what an item is, I just needed some thing to test with, and item was a nice fit.

The first resource, ItemsList is something that you can GET. As a result it will return a nice JSON formatted list of “items” each having an “amount” of some kind and a unique “id”.

The ItemAverageRequest resource is a place where you can POST a new ItemAverageRequest (really this is just a list of items exactly as returned by the first resource), and as a result you will receive the cumulative average “amount” for the items in the list.

The maps in the orchestration shown above ensure that these resources have the described behavior.

Testing the Service

So did all of this work pay off, and give us happy results? I built a simple test client to find out. Here’s what the JSONP request looks like against the ItemsList resource (the textarea in the background contains the raw response from the service):

image

The raw HTTP conversation for this request was actually shown above in the Fiddler screenshot from the JSONP discussion.

Leaving that data in the textarea as input, and then invoking the POST against the ItemAverageRequest resource yields the following:

image

The HTTP conversation for this request happened in two parts, represented below:

image

image

Summary

I really hope I was able to add something to the conversation around using JSON formatted data with the WCF-WebHttp adapter in BizTalk Server 2013. There’s certainly more to it than just getting the happy XML published to the MessageBox.

Was a pipeline component the right choice to handle protocol level insanity like CORS? Probably not, that part of it is something that we should have ultimately implemented by extending WCF and having the adapter deal with it — especially since CORS really doesn’t have anything to do with JSON directly. I’ll leave that to the commenters and other BizTalk bloggers out there to consider.

However, with JSONP, it becomes ultimately an output encoding issue, and since we had already dealt with one cross-domain communications issue in the pipeline, it was natural to handle CORS as well in the same fashion.

Before I sign off for the remainder of the week, I want to take this time to remind you that if you would like to take your BizTalk skills to the next level with BizTalk Server 2013, you should check out one of our upcoming BizTalk Server 2013 Deep Dive Classes, where you will learn all about fun stuff like custom pipeline component development and have an opportunity to get hands on with some of the new adapters!

If you would like to access sample code for this blog post, you can find it on github.

BizTalk Server 2013 Support for RESTful Services (Part 3/5)

This post is the fifteenth in a weekly series intended to briefly spotlight those things that you need to know about new features in BizTalk Server 2013. It is also the third part of a five-part series on REST support in BizTalk Server 2013.

Last week’s post worked through how one would use BizTalk Server to host a RESTful endpoint that would support multiple “operations” (i.e., expose multiple resources). This week we’re going to be trying to do something quite the opposite – consume a RESTful endpoint (using the POST verb) that only represents a single resource (and as such will not require operation mapping). In the process we will deal with formatting a form encoded message, and customizing the outgoing message headers.

This Week’s Scenario

This week we will be working on a ridiculously fictional integration with a home grown finance suite which exposes a RESTful API using a ASP.NET WebAPI service with an implementation that looks something like this.

Essentially the story goes that as new sales happen, the amounts of those sales are aggregated by POSTing them to the MonthlySales resource. Once the data has been collected, the aggregate information can be requested by issuing a GET request to the same resource indexed by month number.

While this week’s scenario is especially painfully contrived, it does serve to allow me to demonstrate simple POST requests and some of the fun that might surround them.

Dealing with Form Values

Whenever you POST data, it is contained in some form within the body of the http request. In the case of ASP.NET WebAPI, it is expected that this data is url encoded. The end result of a request body should look something like this. In order to ensure that a sale message received by the BizTalk application can be translated into this format, we will need not only a map (to transform the message) but also a schema and pipeline (to translate the message from XML to a flat-file format).

In order to translate to a flat-file format, I used the flat-file schema wizard to define a simple flat-file schema that splits first on the ampersand (&), and then on the equals sign (=) to create a message that resembles the following:

image

The one thing that this schema does NOT take into account is the escaping of special characters. For those considerations, one would have to resort to logic within a map – which will not be something that we encounter this week.

After this schema has been defined, we need a pipeline that can create flat-file messages from any XML message that conforms to this schema (nothing special, just a text-book flat-file transmit pipeline):

image

In order to get the data from the sale message into the flat-file format called for, we need to use a map. Mapping back and forth between URL encoded form data and XML ultimately feels like an EDI mapping involving code pairs – you’re dealing with data that is identified by a sibling node. As a result, for digging deep into advanced mapping techniques using a schema similar to this, I would recommend the discussion of EDI code pairs in the book Pro Mapping in BizTalk Server 2009 (ignore the version number in the title – the book is excellent) as it will provide a nice introduction to techniques that can be used to make this manageable.

For my map, it was fairly straightforward. We need to take a single value from the sale message, and map it over into key=value format:

image

Configuring the WCF-WebHttp Adapter to POST Form Data

We’re at the point now where we have a way to take an XML message and convert it into a format that would live nicely within the body of an http request as form data (readable quite nicely by an ASP.NET WebAPI service). However, we do need to make sure that when the data is sent, the format is identified. In order to do that, we need to ensure that the Content-Type header is transmitted alongside the body with a value of “application/x-www-form-urlencoded”.

So how do we make that happen? Through the Messages tab of the adapter configuration:

image

We can finish off the configuration on the General tab. Here I’m showing an operation mapping that only intends to ever call a single operation. Since all of the information is passed within the body, we don’t need to do any building out of the URL, we don’t have any variable mapping from the context into the URL, or anything like that. As such, configuration becomes very simple – we only need to specify the HTTP method in the mapping field:

image

Validating the Request

So what message was generated by our send port? Well after a quick tweak to the URL to ensure that Fiddler would capture it appropriately, we see the following:

image

We can see the POST verb is being used, the Content-Type has been specified correctly, and the body contains the urlencoded content of the original sale message. After submitting a few similar requests (to the correct URL), we can verify that the messages indeed are received and aggregated appropriately by executing a GET request against the resource:

image

What’s next?

We’ve made it 3/5 of the way through this mini-series on the WCF-WebHttp adapter, and have quite a few new powers at our disposal, but there are still a few things missing that we will be tackling over the next few posts – namely the ability to read that wonderful JSON response (or any JSON data for that matter), and also the oft-neglected in sample code: authentication.

The end for now!

If you would like to access sample code for this blog post, you can find it on github.

BizTalk Server 2013 Support for RESTful Services (Part 2/5)

This post is the fourteenth in a weekly series intended to briefly spotlight those things that you need to know about new features in BizTalk Server 2013. It is also the second part of a five-part series on REST support in BizTalk Server 2013.

Last week we started out with a bit of an overview of the WCF-WebHttp adapter that is behind the REST support in BizTalk Server 2013. This week, we will be looking at some simple scenarios around receiving RESTful requests.

The Scenario

For this week’s post I didn’t want to propose a scenario so complex that it would warrant a complex diagram. I will be designing a fairly simple API that will (1) accept an “order” message and respond with an acknowledgement (an “order status” message) through the use of an orchestration. It will also (2) archive the order to a FILE location in a pure messaging fashion. Finally it will (3) support a lookup of an order (for which I will be providing a canned response), through a separate operation on the same orchestration1.

In terms of how I want the world to interface with my service:

  • I want my API to be rooted at http://localhost/services/orderprocessing/
  • I want to expose the resource of /api/orders
    • I want to be able to POST a new order to /api/orders
    • I want to be able to lookup an order by issuing a GET to /api/orders/{order id here}

Defining My Messages

I am going to begin by defining the messages that will be used throughout. Until later in the series, we will be dealing with plain old XML messages, so that’s what I am using here. I have two message schemas Order.xsd, and OrderStatus.xsd. Whenever a client submits an Order message (Order.xsd) they will receive as an acknowledgement an Order Status message (OrderStatus.xsd). Whenever someone issues a GET request to /api/orders/{order id here}, they will also need to receive an Order Status message (OrderStatus.xsd) as a reply. However, in the case of the GET request, there will be no message body. Instead we will capture the information contained in the URL within the message context via the Property Schema (PropertySchema.xsd):

image image image

You may have noticed that I have distinguished a few fields. This will be for easier access within the orchestration later (though I am not actually using the Purchaser field).

Publishing My Service

At this point, I have a pretty good idea of how I want the service to act, and I also have some messages defined. I am in a good position now to publish the service. Unlike other services that we may publish within BizTalk, RESTful services will not have any associated metadata published. Which means, I could have walked all the way through the WCF Service Publishing wizard before even opening Visual Studio.

I should mention at this point that the WCF-WebHttp adapter operates only within an Isolated Host (i.e., it gets hosted in IIS instead of In-Process within a BizTalk Host Instance Service). If you want something different, you can still use WCF-Custom if you want to jump through hoops like crazy.

Let’s see what that experience is like for a RESTful service. It will start with the same dialog that you might be used to by now, but this time I am going to ensure that WCF-WebHttp is selected as the adapter:

image

Notice that the Enable on-premise metadata exchange option is grayed out. As mentioned before, there will be no metadata published, nor will there be a requirement for any tool to consume metadata. While consuming this RESTful service we will be working pretty close to the HTTP metal, and will not always have a lot of tooling at our disposal to “poof” out a happy ready-to-go client (depending on where we are consuming it from of course).

After selecting the message exchange pattern on the following page of the wizard (one-way / request-response) you will be taken to the last page in the wizard that matters:

image

No matter what you do on this page, you will end up with a service hosted at Service1.svc (yuck!) under the virtual directory that you have specified.

Making the Endpoint URL Pretty: Down with Service1.svc

I don’t know about you, but I will not be able to continue knowing that I have an endpoint named Service1.svc. So let’s make that pretty (and also align with our requirements set out above) before continuing.

In order to do that, I am going to be using the URL Rewrite 2.0 module for IIS. You can get it fairly easily through the Web Platform Installer:

image

After getting that installed, I went over to my virtual directory in IIS, and clicked on the Features View tab and fired up the URL Rewrite settings:

image

Next, I added a rule that would rewrite (imagine a redirect that is invisible to the client) a request aimed at *api* so that it would point at *Service1.svc*. Of course, this definition is using Regular Expressions instead of simple wildcards, so it looks a little something like this:

image

image

With IIS all configured (if you’re trying to follow along, make sure the App Pool is happy to do BizTalk processing first), I decided to move on to the configuration of the Receive Location.

Configuring the WCF-WebHttp Adapter for Receiving Messages

The WCF Service Publishing Wizard has already created a Receive Port and a Receive Location for the service. Diving into that Receive Location, and examining the adapter settings, you will find that the receive side looks fairly similar, in terms of the operation and variable mapping, to the send side. For the purposes of the scenario, I defined two separate operations – one uses a GET, while the other uses a POST:

image

The GET operation requires a variable to be included within the URL. In order to access that variable in a meaningful fashion, you can use the Variable Mapping section to define a mapping to a context property in the message:

image

Unfortunately, the data from the URL does not actually get Promoted into the context, it is instead Written to the context. This means that if you attempt to route based on the OrderId property, you might be faced with a routing error that reveals a message context looking something like this:

image

The other interesting thing about this GET request is that it will force us, at the moment, to use a PassThruReceive pipeline (since an XMLReceive will complain about the empty message body). This will prevent property promotion of data from our XML Order message above. For now, we will accept this and move on (as this will be dealt with later in this same series).

Subscribing to a REST Request (Messaging)

In order to archive the orders as they are received, I am going to use a very simple mechanism: a Send Port with a FILE adapter. That’s really it – a pure messaging solution to the problem.

In order to subscribe to the message, I will be using the BTS.OperationName property to specify that I am interested in messages where the Operation is POST_Order:

image

Subscribing to a REST Request (Orchestration)

On the Orchestration side, I made not the greatest choice and doubled-up both operations in a single Orchestration with some hard-coded filters on the Receive shapes. You can see the resulting structure of the orchestration below:

image

The branch on the left of the Listen shape begins with a Receive listening for an order message. Since the message will not have been through an XMLReceive pipeline, the message bound to that receive shape is the generic “XmlDocument” type (which will also end up working for a completely empty message). As a result, the first step under that branch is to take that more generic XmlDocument and jam it into another message variable that is a strongly typed order message. From there we generate an Order Status message (via Transform) which indicates that the order is in the “Pending” status.

The right branch is a little more interesting. This is what happens whenever we have a GET request come in. In order to read the data from the request, we have to muck around with the context. In order to create new messages, we can’t really use a transform since we don’t have a meaningful source message, so we are stuck with only the Message Assignment shape at our disposal.

Here’s the expression behind the magic in creating the (canned response) Order Status message:

image

Consuming the Service (GET Request)

Now that we have all of the pieces, let’s put it to the test! I’m going to do the first test using Internet Explorer. Why? Because it can issue a clean GET request without knowledge of the service or its specific implementation – it’s a good test to do in this case.

Issuing the request directly to the rewritten URL yields the following happy response:

image

Consuming the Service (POST Request)

In order to issue the POST request, it’s not going to be as simple as typing a URL in my browser. Additionally, the service is expecting a pretty specific message, so I will probably need to reference the schema in some way.

The approach I am taking here will be to use the xsd.exe utility to generate a nice C# class for my schema, serializing an instance of my type, and using the HttpClient class (find it in the Microsoft.AspNet.WebApi.Client NuGet package) to POST my newly created string to the service.

Tossed into a console app, it looks a little something like this:

Order order = new Order();
order.Id = "12345";
order.Purchaser = "Alice Bobson";
order.Items = new OrderItem[] { new OrderItem() { Id = "12345", Qty = 1, Price = 5.95M } };

XmlSerializer ser = new XmlSerializer(typeof(Order));
MemoryStream outputStream = new MemoryStream();
ser.Serialize(outputStream, order);

StringContent serializedOrder = new StringContent(Encoding.Unicode.GetString(outputStream.ToArray()),
                                                    Encoding.Unicode, "application/xml");

Console.WriteLine("Submitting order...");

HttpClient client = new HttpClient();
client.BaseAddress = new Uri("http://localhost/services/orderprocessing/");
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/xml"));

HttpResponseMessage response = client.PostAsync("api/orders/", serializedOrder).Result;

if (response.IsSuccessStatusCode)
{
    Console.WriteLine("Request was successful.");
    Console.WriteLine(response.Content.ReadAsStringAsync().Result);
}
else
{
    Console.WriteLine("Request failed.");
}

Console.ReadLine();

The end result of executing that code, is this beauty of a response:

image

I ❤ rest

On a more serious note (not talking about HTTP, services, or anything of that nature), I really do enjoy resting – especially sleep. As a result, I’m going to call it quits for this week, and will pick back up with the other side of this equation (POSTing data to another service) next week.

While you’re here, check out our training calendar to see if there are any classes coming up that might be able to help you sharpen your skills.

If you would like to access sample code for this blog post, you can find it on github.

1 This is purely because I am too lazy to create and document two full orchestrations right now.

    BizTalk Server 2013 Support for RESTful Services (Part 1/5)

    This post is the thirteenth in a weekly series intended to briefly spotlight those things that you need to know about new features in BizTalk Server 2013. It is also the first part of a five-part series on REST support in BizTalk Server 2013.

    BizTalk Server 2013 is trying to make BizTalk Server not only the application/integration server for the enterprise, but also for the Cloud. Here we’re referring specifically to those Software-as-a-Service solutions provided by 3rd party organizations in an elastic fashion – usually exposing an API in the form of REST endpoints.

    In this post, we will begin to explore the REST support in BizTalk Server 2013 in an overview fashion. We will then continue our exploration with a series of posts that cover common scenarios (Receiving data, POST-ing data, JSON en/decoding, and finally OAuth authentication).

    How Do We Do REST in BizTalk Server 2013?

    We do REST through BizTalk’s new WCF-WebHttp adapter. This adapter really is just another WCF adapter (like the SFTP adapter). It can be used on the Send or Receive side, in both one-way and two-way configurations.

    image

    In this case though, it is really clear that WCF is the underlying technology. Like all other service operations within BizTalk, we are exposed to operation mapping (the mapping of operations within orchestrations to actions in the service). However with the operation mappings in the WCF-WebHttp adapter, we don’t specify WCF actions so much as resources that we want to request (since it’s all just plain-old HTTP on the other end):

    image

    Do I Need a Dynamic Send Port To Do a GET?

    The operation mapping shown above begs the question of how a GET request like this would work. Here we’re showing some sort of weather service that is providing temperature readings on a postal code basis. In order to make a request for the readings, one would have to access a URL that would change on a per postal code basis (i.e., the endpoint address changes).

    Typically to accomplish something like sending to a dynamic address, one would use a dynamic send port. In the case of the WebHttp adapter, however, we get to specify variables within the path (which is shown in template form). In the screenshot above, we used the {postalCode} variable. Clicking the Edit button under Variable Mapping will lead us to the following screen:

    image

    Here we can see that in order to specify those variables, we simply pull from the context of the message traveling through the port. In this case we are grabbing the value for the {postalCode} part of the path from the Context Property named ZipCode.

    Since most of the time that we are using the GET method, we don’t want to include a message (really it’s a request for a resource, and the dynamic part is the URL, not the body of the request), we can suppress the body using the Suppress Body for Verbs, and specifying the methods for which we want to not include a request body.

    image

    What Don’t I Get?

    So far looking through the settings, it looks like we have a pretty good story around creating RESTful requests. However, we don’t have everything out of the box that we need to consume every single RESTful API out there.

    You will still have to provide things like the following:

    • Encoding/decoding support (e.g., for JSON encoded messages)
    • WCF extensions for performing authentication (e.g., OAuth)
    • Schemas representing the requests/responses

    These are some of the things that we will be covering within this series, so stay tuned over the next few weeks!

    Do I Need BizTalk Server 2013 To Make this Happen?

    While the adapter is new in BizTalk Server 2013, the underlying mechanism that makes it work is the WCF WebHttpBinding, which was around during the BizTalk Server 2010 timeframe. I’ve actually had people complain at me while explaining the WCF-WebHttp adapter as a new feature because of this.

    Does that mean you can get it all up and running using something like the WCF-Custom adapter? Yes, and no. You can technically make it function with some additional legwork (e.g., as seen here), but you’re not going to get as smooth as an experience as you will see with BizTalk Server 2013 – it really feels cobbled together without having BizTalk Server 2013 at your disposal.

    Coming Up Next

    Next week I will be talking through the receive side of this story as we see how to host a RESTful endpoint in BizTalk Server 2013. Until then, take care (and take our classes)! Smile