Decoding JSON Data Using the BizTalk Server 2013 R2 JSONDecode Pipeline Component

This is the fourth in a series of posts exploring What’s New in BizTalk Server 2013 R2. It is also the second in a series of three posts covering the enhancements to BizTalk Server’s support for RESTful services in the 2003 R2 release.

In my last post, I wrote about the support for JSON Schemas in BizTalk Server 2013 R2. I started out with a small project that included a schema generated from the Yahoo Finance API and a unit test to verify the schema model. I was going to put together a follow-up article last week, but spent the week traveling, in the hospital, and then recovering from being sick.

However, I am back and ready to tear apart the next installment that already hit the github repo a few days back.

Pipeline Support for JSON Documents

In BizTalk Server 2013 R2, Microsoft approached the problem of dealing with JSON content in a way fairly similar to the approach that we used in the previous version with custom components – performing the JSON conversion as an operation in the Decode stage of the pipeline, thus requiring the Disassemble stage to include an XMLDisassemble component for property promotion.

The official component Microsoft.BizTalk.Component.JsonDecoder takes in two properties Root Node and Root Node Namespace that help determine how the XML message will be created.

Finally, there isn’t a JSONReceive pipeline included in BizTalk Server 2013 R2 – only the pipeline component was included. In other words, in order to work with JSON, you will need a custom pipeline.



Creating a Pipeline for Receiving JSON Messages

Ultimately, I would like to create a pipeline that is going to be reusable so that I don’t have to create a new pipeline for each and every message that will be received. Since BizTalk message types are all about the target namespace and root node name, it’s not reasonable to set that value to be the same for every message – despite having different message bodies and content. As a result, it might be best to leave the value blank and only set it at design time.

This is also an interesting constraint, because if we are receiving this message not necessarily just as a service response, we might end up needing to create a fairly flexible schema (i.e., with a lot more choice groups) depending on the variety of inputs / responses that may be received – something that will not be explored within this blog post, but would be an excellent discussion to bring up during one of QuickLearn’s BizTalk Server Developer Deep Dive classes.

In order to make the pipeline behave in a way that will be consistent with typical BizTalk Server message processing, I decided to essentially take what we have in the XMLReceive pipeline and simply add a JsonDecoder in the Decode stage, with none of its properties set at design time.


Testing the JSONReceive Pipeline

In the same vein as my last post, I will be creating automated tests for the pipeline to verify its functionality. However, we cannot use the built-in support for testing pipelines in this case – because properties of the pipeline were left blank, and the TestablePipelineBase class does not support per instance configuration. Luckily, the Winterdom PipelineTesting library does support per instance configuration – and it has a nifty NuGet package as of June.

Unfortunately, the per-instance configuration is not really pretty. It requires an XML configuration file that resembles the guts of a bindings file in the section dedicated to the same purpose. In other words, it’s not as easy as setting properties on the class instance in code in any way. To get around that to some degree, and to be able to reuse the configuration file with different property values, I put together a template with tokens in place of the actual property values.


NOTE: If you’re copying this approach for some other pipeline components, the vt attribute is actually very important in ensuring your properties will be read correctly. See KB884981 for details.

From there, the per-instance configuration is a matter of XML manipulation and use of the ReceivePipelineWrapper class’ ApplyInstanceConfig method:

private void configureJSONReceivePipeline(ReceivePipelineWrapper pipeline, string rootNode, string namespaceUri)
    string configPath = Path.Combine(TestContext.DeploymentDirectory, "pipelineconfig.xml");

    var configDoc = XDocument.Load(configPath);




The final test code includes a validation of the output against the schema from last week’s post. As a result, we’re really dealing with an integration test here rather than a unit test, but it’s a test nonetheless.

public void JSONReceive_JSONMessage_CorrectValidXMLReturned()

    string rootNode = "ServiceResponse";
    string namespaceUri = "";

    string sourceDoc = Path.Combine(TestContext.DeploymentDirectory, "sample.json");
    string schemaPath = Path.Combine(TestContext.DeploymentDirectory, "ServiceResponse.xsd");
    string outputDoc = Path.Combine(TestContext.DeploymentDirectory, "JSONReceive.out");

    var pipeline = PipelineFactory.CreateReceivePipeline(typeof(JSONReceive));

    configureJSONReceivePipeline(pipeline, rootNode, namespaceUri);

    using (var inputStream = File.OpenRead(sourceDoc))
        var result = pipeline.Execute(MessageHelper.CreateFromStream(inputStream));

        Assert.IsTrue(result.Count > 0, "No messages returned from pipeline.");

        using (var outputFile = File.OpenWrite(outputDoc))


    ServiceResponse schema = new ServiceResponse();
    Assert.IsTrue(schema.ValidateInstance(outputDoc, Microsoft.BizTalk.TestTools.Schema.OutputInstanceType.XML),
        "Output message failed validation against the schema");

    Assert.AreEqual(XDocument.Load(outputDoc).Descendants("Bid").First().Value, "44.97", "Incorrect Bid amount in output file");


After giving it a run, it looks like we have a winner.


Coming Up in the Next Installment

In the next installment of this series, I will actually put to use what we have here, and build out a more complete integration that allows us to experience sending JSON messages as well, using the new JsonEncoder component.

Take care until then!

If you would like to access sample code for this blog post, you can find it on github.

JSON Schemas in BizTalk Server 2013 R2

This is the third in a series of posts exploring What’s New in BizTalk Server 2013 R2. It is also the first in a series of three posts covering the enhancements to BizTalk Server’s support for RESTful services in the 2003 R2 release.

In my blog post series covering the last release of BizTalk Server 2013, I ran a 5 post series covering the support for RESTful services, with one of those 5 discussing how one might deal with JSON data. That effort yielded three separate executable components:

  1. JSON to XML Converter for use with the WFX Schema Generation tool
  2. JSON Decoder Pipeline Component (JSON –> XML)
  3. JSON Encoder Pipeline Component (XML –> JSON)

It also yielded some good discussion of the same over on the connectedcircuits blog, wherein some glitches in my sample code were addressed – many thanks for that!

All of that having been said, similar components in one form or another are now available out of the box with BizTalk Server 2013 R2 – and I must say the integrated VS 2013 tooling blows away a 5 minute WinForms app. In this post we will begin an in-depth examination of this improved JSON support by first exploring the support for JSON Schemas within a BizTalk Server 2013 R2 project.

How Does BizTalk Server Understand My Messages?

All BizTalk Server message translation occurs at the intersection between 2 components: (1) A declarative XSD file that defines the model of a given message, with optional inline parsing/processing annotations, and (2) an executable pipeline component (usually within the disassemble stage of a receive pipeline or assemble stage of the send pipeline) that reads the XSD file and uses any inline annotations necessary to parse the source document.

This is the case for XML documents, X12, EDIFACT, Flat-file, etc… It only logically follows then that this model could be extended for JSON. In fact, that’s exactly what the BizTalk Server team has done.

JSON is an interesting beast however, as there already exists a schema format for specifying the shape of JSON data. BizTalk Server prefers working with XSD, and makes no exception for JSON. Surprisingly this XSD looks no different than any other XSD, and contains no special annotations to reflect the message being typically represented as JSON content.

What Does a JSON Schema Look Like?

Let’s consider this JSON snippet, which represents the output of the Yahoo! Finance API performing a stock quote for MSFT:


This is a pretty simple instance, and it is also an interesting case because it has a null property Ask, as well as a repeating record quote that does not actually repeat in this instance. I went ahead and saved this snippet to the worst place possible – my Desktop – as quote.json and then created a new Empty BizTalk Server Project in Microsoft Visual Studio 2003 (with the recently released Update 3).

From there I can generate a schema for this document by using the Add New Items… context-menu item for the project within Solution Explorer. From there, I can choose JSON Schema Wizard in the Add New Item dialog:


The wizard looks surprisingly like the Flat-file schema wizard, and it looks like quite a bit of that work might have been lifted and re-purposed for the JSON schema wizard. What’s nice about this wizard though, is that this is really the only page requiring input (the previous page is the obligatory Welcome screen) – so you won’t be walking through the input document while also recursively walking through the wizard.


Instead the wizard makes some core assumptions about what the schema should look like (much like the WFX schema generator). In the case of this instance, it’s not looking so great. Besides from essentially every single element being optional in the schema, the quote record was not set as having a maxOccurs of unbounded – though this should really be expected given that our input instance gave no indication of this. However, maybe you’re of the opinion that the wizard may have been written to infer that upon noticing it was a child of a record with a plural name – which might be an interesting option to see.


Next the Ask record included was typed as anyType instead of decimal – which again should be expected given that it was simply null in the input instance. However, maybe this could be an opportunity to add pages to the wizard asking for the proper type of any null items in the input instance.

Essentially, it may take some initial massaging to get everything in place and happy. After tweaking the minOccurs and maxOccurs, as well as types assigned to each node, I decided it would be a good time to ensure that my modifications would still yield a schema that would properly validate the input instance I provided to the wizard.

How do We Test These Schemas Or Validate Our JSON Instances?

Quite simply, you don’t. At least not using the typical Validate Instance option available in the Solution Explorer context-menu for the .xsd file. Instead this will require a little bit of work in custom-code.

Where am I writing that custom code? Well right now I’m on-site in Enterprise, Alabama teaching a class that involves a lot of automated testing. As a result, I’m in the mood for writing some unit tests for the schema – which also means updating the project properties so that the class generated for the schema derives from TestableSchemaBase and adds a method we can use to quickly validate an instance against the schema.


It also means adding a new test project to the solution with a reference to the following assemblies:

  • System.Xml
  • C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\PublicAssemblies\Microsoft.BizTalk.TOM.dll
  • C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\PublicAssemblies\Microsoft.BizTalk.TestTools.dll
  • C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\PublicAssemblies\Microsoft.XLANGs.BaseTypes.dll
  • Newtonsoft.Json (via Nuget Package)

That’s not all the setup required unfortunately. I still have to add a new TestSettings file to the solution, ensure that deployment is enabled, that it is deploying the bolded Microsoft.BizTalk.TOM.dll assembly above, and that it is configured to run tests in a 32-bit hosts. From there I need to click TEST > Test Settings > Select Test Settings File, to select the added TestSettings file.





With all the references in place and the solution all setup, I’ll want to bring in the message instance(s) to validate. In order to ensure that the test has access to these items at runtime, I will add the applicable DeploymentItem attribute to each test case that requires one.

using Microsoft.VisualStudio.TestTools.UnitTesting;
using Newtonsoft.Json;
using System.IO;
using System.Xml;

namespace QuickLearn.Finance.Messaging.Test
    public class ServiceResponseTests

        public void ServiceResponse_ValidInstanceSingleResultNullAsk_ValidationSucceeds()

            // Arrange
            ServiceResponse target = new ServiceResponse();
            string rootNode = "ServiceResponse";
            string namespaceUri = "";
            string sourceDoc = Path.Combine(TestContext.DeploymentDirectory, "sample.json");
            string sourceDocAsXml = Path.Combine(TestContext.DeploymentDirectory, "sample.json.xml");

            ConvertJsonToXml(sourceDoc, sourceDocAsXml, rootNode, namespaceUri);

            // Act
            bool validationResult = target.ValidateInstance(sourceDocAsXml, Microsoft.BizTalk.TestTools.Schema.OutputInstanceType.XML);

            // Assert
            Assert.IsTrue(validationResult, "Instance {0} failed validation against the schema.", sourceDoc);


        public void ConvertJsonToXml(string inputFilePath, string outputFilePath,
            string rootNode = "Root", string namespaceUri = "", string namespacePrefix = "ns0")
            var jsonString = File.ReadAllText(inputFilePath);
            var rawDoc = JsonConvert.DeserializeXmlNode(jsonString, rootNode, true);

            // Here we are ensuring that the custom namespace shows up on the root node
            // so that we have a nice clean message type on the request messages
            var xmlDoc = new XmlDocument();
            xmlDoc.AppendChild(xmlDoc.CreateElement(namespacePrefix, rawDoc.DocumentElement.LocalName, namespaceUri));
            xmlDoc.DocumentElement.InnerXml = rawDoc.DocumentElement.InnerXml;


        public TestContext TestContext { get; set; }

What Exactly Am I Looking At Here?

Here in the code we’re converting our JSON instance first to XML using the Newtonsoft.Json library. Once it is in an XML format, it should (in theory at least) conform to the schema definition generated by the BizTalk JSON Schema Wizard. So from there, we take output XML, and feed it into the ValidateInstance method of the schema to perform validation.

The nice thing about doing it this way, is that you will not only get a copy of the file to use within the automated test itself, but you can also use the file generated within the test in concert with the Validate Input Instance option of the schema for performing quick manual verifications as well.

After updating the schema, it looks like it’s going to be in a usable state for consuming the service:

Screenshot of final code

Coming Up Next Week

Next week will be part 2 of the JSON series in which we will test and then use this schema in concert with the tools that BizTalk Server 2013 R2 provides for consuming JSON content.

If you would like to access sample code for this blog post, you can find it on github.

Getting Started with BizTalk Server 2013 R2’s Built-in Health Monitoring

This is the second in a series of posts exploring What’s New in BizTalk Server 2013 R2.

With the BizTalk Server 2013 R2 release, Microsoft has finally implemented a common request to have some level of built-in monitoring tool for a BizTalk Server installation. While this built-in option won’t replace things like the BizTalk Server 2013 Monitoring Management Pack for System Center Operations Manager, or come remotely close to the feature set of third party options like BizTalk360 or AIMS for BizTalk, but it does provide an out-of-the-box solution for performance monitoring, environment validation and notifications.

Ultimately, this tool was built by the same project team that created the MsgBoxViewer tool (MBV), and represents an effort to more tightly integrate this stand-alone tool with the BizTalk Server Administration Console.

The first release supports the following features, with updates promised in the future:

  • Ability to monitor multiple BizTalk Server environments
  • MBV report generation and viewing
  • Dashboard view for overall BizTalk Server environments health
  • Scheduled report collection
  • Email notifications
  • Performance monitor integration with pre-loaded scenario-based performance counters
  • Report management

I Have BizTalk Server 2013 R2, But Where Is This Health Monitor?

Unfortunately, the Health Monitor is not registered for use by default, and doesn’t show up anywhere by default. Before making use of it, you’ll have to do some dirty work to get it prepared for use. The core files live under the BizTalk Server installation directory at \SDK\Utilities\Support Tools\BizTalkHealthMonitor.

BizTalk Server 2013 R2 Health Monitor Files

So what do we do here? We need to run InstallUtil.exe against the MBVSnapin.dll. In order to accomplish this, we can either drop to the command line, or drag and drop MBVSnapin.dll on InstallUtil.exe.

Register BizTalk Server 2013 R2 Health Monitor

Once it is registered, you can add it to a Management Console alongside the BizTalk Server Administration Console for an all-up management experience.

In order to do that, run mmc /32

Running MMC /32

After a nice clean and empty management console appears, press CTRL+M, and then add both the BizTalk Health Monitor and the Microsoft BizTalk Server Administration snap-ins to the console.

Adding BizTalk Server 2013 R2 Health Monitor to the Console

You end up with an Administration Console window containing the items shown in the screenshot below. This might be a good opportunity to add the Event Viewer snap-in for each of your runtime servers as well. At this point, you may want to save the console setup for later use.

BizTalk Server 2013 R2 Administration Console with Health Monitor

What Can I Do with This Thing?

If you expand the Performance node, and click Current Activity, you will be able to examine select performance counters across your BizTalk Server installations through an embedded perfmon display.


If you right-click each BizTalk Group within the Health Monitor, you have the ability to execute a set of rules that validate your installation while highlighting problem areas.

Running Analysis

Once you run the analysis, a node is added to the navigation pane labeled with the date and time of the analysis. This report contains the result of executing validation rules at a fixed point in time. This report can be sent via email, or opened in the browser for additional details.

Results of Analysis

Right now, it’s looking like my installation is throwing a pretty critical warning when it comes to the Jobs category. Let’s see what that might be.

Jobs Warning

It looks like the Backup BizTalk Server job hasn’t been enabled, and there isn’t any history indicating that this job has ever executed. That’s fairly concerning and problematic. It would be nice if we could have been notified about that in a more proactive manner.

Enabling Automatic Scans / Notifications

If I go back to the BizTalk Group within the Health Monitor, and click Settings, I will find myself at a screen that enables me to configure automatic analysis of my BizTalk Server Group as well as notifications of scan results.

BizTalk Health Monitor Settings Menu

Additionally, I can even configure the queries executed, rules evaluated, and the views on top of that information I want to include in each analysis.

Configuring Reporting Settings

If I want to enable notifications, I have a few different options. I can either configure email notifications, or if I want to essentially pipe these notifications into another tool that can consume event log entries, I can direct notifications to the event log instead.

Notification Settings

More to Come

As mentioned earlier, it sounds like the team is already well underway with an update to this tool, and it’s safe to say that there will likely be more to come. I would venture to guess that this will mean either more features and deeper console integration (since there are still quite a few times where clicking an option launches the browser to show full details). We’ll keep this space updated.

In the meantime, if you’re just now moving to either BizTalk Server 2013, or BizTalk Server 2013 R2, and you want to keep your skills up to date, check out one of our BizTalk 2013 Developer Immersion classes or BizTalk 2013 Administrator Immersion classes. Just this last week, students in the developer class that I taught were able to see this functionality demonstrated live.

If you’re already a QuickLearn student, keep following the blog to get the latest bleeding edge information as it becomes available. The series will continue next week!

What’s New In BizTalk Server 2013 R2

This is the first in a series of posts exploring What’s New in BizTalk Server 2013 R2. It will also serve as the index of the series, and contain links to all of the posts to come.

This is a listing of all of the posts within the series:

  1. What’s New In BizTalk Server 2013 R2
    Shows Shared Access Signature (SAS) Authentication for Service Bus
  2. Getting Started with BizTalk Server 2013 R2’s Built-in Health Monitoring
    Demonstrates the installation and use of the BizTalk Health Monitor
  3. JSON Schemas in BizTalk Server 2013 R2 [Code Sample]
    Shows how to generate a JSON schema and write unit tests to validate instances
  4. Decoding JSON Data Using the BizTalk Server 2013 R2 JsonDecode Pipeline Component [Code Sample]
    Shows how to receive JSON messages and write integration tests to validate a configurable pipeline

We’ve been pretty busy over here at QuickLearn over the past few months, as many of you may have noticed. We’ve released our BizTalk Server 2013 Administrator Deep Dive class, and have been hard at work on our Azure BizTalk Services Fundamentals class (coming as soon as September 2014). Meanwhile, Microsoft has released BizTalk Server 2013 R2.

As a result, I am starting a series in a similar vein as my What’s New in BizTalk Server 2013 series, to uncover those new features in 2013 R2 that will make your life just a little bit easier. However, this time around it will be a weekly series that will occasionally take breaks to share time with posts about Azure BizTalk Services.

All of that having been said, I’m going to get upgraded, and then jump right in to check out one of the things I’m most excited about.


I Love Microsoft Azure Service Bus

I’ve got to admit that I’m a huge fan of Microsoft Azure Service Bus. Not only that, but I’m a big fan of the .NET API which really feels oh-so-right and makes allowances for multiple patterns for synchronous vs. asynchronous code.

That being said, a big pain point with Service Bus has been using the Access Control Service for fine-grained security – which really can be the opposite of intuitive – especially when the concept of an identified user isn’t really needed or important to your integration scenario.

Thankfully, ACS isn’t the only security model that Service Bus supports. We actually can also use Shared Access Signatures for authentication. SAS authentication allows you to generate a key for a specific fine-grained entity within Service Bus for which you want to authorize access (e.g., Listen access for a given Queue), and then from that key you can generate a signed token that expires after a period of time. This token need not be borne by a specific user, it need only be presented to be accepted.

While all of that could be filed under mildly interesting, the important thing to note is that unless you have BizTalk Server 2013 R2 installed, you will be limited to using the ACS model for Service Bus authentication and authorization.

SAS-SY SB-Messaging Adapter

Thankfully, after upgrading to BizTalk Server 2013 R2, if you visit the Authentication tab of the SB-Messaging Transport Properties dialog, you will find the following option available:


Knowing that you can use Shared Access Signatures is one thing, being able to put that into practice is another. If you haven’t used SAS authentication/authorization before, you’re in for a treat.

Configuring SAS Authentication for Service Bus Entities

If you head over to the Microsoft Azure Management Portal, create some Service Bus entities, and then head to the Configure tab for the same, you will find a section titled Shared Access Policies


This section allows you to define access policies (e.g., read-only access, or Listen permission for the queue shown in the screenshot), and then generate signing keys that can be used to generate access tokens granting that level of permission.


It’s nice to know that this can all be done through the web UI if needed, but nothing here seems to relate back to the first property that you may have noticed when examining the settings for the SB-Messaging adapter (i.e., the Shared Access Key Name property). In reality, it’s asking for what the UI calls the Policy Name.

So what would the adapter configuration look like for this Shared Access Policy?


Putting it to the Test

So let’s put the updated adapter to the test and see what we get out the other end. First, let’s whip up a quick console app that will give us a message that is simply the string “Hello World”


Yes, the purists will note that I did not use a MesagingFactory. Why? Because this is not going to be a long-lived client, and it felt nicer to type. However, given a real world example, MessagingFactory will usually be the correct choice.

So let’s run down what I have while that message sits in the queue. I have a one-way Receive Port with a single Receive Location. This Receive Location uses the SB-Messaging adapter pointed at myqueue and using SAS for authentication (per the settings in the screenshot above). I have a Send Port subscribing to all messages from this Receive Port. This Send Port is using the FILE adapter, because I’m writing this late at night and am running out of creativity.

With everything in place, you will see this glorious sight…


And opening the file reveals…


Am I impressed that this traveled through a Service Bus Queue to arrive on my file system? No. I’m just happy that it authenticated using SAS token along the way, and I didn’t have to touch ACS at all during the process.

One hope that I have for this new functionality, is that it will see people moving beyond using owner for everything. Even though it’s something that I would find myself doing for demonstration purposes, it is also something that made me cringe to see in real life. It’s a lazy and even dangerous thing to do in some cases.

Just a Taste

This is really just a small flavor of what’s to come. There are some pretty big changes that aren’t getting a lot of fanfare at the moment, but I hope that will change as more and more people upgrade and discover what is waiting for them under the covers.

Until next week, take care!