This article was originally published on LinkedIn.
Introduction
Back in the days when Logic App Consumption based was released, it was a starting point to move your integration landscape from BizTalk Server to Azure.
In order to do so it was important to support business data in the form of XML, XSLT, EDI, AS2, Agreements and Partners. That’s why Azure Integration Account was introduced together with Logic Apps Consumption.
While Integration Account empowered Logic Apps with many options for message validation and transformation, it was primarily focused on XML based technology.
Currently, Logic Apps with Integration Account still have actions like XML Validation, Transform XML, AS2 encoding/decoding and several actions for Edifact messages. And a while ago Logic Apps got the Integration Account Artifact Lookup action to have more flexibility in retrieving artifacts from Integration Account.

Nevertheless, any Logic App developer will recognize the struggles when it comes to validating JSON messages in your workflow. After facing the same challenge over the years, and especially with Logic App standard, I was determined to find a clever solution for this problem. Hopefully you will benefit from the solution I’ll explain in this very article.
Context
Let’s face it: Logic App Consumption and Logic App Standard deviate in what functionality is available. One thing they still have in common is that validating JSON schemas can be challenging (based on the perspective you look at it. I will try to explain this first.
The problem: JSON validation and its troubles in Logic Apps
With XML
Validation and transformation of XML messages is natively supported with Integration Account and with Logic Apps Consumption and Standard through the following actions:
- XML validation: This action comes in two flavors. In Consumption you can make it point to an Integration Account and select your XML Schema Definition file, While Standard also offers the option to use a schema or map in the Artifacts folder or your Logic Apps Standard resource in Azure.

- Transform XML: The very same applies to the transform action, but this action is used to transform a source XML message to destination.

So everything works out of the box which is great!
With JSON
1) If you have a JSON schema definition file, you might want to upload it to Integration Account, so it can be used in your workflow. It won’t work, since uploading anything other than *.xsd files will result in the message shown below. It means that (especially for Consumption based workflows) this is a dead end.

2) If you are on Logic App Standard, you can deploy the JSON file as part of your Logic App Standard solution through the Artifacts folder. This folder can contain Maps and Schemas as shown in the following picture.

So, it’s a great approach that these artifacts are part of your solution and Logic App Standard deployment, it only gives one big challenge: There is no corresponding action (like XML validation) in the workflow designer to validate JSON messages, and you just can’t point the Parse JSON action to the schema in the artifacts folder.
Both approaches will cause new challenges which I couldn’t solve until last week. But I managed to find a clever hack which will make it possible for Logic App standard to retrieve the JSON schema and use it in the Parse JSON action for validation.
Before going into the solution, let’s have a look at the options that many use for JSON validation:
1) Set the Request Body JSON Schema on the HTTP trigger
Pros
- You can directly validate the incoming message
- and if the message validates you will have typed properties in your workflow
Cons
- No additional measurements possible when validation fails. The message is not accepted, end of workflow. In many cases you might want to have more granular control in handling incorrect messages.
- You can’t reuse your JSON schemas. It’s copy/paste it to all workflows that use it, embedding the schemas in your workflow code. This brings its own challenges with maintainability.

2) Put full JSON schema in the Parse JSON action
Pros
- You properly validate the incoming message.
- Better visibility and control within your workflow when validation fails (see picture below)
- Additional measurements possible when validation fails. Based on success/failure, you can handle validation errors in your own way or respond with custom messages to the caller. (see picture below)
Cons
- You can’t reuse your JSON schemas. It’s copy/paste it to all workflows that use it, embedding the schemas in your workflow code. This brings its own challenges with maintainability (e.g. when schema changes).

3) Store JSON schema in Storage Account and use the content in the Parse JSON action
With Logic App Consumption, this has been the way to go for me over the past years, since it gave the possibility to centrally store a JSON schema and use a single schema definition in many workflows. The only downside is that you needed a Storage Account and upload the JSON to blob storage.
The following images illustrate how to retrieve the JSON and hack it into the Parse JSON action:


Previously mentioned three options are a way to do your JSON validation. Nevertheless, it doesn’t feel right that you can deploy your JSON schemas to Logic App Standard and not being able to use it. Let’s see how to fix that when there are no out-of-the-box actions available.
The solution – Get access to the Logic App Standard file system and read JSON file content!
The first thought that came to mind: isn’t there a way to access the Artifacts folder in Logic App standard from within a workflow? To recap, it’s the folder where you store maps and schemas that you can use without any dependency on Azure Integration Account in Logic Apps Standard.

So the first thing I wanted to check was using the Inline Code for JavaScript action and see if I could access the Artifacts directory and with a bit of help from ChatGPT I created some code…only to find out that the action has a node.js implementation that does not have fs, the file system available to use.
I asked ChatGPT about it…

And additionally, it also explained that JavaScript, C# and PowerShell wouldn’t be able to access the file system of Logic App Standard. But I didn’t give up and decided to test the C# route.
Guess what? It worked! I’ll provide you with the steps to achieve it.
Step 1: Create a variable for the schema name in your workflow

Step 2: Set variable schema name to a valid JSON schema in your Artifacts/schemas folder

Step 3: Here the magic happens with the CSharpScriptCode action
1) After setting the schema name, add the CSharpScriptCode action to your workflow and give it a meaningful name.

This will create a csx file (with the same action name) in the workflow folder, together with your workflow.json file. Next, add the following code to this action:
// Add the required libraries
#r "Newtonsoft.Json"
#r "Microsoft.Azure.Workflows.Scripting"
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Microsoft.Extensions.Logging;
using Microsoft.Azure.Workflows.Scripting;
using Newtonsoft.Json.Linq;
public static async Task<Result> Run(WorkflowContext context, ILogger log)
{
// Retrieve schemaname from set variable action.
JToken actionInputs = (await context.GetActionResults("Set_variable_schemaName").ConfigureAwait(false)).Inputs;
var schemaName = actionInputs["value"]?.ToString();
var home = Environment.GetEnvironmentVariable("HOME");
var path = Path.Combine(home, "site", "wwwroot", "artifacts", "schemas", $"{schemaName}.json");
var json = await File.ReadAllTextAsync(path).ConfigureAwait(false);
return new Result
{
Schema = schema // Logic Apps sees this as a JSON object
};
}
public class Result
{
public JObject Schema { get; set; }
}
In general, this code uses the schema name that was set in the variable action and uses File.ReadAllTextAsync to read the contents of this file from the Artifacts/Schemas folder. It then returns the content as a result object to your workflow.
Step 4: Handle exception when schema cannot be found
In my next post I will use this approach to implement validation based on different schema versions, and handling non-existent schemas is needed for that. For now, keep in mind that it’s possible that the schema doesn’t exist in your artifacts folder. Therefore, just add a switch that runs both on success and failure of the C# action and handle failures properly.

The action expression contains the following check:
actions('Get_json_schema')?['code']
Step 5: Use the C# output in the Parse JSON action
Now the content of your schema has been retrieved, it can be used in the Parse JSON action to validate the incoming JSON message.
Important notice: Just with using the storage account approach, the code needs to be added from the workflow Code view and clicking into the schema field will cause an Enter a valid JSON message. So, leave this field alone in the designer 😉

The expression is:
@{outputs('Get_json_schema')['body']['schema']}
Step 6: Proof of the pudding
Now everything has been set up, let’s see proof that it’s possible to retrieve a JSON schema from the Artifacts directory of Logic App Standard.
1) Retrieving the JSON schema

2) Using it in the Parse JSON action

Conclusion
While it already was possible to deploy JSON schemas together with Logic Apps Standard in the Artifacts folder, it wasn’t possible to use this schema file to do JSON validation in the Parse JSON action.
I have mentioned some alternative approaches, all of them with their own pros and cons. Therefore, I wanted to find an easy solution to use a centralized JSON schema in Logic App Standard to do JSON validation.
With a minor hack, it’s possible to retrieve the schema from the Logic App Standard file system and feed it to the Parse JSON action. This centralizes schema storage in Logic App itself and increases maintainability.
In Part 2 of this series (next week) I will use this approach to support multiple versions of a JSON schema to do validation.


Leave a Reply