This article was originally published on LinkedIn.
Introduction
Even though I have a history with Logic Apps since early 2016, it still surprises me when I bump into a platform limitation that I didn’t expect.
And so it happened to my team last week. And to give some background: We’re designing and implementing some mission critical integrations that will heavily read and write data to an internal SQL Database. This database offers various stored procedures to read, write and update data internally.
Since we will be bulk inserting (100K+ records) data through stored procedures, these stored procedures use User Defined Table types as their input parameter. A choice that is made by the DBA team, and therefore it’s a given that we need to integrate with these procedures as is.
Note: Specifically for this article, I have created an implementation in it’s simplest form, that reflects the challenge that we will be tackling.
The problem
It’ll be easy, they said…Nevertheless, while running some connectivity tests, I ran into a strange error that I wasn’t able to pin point.

The first test we did
- Call a stored procedure that reads data from a SQL table.
- Use it’s output, which is an array of objects in JSON notation, e.g.
[
{
"Isbn": "9781593279509",
"Title": "Eloquent JavaScript, Third Edition",
"Author": "Marijn Haverbeke",
"Publisher": "No Starch Press",
"Published": "2018-12-04T00:00:00.000Z"
},
{
"Isbn": "9781484200766",
"Title": "Pro Git",
"Author": "Scott Chacon and Ben Straub",
"Publisher": "Apress; 2nd edition",
"Published": "2014-11-18T00:00:00.000Z"
}
]
- Followed by a stored procedure that takes a SQL User Defined Table type parameter as input, and internally select its content and writes it into another SQL table, e.g.
CREATE procedure StoreBooks @RegistrationDate DateTime, @Data [dbo].[BooksUdt] READONLY AS BEGIN SET QUOTED_IDENTIFIER ON INSERT INTO [dbo].[Books] ( [Isbn], [Title], [Author], [Published], [Publisher], [RegistrationDate] ) SELECT [Isbn], [Title], [Author], [Published], [Publisher], @RegistrationDate FROM @Data END
And passing in a JSON Array of Objects, resulted in the error message “Failed to convert parameter value from a JArray to IEnumerable’1.“. So this actually means that an underlying .NET library isn’t able to parse the JSON array into the type that SQL expects.
A simple search on google left me nowhere (again), so we needed to dig a bit deeper.
Some further investigation made things clear…
I wanted to do some more investigation and decided to set-up a representative environment in my own Azure subscription to see if I could find out more details. Therefore I set up a dummy database and made a test workflow in Logic Apps Standard that uses the “Execute stored procedure (V2)” action that calls a stored procedure with a UDT parameter.
This ultimately resulted in the limitation of the connector being presented:

The image above clearly shows that – unfortunately – the SQL connector does not support User Defined Table Type parameters.
And since we’re bound to what the DBA team offered to us, we needed to investigate our alternative option.
The options
During the next day, I worked on possible workarounds.
The good
- Azure Functions. If you bump into a limitation or some complex logic in Logic Apps, a great way to tackle those is to do some custom C# coding, since it provides both flexibility and scalability when needed. Simply offload it to a Function and handle it there. As mentioned in my previous post: pro- and low code go hand in hand.
The bad (less good ;-))
- Ask the DBA team to change the signature of the stored procedure and instead of using a UDT, rather use JSON – NVARCHAR(MAX) – as input parameter. Let the stored procedure use the OPENJSON functionality in SQL and loop through the data. SQL does support it, and it is a plausible way to send data. The issue I have with this solution is that it 1) would require the DBA team to change a lot of stored procedure signatures and 2) complex logic would be needed to parse the JSON and insert the data into a table, which – in my opinion – is really prone to errors in many ways. You can read more here around this option.
The ugly
To be honest, I don’t even want to share the other options that came in mind. One can think of:
- Why not use ADF for this case? Yeah, but why would we?
- Why not building up a query and do an “Execute a SQL query (V2)“.
While this could seem like a plausible option, though I find it 1) a bad practice to directly fire queries to a database, that in turn calls a stored procedure. 2) It requires additional rights in the database and 3) if you have a lot of data (again 100K+ records) that would go into the UDT, it would simply impact the performance of your Logic App Standard to a possibly undesired level, just because you’re building up a query.

The solution – Logic Apps <3 Functions
Azure Functions are a great way to simplify the complex nature of workflows.
- If something becomes too complex from a workflow perspective, delegate it to an Azure Function!
- If a transformation would be easier in an Azure Function rather than in liquid of xslt, please go ahead!
- And if there is something that a Logic App workflow action can’t (yet) do, delegate it to a Function! And so we did.
The following section outlines the solution and fragments of the implementation.
The Books database
For this tutorial, I decided it would be a great idea to store a collection of books through a stored procedure with an UDT parameter, into an Azure SQL database.
1) So here we have it, the database:

2) This database has a single table to store book records in, and it looks like this:

3) On top of this table, I have created an User Defined Table type that mostly reflects the table from the previous step. It only misses the RegistrationDate, since I wanted to make that an input parameter for the stored procedure.

4) And we have a stored procedure that takes the Registration date and the Books UDT as input parameters.

And with this set up, we’re good to go to work on the Azure Function.
The Azure Function
Putting it pretty simple: what the Azure Function will need to do is:
1) Receive a HTTP request with the JSON payload that looks as follows the following JSON.
{
"RegistrationDate": "2025-10-30",
"Data": [
{
"Isbn": "9781593279509",
"Title": "Eloquent JavaScript, Third Edition",
"Author": "Marijn Haverbeke",
"Publisher": "No Starch Press",
"Published": "2018-12-04T00:00:00.000Z"
},
{
"Isbn": "9781484200766",
"Title": "Pro Git",
"Author": "Scott Chacon and Ben Straub",
"Publisher": "Apress; 2nd edition",
"Published": "2014-11-18T00:00:00.000Z"
}
]
}
2) Translate this JSON to a .NET typed object (model) to work with internally.
3) Use the ADO.NET (or EF, or any other library of your choice) to translate the object to a SQLConnection.ExecuteCommandAsync to execute the stored procedure with the proper parameters.
A visual studio solution for Functions require some initial set-up, boiler plating, and even more do to proper object-oriented programming. It would be too much detail to outline all here, and therefore I took some code snippets to explain how such Azure Function could work.
The function CreateBooks.cs
Is the entry point of the Function and translates the JSON to a .NET Object (a model) and sends it to the Business logic down the chain.
In order to translate the JSON to an object, we’ve written an extension method on the HttpRequest object. In essence, it takes the JSON body and deserializes with JsonConvert.DeserializeObject<T>.
public class CreateBooks
{
private readonly ILogger<CreateBooks> _logger;
private readonly ICreateEntityProcessor<Books> _processor;
public CreateBooks(ILogger<CreateBooks> logger, ICreateEntityProcessor<Books> processor)
{
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
_processor = processor ?? throw new ArgumentNullException(nameof(processor));
}
[Function(nameof(CreateBooks))]
public async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "v1/books/")]
HttpRequest req,
Books book)
{
ArgumentNullException.ThrowIfNull(req);
var books = await req.GetJsonBodyAsync<Books>().ConfigureAwait(false);
await _processor.Process(books, CancellationToken.None).ConfigureAwait(false);
return new OkObjectResult(string.Empty);
}
}
The .NET object (model) that the JSON is deserialized into
public record Books
{
[JsonProperty(nameof(RegistrationDate))]
public DateTime? RegistrationDate { get; set; }
[JsonProperty(nameof(Data))]
public IList<Book>? Data { get; set; }
}
public record Book
{
[JsonProperty(nameof(Isbn))]
public string? Isbn { get; set; }
[JsonProperty(nameof(Title))]
public string? Title { get; set; }
[JsonProperty(nameof(Author))]
public string? Author { get; set; }
[JsonProperty(nameof(Publisher))]
public string? Publisher { get; set; }
[JsonProperty(nameof(Published))]
public DateTime? Published { get; set; }
}
The CreateBooksProcessor.cs
The Processor (or whatever you want to name it) takes the object and e.g. can validate it before it is being passed to the SQL repository. I’ve kept it to a minimum for demo purposes.
public class CreateBooksProcessor : ICreateEntityProcessor<Books>
{
private readonly ISqlRepository<Books> _sqlRepository;
public CreateBooksProcessor(ISqlRepository<Books> sqlRepositoryrepository)
{
_sqlRepository = sqlRepositoryrepository ?? throw new ArgumentNullException(nameof(sqlRepositoryrepository));
}
public async Task Process(Books bookRequest, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(bookRequest);
await _sqlRepository.CreateAsync(bookRequest, cancellationToken)
.ConfigureAwait(false);
}
}
The BookRepository.cs
This actually is the most interesting part of the whole solution. It uses the good old ADO.NET provider in C# to execute the stored procedure.
The CreateAsync method:
- Creates a DataTable for all properties of the UDT.
- Loops through the Data section of the object and adds all record to the DataTable.
- And executes the SQL command with the name of the stored procedure, the RegistrationData parameter, the Data (UDT) parameter.
In the full solution, the ExecuteCommandAsync resides in the IDataContext, but to make things understandeble for this post, I moved it to this class. In essence, this method sets up the SQL connection and executes the stored procedure.
public class BookRepository : ISqlRepository<Books>
{
private readonly IDataContext _dataContext;
public BookRepository(IDataContext dataContext)
{
_dataContext = dataContext ?? throw new ArgumentNullException(nameof(dataContext));
}
public async Task CreateAsync(Books data, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(data);
ArgumentNullException.ThrowIfNull(data.Data);
// The sequence of the columns must match the sequence of columns in the UDT
var table = new DataTable();
table.Columns.Add("Isbn", typeof(string));
table.Columns.Add("Title", typeof(string));
table.Columns.Add("Author", typeof(string));
table.Columns.Add("Published", typeof(DateTime));
table.Columns.Add("Publisher", typeof(string));
foreach (var dataRow in data.Data)
{
table.Rows.Add(dataRow.Isbn, dataRow.Title, dataRow.Author, dataRow.Published, dataRow.Publisher);
}
await ExecuteCommandAsync("dbo.StoreBooks", cmd =>
{
cmd.Parameters.Add(new SqlParameter("@RegistrationDate", SqlDbType.DateTime) { Value = data.RegistrationDate });
var param = new SqlParameter("@Data", SqlDbType.Structured) { TypeName = "Books", Value = table };
cmd.Parameters.Add(param);
}, cancellationToken).ConfigureAwait(false);
}
private async Task ExecuteCommandAsync(string storedProcedure, Action<SqlCommand> parameterMapper, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(storedProcedure))
throw new ArgumentNullException(nameof(storedProcedure));
ArgumentNullException.ThrowIfNull(parameterMapper);
using var connection = await _dataContext.CreateConnectionAsync().ConfigureAwait(false);
using var command = new SqlCommand(storedProcedure, connection);
command.CommandType = CommandType.StoredProcedure;
parameterMapper(command);
await command.ExecuteNonQueryAsync(cancellationToken).ConfigureAwait(false);
}
}
The result
Even though the code in the previous section seems overwhelming, it is quite straight forward and easy to set up. Of course, using Functions takes some time initially, but after that it’s just writing clean and reusable and extensible code.
In our case, implementing support for a next stored procedure is nothing more that 30 minutes of coding effort.
As a result, the workflow will now look like this:, where the SQL action has just been replaced with a Function call.

And running the workflow results in a successful workflow run and records being present in the database.



Important to keep in mind
One thing to keep in mind when using this approach is that the data table columns sequence must exactly match the order of columns in the UDT. If you don’t obey the ordering of the columns, it might result in an exception that a conversion has failed or even worse, data is put into the wrong column in the database.

Endnote
Even though Logic Apps (Standard) is very actively developed by the product team, sometimes you’ll encounter unexpected behavior or unimplemented functionality. In these cases it it important to know that Azure Functions is a great piece of tooling in your palette to overcome these kinds of challenges.
Hopefully you enjoyed this post, and feel free to give feedback or ask questions in the comments section of this post.
Thanks for reading!


Leave a Reply