.Net 6 : Benchmark performance of JsonSerializer.DeserializeAsyncEnumerable

This should have been part of my earlier post on System.Text.Json Support for IAsyncEnumerable, but it slipped off my mind. So here we are.

To understand the significance of this feature in .Net 6, one need to understand the circumstances under which these might be useful. The first of those would be of course, that the we could be consuming the data even as the rest of the JSON is yet to be deserialized.

The significance is further amplified when you are only interested in the some earlier part of the data. Now you do not really need to deserialize the entire JSON (considering it is a huge one), hold it up in your buffers, and then use only a fraction of those. This could provide immense performance boost to the application.

Let us compare and benchmark the performance of various methods exposed by System.Text.Json for deserialization and attempt to understand it better.

There will be 3 methods which we would be placing under the hammer.

  • JsonSerializer.Deserialize<T>
  • JsonSerializer.DeserializeAsync<T>
  • JsonSerializer.DeserializeAsyncEnumerable<T>

Let us write some code to benchmark them.

[Benchmark]
public void TestDeseriliaze()
{
    foreach(var item in DeserializeWithoutStreaming().TakeWhile(x => x.Id < DATA_TO_COMSUME))
    {
        // DoSomeWork
    }
}

public IEnumerable<Data> DeserializeWithoutStreaming()
{
    var deserializedData = JsonSerializer.Deserialize<IEnumerable<Data>>(serializedString);
    return deserializedData;
}

[Benchmark]
public async Task TestDeseriliazeAsync()
{
    foreach (var item in (await DeserializeAsync()).TakeWhile(x => x.Id < DATA_TO_COMSUME))
    {
        // DoSomeWork
    }
}

public async Task<IEnumerable<Data>> DeserializeAsync()
{
    var memStream = new MemoryStream(Encoding.UTF8.GetBytes(serializedString));
    var deserializedData = await JsonSerializer.DeserializeAsync<IEnumerable<Data>>(memStream);
    return deserializedData;
}



[Benchmark]
public async Task TestDeserializeAsyncEnumerable()
{
    await foreach (var item in DeserializeWithStreaming().TakeWhile(x => x.Id < DATA_TO_COMSUME))
    {
        // DoSomeWork
    }
}

public async IAsyncEnumerable<Data> DeserializeWithStreaming()
{
    using var memStream = new MemoryStream(Encoding.UTF8.GetBytes(serializedString));
    await foreach(var item in  JsonSerializer.DeserializeAsyncEnumerable<Data>(memStream))
    {
        yield return item;
    }
}

Scenario 1 : Consuming only first 20% of the JSON Data

The first scenario we need to consider is when only a fairly small amount of the JSON data is consumed, say the first 20% of the data. While Deserialize<T> and DeserializeAsync would need to deserialize the entire JSON, even if the client would consume only the first 20% of that data, on other hand, DeserializeAsyncEnumerable would deserialize on-demand. This is evident in the benchmark results as well, where the performance of the DeserializeAsyncEnumerable is almost 3 times better.

MethodMeanErrorStdDev
TestDeseriliaze4.810 ms0.0952 ms0.2573 ms
TestDeseriliazeAsync5.166 ms0.1008 ms0.1161 ms
TestDeserializeAsyncEnumerable1.531 ms0.0305 ms0.0825 ms

Scenario 2: Consuming about 80% of the JSON Data

In the second scenario, we will consider when the client consume 80% of data. As one could assume, the now a larger part of JSON data has to be consumed and hence the performance margin decreases.

MethodMeanErrorStdDev
TestDeseriliaze4.960 ms0.0974 ms0.1877 ms
TestDeseriliazeAsync5.238 ms0.0997 ms0.1297 ms
TestDeserializeAsyncEnumerable4.851 ms0.0859 ms0.0804 ms

This is expected too, as more of the JSON is deserialized the performance difference is hardly significant, if not non-existent. But still, there is an advantage of using the DeserializeAsyncEnumerable – you would not have to wait for the entire JSON to be deserialized, the on-demand streaming approach allows you to consume the data as soon parts of JSON are deserialized.

I felt this is a huge improvement, especially when the concerned JSON is significantly large. Like many others, I am equally excited to see the improvements in .Net in recent years and is looking forward for the release of .Net 6.

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s