Starting fun with Durable Functions

Not so long ago I started developing a simple side project related to game development. Since it's just a proof-of-concept, I've decided, that I'll present here sneak peeks with the work done. There're two main parts of this project:

  • cloud backend based on serverless architecture
  • desktop client built with Electron and HTML5

As I said it's more like a concept than a real product so all bleeding edge technologies will be used just to confirm that "it's possible" :)

Starting with Durable Functions

Orchestrating workflows in Azure Functions is pretty new feature, which is still in the preview stage. The good thing is the fact, that it's currently somehow documented(not very good but still better than nothing) and draws bigger and bigger audience to FaaS built with Azure components. You can find all required details regarding installation and concepts of this functionality here. We'll go straight to the implementation.

Consider following diagram:

If you're familiar with Durable Task Framework, those names should be familiar in some way. We're telling the framework to start an orchestration, which will schedule activities in the proper order and perform some work. Note that communication between Client and Orchestration is not bidirectional. The idea here is to orchestrate work and don't wait for it. What we're given however is a link to the orchestration itself, where we can check the current status and result. 

The important thing here is that Orchestration aggregates data from Activity 1 and Activity 2 before going further. This allows for easy decoupling of responsibilities, so there's no problem when it comes to ordering tasks.

Let's check the real world example.


I converted the above diagram to the corresponding functions:

public static class Create
	public static async Task<HttpResponseMessage> StartOrchestration(
		[HttpTrigger(AuthorizationLevel.Function, methods: "post", Route = "orchestrators/run")] HttpRequestMessage req,
		[OrchestrationClient] DurableOrchestrationClient starter,
		TraceWriter log)
		// Function input comes from the request content.
		dynamic eventData = await req.Content.ReadAsAsync<object>();
		string instanceId = await starter.StartNewAsync("Galaxy_Create", eventData);

		log.Info($"Started orchestration with ID = '{instanceId}'.");

		return starter.CreateCheckStatusResponse(req, instanceId);

	public static async Task<string> RunImpl([OrchestrationTrigger] DurableOrchestrationContext context)
		var result = await Task.WhenAll(context.CallActivityAsync<string>("Utility_Coords"),

		await context.CallActivityAsync("Galaxy_Create_Impl", new CreateGalaxyContext(result[1], result[0]));

		return "Galaxy created!";

	public static async Task CreateGalaxy(
		[ActivityTrigger] CreateGalaxyContext context,
		[Table("galaxies")] IAsyncCollector<GalaxyDataEntity> galaxies)
			await galaxies.AddAsync(new GalaxyDataEntity(context.Name, context.Coords));                

If I start an orchestration, I'll get a new galaxy added to my table in Table Storage like so:

Let's analyze each function one by one.


It's our Orchestration Client. It acts as an interface to the orchestration, se we can start it. As you can see it works as a HttpTrigger - you can use whichever trigger you like like Event Hub or Event Grid. It returns a simple payload, which allow you to interact with the working orchestration:

    "id": "8d5c8fdf0cf54d87839d9d17223268bf",
    "statusQueryGetUri": "http://localhost:7071http://localhos/admin/extensions/DurableTaskExtension/instances/8d5c8fdf0cf54d87839d9d17223268bf?taskHub=SampleHubVS&connection=Storage&code=YgC1NtQXBBRJGO9sGbNWVA03nVjedvjPwPUWZ/lH3vIBVcngQIDYjg==",
    "sendEventPostUri": "http://localhost:7071/admin/extensions/DurableTaskExtension/instances/8d5c8fdf0cf54d87839d9d17223268bf/raiseEvent/{eventName}?taskHub=SampleHubVS&connection=Storage&code=YgC1NtQXBBRJGO9sGbNWVA03nVjedvjPwPUWZ/lH3vIBVcngQIDYjg==",
    "terminatePostUri": "http://localhost:7071/admin/extensions/DurableTaskExtension/instances/8d5c8fdf0cf54d87839d9d17223268bf/terminate?reason={text}&taskHub=SampleHubVS&connection=Storage&code=YgC1NtQXBBRJGO9sGbNWVA03nVjedvjPwPUWZ/lH3vIBVcngQIDYjg=="


The actual orchestration. It holds logic responsible for scheduling all tasks as needed.


The actual implementation of a worker, which is responsible for inserting data into a storage table. It's unaware of other functions - it just expects a particular context passed to it, which hold the necessary data.

There're other 2 functions in this solution(Utility_Coords and Utility_Name), but they're just a standard ActivityTriggers, which generate random data. 


As you can see Durable Functions ease developing organized flows greatly. Without this feature we'd have to use e.g. queues or perform HTTP requests to gather all the data. What is more it moves a solution towards event driven architectures and simplifies overall structure. In the next post we'll convert this solution to a passive application with Event Grid taking the role of the commander. Stay tuned!

What's all about Azure Event Grid?

Azure Event Grid is one of the newest products available in Azure cloud stack. Since it's still in preview, we are not offered full functionality(so e.g. only two regions can be selected, not all event publishers have been added). However with all the goodnes provided by this component, we can start thinking about "reactive programming in the cloud" - at least this is what documentation tells us. Let's dive deeper into Event Grid and find why it's so special.


Event Grid is all about events. You may ask how it is different comparing similar products like Event Hub or Service Bus. If you take a look at the basic architecture, you'll find very similar concept like topics or subscribers (well at least for Service Bus). So why do I need Event Grid(which will complicate my architecture even more) when I can easily connect e.g. my Azure Functions to a topic and achieve the same functionality with ease? Well, this is only partially true.

Event Grid functional model(source:

The downside of other solutions is the need of pooling - details doesn't matter now, you have to implement some way of communication between your app and an event publisher. It can be long-pooling, event sourcing, WebSockets - whatever works can be used. So even if you establish a persistent connection, you have to talk to the other side and await messages. You're not passive in this model - that's why you cannot "react" on events passed to you. You only parse them and pass further.

Event Grid allows you to make your components "passive" - they are somewhere in the cloud and are only interested in the data you send to them. They don't have to persist any connections - it's up to Event Grid to distribute messages and deliver to the configured subscribers. Microsoft states, that this approach is suited for serverless scenario and I can agree with them - you can make underlying infrastructure even more abstract and control the flow of event from the single point. For me the possibility to configure connection between Event Hub and several Azure Functions using Azure Portal(so I don't have to pass a connection string of EH to each individual component) is definitely a big YES to Event Grid.

Should I go for it?

I still think, that though Event Grid simplifies and improves working with serverless architecture(what am I saying - it actually enables you to start thinking about serverless at all...), you cannot just take it, write a couple of Functions and say "this is how we're making applications today in our company". It still requires proper planning, it's still not valid for each and every application(with Event Hub, Event Grid and Azure Functions, you may assume, that an event will reach its destination... at some point in time) and forces you to change your mindset into being "reactive"(and this is sometimes a challenge itself).

Event Grid as the "man-in-the-middle" in serverless architecture(source:

On the other hand I like how it smoothly integrates with the cloud - for now only a few publishers are available, but we're given a promise, that this will change soon. I treat it as a serverless orchestrator - it's the centre of my architecture, where I can separate concerns seamlessly. Combine it with negligible cost($0.60 per million operations, first 100k is free) and easy learning curve and ask yourself why haven't you tested it yet?