Reactive Durable Functions

Durable Functions itself are a big topic and I'll come back to the them soon. In the previous post I created a simple Function App, which inserts a row into Table Storage. All activities were orchestrated and it was really easy to schedule more work. Today I want to present you how easy you can transform such active architecture into a passive one using Event Grid. This will be a fairly easy episode so let's start!

Function App

There's no need to change anything in the Function App since for now we'll use a HttpTrigger. Unfortunately we cannot host it locally(because of Event Grid) so there's a need to publish it to the cloud.

Event Grid

You can easily deploy Event Grid from the marketplace. For now there's nothing special regarding its installation, so I won't go into details.

Event Grid chosen from Marketplace - still in preview though

Combining it all together

Once we have components deployed we can configure Event Grid so it'll pass events to the chosen endpoint. In our case it'll be our Galaxy_Create_Start function(which we used to start orchestration). We'll need one thing - our function URL. To get it go to the Function App you deployed, find a function and click on Get Function URL.

Once you have it, we can go to Event Grid and create a new subscriber.

What we need now is a new subscription. This feature allow you to orchestrate events flow, so each subscriber can be subscribed to a particular event type. With this configured we can centralize the way, how e.g. multiple services built with Azure Functions integrate with event producers.

My subscriber configured and subscribed to nebula.galaxy event type

When you configure a new subscriber it will be added to the list of all supported subscribers. We've done all what we needed to integrate Event Grid with Azure Functions, let's test it now.

Working example

There're two important things what we need to test our solution - Event Grid endpoint and access key. The former ss available on the main screen - please copy it so you'll know where to post your messages. Access keys can be found under Settings section in Event Grid main menu.

Since we don't have any real producer yet, we'll try to simulate one. For this purposed I used Postman, however all is up to you. The only thing we have to do is to post a HTTP request to the mentioned enpoint. Here you have an example:

/
POST /api/events HTTP/1.1
Host: your.eventgrid.azure.net
aeg-sas-key: YOUR_KEY
Content-Type: application/json
Cache-Control: no-cache

[
    {
        "id": "2",
        "eventType": "nebula.galaxy",
        "subject": "nebula/galaxy/create",
        "eventTime": "2017-11-08T13:25:00+01:00",
        "data":{
        }
    }
]

As you can see the payload has a specific schema, which will be validated on the Event Grid side. In fact it should self-explanatory. What is important here is the fact, that this payload is being passed to the function. If you change the main function a little bit:

/
[FunctionName("Galaxy_Create_Start")]
public static async Task<HttpResponseMessage> StartOrchestration(
	[HttpTrigger(AuthorizationLevel.Function, "post", Route = "orchestration/start")] HttpRequestMessage req,
	[OrchestrationClient] DurableOrchestrationClient starter,
	TraceWriter log)
{
	// Function input comes from the request content.
	string instanceId = await starter.StartNewAsync("Galaxy_Create", null);

	var payload = await req.Content.ReadAsStringAsync();
	log.Info($"Started orchestration with ID = '{instanceId}'.");
	log.Info($"The payload is: {payload}");

	return starter.CreateCheckStatusResponse(req, instanceId);
}

You'll see following result when you go to the function and check the console:

/
2017-11-08T12:36:54.565 Function started (Id=ecb2655e-912e-435b-b916-f21b65729716)
2017-11-08T12:36:55.144 Started orchestration with ID = 'f7d2ad0001204ff0a381d61b448ef8b7'.
2017-11-08T12:36:55.144 The payload is: [{
  "id": "3",
  "eventType": "nebula.galaxy",
  "subject": "nebula/galaxy/create",
  "eventTime": "2017-11-08T12:25:00+00:00",
  "data": {},
  "topic": "/SUBSCRIPTIONS/____________/RESOURCEGROUPS/NEBULA-EUW-DEV-RG2/PROVIDERS/MICROSOFT.EVENTGRID/TOPICS/NEBULA-EUW-DEV-EVENTGRID"
}]
2017-11-08T12:36:55.173 Function completed (Success, Id=ecb2655e-912e-435b-b916-f21b65729716, Duration=596ms)

Of course you can easily deserialize it and incorporate into your flow.

Summary

As you can see implementing Event Grid as a gateway to the underlying architecture is a piece of cake. In fact we didn't need any change in our code - the whole integration perfomed seamlessly. In the next episode I'll try to present you how to integrate a producer(Event Hub), so we don't have to post messages directly to our Event Grid.

Starting fun with Durable Functions

Not so long ago I started developing a simple side project related to game development. Since it's just a proof-of-concept, I've decided, that I'll present here sneak peeks with the work done. There're two main parts of this project:

  • cloud backend based on serverless architecture
  • desktop client built with Electron and HTML5

As I said it's more like a concept than a real product so all bleeding edge technologies will be used just to confirm that "it's possible" :)

Starting with Durable Functions

Orchestrating workflows in Azure Functions is pretty new feature, which is still in the preview stage. The good thing is the fact, that it's currently somehow documented(not very good but still better than nothing) and draws bigger and bigger audience to FaaS built with Azure components. You can find all required details regarding installation and concepts of this functionality here. We'll go straight to the implementation.

Consider following diagram:

If you're familiar with Durable Task Framework, those names should be familiar in some way. We're telling the framework to start an orchestration, which will schedule activities in the proper order and perform some work. Note that communication between Client and Orchestration is not bidirectional. The idea here is to orchestrate work and don't wait for it. What we're given however is a link to the orchestration itself, where we can check the current status and result. 

The important thing here is that Orchestration aggregates data from Activity 1 and Activity 2 before going further. This allows for easy decoupling of responsibilities, so there's no problem when it comes to ordering tasks.

Let's check the real world example.

Implementation

I converted the above diagram to the corresponding functions:

/
public static class Create
{
	[FunctionName("Galaxy_Create_Start")]
	public static async Task<HttpResponseMessage> StartOrchestration(
		[HttpTrigger(AuthorizationLevel.Function, methods: "post", Route = "orchestrators/run")] HttpRequestMessage req,
		[OrchestrationClient] DurableOrchestrationClient starter,
		TraceWriter log)
	{
		// Function input comes from the request content.
		dynamic eventData = await req.Content.ReadAsAsync<object>();
		string instanceId = await starter.StartNewAsync("Galaxy_Create", eventData);

		log.Info($"Started orchestration with ID = '{instanceId}'.");

		return starter.CreateCheckStatusResponse(req, instanceId);
	}

	[FunctionName("Galaxy_Create")]
	public static async Task<string> RunImpl([OrchestrationTrigger] DurableOrchestrationContext context)
	{
		var result = await Task.WhenAll(context.CallActivityAsync<string>("Utility_Coords"),
			context.CallActivityAsync<string>("Utility_Name"));

		await context.CallActivityAsync("Galaxy_Create_Impl", new CreateGalaxyContext(result[1], result[0]));

		return "Galaxy created!";
	}

	[FunctionName("Galaxy_Create_Impl")]
	public static async Task CreateGalaxy(
		[ActivityTrigger] CreateGalaxyContext context,
		[Table("galaxies")] IAsyncCollector<GalaxyDataEntity> galaxies)
	{
			await galaxies.AddAsync(new GalaxyDataEntity(context.Name, context.Coords));                
	}
}

If I start an orchestration, I'll get a new galaxy added to my table in Table Storage like so:

Let's analyze each function one by one.

Galaxy_Create_Start

It's our Orchestration Client. It acts as an interface to the orchestration, se we can start it. As you can see it works as a HttpTrigger - you can use whichever trigger you like like Event Hub or Event Grid. It returns a simple payload, which allow you to interact with the working orchestration:

{
    "id": "8d5c8fdf0cf54d87839d9d17223268bf",
    "statusQueryGetUri": "http://localhost:7071http://localhos/admin/extensions/DurableTaskExtension/instances/8d5c8fdf0cf54d87839d9d17223268bf?taskHub=SampleHubVS&connection=Storage&code=YgC1NtQXBBRJGO9sGbNWVA03nVjedvjPwPUWZ/lH3vIBVcngQIDYjg==",
    "sendEventPostUri": "http://localhost:7071/admin/extensions/DurableTaskExtension/instances/8d5c8fdf0cf54d87839d9d17223268bf/raiseEvent/{eventName}?taskHub=SampleHubVS&connection=Storage&code=YgC1NtQXBBRJGO9sGbNWVA03nVjedvjPwPUWZ/lH3vIBVcngQIDYjg==",
    "terminatePostUri": "http://localhost:7071/admin/extensions/DurableTaskExtension/instances/8d5c8fdf0cf54d87839d9d17223268bf/terminate?reason={text}&taskHub=SampleHubVS&connection=Storage&code=YgC1NtQXBBRJGO9sGbNWVA03nVjedvjPwPUWZ/lH3vIBVcngQIDYjg=="
}

Galaxy_Create

The actual orchestration. It holds logic responsible for scheduling all tasks as needed.

Galaxy_Create_Impl

The actual implementation of a worker, which is responsible for inserting data into a storage table. It's unaware of other functions - it just expects a particular context passed to it, which hold the necessary data.

There're other 2 functions in this solution(Utility_Coords and Utility_Name), but they're just a standard ActivityTriggers, which generate random data. 

Conclusion

As you can see Durable Functions ease developing organized flows greatly. Without this feature we'd have to use e.g. queues or perform HTTP requests to gather all the data. What is more it moves a solution towards event driven architectures and simplifies overall structure. In the next post we'll convert this solution to a passive application with Event Grid taking the role of the commander. Stay tuned!