Serverless framework with Azure Functions

Recently an update to Azure Functions has been published containing proxies and integration with Serverless framework. In a nutshell it's a tool for building serverless architectures. Don't get the name fool you - in fact there's a server behind the whole functionality and still all principals regarding web development are true. What in reality it gives you is focusing only on the very functionality of your application. There're no pipelines, layers or areas - you're going to ensure that each component follows the single responsibility pattern and can be deployed separately. 

We're going to build a very simple application using Azure Functions and Serverless to present the current capabilities of the integration.

Provisioning you environment

I decided, that the easiest solution here is to obtain a pre-configured Ubuntu VM with installed NodeJS - you can easiliy get it from the marketplace in the portal. Once you have it, connect to it(as usual in my case, from Windows PuTTY seems like the best idea) and we can start. Type in following commands:

sudo npm install -g serverless
serverless install --url https://github.com/azure/boilerplate-azurefunctions --name ServerlessExample

Those two will:

  • install Serverless globally
  • install a new service locally using name ServerlessExample

Once it's finished, you can go to the ServerlessExample directory - you'll see some files which are the core of our application. In fact, this boilerplate gives us all we need to get started as the example is fully functional.

Deploying a function

Before you're able to deploy your function to Azure you have to provide a credentials for deployment. The whole process is pretty well described here, however I attaching it here with my comments because in my case it didn't go as flawlessly as I thought:

\
bitnami@nodejsblog:~$ sudo npm i -g azure-cli
bitnami@nodejsblog:~$ azure login
bitnami@nodejsblog:~$ azure account show
bitnami@nodejsblog:~$ azure login
bitnami@nodejsblog:~$ azure ad app create -n service_name --home-page http://www.my-app.org --identifier-uris https://www.my-app.org/example -p pass
bitnami@nodejsblog:~$ azure ad sp create -a application_id
bitnami@nodejsblog:~$ azure role assignment create --objectId object_id  -o Contributor
bitnami@nodejsblog:~$ export azureSubId='<subscriptionId>'
bitnami@nodejsblog:~$ export azureServicePrincipalTenantId='<tenantId>'
bitnami@nodejsblog:~$ export azureServicePrincipalClientId='<servicePrincipalName>'
bitnami@nodejsblog:~$ export azureServicePrincipalPassword='<password>'

Note that I used azure ad app create command additionally - in my case it was not possible to simply create a service principal using only a name and a password.

You can ensure that you've added environment variables using printenv command. Once you configured your credentials, go to the directory of your service and just run:

/
serverless deploy

You should see following output:

/
Serverless: Packaging service...
Serverless: Logging in to Azure
Serverless: Creating resource group: ServerlessExampleKamz-rg
Serverless: Creating function app: ServerlessExampleKamz
Serverless: Waiting for Kudu endpoint...
Serverless: Parsing Azure Functions Bindings.json...
Serverless: Building binding for function: httpjs event: httpTrigger
Serverless: Packaging function: httpjs
Serverless: Building binding for function: queuejs event: queueTrigger
Serverless: Building binding for function: queuejs event: blob
Serverless: Packaging function: queuejs
Serverless: Syncing Triggers....Response statuscode: 200
Serverless: Running Kudu command del package.json...
Serverless: Running Kudu command npm install --production...
Serverless: Successfully created Function App

Now when you go to the portal, you'll see a new resource group created and function deployed.

In the next post we'll try to do more advanced configuration and consider involving VSTS and incorporating the framework into our build/release pipeline.

EventStore on Azure and Ubuntu - it's a piece of cake! #1

EventStore is a well known, open-sourced and a solid database designed to be the very foundations of event-driven systems. What is great about it is the fact, that it can be built against both Windows and Ubuntu systems, what widens technology stack it can be used with. If you prefer Linux solutions and would like to build an event sourced solution based on ES, there's nothing that will stop you. In this short series of posts I will present how to quickly install, configure and manage EventStore using Ubuntu VMs from Azure.

Getting VM

You can obtain Ubuntu 14.04 VM from the marketplace in Azure Portal. There's nothing special about its configuration or size - for the purpose of testing it can be whichever you like and you're comfortable with. Once you fill in all fields and provision the whole environment, we can connect to the machine and try to install the database.

Installation

To connect to the VM you need an SSH client and credentials you provided during VM installation process. I personally recommend using PuTTY in Windows environment since it's lightweight and completely free. Once you're logged in, we can start installing EventStore instance.

Firstly run following command:

/
curl -s https://packagecloud.io/install/repositories/EventStore/EventStore-OSS/script.deb.sh | sudo bash

Once you have EventStore preconfigured, you can install it:

/
sudo apt-get install eventstore-oss=3.9.3

You can choose any version you like, in this particular post I selected 3.9.3 since it was the most recent one available.

Once EventStore is installed we can run it using this command:

/
sudo service eventstore start

and use curl to sent testing event to make sure everything is all right. To make things easier, take following JSON from the documentation:

/
[
  {
    "eventId": "fbf4a1a1-b4a3-4dfe-a01f-ec52c34e16e4",
    "eventType": "event-type",
    "data": {
      "a": "1"
    }
  }
]

and use following command to send an event:

/
vi event.txt
curl -i -d @event.txt "http://127.0.0.1:2113/streams/newstream" -H "Content-Type:application/vnd.eventstore.events+json"

Note that we're using vi to quickly create events.txt file using JSON from above. When you execute the command, you should receive HTTP 201 Created response:

/
HTTP/1.1 201 Created
Access-Control-Allow-Methods: POST, DELETE, GET, OPTIONS
Access-Control-Allow-Headers: Content-Type, X-Requested-With, X-Forwarded-Host, X-PINGOTHER, Authorization, ES-LongPoll, ES-ExpectedVersion, ES-EventId, ES-EventType, ES-RequiresMaster, ES-HardDelete, ES-ResolveLinkTo
Access-Control-Allow-Origin: *
Access-Control-Expose-Headers: Location, ES-Position, ES-CurrentVersion
Location: http://127.0.0.1:2113/streams/newstream/0
Server: Mono-HTTPAPI/1.0
Date: Wed, 22 Feb 2017 08:09:26 GMT
Content-Type: text/plain
Content-Length: 0
Keep-Alive: timeout=15,max=100

Note that the configuration file used by EventStore is located in /etc/eventstore/eventstore.conf and since it's read-only, you will have to use sudo command to change something in it. For now, leave it as it is.

What's next?

In the next posts I will present how to access EventStore from your local computer and what to change to be able to send and receive messages from it. We'll end this series running a simple cluster of EventStore instances on 3 different Ubuntu machines.