Building a serverless blog on Azure Functions


Posted on 11/29/2020


Introduction

In this blog post, I will go through the steps of developing a serverless blog engine.

Why did I write my own blog engine? Aren’t there plenty of them?

Firstly, I just wanted to do it on my own. Secondly, I did not find too many good ways of doing a serverless blog.

So why would you want to go serverless at all?

  • No server management necessary
  • You will be only charged for what you use (e.g. per request)
  • Scale extremely well
  • Event-based
  • Reduced time-to-market

Architecture

The application consists of the following primary components:

  • Engine (for managing the posts)
  • Frontend (Presenting the HTML content)

Architecture Overview

The goal of this project is to make the blog as cost-efficient as possible while ensuring scalability.

Frontend

The frontend will be served by Azure Functions. Functions have a quite good integration with Azure Blob Storage which allows loading content from blob storage very easily. We will use this to store and load metadata and blog posts. To reduce the workload on Functions, reduce latency, and loading times we will use Azure CDN to cache content generated by Azure Functions.

Azure CDN allows us to scale and deliver content as needed while only paying for the usage of the CDN (per GB of traffic).

Engine

The Engine is a separated function which the normal user will not use at all. It is used to create, edit, and process blog posts. Blog posts will be saved on Azure Blob Storage as markdowns and then converted into HTML files through Azure Functions.

Azure Active Directory is used to authenticate requests to the engine.

Infrastructure

The central part of the application is the frontend. It consists of an Azure Function and the static content on blob storage (posts, images, etc.). The Frontend will be served through Azure CDN

Azure Content Delivery Network (Azure CDN)

CDNs are a network of content/cache servers distributed over the whole world to provide the content as close to the users as possible. This will result in lower latency for those users as they connect to the nearest point of presence (POP) instead of a central server.

Azure CDN manages also TLS certificates. Either we can provide an own certificate or let Azure CDN create and manage one for use (for free).

Azure Storage Account

Azure Storage offers the following multiple (sub)-services:

  • Blob storage
  • Table storage
  • Queue storage
  • File storage (SMB)
  • Azure Disks

Besides file and disk storage we will use every feature in this project.

Azure Storage is a complete managed storage platform that is encrypted (at rest) by default. The storage API is accessible from anywhere in the world (by default) using HTTP or HTTPS. There are certain client libraries available for different languages (e.g. .NET, Java, Python, etc...)

Azure Storage comes with different redundancy options we can choose from.

Blob storage

Blob storage is used to host the blog posts

Table storage

Table storage is used to store metadata (e.g. post publish date, tags, etc).

Instead of table storage we could also use Azure CosmosDB which offers better scalability and more other (premium) features.

Queue storage

Queue storage is used to provide basic messaging functionality (e.g. new post is published)

Instead of queue storage we could also use Azure Service Bus which supports for example message ordering (FIFO) and different types of receive modes. As this project does not rely heavily on messaging (yet), we are fine with using queue storage.

Azure Functions

Azure Functions is a Function as a service (FaaS) offering which provides (almost) endless scale while only paying for the execution. On top of that Functions has an integrated programming model based on triggers and binging which help to respond to events and integrate with other Azure platform services (e.g. Blob Storage, CosmosDB, etc.).

Costs

Calculating the cost is a bit difficult as we don't know the exact usage of our blog. The following estimate should give you an idea of the bill that we could get at the end of the month.

Service Costs Amount Sum
Azure Blob Storage (LRS) Data €0,00069/GB per month 1 €0,00069
Azure Blob Storage Transactions €0,0037 Read operations (per 10,000) 10000 €0,0037
Azure CDN €0.0684 per GB (Zone 1) 4 €0,2736
Azure Functions execution time €0,000014/GB-s (400.000 GB-s free) 150.000 €0,00
Azure Functions executions €0,169 per million executions (1 million free ) 75.000 €0,00

This is just a rough estimation which should give you an idea. Don't take this for granted ;-) Review your costs regularly (or set up budget alerts)!

Setup

The whole source code can be found on GitHub.

Setting up the infrastructure

If you don't have an Azure subscription yet, you can sign up for an account here.

If you have already an account you are good to go. You can either deploy the infrastructure using the following method or fork the repository and set up your own CI/CD pipelines (recommended).

Deploy to Azure

After the infrastructure is deployed you should be able to create your first post. Open the blog-engine function in the Azure Portal and open the URL of the functions https://(yourfunctionname).azurewebsites.net/Add

How it works

The frontend

The user flows can be simplified like this:

User visits index -> Function call to retrive blog posts -> displays index page with blog posts.

  • Azure Functions gets Metadata from table storage

User selects post -> Function call to load blog post -> displays blog post page with content

  • Azure Functions loads the content from azure blob storage and load metadata from table storage

The engine

The user flows can be simplified like this:

The content editor creates or edits a post -> Functions saves the post as markdown to blob storage and the metadata to table storage -> Azure Functions will convert the markdown to HTML

  • Azure Functions saves to blob storage, table storage, and creates queue message
  • Azure Functions reacts to queue the message and converts markdown to HTML

When a blog post is saved the "Save" Function will save the post as a markdown to a blob container. It will also create a message in the queue on which the "RenderPost" Function will react.

    [FunctionName(nameof(Save))]
    public static async Task Save(
    [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)] HttpRequest req,
    [Queue("created", Connection = "AzureStorageConnection")] CloudQueue queue,
    [Blob("posts", FileAccess.ReadWrite, Connection = "AzureStorageConnection")] CloudBlobContainer container)
    {
        string slug = req.Query["slug"];
        if (string.IsNullOrWhiteSpace(slug))
        {
            return new BadRequestObjectResult("slug cannot be empty");
        }

        var blobRef = container.GetBlockBlobReference(slug + ".md");

        await blobRef.UploadFromStreamAsync(req.Body);
        blobRef.Properties.ContentType = "text/markdown";
        await blobRef.SetPropertiesAsync();

        await queue.AddMessageAsync(new CloudQueueMessage(slug));

        return new OkObjectResult(slug);
    }

The following code will convert the markdown then into HTML code and upload it to blob storage. I am using Markdig for this. The function will be triggered as soon as a message arrives in the "created" queue.

    [FunctionName("RenderPost")]
    public static async Task RenderPost([QueueTrigger("created", Connection = "AzureStorageConnection")] string slug,
    [Blob("posts/{queueTrigger}.md", FileAccess.Read, Connection = "AzureStorageConnection")] string postContent,
    [Blob("published", FileAccess.Write, Connection = "AzureStorageConnection")]CloudBlobContainer container, ILogger log)
    {
        log.LogInformation($"Processed blob\n Name:{slug}");

        MarkdownPipeline pipeline = new MarkdownPipelineBuilder().UsePipeTables().UseBootstrap().Build();
        string html = Markdown.ToHtml(postContent, pipeline);

        var blobRef = container.GetBlockBlobReference(slug + ".html");

        await blobRef.UploadTextAsync(html).ConfigureAwait(false);

        blobRef.Properties.ContentType = "text/html";
        await blobRef.SetPropertiesAsync().ConfigureAwait(false);
    }

Further enhancing

The backend doesn't support uploading media files at the moment and doesn't have too many editor features yet.

The Frontend code isn't too nice either at the moment. Especially with a large number of posts, the frontend will load a lot of posts on the index page which might become slow. Pagination should properly solve this issue. Additionally, the frontend should offer the option to create comments under a post.

With Azure Cognitive Services we could also add audio transcription offering text to speech.

Also automating the creation of a Twitter Tweet if you publish new blog posts could be added.

Closing thoughts

Serverless is a cool technology that allows you to innovate fast while being extremely cost-efficient compared with e.g. web apps or virtual machines.