Implement Azure functions

Last Updated on August 12, 2021 by Alam Mohammed

Implement Azure functions is part of Develop Azure compute solutions topics. The total weight of this in the exam will be 25-30%. This training post is designed to help and provide readers with a better understanding of the topic mentioned.

Disclaimer: This is not a training article to help complete the Microsoft Azure AZ-204, but it provides a good insight into the areas within these topics. Labs and hands-on work are essential to passing most Microsoft Azure exams.

Implement Azure functions:
Azure Functions Overview

Azure Functions Overview

Microsoft Azure Functions is a powerful solution for processing bulk data, integrating systems, working with the Internet of Things (IoT), and building simple APIs and microservices.

Implement Azure functions: Azure Functions

Azure Functions is a solution for easily running small pieces of code, or “functions,” in the cloud. You can write just the code you need for the problem at hand, without worrying about a whole application or the infrastructure to run it. Functions can make development even more productive, and you can use your development language of choice, such as C#, F#, Node.js, Java, or PHP. Pay only for the time your code runs and trust Azure to scale as needed. Azure Functions lets you develop serverless applications on Microsoft Azure.

Functions is a great solution for processing data, integrating systems, working with the internet-of-things (IoT), and building simple APIs and microservices. Consider Functions for tasks such as image or order processing, file maintenance, or any tasks that you want to run on a schedule.

Function integrations

Azure Functions integrates with various Azure and third-party services. These services can trigger your function and start execution, or they can serve as input and output for your code.

Implement Azure functions:
implement input and output bindings for a function

Input and Output Bindings

Input and output bindings provide a declarative way to connect to data from within your code. Bindings are optional and an Azure Function can have multiple input and output bindings.

Bindings

Triggers and bindings let you avoid hard-coding the details of the services that you’re working with. Your function receives data (for example, the content of a queue message) in function parameters. You send data (for example, to create a queue message) by using the return value of the function. In C# and C# script, alternative ways to send data are out parameters and collector objects.

Triggers

A trigger defines how a function is invoked. A function must have exactly one trigger. Triggers have associated data, which is usually the payload that triggers the function.

There are many types of triggers for Azure services including:

  • HTTPTrigger – Trigger the execution of your code by using an HTTP request.
  • TimerTrigger – Execute cleanup or other batch tasks on a predefined schedule.
  • GitHub webhook – Respond to events that occur in your GitHub repositories. Generic webhook – process webhook HTTP requests from any service that supports webhooks.
  • CosmosDBTrigger – Process Azure Cosmos DB documents when they are added or updated in collections in a NoSQL database.
  • BlobTrigger – Process Azure Storage blobs when they are added to containers. You might use this function for image resizing.
  • QueueTrigger – Respond to messages as they arrive in an Azure Storage queue.
  • EventHubTrigger – Respond to events delivered to an Azure Event Hub. Particularly useful in application instrumentation, user experience, or workflow processing, and Internet of Things (IoT) scenarios.
  • ServiceBusQueueTrigger – Connect your code to other Azure services or on-premises services by listening to message queues.
  • ServiceBusTopicTrigger – Connect your code to other Azure services or on-premises services by subscribing to topics.

Trigger and Bindings example

 

Integrating with Azure Virtual Network

Under the Premium plan, Azure Functions has the same hosting capabilities as the Web Apps in Azure App Service. This means that Azure Functions also has the Virtual Network Integration feature.

When Virtual Network Integration is used with an Azure Virtual Network in the same region as your app, it requires you to use a delegated subnet with at least 32 addresses in it. You won’t be able to use the subnet for anything else. Outbound calls from your app will be made from the addresses in the delegated subnet. When you use this version of Virtual Network Integration, the calls are made from addresses in your Virtual Network.

Best practices

Avoid long running functions

Large, long-running functions can cause unexpected time-out issues. A function can become large due to many Node.js dependencies. Importing dependencies can also cause increased load times that result in unexpected time-outs. Dependencies are loaded both explicitly and implicitly. A single module loaded by your code may load its own additional modules.

Cross function communication

Durable Functions and Azure Logic Apps are built to manage state transitions and communication between multiple functions. If you are not using Durable Functions or Logic Apps to integrate with multiple functions, it is generally a best practice to use storage queues for cross function communication. The main reason is storage queues are less costly and much easier to provision.

Write functions to be stateless

Functions should be stateless and idempotent if possible. Associate any required state information with your data. For example, an order being processed would likely have an associated state member. A function could process an order based on that state while the function itself remains stateless.

Write defensive functions

Assume that your function could encounter an exception at any time. Design your functions with the ability to continue from a previous fail point during the next execution. 

Implement Azure functions:
implement function triggers by using data operations, timers, and webhooks

Durable Function scenario – Monitoring

The monitor pattern refers to a flexible recurring process in a workflow—for example, polling until certain conditions are met. A regular timer-trigger can address a simple scenario, such as a periodic cleanup job, but its interval is static and managing instance lifetimes becomes complex. Durable Functions enables flexible recurrence intervals, task lifetime management, and the ability to create multiple monitor processes from a single orchestration.

An example would be reversing the earlier async HTTP API scenario. Instead of exposing an endpoint for an external client to monitor a long-running operation, the long-running monitor consumes an external endpoint, waiting for some state change.

Using Durable Functions, multiple monitors that observe arbitrary endpoints can be created in a few lines of code. The monitors can end execution when some condition is met, or be terminated by the DurableOrchestrationClient, and their wait interval can be changed based on some condition (that is, exponential backoff). The code on the next slide implements a basic monitor.

Durable Function scenario – Monitoring code

Durable Function scenario – Async HTTP APIs

This pattern is all about the problem of coordinating the state of long-running operations with external clients. A common way to implement this pattern is by having the long-running action triggered by an HTTP call, and then redirecting the client to a status endpoint that they can poll to learn when the operation completes.

Durable Functions provides built-in APIs that simplify the code you write for interacting with long-running function executions. After an instance is started, the extension exposes webhook HTTP APIs that query the Orchestrator function status. The following example shows the REST commands to start an Orchestrator and to query its status. For clarity, some details are omitted from the example.

The durable timer is created by calling ctx.CreateTimer. The notification is received by ctx.WaitForExternalEvent. And Task.WhenAny is called to decide whether to escalate (timeout happens first) or process approval (approval is received before time-out).

Implement Azure functions:
implement Azure Durable Functions

Durable Functions

Durable Functions is an extension of Azure Functions and Azure WebJobs that lets you write stateful functions in a serverless environment. The extension manages state, checkpoints, and restarts for you.

The extension lets you define stateful workflows in a new type of function called an Orchestrator function. Here are some of the advantages of Orchestrator functions:

  • They define workflows in code. No JSON schemas or designers are needed.
  • They can call other functions synchronously and asynchronously. Output from called functions can be saved to local variables.
  • They automatically checkpoint their progress whenever the function awaits. Local state is never lost if the process recycles or the virtual machine (VM) reboots.

Note: Durable Functions is an advanced extension for Azure Functions that is not appropriate for all applications. The rest of this section assumes that you have a strong familiarity with Azure Functions concepts and the challenges involved in serverless application development.

The primary use case for Durable Functions is simplifying complex, stateful coordination problems in serverless applications. The following sections describe some typical application patterns that can benefit from Durable Functions.

Durable Function scenario – Chaining

Function chaining refers executing a sequence of functions in a particular order. Often, the output of one function needs to be applied to the input of another function.

Durable Function scenario – Chaining code

The values “F1”, “F2”, “F3”, and “F4” are the names of other functions in the function app. Control flow is implemented by using normal imperative coding constructs. That is, code executes top down and can involve existing language control flow semantics, like conditionals and loops. Error handling logic can be included in try/catch/finally blocks.

The ctx parameter (DurableOrchestrationContext) provides methods for invoking other functions by name, passing parameters, and returning function output. Each time the code calls await, the Durable Functions framework checkpoints the progress of the current function instance. If the process or VM recycles midway through the execution, the function instance resumes from the previous await call.  We will cover more on this restart behavior later.

More topics on Develop Azure compute solutions:

Create Azure App Service Web Apps

Implement IaaS solutions

Microsoft Azure AZ-204 exam topics:

If you have covered the current topics in Connect to and consume Azure services and third-party services then you can have a look at the other topic areas:

View full documentation Microsoft Azure: AZ-204 exam content from Microsoft

 

Leave a Reply

Your email address will not be published. Required fields are marked *