Event-Driven App with Dapr & Azure Event Hubs

Event-Driven App with Dapr & Azure Event Hubs


11 min read

In this blog post, I will walk through how to integrate Dapr - Distributed application runtime in Kubernetes with Azure Event Hubs.

What is Dapr?

Dapr is a portable, event-driven runtime that makes it easy for developers to build resilient, microservice stateless and stateful applications that run on the cloud and edge.

Dapr consists of a set of building blocks accessed by standard HTTP or gRPC APIs that can be called from any programming language. Dapr building blocks include the capabilities needed for building microservices applications. Dapr provides some of the building blocks such as Service Invocation, State management, Pub/Sub, Event-driven resource binding, Virtual actors, and Distributed tracing.

{% github dapr/dapr no-readme %}

Key features of Dapr & Building blocks.

  • Pluggable component model
  • Standard APIs using HTTP and gRPC.
  • Portable across different supported infrastructures
  • Extensibility
  • Ease of integration
  • Runs anywhere as a process / in a docker container.
  • Dynamically choose different implementation.

Dapr provides a dedicated CLI for easy debugging and clients for Java, .NET, Go, Javascript, Python, Rust, and C++.


In event-driven applications, your app can handle events from external systems or trigger events in external systems. Input and Output bindings help you to receive and send events through Dapr runtime abstraction.

Bindings remove the complexities of integrating with external systems such as queue, message bus, etc., and help developers focus on business logic. You can switch between the bindings at runtime and keep the code free from vendor SDKs or libraries to create portable applications.


Getting started with Dapr on Kubernetes

Install Dapr on the Kubernetes cluster using a helm chart. You can install in minikube or managed kubernetes cluster such as AKS or any.

  • Make sure Helm 3 is installed on your machine.

  • Add daprio Azure Container Registry as a Helm repo.

    $ helm repo add dapr https://daprio.azurecr.io/helm/v1/repo
    $ helm repo update
  • Create dapr-system namespace on your kubernetes cluster.

    $ kubectl create namespace dapr-system
  • Install the Dapr chart on your cluster in the dapr-system namespace

    $ helm install dapr dapr/dapr --namespace dapr-system
    $ kubectl get pods -n dapr-system -w // Verify the pods are created

Setup Kafka Enabled Azure Event Hubs

The first step is to create the Azure Event Hubs in Azure Portal and get Event Hub Connection String

After you complete the steps from above links from Microsoft Docs, you will have

  1. Event Hub Connection String
  2. Event Hub Namespace Name

You need this to configure the input bindings.


In this demo, we are going to create an application that pushes the event to Azure Event Hub and the microservice that receives the event through Dapr input binding.

The event hub consumer microservice is deployed into the kubernetes cluster. The Dapr input binding - Azure Event Hub spec triggers the event hub consumer HTTP endpoint when an event is triggered from Event Hub.

To demo the portability, we are going to change the Azure EventHub input binding spec to Kafka input binding spec.

{% github ksivamuthu/dapr-demo no-readme %}

git clone https://github.com/ksivamuthu/dapr-demo.git

Event Hub Producer

Event Hub Producer application - eventhub-producer is using Azure EventHubs NodeJS SDK to connect with Azure Event Hub.

Configure the connection string and event hub name in .env file or in your environment variables to load it in the nodejs process.


Then, create the EventHubProducerClient to send the individual events to Azure Event Hub with a delay of two seconds for this demo purpose.

  const producer = new EventHubProducerClient(connectionString, eventHubName);

    const events = [
        'Order received',
        'Being prepared',
        'On the way',

    for (let index = 0; index < events.length; index++) {       
        console.log(`Sending event: ${events[index]}`) 
        const batch = await producer.createBatch();
        batch.tryAdd({ body: { msg: events[index] }});
        await producer.sendBatch(batch);        
        await delay(2000);

    await producer.close();

Event Hub Consumer

Event Hub Consumer is the microservice application that is going to be deployed in the Kubernetes cluster.

It's the simple express application with a post endpoint which logs the data received.

const express = require('express')
const app = express()

const port = 3000

app.post('/receive-event', (req, res) => {

app.listen(port, () => console.log(`Kafka consumer app listening on port ${port}!`))

Deploy the app in Kubernetes

To build the container image with the eventhub-consumer application,

cd bindings/eventhub-consumer
docker build -t eventhub-consumer:latest .

Deploy the app using a YAML file that has deployment, service objects.

kubectl apply -f bindings/yamls/eventhub-consumer.yaml

In the deployment YAML file, I would like to show you the annotations Dapr runtime requires to know which app and port the input binding should call.

apiVersion: apps/v1
kind: Deployment
  name: eventhub-consumer
    app: eventhub-consumer
  replicas: 1
      app: eventhub-consumer
        app: eventhub-consumer
        dapr.io/enabled: "true"
        dapr.io/id: "eventhub-consumer"
        dapr.io/port: "3000"
      - name: node
        image: eventhub-consumer:latest
        - containerPort: 3000
        imagePullPolicy: IfNotPresent

Azure Event Hub Input Binding

Now it's time to integrate the producer and consumer through Dapr runtime. There are different specs available for input binding. In this demo, we are going to create the input binding with Azure Event Hub spec.

Update the connection string in the below YAML with the Azure Event Hub Connection String. The metadata.name is nothing but your post endpoint of eventhub-consumer application.

Apply this YAML in your kubernetes cluster.

apiVersion: dapr.io/v1alpha1
kind: Component
  name: receive-event
  type: bindings.azure.eventhubs
  - name: connectionString 
    value: CONNECTION_STRING # replace

Run the eventhub-producer application to send the events. You may also need to watch the logs of the eventhub-consumer pod to see the received events.

cd eventhub-producer
node index.js

//and in another tab
kubectl logs <pod_name> -f <container_name>

You should see the output like below

// eventhub-producer
Sending event: Order received
Sending event: Being prepared
Sending event: On the way
Sending event: Delivered
// eventhub-consumer kubernetes logs
Kafka consumer app listening on port 3000!
{ msg: 'Order received' }
{ msg: 'Being prepared' }
{ msg: 'On the way' }
{ msg: 'Delivered' }

Kafka Input Binding

It's clear how Dapr abstracts the runtime binding in which the eventhub-consumer is not aware of the vendor libraries and abstracted through HTTP/gRPC endpoints.

Now, we are going to switch the input bindings from Azure Event Hub to Kafka. For brevity of this demo, the same Azure Event Hub is used to connect, but this time through Kafka API of Azure Event Hub.

So what will change? - Only the input binding.

Delete the Azure Event Hub Input binding spec in kubernetes.

kubectl delete -f yamls/eventhub-input-binding.yaml

Create the Kafka Input binding spec in Kubernetes. Use the below YAML to configure Kafka binding with Azure Event Hub Connection String and Namespace.

BROKER_URL=EVENT_HUB_NAME.servicebus.windows.net:9093 EVENTHUB_NAMESPACE= EventHub Namespace Name CONNECTION_STRING= EventHub Connection String

apiVersion: dapr.io/v1alpha1
kind: Component
  name: receive-event
  type: bindings.kafka
    - name: brokers
      value: BROKER_URL # replace
    - name: topics
      value: EVENTHUB_NAMESPACE # replace
    - name: consumerGroup
      value: $Default
    - name: authRequired
      value: "true"
    - name: saslUsername
      value: $ConnectionString
    - name: saslPassword
      value: CONNECTION_STRING # Replace

Run the eventhub-producer application to see you got the below output from Kafka Input binding spec.

You should see the output like below

// eventhub-producer
Sending event: Order received
Sending event: Being prepared
Sending event: On the way
Sending event: Delivered
// eventhub-consumer kubernetes logs
Kafka consumer app listening on port 3000!
{ msg: 'Order received' }
{ msg: 'Being prepared' }
{ msg: 'On the way' }
{ msg: 'Delivered' }


In this post, you saw how to use Dapr bindings to connect Azure Event Hubs and the portability of using different binding specs without changing the application code.

Stay tuned for more posts on Dapr and Kubernetes. Like, follow and comment, DM me at my Twitter @ksivamuthu, your feedback is most welcome.