Skip to content

Event Subscriptions

This guide shows you how to add Azure Service Bus event subscription capabilities to your existing .NET application.

Aspire Publish Alternative

Starting in Forge 3.3, Azure Functions subscription projects are auto-detected by Aspire publish when using the SAIF.Platform.Aspire.Hosting package. Add them with AddSubscriptionService<TProject>() in your AppHost and pipeline YAML is generated automatically. See Aspire Publish for details.

๐Ÿ“‹ Prerequisites

  • โœ… Existing SAIF API project
  • โœ… .NET 10.0 SDK
  • โœ… Docker Desktop
  • โœ… Azure DevOps access

๐Ÿš€ Quick Start

1. Add the Feature

Using SAIF CLI:

saif new saif-feature-event-subscription

Using .NET Template:

dotnet new saif-feature-event-subscription -n MyApp -o . \
  --application_name "MyApp" \
  --owner "Platform" \
  --project_id "it-api-exp-myapp" \
  --tenant "corporate"

This adds:

  • YourApp.Subscriptions - Functions project
  • YourApp.ServiceBus.Seed - Test data seeder
  • Infrastructure and pipeline files

2. Wire Up AppHost

โš ๏ธ Manual step required: Add to src/YourApp.AppHost/Program.cs:

var subscription = builder.AddSubscription();

Complete example:

var builder = DistributedApplication.CreateBuilder(args);

var backend = builder.AddApi();
var subscription = builder.AddSubscription();  // ๐Ÿ‘ˆ Add this

builder.Build().Run();

โš™๏ธ Configure Subscriptions

Edit infra/sub/vars.yml with the topics and subscriptions you want to create:

application_name: YourApp
owner: Platform
project_id: it-api-exp-yourapp
function_app_project_id: it-func-yourapp
events:
  - topic: newuser # Topic name from the event service
    subscription: it-func-yourapp-newuser-subscription
  - topic: policy # Add as many topics as you need
    subscription: it-func-yourapp-policy-subscription
  - topic: accountupdated
    subscription: it-func-yourapp-accountupdated-subscription

๐Ÿ’ก Note: Subscription names must be unique within each topic. It is recommended to prefix subscription names with your function app project ID (e.g., it-func-yourapp-*), as that is guaranteed to be unique, to avoid conflicts with other applications subscribing to the same topics.

๐ŸŽฏ Create Event Handlers

You need to create a model and trigger function for each event type in your vars.yml.

Option A: Manual POCOs (Plain Old CLR Objects)

1. Define Event Models

Create one model per event type:

src/YourApp.Subscriptions/Models/NewUserEvent.cs:

using System.Text.Json.Serialization;

namespace YourApp.Subscriptions.Models;

public class NewUserEvent
{
  [JsonPropertyName("eventId")]
  public string EventId { get; set; } = string.Empty;

  [JsonPropertyName("userId")]
  public string UserId { get; set; } = string.Empty;

  [JsonPropertyName("email")]
  public string Email { get; set; } = string.Empty;
}

๐Ÿ’ก Tip: Use [JsonPropertyName] to match the exact property names from the event publisher.

2. Create Trigger Functions

Create one trigger per event type:

src/YourApp.Subscriptions/Triggers/NewUserTrigger.cs:

using Azure.Messaging.ServiceBus;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
using System.Text.Json;
using YourApp.Subscriptions.Models;

namespace YourApp.Subscriptions.Triggers;

public class NewUserTrigger(ILogger<NewUserTrigger> logger)
{
  [Function("NewUserTrigger")]
  public void Run(
      [ServiceBusTrigger("newuser", "it-func-yourapp-newuser-subscription", Connection = "sbnamespace")]
      ServiceBusReceivedMessage message)
  {
    logger.LogInformation("Processing user event: {MessageId}", message.MessageId);

    try
    {
      var userEvent = JsonSerializer.Deserialize<NewUserEvent>(message.Body.ToString());

      if (userEvent == null)
        throw new InvalidOperationException("Invalid message format");

      // YOUR BUSINESS LOGIC HERE
      logger.LogInformation("User: {UserId}, Email: {Email}",
        userEvent.UserId, userEvent.Email);
    }
    catch (Exception ex)
    {
      logger.LogError(ex, "Error processing message");
      throw; // Re-throw for retry
    }
  }
}

Key points:

  • Topic and subscription names must match vars.yml
  • Connection is always "sbnamespace" (unless using a custom namespace and implementation)
  • Throw exceptions to trigger retries
  • Repeat this pattern for each event type

Option B: Generate Models with Kiota

Instead of manually creating POCOs, you can generate type-safe models from OpenAPI specifications using Kiota.

Benefits

  • Type-safe models automatically generated from OpenAPI specs
  • Reduced manual maintenance and errors
  • Built-in JSON serialization/deserialization
  • Keeps models in sync with event schemas
  • Automatic regeneration on build when OpenAPI spec changes

Configure Kiota Reference

The template includes a commented-out KiotaReference in the .csproj file. To enable it:

  1. Open src/YourApp.Subscriptions/YourApp.Subscriptions.csproj

  2. Uncomment the KiotaReference section at the bottom of the file:

<ItemGroup>
  <KiotaReference Include="KiotaGenerated" OpenApi="https://openapi.saif.com/{event-service-project-id}/{environment}/{openapi-spec-file-name}.yaml">
    <NamespaceName>YourApp.Subscriptions</NamespaceName>
  </KiotaReference>
</ItemGroup>

Replace:

  • {event-service-project-id} with the event service's project ID (e.g., it-api-sys-testeventing)
  • {environment} with test, qa, uat or prod
  • {openapi-spec-file-name} with the spec filename (commonly openapi or openapi.v1)

๐Ÿ’ก Tip: You can find the correct OpenAPI URL by checking the event service's documentation or asking the event service team.

  1. Remove or comment out the pre-created Models files:

The template creates a Models folder by default. Since Kiota will generate models in the KiotaGenerated/Models folder, you can either:

  • Delete the src/YourApp.Subscriptions/Models folder, or
  • Keep it for any custom models you want to create manually alongside the generated ones

  • Build the project to generate models:

cd src/YourApp.Subscriptions
dotnet build

The models will be automatically generated in the KiotaGenerated/Models folder during build.

Use Generated Models in Triggers

Create one trigger per event type:

src/YourApp.Subscriptions/Triggers/NewUserTrigger.cs:

using Azure.Messaging.ServiceBus;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
using Microsoft.Kiota.Serialization.Json;
using YourApp.Subscriptions.Models;

namespace YourApp.Subscriptions.Triggers;

public class NewUserTrigger(ILogger<NewUserTrigger> logger)
{
  [Function("NewUserTrigger")]
  public void Run(
      [ServiceBusTrigger("newuser", "it-func-yourapp-newuser-subscription", Connection = "sbnamespace")]
      ServiceBusReceivedMessage message)
  {
    logger.LogInformation("Processing user event: {MessageId}", message.MessageId);

    try
    {
      // Deserialize using Kiota's JsonParseNodeFactory
      using var stream = new MemoryStream(message.Body.ToArray());
      var parseNode = new JsonParseNodeFactory().GetRootParseNode("application/json", stream);
      var userEvent = parseNode.GetObjectValue(NewUserEvent.CreateFromDiscriminatorValue);

      if (userEvent == null)
        throw new InvalidOperationException("Invalid message format");

      // YOUR BUSINESS LOGIC HERE
      logger.LogInformation("User: {UserId}, Email: {Email}",
        userEvent.UserId, userEvent.Email);
    }
    catch (Exception ex)
    {
      logger.LogError(ex, "Error processing message");
      throw; // Re-throw for retry
    }
  }
}

Updating Models

When your OpenAPI spec changes, the models will be automatically regenerated on the next build. Simply run dotnet build and the KiotaReference in your .csproj file ensures models stay in sync with the OpenAPI spec without manual intervention.

๐Ÿ’ก Best Practice: Always use the latest published OpenAPI spec URL (e.g., from https://openapi.saif.com/...) to ensure your models match the current event schema.


Comparison: Manual vs. Kiota

Aspect Manual POCOs Kiota Generated (KiotaReference)
Setup Time Fast (write class) Fast (uncomment config in .csproj)
Maintenance Manual updates needed for each model Automatic regeneration on build
Type Safety Basic Enhanced with factories
Schema Validation None OpenAPI validation
Multiple Events Write each model separately All models generated automatically
Workflow Manual coding Build triggers generation
Best For Simple events, prototyping Multiple events, production, complex schemas

Integrating with Cosmos DB

If you need to interact with the event data in your application, storing the data in the API's Cosmos DB may be a viable solution. Here's a brief example of how to do that.

  1. Add Cosmos DB to your project if not already done. Follow Add CosmosDB NoSQL guide on how to get it set up before proceeding to step 2.

  2. In the AppHost builder, make sure the subscription variable is created before the database variable, then add subscription as an argument to .AddCosmosDb():

...
var subscription = builder.AddSubscription();

var (cosmosdb, database) = builder.AddCosmosDb(backend, subscription);
...
  1. In the Subscription builder, add the Cosmos DB context for dependency injection:
...
builder
  .AddAzureDefaults()
  .AddServiceDefaults()
  .AddCosmosDbServices(); // ๐Ÿ‘ˆ Add this
...
  1. Create mappers to convert event models to the existing Cosmos DB entities. Here is an hand-written mapper example (you can also use libraries like AutoMapper, Mapperly, etc.):
public static class NewUserEventMapper
{
  public static Data.NewUserEvent ToEntity(this Models.NewUserEvent userEvent, string partitionKey)
  {
    return new Data.NewUserEvent
    {
      Id = userEvent.EventId,
      EventId = userEvent.EventId,
      UserId = userEvent.UserId,
      Email = userEvent.Email,
      PartitionKey = partitionKey
    };
  }
}
  1. In your trigger, inject the Cosmos DB context and use it in the function to save the event data:
public class NewUserEventTrigger(ILogger<NewUserEventTrigger> logger, EmmjohContext dbContext)
{
  [Function("NewUserTrigger")]
  public void Run(
      [ServiceBusTrigger("newuser", "it-func-yourapp-newuser-subscription", Connection = "sbnamespace")]
      ServiceBusReceivedMessage message, CancellationToken cancellationToken)
  {
    ...
    try {
      using var stream = new MemoryStream(message.Body.ToArray());
      var parseNode = await new JsonParseNodeFactory().GetRootParseNodeAsync("application/json", stream);
      var newUserEvent = parseNode.GetObjectValue(Models.NewUserEvent.CreateFromDiscriminatorValue);

      // Add to DbContext and save to Cosmos DB
      var entity = newUserEvent.ToEntity(partitionKey: newUserEvent.Email);
      dbContext.NewUserEvents.Add(entity);

      logger.LogInformation("Saving changes to Cosmos DB...");
      await dbContext.NewUserEvents.SaveChangesAsync(cancellationToken);

      logger.LogInformation("Successfully stored new user event {EventId} to Cosmos DB", newUserEvent.EventId);

      // Other business logic
      ...
    }
    ...
  }
}

๐Ÿงช Test Locally

  1. Run the AppHost (F5 in Visual Studio or dotnet run in terminal):

  2. Aspire dashboard opens showing:

  3. Service Bus emulator

  4. Your subscription function
  5. Service Bus seeder (runs automatically)
  6. Cosmos DB emulator (if integrating)
  7. Cosmos DB seeder (if integrating, may need to manually run depending on setup)

  8. Check logs for message processing:

[servicebus-seed] Seeded 5 messages to 'newuser' topic
[subscriptions] Processing user event: abc-123
[subscriptions] User: user123, Email: john@example.com

If integrating Cosmos DB, check logs for writing changes to the database:

[subscriptions] Saving changes to Cosmos DB...
[subscriptions] Executed CreateItem (284.6743 ms, 7.43 RU) ActivityId='52a0be67-604e-4a52-9133-173b942c1bb9', Container='NewUserEvents', Id='?', Partition='?'
[subscriptions] Successfully stored new user event abc-123 to Cosmos DB

Optional: Customize Test Data

Edit src/YourApp.ServiceBus.Seed/DataSeed.cs to send custom test messages. Add a seeding method for each event type you configured in vars.yml.

๐Ÿš€ Deploy

  1. Commit and push:
git add .
git commit -m "Add event subscription feature"
git push
  1. Run pipeline in Azure DevOps:
  2. Pipeline name: {project-id}-sub
  3. Example: it-api-exp-myapp-sub

The pipeline deploys:

  • Azure Function App
  • Service Bus subscriptions
  • Application Insights

  • Verify in Azure Portal:

  • Function App is running

  • Service Bus subscriptions exist
  • Functions appear in Function App

  • Monitor in Dynatrace:

  • All logs are automatically sent to Dynatrace
  • View structured logs with your event data
  • Monitor function performance and errors
  • Access at: Dynatrace Portal

๐Ÿ’ก Tip: Use structured logging (as shown in examples) for better Dynatrace integration. Log properties appear as filterable fields in Dynatrace.

๐Ÿ”ง Troubleshooting

Messages not being received

  • โœ… Verify subscription exists in Azure Portal
  • โœ… Check Function App is running (view in Dynatrace)
  • โœ… Ensure topic names match exactly

Deserialization errors

  • โœ… Check JsonPropertyName attributes match publisher schema
  • โœ… View error logs in Dynatrace with full stack traces
  • โœ… Log raw message to see actual JSON:
    logger.LogInformation("Raw: {Body}", message.Body.ToString());
    

Local testing: Service Bus not starting

  • โœ… Ensure Docker Desktop is running
  • โœ… Restart Docker Desktop
  • โœ… Re-run AppHost

Deployment fails

  • โœ… Check pipeline logs in Azure DevOps
  • โœ… Verify vars.yml syntax is correct
  • โœ… Ensure all required variables are set

Messages in dead letter queue

  • โœ… Check Dynatrace for exception logs and traces
  • โœ… Verify event model matches message schema
  • โœ… Filter Dynatrace logs by MessageId for detailed investigation

Viewing Logs and Monitoring

All function logs are automatically sent to Dynatrace:

  • Filter by function name, severity, or custom properties
  • View distributed traces across services
  • Set up alerts for errors or performance issues
  • Access: Dynatrace Portal

๐Ÿ“š Resources