Event Subscriptions¶
This guide shows you how to add Azure Service Bus event subscription capabilities to your existing .NET application.
Aspire Publish Alternative
Starting in Forge 3.3, Azure Functions subscription projects are auto-detected by Aspire publish when using the SAIF.Platform.Aspire.Hosting package. Add them with AddSubscriptionService<TProject>() in your AppHost and pipeline YAML is generated automatically. See Aspire Publish for details.
๐ Prerequisites¶
- โ Existing SAIF API project
- โ .NET 10.0 SDK
- โ Docker Desktop
- โ Azure DevOps access
๐ Quick Start¶
1. Add the Feature¶
Using SAIF CLI:
Using .NET Template:
dotnet new saif-feature-event-subscription -n MyApp -o . \
--application_name "MyApp" \
--owner "Platform" \
--project_id "it-api-exp-myapp" \
--tenant "corporate"
This adds:
YourApp.Subscriptions- Functions projectYourApp.ServiceBus.Seed- Test data seeder- Infrastructure and pipeline files
2. Wire Up AppHost¶
โ ๏ธ Manual step required: Add to src/YourApp.AppHost/Program.cs:
Complete example:
var builder = DistributedApplication.CreateBuilder(args);
var backend = builder.AddApi();
var subscription = builder.AddSubscription(); // ๐ Add this
builder.Build().Run();
โ๏ธ Configure Subscriptions¶
Edit infra/sub/vars.yml with the topics and subscriptions you want to create:
application_name: YourApp
owner: Platform
project_id: it-api-exp-yourapp
function_app_project_id: it-func-yourapp
events:
- topic: newuser # Topic name from the event service
subscription: it-func-yourapp-newuser-subscription
- topic: policy # Add as many topics as you need
subscription: it-func-yourapp-policy-subscription
- topic: accountupdated
subscription: it-func-yourapp-accountupdated-subscription
๐ก Note: Subscription names must be unique within each topic. It is recommended to prefix subscription names with your function app project ID (e.g., it-func-yourapp-*), as that is guaranteed to be unique, to avoid conflicts with other applications subscribing to the same topics.
๐ฏ Create Event Handlers¶
You need to create a model and trigger function for each event type in your vars.yml.
Option A: Manual POCOs (Plain Old CLR Objects)¶
1. Define Event Models¶
Create one model per event type:
src/YourApp.Subscriptions/Models/NewUserEvent.cs:
using System.Text.Json.Serialization;
namespace YourApp.Subscriptions.Models;
public class NewUserEvent
{
[JsonPropertyName("eventId")]
public string EventId { get; set; } = string.Empty;
[JsonPropertyName("userId")]
public string UserId { get; set; } = string.Empty;
[JsonPropertyName("email")]
public string Email { get; set; } = string.Empty;
}
๐ก Tip: Use [JsonPropertyName] to match the exact property names from the event publisher.
2. Create Trigger Functions¶
Create one trigger per event type:
src/YourApp.Subscriptions/Triggers/NewUserTrigger.cs:
using Azure.Messaging.ServiceBus;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
using System.Text.Json;
using YourApp.Subscriptions.Models;
namespace YourApp.Subscriptions.Triggers;
public class NewUserTrigger(ILogger<NewUserTrigger> logger)
{
[Function("NewUserTrigger")]
public void Run(
[ServiceBusTrigger("newuser", "it-func-yourapp-newuser-subscription", Connection = "sbnamespace")]
ServiceBusReceivedMessage message)
{
logger.LogInformation("Processing user event: {MessageId}", message.MessageId);
try
{
var userEvent = JsonSerializer.Deserialize<NewUserEvent>(message.Body.ToString());
if (userEvent == null)
throw new InvalidOperationException("Invalid message format");
// YOUR BUSINESS LOGIC HERE
logger.LogInformation("User: {UserId}, Email: {Email}",
userEvent.UserId, userEvent.Email);
}
catch (Exception ex)
{
logger.LogError(ex, "Error processing message");
throw; // Re-throw for retry
}
}
}
Key points:
- Topic and subscription names must match
vars.yml - Connection is always
"sbnamespace"(unless using a custom namespace and implementation) - Throw exceptions to trigger retries
- Repeat this pattern for each event type
Option B: Generate Models with Kiota¶
Instead of manually creating POCOs, you can generate type-safe models from OpenAPI specifications using Kiota.
Benefits¶
- Type-safe models automatically generated from OpenAPI specs
- Reduced manual maintenance and errors
- Built-in JSON serialization/deserialization
- Keeps models in sync with event schemas
- Automatic regeneration on build when OpenAPI spec changes
Configure Kiota Reference¶
The template includes a commented-out KiotaReference in the .csproj file. To enable it:
-
Open
src/YourApp.Subscriptions/YourApp.Subscriptions.csproj -
Uncomment the KiotaReference section at the bottom of the file:
<ItemGroup>
<KiotaReference Include="KiotaGenerated" OpenApi="https://openapi.saif.com/{event-service-project-id}/{environment}/{openapi-spec-file-name}.yaml">
<NamespaceName>YourApp.Subscriptions</NamespaceName>
</KiotaReference>
</ItemGroup>
Replace:
{event-service-project-id}with the event service's project ID (e.g.,it-api-sys-testeventing){environment}withtest,qa,uatorprod{openapi-spec-file-name}with the spec filename (commonlyopenapioropenapi.v1)
๐ก Tip: You can find the correct OpenAPI URL by checking the event service's documentation or asking the event service team.
- Remove or comment out the pre-created Models files:
The template creates a Models folder by default. Since Kiota will generate models in the KiotaGenerated/Models folder, you can either:
- Delete the
src/YourApp.Subscriptions/Modelsfolder, or -
Keep it for any custom models you want to create manually alongside the generated ones
-
Build the project to generate models:
The models will be automatically generated in the KiotaGenerated/Models folder during build.
Use Generated Models in Triggers¶
Create one trigger per event type:
src/YourApp.Subscriptions/Triggers/NewUserTrigger.cs:
using Azure.Messaging.ServiceBus;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
using Microsoft.Kiota.Serialization.Json;
using YourApp.Subscriptions.Models;
namespace YourApp.Subscriptions.Triggers;
public class NewUserTrigger(ILogger<NewUserTrigger> logger)
{
[Function("NewUserTrigger")]
public void Run(
[ServiceBusTrigger("newuser", "it-func-yourapp-newuser-subscription", Connection = "sbnamespace")]
ServiceBusReceivedMessage message)
{
logger.LogInformation("Processing user event: {MessageId}", message.MessageId);
try
{
// Deserialize using Kiota's JsonParseNodeFactory
using var stream = new MemoryStream(message.Body.ToArray());
var parseNode = new JsonParseNodeFactory().GetRootParseNode("application/json", stream);
var userEvent = parseNode.GetObjectValue(NewUserEvent.CreateFromDiscriminatorValue);
if (userEvent == null)
throw new InvalidOperationException("Invalid message format");
// YOUR BUSINESS LOGIC HERE
logger.LogInformation("User: {UserId}, Email: {Email}",
userEvent.UserId, userEvent.Email);
}
catch (Exception ex)
{
logger.LogError(ex, "Error processing message");
throw; // Re-throw for retry
}
}
}
Updating Models¶
When your OpenAPI spec changes, the models will be automatically regenerated on the next build. Simply run dotnet build and the KiotaReference in your .csproj file ensures models stay in sync with the OpenAPI spec without manual intervention.
๐ก Best Practice: Always use the latest published OpenAPI spec URL (e.g., from https://openapi.saif.com/...) to ensure your models match the current event schema.
Comparison: Manual vs. Kiota¶
| Aspect | Manual POCOs | Kiota Generated (KiotaReference) |
|---|---|---|
| Setup Time | Fast (write class) | Fast (uncomment config in .csproj) |
| Maintenance | Manual updates needed for each model | Automatic regeneration on build |
| Type Safety | Basic | Enhanced with factories |
| Schema Validation | None | OpenAPI validation |
| Multiple Events | Write each model separately | All models generated automatically |
| Workflow | Manual coding | Build triggers generation |
| Best For | Simple events, prototyping | Multiple events, production, complex schemas |
Integrating with Cosmos DB¶
If you need to interact with the event data in your application, storing the data in the API's Cosmos DB may be a viable solution. Here's a brief example of how to do that.
-
Add Cosmos DB to your project if not already done. Follow Add CosmosDB NoSQL guide on how to get it set up before proceeding to step 2.
-
In the AppHost builder, make sure the
subscriptionvariable is created before the database variable, then addsubscriptionas an argument to.AddCosmosDb():
...
var subscription = builder.AddSubscription();
var (cosmosdb, database) = builder.AddCosmosDb(backend, subscription);
...
- In the Subscription builder, add the Cosmos DB context for dependency injection:
- Create mappers to convert event models to the existing Cosmos DB entities. Here is an hand-written mapper example (you can also use libraries like AutoMapper, Mapperly, etc.):
public static class NewUserEventMapper
{
public static Data.NewUserEvent ToEntity(this Models.NewUserEvent userEvent, string partitionKey)
{
return new Data.NewUserEvent
{
Id = userEvent.EventId,
EventId = userEvent.EventId,
UserId = userEvent.UserId,
Email = userEvent.Email,
PartitionKey = partitionKey
};
}
}
- In your trigger, inject the Cosmos DB context and use it in the function to save the event data:
public class NewUserEventTrigger(ILogger<NewUserEventTrigger> logger, EmmjohContext dbContext)
{
[Function("NewUserTrigger")]
public void Run(
[ServiceBusTrigger("newuser", "it-func-yourapp-newuser-subscription", Connection = "sbnamespace")]
ServiceBusReceivedMessage message, CancellationToken cancellationToken)
{
...
try {
using var stream = new MemoryStream(message.Body.ToArray());
var parseNode = await new JsonParseNodeFactory().GetRootParseNodeAsync("application/json", stream);
var newUserEvent = parseNode.GetObjectValue(Models.NewUserEvent.CreateFromDiscriminatorValue);
// Add to DbContext and save to Cosmos DB
var entity = newUserEvent.ToEntity(partitionKey: newUserEvent.Email);
dbContext.NewUserEvents.Add(entity);
logger.LogInformation("Saving changes to Cosmos DB...");
await dbContext.NewUserEvents.SaveChangesAsync(cancellationToken);
logger.LogInformation("Successfully stored new user event {EventId} to Cosmos DB", newUserEvent.EventId);
// Other business logic
...
}
...
}
}
๐งช Test Locally¶
-
Run the AppHost (F5 in Visual Studio or
dotnet runin terminal): -
Aspire dashboard opens showing:
-
Service Bus emulator
- Your subscription function
- Service Bus seeder (runs automatically)
- Cosmos DB emulator (if integrating)
-
Cosmos DB seeder (if integrating, may need to manually run depending on setup)
-
Check logs for message processing:
[servicebus-seed] Seeded 5 messages to 'newuser' topic
[subscriptions] Processing user event: abc-123
[subscriptions] User: user123, Email: john@example.com
If integrating Cosmos DB, check logs for writing changes to the database:
[subscriptions] Saving changes to Cosmos DB...
[subscriptions] Executed CreateItem (284.6743 ms, 7.43 RU) ActivityId='52a0be67-604e-4a52-9133-173b942c1bb9', Container='NewUserEvents', Id='?', Partition='?'
[subscriptions] Successfully stored new user event abc-123 to Cosmos DB
Optional: Customize Test Data¶
Edit src/YourApp.ServiceBus.Seed/DataSeed.cs to send custom test messages. Add a seeding method for each event type you configured in vars.yml.
๐ Deploy¶
- Commit and push:
- Run pipeline in Azure DevOps:
- Pipeline name:
{project-id}-sub - Example:
it-api-exp-myapp-sub
The pipeline deploys:
- Azure Function App
- Service Bus subscriptions
-
Application Insights
-
Verify in Azure Portal:
-
Function App is running
- Service Bus subscriptions exist
-
Functions appear in Function App
-
Monitor in Dynatrace:
- All logs are automatically sent to Dynatrace
- View structured logs with your event data
- Monitor function performance and errors
- Access at: Dynatrace Portal
๐ก Tip: Use structured logging (as shown in examples) for better Dynatrace integration. Log properties appear as filterable fields in Dynatrace.
๐ง Troubleshooting¶
Messages not being received¶
- โ Verify subscription exists in Azure Portal
- โ Check Function App is running (view in Dynatrace)
- โ Ensure topic names match exactly
Deserialization errors¶
- โ
Check
JsonPropertyNameattributes match publisher schema - โ View error logs in Dynatrace with full stack traces
- โ Log raw message to see actual JSON:
Local testing: Service Bus not starting¶
- โ Ensure Docker Desktop is running
- โ Restart Docker Desktop
- โ Re-run AppHost
Deployment fails¶
- โ Check pipeline logs in Azure DevOps
- โ
Verify
vars.ymlsyntax is correct - โ Ensure all required variables are set
Messages in dead letter queue¶
- โ Check Dynatrace for exception logs and traces
- โ Verify event model matches message schema
- โ Filter Dynatrace logs by MessageId for detailed investigation
Viewing Logs and Monitoring¶
All function logs are automatically sent to Dynatrace:
- Filter by function name, severity, or custom properties
- View distributed traces across services
- Set up alerts for errors or performance issues
- Access: Dynatrace Portal