Attribute | Details |
---|---|
Dapr runtime version | v1.14.4 |
Dapr .NET SDK | v1.14.0 |
Language | C# |
Environment | Local |
This sample demonstrates how to integrate a Kafka producer using the Confluent Kafka SDK with a Dapr-powered consumer in .NET applications. The producer publishes messages directly to Kafka, while the consumer uses Dapr's pub/sub building block to receive them. These messages are not wrapped as CloudEvents, which is the default Dapr behaviour when publishing/subscribing to messages.
You can find more details about publishing & subscribing messages without CloudEvents here.
- Clone the repository
- Navigate to the solution folder:
cd pubsub-raw-payload
- Start Kafka using Docker Compose:
docker-compose up -d
- Start the Dapr Subscriber:
dapr run --app-id subscriber \
--app-port 5001 \
--dapr-http-port 3501 \
--resources-path ./components \
-- dotnet run --project src/Subscriber/Subscriber.csproj
- In a new terminal, start the Kafka Publisher:
dotnet run --project src/Publisher/Publisher.csproj
The subscriber uses programmatic subscription configured in code:
app.MapGet("/dapr/subscribe", () =>
{
var subscriptions = new[]
{
new
{
pubsubname = "pubsub",
topic = "messages",
route = "/messages",
metadata = new Dictionary<string, string>
{
{ "isRawPayload", "true" }
}
}
};
return Results.Ok(subscriptions);
});
Alternatively, create a subscription.yaml
in your components directory:
apiVersion: dapr.io/v2alpha1
kind: Subscription
metadata:
name: message-subscription
spec:
topic: messages
routes:
default: /messages
pubsubname: pubsub
metadata:
isRawPayload: "true"
When using declarative subscriptions:
- Remove the
/dapr/subscribe
endpoint from your subscriber application - Place the
subscription.yaml
file in your components directory - The subscription will be automatically loaded when you start your application
To publish a message:
curl -X POST http://localhost:5000/publish
The subscriber will display received messages in its console output.
- Stop the running applications using Ctrl+C in each terminal
- Stop Kafka:
docker-compose down
- The
isRawPayload
attribute is required for receiving raw JSON messages in .NET applications - The publisher uses the Confluent.Kafka client directly to publish messages to Kafka
- The subscriber uses Dapr's pub/sub building block to consume messages
- Make sure your Kafka broker is running before starting the applications