Kafka streaming
Every committed transaction in EvidentSource is available as a Kafka record. You can consume this stream with any standard Kafka client.
Endpoints
Section titled “Endpoints”Built-in (Milena)
Section titled “Built-in (Milena)”Milena, EvidentSource’s Kafka-compatible broker, is built in and backed by SlateDB. It exposes:
- Sandbox Milena endpoint:
kafkas://sandbox.evidentsource.com:9093 - Your deployment: typically port
9092or9093behind your ALB
External bridge
Section titled “External bridge”If you’re bridging to an external Kafka (MSK, Confluent), configure the bridge container — see Kafka prerequisites.
Topic convention
Section titled “Topic convention”- Per-database topic:
evidentsource.<database>.events - Keyed by
<stream>:<subject> - Values are CloudEvents JSON (default) or Avro/Protobuf (configurable)
Consuming
Section titled “Consuming”Use any Kafka client library. Bootstrap servers = the Milena or bridged endpoint. No special configuration beyond standard TLS/SASL for your environment.
from kafka import KafkaConsumer
consumer = KafkaConsumer( "evidentsource.todomvc.events", bootstrap_servers="sandbox.evidentsource.com:9093", security_protocol="SSL",)for msg in consumer: print(msg.key, msg.value)Exactly-once semantics
Section titled “Exactly-once semantics”EvidentSource commits transactions atomically to both the event store and the Milena log; Kafka consumers see exactly the events that were committed, in commit order.
When to subscribe via Kafka vs gRPC
Section titled “When to subscribe via Kafka vs gRPC”| Use Kafka when | Use gRPC SubscribeEvents when |
|---|---|
| Existing Kafka tooling / stream processors | Language has a good gRPC client |
| Long-lived pipelines | Short-lived consumers or browser clients |
| Integration with Kafka Connect / Streams | Filtering by stream / subject / type |