Welcome to our Microsoft Fabric Tutorial Series! – Eventstream in Fabric. A complete guide for beginners to professionals—learn real-time data streaming, transformations, automation, and best practices with practical, code, and visual insights.
1️⃣ What is Eventstream in Fabric?
Eventstream is a no-code platform offered within Microsoft Fabric for ingesting, transforming, and routing streaming event data. It’s built for real-time scenarios—IoT, app telemetry, business transactions—allowing seamless integration, live analytics, and workflow automation.
Use-cases: IoT monitoring, operational dashboards, anomaly alerts, unified event processing.
- Unified ingress: Connects cloud, on-prem, and IoT streams.
- No-code pipeline: Drag-and-drop modeling, data enrichment, complex joins/aggregations.
- Multi-destination: Send processed events to Lakehouse, KQL DB, Eventhouse, REST, or Data Activator for automation.

2️⃣ Key Concepts & Architecture
Eventstream pipelines consist of four main building blocks, working together for real-time analytics and actions:
Component | Description |
---|---|
Source | Where event data originates (e.g., Azure Event Hubs, IoT devices, CDC-enabled DBs, streaming APIs, 3rd-party cloud brokers) |
Stream | Continuous, live event flow for processing |
Transform/Processor | Visual blocks for filtering, enriching, aggregating, joining, or splitting the data streams |
Destination | Output sinks: Lakehouse (Parquet), KQL DB, Eventhouse, Data Activator, custom API/webhooks |
Supported Data Sources:
- Azure Event Hubs/Service Bus
- Azure IoT Hub
- Amazon Kinesis & Google Pub/Sub
- Databases via CDC (SQL, Cosmos DB, etc.)
- External MQTT Brokers, Kafka, Solace
- App and Web telemetry (via REST/Kafka endpoints)
3️⃣ Prerequisites
- Microsoft Fabric workspace with Contributor+ permissions
- Assigned streaming capacity/skus (trial works for testing)
- Access to the source and destination services (credentials/keys as needed)
- Basic understanding of your target data scenario
4️⃣ Step-by-Step: Build a Complete Eventstream in Fabric
4.1 Create an Eventstream Pipeline
- Login to the Fabric Portal.
- Select your workspace. Click + New → Eventstream.
- Enter a name (e.g.,
IoTMonitoringStream
) and click Create.
4.2 Add Data Sources
- On the Eventstream canvas, click Add Source.
- Pick source type (Event Hub, IoT Hub, Service Bus, CDC for DBs, Kafka, REST Endpoint, etc.).
- Paste the required credentials/connection details when prompted.
- For Custom Apps: Copy the Event Hub name and connection string for code integration (see below).
4.3 Apply Stream Transformations (Drag-and-Drop)
Transformation | Details | Example Scenario |
---|---|---|
Filter | Route only events matching logical conditions | temperature > 35 (only critical IoT events forwarded) |
Manage Fields | Rename, drop, or type-cast fields for downstream compatibility | Remove unnecessary log fields; convert location to string |
Aggregate | Calculate sum, min, max, average within time window (sliding/tumbling) | Compute hourly average sensor readings per device |
Group By | Aggregate events grouped by key(s) | Summarize event count per region in real time |
Join | Match/merge two streams on keys/timestamps | Join sensor readings with device metadata |
Union | Merge multiple streams (same schema) for unified output | Combine event traffic from multiple factories/countries |
Expand Array | Create a row per array element within an event | Decompose batched sensor packet into per-reading records |
4.4 Add Event Destinations
- On the canvas, click Add Destination.
- Select:
- Lakehouse (Delta/Parquet for storage; supports batch and analytics)
- KQL Database (for instant querying and Power BI dashboards)
- Eventhouse (optimized for high-velocity clickstream data)
- Data Activator (to trigger notifications, workflows, or integrations on events)
- Custom API or Webhook (trigger third-party apps and workflows)
- Configure additional partitioning, latency windows, or retention as needed.
5️⃣ Eventstream & Code: Custom App Integration
Integrate Eventstream with Your Code (Node.js Example)
Send events from an application using Azure Event Hubs SDK:
const { EventHubProducerClient } = require("@azure/event-hubs");
const connectionString = "<Fabric stream connection string>";
const eventHubName = "<Fabric stream name>";
const client = new EventHubProducerClient(connectionString, eventHubName);
async function sendEvent(data) {
const batch = await client.createBatch();
batch.tryAdd({ body: data });
await client.sendBatch(batch);
}
sendEvent({ temperature: 41, deviceId: "sensor-103", status: "hot" });
- Replace placeholders with yours from the Fabric Eventstream source connection UI.
- Test sending a sample event as JSON.
6️⃣ Advanced Techniques & Automation
- Real-time analytics: Route events into KQL DB for streaming SQL-like queries and immediate dashboarding.
- Data Activator: Define conditions (e.g.,
humidity > 80
) to auto-trigger Teams/Slack alerts, Azure Logic Apps, or custom notifications without writing code. - Historical replay: For CDC/DB log sources, reprocess past data as live events via Lakehouse/Eventhouse storage sinks.
- Combining sources: Use Unions/Joins to orchestrate complex multi-source pipelines (e.g., merge app telemetry + IoT + transaction logs into a single unified analytics stream).
- Monitoring & diagnostics: Monitor pipeline health, throughput, and error metrics in Fabric’s native dashboard. Proactively set up alerts for failures or stalling pipelines.
7️⃣ Monitoring, Troubleshooting & Optimization
- Use Fabric’s built-in tools to view stream health, event lag, dropped data, and throughput.
- Enable diagnostic logging for long-term monitoring/troubleshooting.
- Partition data in destinations for scalable storage and query performance (especially for high-volume/lakehouse scenarios).
- Design granularity (e.g., hourly/daily files) based on expected analytics and retention needs.
8️⃣ Best Practices for Production Eventstream in Fabric
- Enable Enhanced Capabilities: Unlock advanced transformations and flexible routing.
- Keep transformations modular: Isolate heavy logic for maintainability and performance.
- Document your flows: Use the built-in description fields and maintain diagrams for team clarity.
- Start with staging: Build/test in a non-production workspace before moving to critical data flows.
- Leverage automation: Use Data Activator for hands-off, event-driven workflows across the organization.
9️⃣ Real-World Case Study: Logistics IoT Monitoring
Scenario: A logistics provider needs to monitor truck temperature and route violations.
Design:
- Source: Azure IoT Hub streaming truck sensor feeds
- Transform: Filter readings where
temperature > 38°C
- Destination: Real-time KQL query for dashboard; Data Activator for instant SMS/Teams alerts
Outcome: Operational dashboard, live anomaly detection, and automated incident response—all real time.