Eventstream in Fabric – Microsoft Fabric Tutorial Series 2025

Welcome to our Microsoft Fabric Tutorial Series! – Eventstream in Fabric. A complete guide for beginners to professionals—learn real-time data streaming, transformations, automation, and best practices with practical, code, and visual insights.

1️⃣ What is Eventstream in Fabric?

Eventstream is a no-code platform offered within Microsoft Fabric for ingesting, transforming, and routing streaming event data. It’s built for real-time scenarios—IoT, app telemetry, business transactions—allowing seamless integration, live analytics, and workflow automation.

Use-cases: IoT monitoring, operational dashboards, anomaly alerts, unified event processing.

  • Unified ingress: Connects cloud, on-prem, and IoT streams.
  • No-code pipeline: Drag-and-drop modeling, data enrichment, complex joins/aggregations.
  • Multi-destination: Send processed events to Lakehouse, KQL DB, Eventhouse, REST, or Data Activator for automation.
Eventstream in Fabric

2️⃣ Key Concepts & Architecture

Eventstream pipelines consist of four main building blocks, working together for real-time analytics and actions:

ComponentDescription
SourceWhere event data originates (e.g., Azure Event Hubs, IoT devices, CDC-enabled DBs, streaming APIs, 3rd-party cloud brokers)
StreamContinuous, live event flow for processing
Transform/ProcessorVisual blocks for filtering, enriching, aggregating, joining, or splitting the data streams
DestinationOutput sinks: Lakehouse (Parquet), KQL DB, Eventhouse, Data Activator, custom API/webhooks

Supported Data Sources:

  • Azure Event Hubs/Service Bus
  • Azure IoT Hub
  • Amazon Kinesis & Google Pub/Sub
  • Databases via CDC (SQL, Cosmos DB, etc.)
  • External MQTT Brokers, Kafka, Solace
  • App and Web telemetry (via REST/Kafka endpoints)

3️⃣ Prerequisites

  • Microsoft Fabric workspace with Contributor+ permissions
  • Assigned streaming capacity/skus (trial works for testing)
  • Access to the source and destination services (credentials/keys as needed)
  • Basic understanding of your target data scenario

4️⃣ Step-by-Step: Build a Complete Eventstream in Fabric

4.1 Create an Eventstream Pipeline

  1. Login to the Fabric Portal.
  2. Select your workspace. Click + NewEventstream.
  3. Enter a name (e.g., IoTMonitoringStream) and click Create.

4.2 Add Data Sources

  1. On the Eventstream canvas, click Add Source.
  2. Pick source type (Event Hub, IoT Hub, Service Bus, CDC for DBs, Kafka, REST Endpoint, etc.).
    • Paste the required credentials/connection details when prompted.
    • For Custom Apps: Copy the Event Hub name and connection string for code integration (see below).

4.3 Apply Stream Transformations (Drag-and-Drop)

TransformationDetailsExample Scenario
FilterRoute only events matching logical conditionstemperature > 35 (only critical IoT events forwarded)
Manage FieldsRename, drop, or type-cast fields for downstream compatibilityRemove unnecessary log fields; convert location to string
AggregateCalculate sum, min, max, average within time window (sliding/tumbling)Compute hourly average sensor readings per device
Group ByAggregate events grouped by key(s)Summarize event count per region in real time
JoinMatch/merge two streams on keys/timestampsJoin sensor readings with device metadata
UnionMerge multiple streams (same schema) for unified outputCombine event traffic from multiple factories/countries
Expand ArrayCreate a row per array element within an eventDecompose batched sensor packet into per-reading records

4.4 Add Event Destinations

  1. On the canvas, click Add Destination.
  2. Select:
    • Lakehouse (Delta/Parquet for storage; supports batch and analytics)
    • KQL Database (for instant querying and Power BI dashboards)
    • Eventhouse (optimized for high-velocity clickstream data)
    • Data Activator (to trigger notifications, workflows, or integrations on events)
    • Custom API or Webhook (trigger third-party apps and workflows)
  3. Configure additional partitioning, latency windows, or retention as needed.

5️⃣ Eventstream & Code: Custom App Integration

Integrate Eventstream with Your Code (Node.js Example)

Send events from an application using Azure Event Hubs SDK:


const { EventHubProducerClient } = require("@azure/event-hubs");
const connectionString = "<Fabric stream connection string>";
const eventHubName = "<Fabric stream name>";
const client = new EventHubProducerClient(connectionString, eventHubName);

async function sendEvent(data) {
  const batch = await client.createBatch();
  batch.tryAdd({ body: data });
  await client.sendBatch(batch);
}
sendEvent({ temperature: 41, deviceId: "sensor-103", status: "hot" });
  • Replace placeholders with yours from the Fabric Eventstream source connection UI.
  • Test sending a sample event as JSON.

6️⃣ Advanced Techniques & Automation

  • Real-time analytics: Route events into KQL DB for streaming SQL-like queries and immediate dashboarding.
  • Data Activator: Define conditions (e.g., humidity > 80) to auto-trigger Teams/Slack alerts, Azure Logic Apps, or custom notifications without writing code.
  • Historical replay: For CDC/DB log sources, reprocess past data as live events via Lakehouse/Eventhouse storage sinks.
  • Combining sources: Use Unions/Joins to orchestrate complex multi-source pipelines (e.g., merge app telemetry + IoT + transaction logs into a single unified analytics stream).
  • Monitoring & diagnostics: Monitor pipeline health, throughput, and error metrics in Fabric’s native dashboard. Proactively set up alerts for failures or stalling pipelines.

7️⃣ Monitoring, Troubleshooting & Optimization

  • Use Fabric’s built-in tools to view stream health, event lag, dropped data, and throughput.
  • Enable diagnostic logging for long-term monitoring/troubleshooting.
  • Partition data in destinations for scalable storage and query performance (especially for high-volume/lakehouse scenarios).
  • Design granularity (e.g., hourly/daily files) based on expected analytics and retention needs.

8️⃣ Best Practices for Production Eventstream in Fabric

  • Enable Enhanced Capabilities: Unlock advanced transformations and flexible routing.
  • Keep transformations modular: Isolate heavy logic for maintainability and performance.
  • Document your flows: Use the built-in description fields and maintain diagrams for team clarity.
  • Start with staging: Build/test in a non-production workspace before moving to critical data flows.
  • Leverage automation: Use Data Activator for hands-off, event-driven workflows across the organization.

9️⃣ Real-World Case Study: Logistics IoT Monitoring

Scenario: A logistics provider needs to monitor truck temperature and route violations.

Design:

  • Source: Azure IoT Hub streaming truck sensor feeds
  • Transform: Filter readings where temperature > 38°C
  • Destination: Real-time KQL query for dashboard; Data Activator for instant SMS/Teams alerts

Outcome: Operational dashboard, live anomaly detection, and automated incident response—all real time.

Microsoft Fabric tutorial, Eventstream in Microsoft Fabric, Fabric Eventstream tutorial, Microsoft Fabric real-time data, real-time data streaming in Fabric, Azure Event Hub Fabric integration, Fabric Eventstream pipeline, Microsoft real-time analytics, no-code data stream processing, Microsoft Fabric real-time ingestion, Eventstream KQL database, Microsoft Fabric Data Activator, IoT streaming with Fabric, event routing in Microsoft Fabric, Microsoft Fabric for IoT, Stream processing in Fabric, Eventhub to Fabric integration, how to use Fabric Eventstream, CDC with Microsoft Fabric, Lakehouse with Eventstream

Scroll to Top