Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

anmolmalviya05

Power BI Enhancements You Need to Know – Part 1: The New Fabric Workspace

With the launch of Microsoft Fabric, the Power BI Workspace has transformed from a basic reporting repository to a unified data experience platform. If you haven't explored your workspace, you're in for a surprise!

 

Then vs. Now: Power BI Workspace Evolution

Before Microsoft Fabric, a Power BI workspace mainly allowed:

  • Reports – Interactive visuals and insights
  • Dashboards – Consolidated KPIs on a single canvas
  • Datasets – Published data models
  • Dataflows – ETL pipelines for reuse

Simple. Neat. But siloed.

 

Now with Fabric Integration – A Full-Fledged Data Hub

The Fabric-powered workspace now brings end-to-end data capabilities directly to Power BI users, enabling them to not only visualize but also ingest, store, transform, analyze, and automate – all in one place.

 

Here's what's new.

Microsoft Fabric – Key Capabilities Categorized

Microsoft Fabric offers a wide range of tools and capabilities to help you build end-to-end data solutions. These capabilities are categorized into the following major sections:

  • Visualize data
  • Get data
  • Store data
  • Prepare data
  • Analyze and train data
  • Track data
  • Develop data
  • Others

Visualize data

Present your data as rich visualizations and insights that can be shared with others.

anmolmalviya05_1-1749022948617.png

 

  • Dashboard: Build a single-page data story.
  • Exploration (preview): Use lightweight tools to analyze your data and uncover trends.
  • Paginated Report (preview): Display tabular data in a report that's easy to print and share.
  • Real-Time Dashboard: Visualize key insights to share with your team.
  • Report: Create an interactive presentation of your data.
  • Scorecard: Define, track, and share key goals for your organization.

Get data

Ingest batch and real-time data into a single location within your Fabric workspace.

anmolmalviya05_2-1749023102965.png

 

  • Copy job: Makes it easy to copy data in Fabric. Includes full copy, incremental copy, and event-based copy modes.
  • Data pipeline: Ingest data at scale and schedule data workflows.
  • Dataflow Gen1 / Gen2: Prep, clean, and transform data.
  • Eventstream: Capture, transform, and route real-time event streams to various destinations in desired formats with a no-code experience.
  • Mirrored databases (Azure Cosmos DB, PostgreSQL, SQL Database, SQL Managed Instance, Snowflake, SQL Server): Easily replicate data from existing sources into an analytics-friendly format.
  • Notebook: Explore, analyze, and visualize data and build ML models. Supports Apache Spark, Python, T-SQL, and more.
  • Spark Job Definition: Define, schedule, and manage your Apache Spark jobs for big data processing.

Store data

Organize, query, and store your ingested data in an easily retrievable format.

anmolmalviya05_3-1749024031443.png

  • Eventhouse: Rapidly load structured, unstructured, and streaming data for querying.
  • Lakehouse: Store big data for cleaning, querying, reporting, and sharing.
  • Sample warehouse: Start a new warehouse with sample data already loaded.
  • Semantic model: Combine data sources in a semantic model to visualize or share.
  • SQL database (preview): Build modern cloud apps that scale on an intelligent, fully managed database.
  • Warehouse: Provide strategic insights from multiple sources into your entire business.

Prepare data

Clean, transform, extract, and load your data for analysis and modeling tasks.

anmolmalviya05_4-1749024114799.png

  • Apache Airflow job: Simplifies the creation and management of Apache Airflow environments for end-to-end data pipelines.
  • Copy job: Includes full copy, incremental copy, and event-based copy modes.
  • Data pipeline: Ingest data at scale and schedule data workflows.
  • Dataflow Gen1 / Gen2: Prep, clean, and transform data.
  • Eventstream: Capture, transform, and route real-time streams.
  • Notebook: Explore, analyze, and build ML models.
  • Spark Job Definition: Define and manage Spark jobs.

Analyze and train data
Propose hypotheses, train models, and explore your data to make decisions and predictions.

anmolmalviya05_5-1749024196252.png

  • Environment: Set up Spark compute settings and resources for notebooks.
  • Experiment: Create, run, and track multiple models to validate hypotheses.
  • ML model: Use machine learning to predict outcomes and detect anomalies.
  • Notebook: Analyze data and build ML models.
  • Spark Job Definition: Manage Spark jobs for big data processing.

Track data

Monitor your streaming or near real-time data and take action on insights.

anmolmalviya05_6-1749024275377.png

  • Activator: Monitor datasets, queries, and streams to trigger alerts.
  • Copy job: Includes event-based monitoring capabilities.
  • Eventhouse: Query streaming and structured data.
  • Eventstream: Real-time data transformation and routing.
  • KQL Queryset: Run queries to generate shareable insights.
  • Scorecard: Define and track organizational goals.

Develop data

Create and build software, applications, and data solutions.

anmolmalviya05_7-1749024484014.png

  • API for GraphQL: Connect apps to Fabric data sources via GraphQL.
  • Environment: Set up Spark resources for development.
  • Notebook: Develop analytical and ML solutions using various languages.

Others

Find unique or industry-specific functionality that extends Fabric’s capabilities.

anmolmalviya05_8-1749024544786.png

  • Healthcare data solutions: Leverage AI to improve healthcare insights and patient outcomes.
  • Retail solutions: Scale and analyze retail data to enhance customer experiences.
  • Streaming dataset: Build visuals directly from real-time data streams.
  • Sustainability solutions: Unify ESG data for disclosures and analytics.

Why It Matters?

This unified approach breaks silos. You no longer have to juggle different tools for data storage, preparation, modeling, and visualization. Instead, Power BI Workspaces in Fabric act as your single pane of glass for the entire data lifecycle.

 

“From data ingestion to actionable dashboards – everything lives in one workspace.”


Proud to be a Microsoft Fabric community super user


Let's Connect on LinkedIn


Subscribe to my YouTube channel for Microsoft Fabric and Power BI updates.