September 11, 2025
TLDR
This guide demonstrates how to connect Microsoft Fabric Warehouse to Mage Pro for automated data pipelines. The integration requires creating an Azure service principal for authentication, granting it access to your Fabric workspace, and configuring Mage with the connection credentials. Once set up, you can build data pipelines that pull data from external APIs (like weather services) and automatically load it into your Fabric warehouse.
Table of contents
Introduction
Understanding the integration
Azure authentication and Fabric workspace setup
Configure Mage Pro for Fabric integration
Creating Fabric-connected data pipelines in Mage Pro
Conclusion
Introduction
Modern data teams need efficient ways to connect cloud data warehouses with pipeline orchestration tools. This guide demonstrates how to integrate Microsoft Fabric Warehouse with Mage Pro, creating automated data workflows that can pull from both warehouse sources and external APIs.
Understanding the integration
Microsoft Fabric provides a unified analytics platform that combines data warehousing capabilities with other data services. Mage Pro serves as a data pipeline tool that simplifies ETL operations through a visual interface and code-based blocks.
The integration between these platforms rely on Azure Active Directory (Microsoft Entra ID) service principals for authentication. This approach provides secure, programmatic access without requiring individual user credentials in your data pipelines.
Azure authentication and Fabric workspace setup
Step 1: Create the service principal: Create a new app registration in Microsoft Entra ID through Azure Portal, giving it a descriptive name and setting it to "Accounts in this organizational directory only" for proper tenant access control.
Step 2: Copy the Azure Client ID: Obtain the Application (client) ID from the app registration overview page for your authentication configuration.
Step 3: Create client secret: Generate a client secret with appropriate expiration period to serve as your authentication credentials. Make sure to copy the client secret, you will need to add it as a secret to your Mage io_config.yml
file.
Step 4: Grant Fabric workspace access: Click "Add people or groups" in your Fabric workspace access management, search for your service principal using its name or application id, and add it to the workspace.
Step 5: Assign workspace permissions: Grant either "Admin" or "Contributor" permissions to enable your service principal to connect to warehouse resources and execute queries through the Mage integration.

For more detailed setup instruction see the Azure documentation for creating an Azure service principal and granting Fabric workspace access.
Configure Mage Pro for Fabric integration
Complete the integration by configuring Mage with your Azure authentication credentials and updating the io_config.yml file with your Fabric connection details. This two-step configuration enables secure communication between Mage and your Fabric warehouse.
Step 1: Create a data pipeline: Create a new data pipeline in Mage by navigating to the pipelines page, and clicking “New pipeline.” Next select “Start from Scratch” and then choose “Batch data pipeline.” Give the data pipeline a name and then click “Create new pipeline.” After creating the new pipeline you will be navigated to the data pipeline editor page where you can begin your integration with Microsoft Fabric.

Step 2: Enter your secrets: Add your Azure service principal credentials and Fabric warehouse connection details to your Mage environment's secret management system for secure storage. From the pipeline editor page, hover over the right popout navigation menu and select “Secrets.” Add your key value pair in the text box and then click enter.
Step 3: Complete the Mage integration with Fabric: by defining the warehouse connection parameters MICROSOFT_FABRIC_WAREHOUSE_NAME
, MICROSOFT_FABRIC_WAREHOUSE_ENDPOINT
, and MICROSOFT_FABRIC_WAREHOUSE_SCHEMA
in your io_config.yml file
. Use Mage's secret interpolation to reference your stored credentials securely. Your configuration should follow the format shown below.
Creating Fabric-connected data pipelines in Mage Pro
Once your authentication and workspace configuration is complete, building data pipelines in Mage involves adding data loader, transformer, and exporter blocks to handle the flow from external sources to your Fabric warehouse.
Step 1: Add a data loader block: Click the "Add block" button in your pipeline editor and select "Data loader." Choose "Generic (no template)" to create a custom loader for external APIs. This block will pull real-time data from sources like weather APIs, replacing static datasets with dynamic, up-to-date information. Copy the code below into your newly created data loader block, click the “Run” button and then look for a data return in the output of the block.
Step 2: Add a data exporter block: Create a data exporter by selecting "Data exporter" from the Blocks menu and choosing "Microsoft Fabric Warehouse" from the Data warehouse selection. This block handles writing your processed data to the Fabric warehouse using the authentication credentials you configured earlier.
Step 3: Configure the export settings: Specify your target schema (typically 'dbo'), table name, and export behavior in the exporter block. Set the if_exists
parameter to 'append' for incremental data loading or 'replace' for full refreshes. The index=False
setting prevents pandas index columns from being included in your warehouse table.

The data flows seamlessly from external APIs through Mage's transformation capabilities into your Fabric warehouse, creating an automated pipeline that can run on scheduled intervals. This pattern scales to multiple data sources and supports complex transformation logic between the loader and exporter blocks.
Conclusion
This integration provides a foundation for building sophisticated data workflows that combine Microsoft Fabric's enterprise warehousing capabilities with Mage's flexible pipeline orchestration. The service principal authentication model ensures secure, scalable access while maintaining proper access controls.
The combination enables teams to build automated data pipelines that pull from multiple sources, transform data according to business requirements, and deliver insights through various output channels. With proper setup and monitoring, this integration supports reliable, production-scale data operations.