Connect dbt models to Mage Pro blocks (Step-by-step tutorial)

Learn how to seamlessly connect your dbt models with Mage Pro pipelines to automate data workflows and load data into BigQuery. This step-by-step tutorial guides you through creating batch pipelines, configuring YAML files, and running dbt transformations in Mage Pro.
Migrate your dbt Cloud project to Mage Pro

Migrations are usually scary, but they don’t have to be. Moving your dbt Cloud project to Mage Pro is almost as simple as cloning your dbt project into Mage Pro and running a pipeline. You can be fully migrated in about 10 minutes and start connecting your dbt models to Mage Pro blocks. This is a smooth transition
How to Integrate Mage Pro with VS Code: Step-by-Step Guide

This guide demonstrates how to integrate your Mage Pro environment with Visual Studio Code and Cursor. By implementing Tailscale VPN and Remote SSH connections, you'll gain the ability to develop and manage your data pipelines directly within your preferred code editor, enhancing your development workflow and productivity. The integration provides a coding experience with full access to your Mage Pro features while leveraging the advanced editing capabilities of modern code editors.
Mastering Data Transformation with Mage SQL Blocks: A Technical Guide for Data Engineers

SQL Blocks let you connect to data warehouses like BigQuery with minimal setup while maintaining the flexibility to write custom SQL. You can run either raw SQL (for multiple statements) or simple SQL queries, create/merge tables, and monitor execution output. Key best practices include organizing code, testing queries before deployment, and using version control. Common issues involve connection errors, SQL syntax problems, and data export failures - all of which can be resolved by checking configurations and logs.
Step-by-step guide to connecting Mage Pro SQL blocks with Databricks

The guide provides a step-by-step tutorial for integrating Mage Pro SQL blocks with Databricks to create automated data pipelines. It walks through configuring Databricks credentials in Mage Pro, creating a pipeline, adding a data loader block to fetch API data (using golf rankings as an example), securely storing API keys, creating SQL blocks to transfer data to Databricks, and finally verifying the data through queries in Databricks. The integration aims to streamline data workflows by eliminating manual processes and creating a seamless connection between data sources and Databricks for analysis.
Azure Blob Storage file transfer using Mage Pro’s dynamic blocks

Companies are always looking for better ways to manage and process their data. In this blog post, we’ll explore a proof of concept developed for a leading beauty brand, demonstrating how Mage Pro can turn complicated data tasks into smooth operations. Whether you’re a data engineer, analyst, or simply interested in data management, this article will offer valuable insights into building effective data pipelines using Mage Pro, Azure Blob Storage, and Snowflake.