
How Lodgify built an AI-powered ticket classification system in 2 weeks using Mage Pro
December 9, 2025
When David Cahill, a data engineer at Lodgify, set out to solve his customer success team's manual ticket classification problem, he had one primary goal: build something fast that actually works. David and his team went from initial concept to a fully deployed AI-powered classification system in just two weeks, powered by Mage Pro.
"This was our first use of AI within Mage," David explained during a recent Learn with Mage session. "Some people on the team probably wouldn't want me to show this because it was an MVP, but I'm doing the presentation, so here we are. It's the first one we did, and we're basing all of our classifications on this now."
That MVP has since evolved into five different classification systems across Lodgify, fundamentally transforming how the vacation rental management platform handles customer support at scale.
The manual classification bottleneck
Lodgify, a SaaS platform for vacation rental management, processes thousands of customer success tickets as their platform continues to grow. Like many fast-scaling companies, they faced a challenge that was eating away at team productivity and customer satisfaction.
The customer success team was drowning in manual work. Every customer success ticket required a human agent to read through the content, understand the issue, classify it by category and module, determine its urgency, and then route it to the appropriate team member. As the platform scaled this manual process became unattainable.
"The customer success team was overwhelmed by a manual ticketing classification system," David noted. "Team members needed to read each customer success ticket and then classify it for their solution and response."
The problems compounded quickly:
Bottlenecks during high-volume periods meant customers waited longer for responses
Delayed routing prevented urgent issues from being addressed immediately
No visibility into trends made it impossible to proactively address recurring problems
Resource drain kept skilled support agents from focusing on complex, high-value customer interactions
For a company that prides itself on helping property managers streamline their operations, having their own support team bogged down in manual processes was ironic.
Why Mage Pro for AI orchestration?
Lodgify has been using Mage for data orchestration for nearly 3 years, managing approximately 80 pipelines in production across their entire data ecosystem. They've built everything from dynamic dataflow orchestrators replicating 91 core database tables to complex data science models predicting customer churn.
When it came time to solve the ticket classification problem, Mage was the natural choice for several reasons:
Speed of development: Mage's modular block architecture allows teams to build, test, and iterate rapidly. Individual components can be developed and tested independently before connecting them into a complete workflow.
Flexibility for AI workflows: Unlike rigid orchestration tools, Mage makes it easy to integrate LLM calls, manage prompts, and handle the dynamic nature of AI-powered pipelines.
Collaborative prompt engineering: By storing prompts in Google Sheets and loading them as data into Mage pipelines, even non-technical team members could participate in refining the classification logic.
Production-ready infrastructure: Moving from MVP to production required no architectural changes—the same pipeline structure that worked for testing scaled seamlessly to handle all customer tickets.

Building the classification pipeline: A modular approach
The beauty of Lodgify's solution lies in its simplicity. The pipeline consists of just four clearly defined steps, each handling a specific responsibility:
1. Prompt loading from Google Sheets: The team stores their classification prompts in Google Sheets, enabling non-technical team members to refine the AI's logic without touching code. This collaborative approach meant customer success managers could iterate on prompts based on real-world results while engineers focused on pipeline maintenance.
2. Ticket ingestion from BigQuery: The pipeline connects to BigQuery and pulls unprocessed tickets by tracking the last update timestamp. This incremental approach keeps the pipeline efficient and avoids unnecessary reprocessing.
3. LLM-powered classification: The pipeline combines prompts and ticket content into a structured message for the LLM, which returns three key outputs: category classification, module assignment, and urgency score. "We collate everything together as a general message for the prompt, we send it, and the LLM does the rest," David explained.
4. Multi-dimensional export: Classified tickets are exported to two separate BigQuery tables, one organized by module, another by category. This dual structure feeds directly into customer success dashboards, giving teams multiple views to prioritize their work in real-time.
The two-week sprint to production
The timeline for this project is what makes the story remarkable. From conception to production, deploying the MVP took just two weeks.
"We got the brief: let's build an MVP. We built it in about two weeks, and that was with all the testing for the engineering side," David said. "Obviously the prompts took a bit longer to fine-tune, but that was the cool thing, we could iterate and work really fast on it."
This rapid development was possible because of Mage's modular approach. The team could:
Build each component independently
Test with sample data at every step
Iterate on prompts without redeploying code
Validate results before connecting to production data sources
"Because of Mage's modular approach, it was a really cool experiment," David reflected. "This is one of our early AI adoptions within Mage, and I thought it was like a really cool piece of tech to show."

Scaling beyond the MVP
What started as a single classification system quickly proved its value, leading Lodgify to expand the approach across the organization. They now run five different AI-powered classification pipelines, each adapted for specific use cases and departments.
The original architecture scaled effortlessly. "We wanted to avoid somebody going in and having to classify tickets by hand," David explained. "With this setup, we give it the prompts that are already defined, we read the message, and the LLM classifies."
More sophisticated versions of the original pipeline now include:
Enhanced urgency detection with multi-level prioritization
Automated routing to specific team members based on expertise
Trend analysis to identify recurring issues
Integration with other Lodgify systems for context-aware classification
Results that matter
The impact of automated ticket classification extends far beyond just saving time:
Immediate operational improvements: Manual classification that once took minutes per ticket now happens in seconds. Tickets are routed to the right team member on first assignment, eliminating the back-and-forth of incorrect routing.
Better customer experience: Urgent issues are identified and escalated automatically, ensuring customers with critical problems get faster responses. The team can now focus their energy on solving problems rather than categorizing them.
Data-driven insights: With consistent, automated classification, Lodgify gains visibility into ticket trends and patterns. They can identify recurring issues, allocate resources more effectively, and make product improvements based on support data.
Team empowerment: Less-technical team members can now refine classification logic by editing prompts in Google Sheets. This democratization of AI means the system improves continuously based on front-line feedback.
Want to learn more about building AI-powered data pipelines with Mage? Check out our documentation at docs.mage.ai or try Mage Pro free for 14 days.











