Skip to main content

AI and agentic coding

Using AI-powered workflows to build better data products.

View All Tags

Make your AI better at data work with dbt's agent skills

· 14 min read
Joel Labes
Staff Developer Experience Advocate at dbt Labs
Jason Ganz
Director of Community, Developer Experience & AI at dbt Labs

Community-driven creation and curation of best practices is perhaps the driving factor behind dbt and analytics engineering’s rise - transferrable workflows and processes enable everyone to create and disseminate organizational knowledge. In the early days, dbt Labs’ Fishtown Analytics’ dbt_style_guide.md contained foundational guidelines for anyone adopting the dbt viewpoint for the first time.

Today we released a collection of dbt agent skills so that AI agents (like Claude Code, OpenAI's Codex, Cursor, Factory or Kilo Code) can follow the same dbt best practices you would expect of any collaborator in your codebase. This matters because by extending their baseline capabilities, skills can transform generalist coding agents into highly capable data agents.

Diagram showing how dbt agent skills transform generalist coding agents into specialized data agents capable of analytics engineering, semantic layer definition, testing, debugging, natural language querying, and migration workflowsdbt agent skills allow you to transform generalist coding agents into highly capable data agents

These skills encapsulate a broad swathe of hard-won knowledge from the dbt Community and the dbt Labs Developer Experience team. Collectively, they represent dozens of hours of focused work by dbt experts, backed by years of using dbt.

A gif showing Claude using the analytics engineering skill to validate its workWith access to skills, agents like Claude take a systematic approach to tasks

Modernizing the Semantic Layer Spec

· 5 min read
Dave Connors
Product Manager at dbt Labs

New engine, who dis?

It’s unlikely that anyone reading this blog has not heard about the new dbt Fusion engine — it’s been the talk of the data town since last January, culminating in Elias’s legendary live Coalesce 2025 demo of the incredible capabilities that native SQL comprehension in dbt can unlock. If you attended Coalesce, or have upgraded your project to Fusion already, you’ve likely also heard about the changes we’ve made to the authoring layer of dbt (the literal code you write in your project). As part of the major version upgrade, we took the opportunity to simplify + standardize the configuration language of dbt to be built to scale as we enter the next era of analytics engineering.

In particular, we wanted to reevaluate how metrics are defined in the dbt Semantic Layer. We’ve heard from numerous community members over the years that defining metrics was just plain hard. In conversation with internal + external users and our newest pals from SDF, we’ve come up with a redesigned YAML spec that is simpler, more integrated to the dbt configuration experience we’ve come to know and love, and built for the future.

Building the Remote dbt MCP Server

· 7 min read
Devon Fulcher
Senior Software Engineer at dbt Labs

In April, we released the local dbt MCP (Model Context Protocol) server as an open source project to connect AI agents and LLMs with direct, governed access to trusted dbt assets. The dbt MCP server provides a universal, open standard for bridging AI systems with your structured context that keeps your agents accurate, governed, and trustworthy. Learn more in About dbt Model Context Protocol.

Since releasing the local dbt MCP server, the dbt community has been applying it in incredible ways including agentic conversational analytics, data catalog exploration, and dbt project refactoring. However, a key piece of feedback we received from AI engineers was that the local dbt MCP server isn’t easy to deploy or host for multi-tenanted workloads, making it difficult to build applications on top of the dbt MCP server.

This is why we are excited to announce a new way to integrate with dbt MCP: the remote dbt MCP server. The remote dbt MCP server doesn’t require installing dependencies or running the dbt MCP server in your infrastructure, making it easier than ever to build and run agents. It is available today in public beta for users with dbt Starter, Enterprise, or Enterprise+ plans, ready for you to start building AI-powered applications.

Introducing the dbt MCP Server – Bringing Structured Data to AI Workflows and Agents

· 16 min read
Jason Ganz
Director of Community, Developer Experience & AI at dbt Labs

dbt is the standard for creating governed, trustworthy datasets on top of your structured data. MCP is showing increasing promise as the standard for providing context to LLMs to allow them to function at a high level in real world, operational scenarios.

Today, we are open sourcing an experimental version of the dbt MCP server. We expect that over the coming years, structured data is going to become heavily integrated into AI workflows and that dbt will play a key role in building and provisioning this data.