---
title: "Rethinking SQL ETL for modern data platforms"
source_name: "Databricks"
original_url: "https://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms"
canonical_url: "https://www.traeai.com/articles/3d70ce6e-561a-40cd-b0f2-a407303ee885"
content_type: "article"
language: "英文"
score: 6
tags: ["ETL","SQL","数据平台"]
published_at: "2026-04-29T16:45:00+00:00"
created_at: "2026-04-29T23:33:03.666959+00:00"
---

# Rethinking SQL ETL for modern data platforms

Canonical URL: https://www.traeai.com/articles/3d70ce6e-561a-40cd-b0f2-a407303ee885
Original source: https://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms

## Summary

本文介绍了Databricks如何重新思考SQL ETL在现代数据平台中的应用，但内容较为泛泛，缺乏深度技术细节。

## Key Takeaways

- Databricks提出了一种新的SQL ETL方法来适应现代数据平台。
- 文章强调了统一的数据管理和治理的重要性。
- 虽然提到了一些新工具和实践，但缺乏具体的技术实现细节。

## Content

Title: Rethinking SQL ETL for modern data platforms

URL Source: http://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms

Published Time: 2026-04-29T16:45:00+0000

Markdown Content:
# Rethinking SQL ETL for modern data platforms | Databricks Blog

[Skip to main content](http://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms#main)

[![Image 1](blob:http://localhost/c3d26385bd032c882a09c45135533626)](http://www.databricks.com/)

[![Image 2](blob:http://localhost/c3d26385bd032c882a09c45135533626)](http://www.databricks.com/)

*   Why Databricks 

    *           *   Discover 

            *   [For App Developers](http://www.databricks.com/developers)

            *   [For Executives](http://www.databricks.com/why-databricks/executives)

            *   [For Startups](http://www.databricks.com/product/startups)

            *   [Lakehouse Architecture](http://www.databricks.com/product/data-lakehouse)

            *   [Databricks AI Research](http://www.databricks.com/research/databricks-ai-research)

        *   Customers 

            *   [Customer Stories](http://www.databricks.com/customers)

        *   Partners 

            *   [Partner Overview Explore the Databricks partner ecosystem](http://www.databricks.com/partners)

            *   [Partner Program Explore benefits, tiers and how to become a partner](http://www.databricks.com/partners/partner-program)

            *   [Find a Partner Discover Databricks partners for your needs](http://www.databricks.com/partners/partner-directory)

            *   [Partner Spotlight Featured partner announcements](http://www.databricks.com/partners/partner-spotlight)

            *   [Cloud Providers Databricks on AWS, Azure and GCP](http://www.databricks.com/partners/cloud-partners)

            *   [Partner Solutions Find custom industry and migration solutions](http://www.databricks.com/partners/consulting-and-si/partner-solutions)

*   Product 

    *           *   Databricks Platform 

            *   [Platform Overview A unified platform for data, analytics and AI](http://www.databricks.com/product/data-intelligence-platform)

            *   [Sharing An open, secure, zero-copy sharing for all data](http://www.databricks.com/product/delta-sharing)

            *   [Governance Unified governance for all data, analytics and AI assets](http://www.databricks.com/product/unity-catalog)

            *   [Artificial Intelligence Build and deploy ML and GenAI applications](http://www.databricks.com/product/artificial-intelligence)

            *   [Business Intelligence Intelligent analytics for real-world data](https://www.databricks.com/product/business-intelligence)

            *   [Database Postgres for data apps and AI agents](http://www.databricks.com/product/lakebase)

            *   [Data Management Data reliability, security and performance](http://www.databricks.com/product/delta-lake-on-databricks)

            *   [Data Warehousing Serverless data warehouse for SQL analytics](http://www.databricks.com/product/databricks-sql)

            *   [Data Engineering ETL and orchestration for batch and streaming data](http://www.databricks.com/product/data-engineering)

            *   [Data Science Collaborative data science at scale](http://www.databricks.com/product/data-science)

            *   [Application Development Quickly build secure data and AI apps](http://www.databricks.com/product/databricks-apps)

            *   [Security Open agentic SIEM built for the AI era](http://www.databricks.com/product/lakewatch)

        *   Integrations and Data 

            *   [Marketplace Open marketplace for data, analytics and AI](http://www.databricks.com/product/marketplace)

            *   [IDE Integrations Build on the Lakehouse in your favorite IDE](http://www.databricks.com/product/data-science/ide-integrations)

            *   [Partner Connect Discover and integrate with the Databricks ecosystem](http://www.databricks.com/partnerconnect)

        *   Pricing 

            *   [Databricks Pricing Explore product pricing, DBUs and more](http://www.databricks.com/product/pricing)

            *   [Cost Calculator Estimate your compute costs on any cloud](http://www.databricks.com/product/pricing/product-pricing/instance-types)

        *   Open Source 

            *   [Open Source Technologies Learn more about the innovations behind the platform](http://www.databricks.com/product/open-source)

*   Solutions 

    *           *   Databricks for Industries 

            *   [Communications](http://www.databricks.com/solutions/industries/communications)

            *   [Financial Services](http://www.databricks.com/solutions/industries/financial-services)

            *   [Healthcare & Life Sciences](http://www.databricks.com/solutions/industries/healthcare-and-life-sciences)

            *   [Manufacturing](http://www.databricks.com/solutions/industries/manufacturing-industry-solutions)

            *   [Media and Entertainment](http://www.databricks.com/solutions/industries/media-and-entertainment)

            *   [Public Sector](http://www.databricks.com/solutions/industries/public-sector)

            *   [Retail](http://www.databricks.com/solutions/industries/retail-industry-solutions)

            *   [See All Industries](http://www.databricks.com/solutions)

        *   Cross Industry Solutions 

            *   [AI Agents](http://www.databricks.com/solutions/ai-agents)

            *   [Cybersecurity](http://www.databricks.com/solutions/industries/cybersecurity)

            *   [Marketing](http://www.databricks.com/solutions/industries/marketing)

        *   Migration & Deployment 

            *   [Data Migration](http://www.databricks.com/solutions/migration)

            *   [Professional Services](http://www.databricks.com/professional-services)

        *   Solution Accelerators 

            *   [Explore Accelerators Move faster toward outcomes that matter](http://www.databricks.com/solutions/accelerators)

*   Resources 

    *           *   Learning 

            *   [Training Discover curriculum tailored to your needs](https://www.databricks.com/learn/training/home)

            *   [Databricks Academy Sign in to the Databricks learning platform](https://www.databricks.com/learn/training/login)

            *   [Certification Gain recognition and differentiation](https://www.databricks.com/learn/training/certification)

            *   [Free Edition Learn professional Data and AI tools for free](http://www.databricks.com/learn/free-edition)

            *   [University Alliance Want to teach Databricks? See how.](http://www.databricks.com/university)

        *   Events 

            *   [Data + AI Summit](https://www.databricks.com/dataaisummit)

            *   [Data + AI World Tour](http://www.databricks.com/dataaisummit/worldtour)

            *   [AI Days](https://www.databricks.com/ai-days)

            *   [Event Calendar](http://www.databricks.com/events)

        *   Blog and Podcasts 

            *   [Databricks Blog Explore news, product announcements, and more](http://www.databricks.com/blog)

            *   [AI Blog Explore our AI research and engineering work](http://www.databricks.com/blog/category/ai)

            *   [Data Brew Podcast Let’s talk data!](http://www.databricks.com/discover/data-brew)

            *   [Champions of Data + AI Podcast Insights from data leaders powering innovation](http://www.databricks.com/discover/champions-of-data-and-ai)

        *   Get Help 

            *   [Customer Support](https://www.databricks.com/support)

            *   [Documentation](https://www.databricks.com/databricks-documentation)

            *   [Community](https://community.databricks.com/s/)

        *   Dive Deep 

            *   [Resource Center](http://www.databricks.com/resources)

            *   [Demo Center](http://www.databricks.com/resources/demos)

            *   [Architecture Center](http://www.databricks.com/resources/architectures)

*   About 

    *           *   Company 

            *   [Who We Are](http://www.databricks.com/company/about-us)

            *   [Our Team](http://www.databricks.com/company/leadership-team)

            *   [Databricks Ventures](http://www.databricks.com/databricks-ventures)

            *   [Contact Us](http://www.databricks.com/company/contact)

        *   Careers 

            *   [Working at Databricks](http://www.databricks.com/company/careers)

            *   [Open Jobs](http://www.databricks.com/company/careers/open-positions)

        *   Press 

            *   [Awards and Recognition](http://www.databricks.com/company/awards-and-recognition)

            *   [Newsroom](http://www.databricks.com/company/newsroom)

        *   Security and Trust 

            *   [Security and Trust](http://www.databricks.com/trust)

*   DATA + AI SUMMIT [![Image 3: Data+ai summit promo](https://www.databricks.com/sites/default/files/2026-03/dais26-nav-promo-240x96-2x.svg) JUNE 15–18|SAN FRANCISCO Last chance to save 50% — ends April 30. Register](http://www.databricks.com/dataaisummit?itm_source=www&itm_category=home&itm_page=home&itm_location=navigation&itm_component=navigation&itm_offer=dataaisummit)

*   [Login](https://login.databricks.com/?dbx_source=www&itm=main-cta-login&l=en-EN)
*   [Contact Us](http://www.databricks.com/company/contact)
*   [Try Databricks](https://www.databricks.com/signup?dbx_source=www&itm_data=dbx-web-nav&l=en-EN&itm_source=www&itm_category=blog&itm_page=rethinking-sql-etl-modern-data-platforms&itm_location=nav&itm_component=menu-area&itm_offer=signup)

1.   [All blogs](http://www.databricks.com/blog/category/all)
2.   / [Industries](http://www.databricks.com/blog/category/industries)

Table of contents

*   [Run and operate SQL ETL on one platform](http://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms#section-1)
*   [Support how teams actually build SQL pipelines](http://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms#section-2)
*   [Build SQL pipelines that evolve with your workloads](http://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms#section-3)
*   [Why SQL ETL should shape your data platform strategy](http://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms#section-4)
*   [Conclusion](http://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms#section-5)

Table of contents

Table of contents

*   [Run and operate SQL ETL on one platform](http://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms#section-1)
*   [Support how teams actually build SQL pipelines](http://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms#section-2)
*   [Build SQL pipelines that evolve with your workloads](http://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms#section-3)
*   [Why SQL ETL should shape your data platform strategy](http://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms#section-4)
*   [Conclusion](http://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms#section-5)

[Industries](http://www.databricks.com/blog/category/industries)April 29, 2026

# Rethinking SQL ETL for modern data platforms

Reduce cost and complexity by unifying fragmented SQL pipelines on a single platform

by [Matt Jones](http://www.databricks.com/blog/author/matt-jones) and [Shanelle Roman](http://www.databricks.com/blog/author/shanelle-roman)

Summary

*   Fragmented SQL ETL drives hidden cost, brittle pipelines, and slow incident resolution
*   Running ETL across warehouses, orchestrators, and tools creates operational drag that scales with every pipeline
*   A unified platform for all SQL ETL removes coordination overhead and lets teams ship faster on one governed system

SQL is the foundation of modern data work. It’s how analytics engineers define transformations, how data warehouse engineers manage pipelines, and how analysts explore and refine data.

But while SQL itself is standardized, the systems used to run SQL ETL are anything but.

In most organizations, SQL pipelines are spread across a combination of tools: a data warehouse for execution, a transformation framework for modeling, an orchestrator for scheduling, and separate systems for monitoring, lineage, and data quality. Each layer addresses a specific need, but together they create a fragmented environment that is difficult to operate and increasingly difficult to scale.

As data teams scale, this fragmentation starts to show up in day-to-day operations. Pipelines fail across multiple systems, dependencies are difficult to trace, and resolving issues often requires jumping between tools that were never designed to work together. At the same time, expectations increase. Teams are asked to deliver fresher data, support more use cases, and move faster, without adding operational overhead.

This is where many data platform strategies begin to break down. Even as organizations invest in modern infrastructure, SQL ETL often remains distributed across multiple systems, carrying forward the same complexity and constraints.

The challenge isn’t SQL itself - it’s how SQL ETL is implemented.

If SQL ETL were designed from the ground up for how teams actually work today, it would look very different. In practice, it would mean:

*   A single platform for ETL
*   Support for every SQL practitioner
*   Open, future-ready pipelines

Together, these principles define a simpler and more durable approach to SQL ETL - one that reduces fragmentation today while supporting how data workloads evolve over time.

## Run and operate SQL ETL on one platform

The challenge in SQL ETL isn’t writing transformations - it’s operating pipelines as they span multiple systems.

In practice, this means coordinating execution in the data warehouse, orchestration in a separate system, and observability layered on afterward. Keeping pipelines running requires stitching these pieces together - tracking dependencies, diagnosing failures, and managing retries across tools that don’t share context.

As pipelines grow in number and importance, this coordination becomes a significant operational burden.

A unified platform simplifies this model by bringing these capabilities together. When execution, orchestration, observability, and governance are part of the same system, pipelines become easier to manage by design. Dependencies are tracked automatically, and issues can be identified and resolved more quickly because the relevant context is available in one place.

On Databricks, SQL ETL is defined and executed within a single platform. Pipelines run with built-in orchestration, while lineage and observability are captured automatically across each stage. Data quality checks and governance controls are integrated directly into pipeline execution rather than managed through separate tools.

This approach is further strengthened by serverless infrastructure and AI-driven optimization. Performance tuning, resource management, and scaling are handled automatically, allowing teams to focus on delivering reliable data rather than operating systems.

> After transitioning our Databricks pipelines to serverless compute, HP realized cloud savings of over 32% and decreased the combined runtime of jobs by 36%. The effortless infrastructure management provided by serverless made this decision an obvious and strategic choice. — Luis Alonso, Head of Data Strategy & Engineering at HP Marketing

The result is a more streamlined and dependable foundation for SQL ETL - one that reduces operational overhead while improving performance and reliability at scale.

## Support how teams actually build SQL pipelines

SQL ETL is fragmented not just because of tools, but because teams don’t all build pipelines the same way.

Analytics engineers - who focus on defining business logic in SQL - often want a way to build pipelines without managing the underlying infrastructure, with testing, version control, and dependencies handled automatically. Data warehouse engineers tend to rely on SQL scripts and stored procedures, often within tightly controlled execution environments. Analysts may create transformations directly within no-code tools or lightweight SQL interfaces.

Many platforms implicitly favor one of these approaches. As organizations grow, they often introduce additional systems to support other personas, resulting in parallel environments that are difficult to standardize and maintain.

A more effective approach is to standardize the platform rather than the interface.

Databricks supports a range of SQL ETL workflows within the same environment. Teams can run existing [dbt workflows](https://docs.databricks.com/aws/en/integrations/dbt-core-tutorial) directly on the platform, lift and shift warehouse-style SQL into scripts and [stored procedures](https://www.databricks.com/blog/introducing-sql-stored-procedures-databricks), accelerate BI workloads with [Materialized Views in Databricks SQL](https://docs.databricks.com/aws/en/ldp/dbsql/materialized), define [declarative pipelines](https://docs.databricks.com/aws/en/ldp/) that simplify production workflows, or use [no-code tools](https://www.databricks.com/blog/announcing-public-preview-lakeflow-designer) for business analysts built on the same platform. Although these approaches differ in how pipelines are authored, they share the same execution engine, governance model, and observability framework.

![Image 4: SQL ETL workflows](https://www.databricks.com/sites/default/files/inline-images/2026-04-blog-rethinking-sql-etl-for-modern-data-platforms-inline-960x490.png)

Expand

This consistency allows organizations to support multiple development styles without introducing fragmentation in how pipelines are run. Teams can work at the level of abstraction that fits their needs, while still benefiting from shared lineage, monitoring, and operational controls.

It also ensures that existing warehouse-style SQL scripts and newer approaches can coexist on the same foundation. Teams do not need to choose between maintaining what they have and adopting new patterns—they can do both within a single system.

Each of these workflows is reflected in a dedicated authoring experience.

1. For **data warehouse engineers** running SQL scripts and stored procedures:

**SQL Editor for Stored Procedures & Materialized Views**

![Image 5: Simple SQL Editor for warehouse-style ETL](https://www.databricks.com/sites/default/files/inline-images/sql-editor-for-warehouse.png)

Expand

Simple SQL Editor for warehouse-style ETL

2. For **analytics engineers** building production pipelines with SQL:

**Spark Declarative Pipelines Editor**

![Image 6: IDE purpose-built for modernized, declarative SQL ETL](https://www.databricks.com/sites/default/files/inline-images/IDE-purpose-built-for-modernized-declarative-SQL-ETL.png)

Expand

IDE purpose-built for modernized, declarative SQL ETL

3. For **analysts and business users** preparing data without code:

**Lakeflow Designer**

![Image 7: Drag-and-drop canvas for no-code data prep](https://www.databricks.com/sites/default/files/inline-images/lakeflow-designer_0.png)

Expand

Natural language or drag-and-drop canvas for no-code data prep

The result is a more cohesive environment for SQL ETL, where collaboration improves and operational complexity does not increase with scale.

## Build SQL pipelines that evolve with your workloads

As new data sources, real-time use cases, and AI workloads emerge, teams are often forced to introduce additional systems or rewrite existing pipelines - adding complexity and cost over time.

Many SQL ETL solutions introduce these constraints through proprietary formats, tightly coupled execution models, or assumptions about how data will be processed. These constraints may not be immediately apparent, but they tend to surface as organizations expand into new workloads, require fresher data, or support a broader set of use cases.

A future-ready approach to SQL ETL prioritizes openness and flexibility from the outset.

Databricks builds SQL ETL on [open table formats](https://docs.databricks.com/aws/en/tables/#storage-formats) and ANSI SQL, helping ensure that pipelines remain portable and interoperable across systems. This reduces the risk of lock-in and allows organizations to retain control over their data and logic as their architecture evolves.

At the same time, Databricks provides a unified SQL model that supports both batch and [real-time](https://docs.databricks.com/aws/en/ldp/streaming-tables) analytics use cases. Rather than requiring separate systems for different workloads, the same SQL-based approach can be applied across a wide range of use cases.

This flexibility allows pipelines to evolve alongside the organization. Teams can continue to run existing SQL workflows while adopting more advanced patterns - such as [incremental processing](https://docs.databricks.com/aws/en/optimizations/incremental-refresh#optimize-materialized-views) or declarative pipelines - when they are needed.

> The conversion to Materialized Views has resulted in a drastic improvement in query performance, with the execution time decreasing from 8 minutes to just 3 seconds. This enables our team to work more efficiently and make quicker decisions based on the insights gained from the data. Plus, the added cost savings have really helped. — Karthik Venkatesan, Security Software Engineering Sr. Manager, Adobe

By avoiding rigid architectural constraints, this approach provides a stable foundation that can support both current requirements and future demands without requiring disruptive changes.

## Why SQL ETL should shape your data platform strategy

Data platform discussions often focus on where data is stored and how queries are executed. In practice, however, the effectiveness of a platform depends just as much on how data pipelines are built and maintained, and whether they are defined in open, interoperable ways that avoid long-term lock-in.

If SQL ETL remains fragmented across multiple systems, organizations are likely to carry forward the same operational complexity and inefficiencies, even after adopting a new platform. Over time, this limits the value of the platform and makes it more difficult to scale data operations.

A more effective approach is to evaluate how well a platform supports SQL ETL across its full lifecycle - from development and execution to monitoring and governance. This includes the ability to support different working styles, reduce operational overhead, and adapt to evolving requirements without introducing additional systems.

Databricks addresses these needs by combining SQL execution, pipeline management, governance, and optimization within a single platform. This unified approach allows teams to build and operate SQL pipelines more efficiently while maintaining the flexibility to support a wide range of workloads.

## Conclusion

SQL will continue to play a central role in how organizations work with data.

As a result, the way SQL ETL is implemented has a direct impact on the effectiveness of the overall data platform. Fragmented approaches introduce complexity and slow teams down, while unified approaches simplify operations and improve scalability.

For organizations evaluating how to evolve their data platforms, SQL ETL is a core consideration. Databricks provides a model for unified, future-proof SQL ETL that brings together execution, pipeline management, and governance within a single platform, while remaining open and adaptable as requirements evolve.

In practice, most organizations aren’t starting from scratch. SQL ETL modernization often stalls because the cost and risk of rewriting production pipelines are too high. Rather than forcing a disruptive rebuild, a more effective approach is to evolve incrementally - running existing pipelines first, consolidating systems over time, and modernizing step by step.

This is how teams can reduce fragmentation today while building toward a more unified, future-proof data platform over time. We’ll dive into this approach in more detail in a future post. In the meantime, you can read more about building, running, and scaling SQL pipelines on a unified lakehouse platform in this ebook, [_A Guide to Building ETL Pipelines with SQL_](https://www.databricks.com/resources/ebook/guide-building-etl-pipelines-sql).

### Get the latest posts in your inbox

Subscribe to our blog and get the latest posts delivered to your inbox.

## Sign up

*

Work Email 

*

Country Country*

By clicking “Subscribe” I understand that I will receive Databricks communications, and I agree to Databricks processing my personal data in accordance with its [Privacy Policy](https://www.databricks.com/legal/privacynotice).

Subscribe

[View all blogs](http://www.databricks.com/blog/category/all)

[![Image 8: databricks logo](https://www.databricks.com/sites/default/files/2023-08/databricks-default.png?v=1712162038)](https://www.databricks.com/)

Why Databricks

Discover
*   [For App Developers](http://www.databricks.com/developers)
*   [For Executives](http://www.databricks.com/why-databricks/executives)
*   [For Startups](http://www.databricks.com/product/startups)
*   [Lakehouse Architecture](http://www.databricks.com/product/data-lakehouse)
*   [Databricks AI Research](http://www.databricks.com/research/databricks-ai-research)

Customers
*   [Customer Stories](https://www.databricks.com/customers)

Partners
*   [Partner Overview](http://www.databricks.com/partners)
*   [Partner Program](http://www.databricks.com/partners/partner-program)
*   [Find a Partner](http://www.databricks.com/partners/partner-directory)
*   [Partner Spotlight](http://www.databricks.com/partners/partner-spotlight)
*   [Cloud Providers](http://www.databricks.com/partners/cloud-partners)
*   [Partner Solutions](http://www.databricks.com/partners/consulting-and-si/partner-solutions)

Why Databricks

Discover

*   [For App Developers](http://www.databricks.com/developers)
*   [For Executives](http://www.databricks.com/why-databricks/executives)
*   [For Startups](http://www.databricks.com/product/startups)
*   [Lakehouse Architecture](http://www.databricks.com/product/data-lakehouse)
*   [Databricks AI Research](http://www.databricks.com/research/databricks-ai-research)

Customers

*   [Customer Stories](https://www.databricks.com/customers)

Partners

*   [Partner Overview](http://www.databricks.com/partners)
*   [Partner Program](http://www.databricks.com/partners/partner-program)
*   [Find a Partner](http://www.databricks.com/partners/partner-directory)
*   [Partner Spotlight](http://www.databricks.com/partners/partner-spotlight)
*   [Cloud Providers](http://www.databricks.com/partners/cloud-partners)
*   [Partner Solutions](http://www.databricks.com/partners/consulting-and-si/partner-solutions)

Product

Databricks Platform
*   [Platform Overview](http://www.databricks.com/product/data-intelligence-platform)
*   [Sharing](http://www.databricks.com/product/delta-sharing)
*   [Governance](http://www.databricks.com/product/unity-catalog)
*   [Artificial Intelligence](http://www.databricks.com/product/artificial-intelligence)
*   [Business Intelligence](http://www.databricks.com/product/business-intelligence)
*   [Database](http://www.databricks.com/product/lakebase)
*   [Data Management](http://www.databricks.com/product/delta-lake-on-databricks)
*   [Data Warehousing](http://www.databricks.com/product/databricks-sql)
*   [Data Engineering](http://www.databricks.com/product/data-engineering)
*   [Data Science](http://www.databricks.com/product/data-science)
*   [Application Development](http://www.databricks.com/product/databricks-apps)
*   [Security](http://www.databricks.com/product/lakewatch)

Pricing
*   [Pricing Overview](http://www.databricks.com/product/pricing)
*   [Pricing Calculator](http://www.databricks.com/product/pricing/product-pricing/instance-types)

[Open Source](http://www.databricks.com/product/open-source)

Integrations and Data
*   [Marketplace](http://www.databricks.com/product/marketplace)
*   [IDE Integrations](http://www.databricks.com/product/data-science/ide-integrations)
*   [Partner Connect](http://www.databricks.com/partnerconnect)

Product

Databricks Platform

*   [Platform Overview](http://www.databricks.com/product/data-intelligence-platform)
*   [Sharing](http://www.databricks.com/product/delta-sharing)
*   [Governance](http://www.databricks.com/product/unity-catalog)
*   [Artificial Intelligence](http://www.databricks.com/product/artificial-intelligence)
*   [Business Intelligence](http://www.databricks.com/product/business-intelligence)
*   [Database](http://www.databricks.com/product/lakebase)
*   [Data Management](http://www.databricks.com/product/delta-lake-on-databricks)
*   [Data Warehousing](http://www.databricks.com/product/databricks-sql)
*   [Data Engineering](http://www.databricks.com/product/data-engineering)
*   [Data Science](http://www.databricks.com/product/data-science)
*   [Application Development](http://www.databricks.com/product/databricks-apps)
*   [Security](http://www.databricks.com/product/lakewatch)

Pricing

*   [Pricing Overview](http://www.databricks.com/product/pricing)
*   [Pricing Calculator](http://www.databricks.com/product/pricing/product-pricing/instance-types)

Open Source

Integrations and Data

*   [Marketplace](http://www.databricks.com/product/marketplace)
*   [IDE Integrations](http://www.databricks.com/product/data-science/ide-integrations)
*   [Partner Connect](http://www.databricks.com/partnerconnect)

Solutions

Databricks For Industries
*   [Communications](http://www.databricks.com/solutions/industries/communications)
*   [Financial Services](http://www.databricks.com/solutions/industries/financial-services)
*   [Healthcare and Life Sciences](http://www.databricks.com/solutions/industries/healthcare-and-life-sciences)
*   [Manufacturing](http://www.databricks.com/solutions/industries/manufacturing-industry-solutions)
*   [Media and Entertainment](http://www.databricks.com/solutions/industries/media-and-entertainment)
*   [Public Sector](http://www.databricks.com/solutions/industries/public-sector)
*   [Retail](http://www.databricks.com/solutions/industries/retail-industry-solutions)
*   [View All](http://www.databricks.com/solutions)

Cross Industry Solutions
*   [Cybersecurity](http://www.databricks.com/solutions/industries/cybersecurity)
*   [Marketing](http://www.databricks.com/solutions/industries/marketing)

[Data Migration](http://www.databricks.com/solutions/migration)

[Professional Services](http://www.databricks.com/professional-services)

[Solution Accelerators](http://www.databricks.com/solutions/accelerators)

Solutions

Databricks For Industries

*   [Communications](http://www.databricks.com/solutions/industries/communications)
*   [Financial Services](http://www.databricks.com/solutions/industries/financial-services)
*   [Healthcare and Life Sciences](http://www.databricks.com/solutions/industries/healthcare-and-life-sciences)
*   [Manufacturing](http://www.databricks.com/solutions/industries/manufacturing-industry-solutions)
*   [Media and Entertainment](http://www.databricks.com/solutions/industries/media-and-entertainment)
*   [Public Sector](http://www.databricks.com/solutions/industries/public-sector)
*   [Retail](http://www.databricks.com/solutions/industries/retail-industry-solutions)
*   [View All](http://www.databricks.com/solutions)

Cross Industry Solutions

*   [Cybersecurity](http://www.databricks.com/solutions/industries/cybersecurity)
*   [Marketing](http://www.databricks.com/solutions/industries/marketing)

Data Migration

Professional Services

Solution Accelerators

Resources

[Documentation](https://www.databricks.com/databricks-documentation)

[Customer Support](https://www.databricks.com/support)

[Community](https://community.databricks.com/)

Learning
*   [Training](http://www.databricks.com/learn/training/home)
*   [Certification](https://www.databricks.com/learn/training/certification)
*   [Free Edition](http://www.databricks.com/learn/free-edition)
*   [University Alliance](http://www.databricks.com/university)
*   [Databricks Academy Login](https://www.databricks.com/learn/training/login)

Events
*   [Data + AI Summit](http://www.databricks.com/dataaisummit)
*   [Data + AI World Tour](http://www.databricks.com/dataaisummit/worldtour)
*   [AI Days](https://www.databricks.com/ai-days)
*   [Event Calendar](http://www.databricks.com/events)

Blog and Podcasts
*   [Databricks Blog](http://www.databricks.com/blog)
*   [AI Blog](http://www.databricks.com/blog/category/ai)
*   [Data Brew Podcast](http://www.databricks.com/discover/data-brew)
*   [Champions of Data & AI Podcast](http://www.databricks.com/discover/champions-of-data-and-ai)

Resources

Documentation

Customer Support

Community

Learning

*   [Training](http://www.databricks.com/learn/training/home)
*   [Certification](https://www.databricks.com/learn/training/certification)
*   [Free Edition](http://www.databricks.com/learn/free-edition)
*   [University Alliance](http://www.databricks.com/university)
*   [Databricks Academy Login](https://www.databricks.com/learn/training/login)

Events

*   [Data + AI Summit](http://www.databricks.com/dataaisummit)
*   [Data + AI World Tour](http://www.databricks.com/dataaisummit/worldtour)
*   [AI Days](https://www.databricks.com/ai-days)
*   [Event Calendar](http://www.databricks.com/events)

Blog and Podcasts

*   [Databricks Blog](http://www.databricks.com/blog)
*   [AI Blog](http://www.databricks.com/blog/category/ai)
*   [Data Brew Podcast](http://www.databricks.com/discover/data-brew)
*   [Champions of Data & AI Podcast](http://www.databricks.com/discover/champions-of-data-and-ai)

About

Company
*   [Who We Are](http://www.databricks.com/company/about-us)
*   [Our Team](http://www.databricks.com/company/leadership-team)
*   [Databricks Ventures](http://www.databricks.com/databricks-ventures)
*   [Contact Us](http://www.databricks.com/company/contact)

Careers
*   [Open Jobs](http://www.databricks.com/company/careers/open-positions)
*   [Working at Databricks](http://www.databricks.com/company/careers)

Press
*   [Awards and Recognition](http://www.databricks.com/company/awards-and-recognition)
*   [Newsroom](http://www.databricks.com/company/newsroom)

[Security and Trust](http://www.databricks.com/trust)

About

Company

*   [Who We Are](http://www.databricks.com/company/about-us)
*   [Our Team](http://www.databricks.com/company/leadership-team)
*   [Databricks Ventures](http://www.databricks.com/databricks-ventures)
*   [Contact Us](http://www.databricks.com/company/contact)

Careers

*   [Open Jobs](http://www.databricks.com/company/careers/open-positions)
*   [Working at Databricks](http://www.databricks.com/company/careers)

Press

*   [Awards and Recognition](http://www.databricks.com/company/awards-and-recognition)
*   [Newsroom](http://www.databricks.com/company/newsroom)

Security and Trust

[![Image 10: databricks logo](https://www.databricks.com/sites/default/files/2023-08/databricks-default.png?v=1712162038)](https://www.databricks.com/)

Databricks Inc.

 160 Spear Street, 15th Floor

 San Francisco, CA 94105

 1-866-330-0121

*   [](https://www.linkedin.com/company/databricks)
*   [](https://www.facebook.com/pages/Databricks/560203607379694)
*   [](https://twitter.com/databricks)
*   [](https://www.databricks.com/feed)
*   [](https://www.glassdoor.com/Overview/Working-at-Databricks-EI_IE954734.11,21.htm)
*   [](https://www.youtube.com/@Databricks)

![Image 12](https://www.databricks.com/sites/default/files/2021/02/telco-icon-2.png?v=1715274112)

[See Careers](https://www.databricks.com/company/careers)

[at Databricks](https://www.databricks.com/company/careers)

*   [](https://www.linkedin.com/company/databricks)
*   [](https://www.facebook.com/pages/Databricks/560203607379694)
*   [](https://twitter.com/databricks)
*   [](https://www.databricks.com/feed)
*   [](https://www.glassdoor.com/Overview/Working-at-Databricks-EI_IE954734.11,21.htm)
*   [](https://www.youtube.com/@Databricks)

© Databricks 2026. All rights reserved. Apache, Apache Spark, Spark, the Spark Logo, Apache Iceberg, Iceberg, and the Apache Iceberg logo are trademarks of the [Apache Software Foundation](https://www.apache.org/).

*   [Privacy Notice](https://www.databricks.com/legal/privacynotice)
*   |[Terms of Use](https://www.databricks.com/legal/terms-of-use)
*   |[Modern Slavery Statement](https://www.databricks.com/legal/modern-slavery-policy-statement)
*   |[California Privacy](https://www.databricks.com/legal/supplemental-privacy-notice-california-residents)
*   |[Your Privacy Choices](http://www.databricks.com/blog/rethinking-sql-etl-modern-data-platforms#yourprivacychoices)
*   ![Image 14](https://www.databricks.com/sites/default/files/2022-12/gpcicon_small.png)

## We Care About Your Privacy

Databricks uses cookies and similar technologies to enhance site navigation, analyze site usage, personalize content and ads, and as further described in our [Cookie Notice](https://www.databricks.com/legal/cookienotice). To disable non-essential cookies, click “Reject All”. You can also manage your cookie settings by clicking “Manage Preferences.” 

Manage Preferences

Reject All Accept All

![Image 16: Databricks Company Logo](https://cdn.cookielaw.org/logos/29b588c5-ce77-40e2-8f89-41c4fa03c155/bc546ffe-d1b7-43af-9c0b-9fcf4b9f6e58/1e538bec-8640-4ae9-a0ca-44240b0c1a20/databricks-logo.png)

## Privacy Preference Center

Opt-Out Preference Signal Honored

## Privacy Preference Center

*   ### Your Privacy 
*   ### Strictly Necessary Cookies 
*   ### Performance Cookies 
*   ### Functional Cookies 
*   ### Targeting Cookies 
*   ### TOTHR 

#### Your Privacy

When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.

#### Opting out of sales, sharing, and targeted advertising

 Depending on your location, you may have the right to opt out of the “sale” or “sharing” of your personal information or the processing of your personal information for purposes of online “targeted advertising.” You can opt out based on cookies and similar identifiers by disabling optional cookies here. To opt out based on other identifiers (such as your email address), submit a request in our [Privacy Request Center](https://privacypreferences.databricks.com/). 

[More information](https://www.databricks.com/legal/cookienotice)

#### Strictly Necessary Cookies

Always Active

These cookies are necessary for the website to function and cannot be switched off in our systems. They assist with essential site functionality such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will no longer work.

#### Performance Cookies

- [x] Performance Cookies 

These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site.

#### Functional Cookies

- [x] Functional Cookies 

These cookies enable the website to provide enhanced functionality and personalization. They may be set by us or by third party providers whose services we have added to our pages. If you do not allow these cookies then some or all of these services may not function properly.

#### Targeting Cookies

- [x] Targeting Cookies 

These cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant advertisements on other sites. If you do not allow these cookies, you will experience less targeted advertising.

#### TOTHR

- [x] TOTHR 

### Cookie List

Consent Leg.Interest

- [x] checkbox label label

- [x] checkbox label label

- [x] checkbox label label

Clear

*   - [x] checkbox label label 

Apply Cancel

Confirm My Choices

Allow All

[![Image 17: Powered by Onetrust](https://cdn.cookielaw.org/logos/static/powered_by_logo.svg)](https://www.onetrust.com/products/cookie-consent/)

![Image 18](https://bttrack.com/Pixel/Conversion/16418/default?type=img)![Image 19](https://bttrack.com/Pixel/Conversion/16418/landingpage?type=img)

![Image 20](https://insight.adsrvr.org/track/pxl/?adv=b44zbk4&ct=0:hep6l5y&fmt=3)

![Image 21](https://bat.bing.com/action/0?ti=26095796&tm=gtm002&Ver=2&mid=ea77493a-247d-4898-8f69-6eabf1a59259&bo=3&sid=b84949d0442311f184252523149b1f8d&vid=b8496710442311f1b2fab3a578db6ff1&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=800&sh=600&sc=24&tl=Rethinking%20SQL%20ETL%20for%20modern%20data%20platforms%20%7C%20Databricks%20Blog&p=https%3A%2F%2Fwww.databricks.com%2Fblog%2Frethinking-sql-etl-modern-data-platforms&r=&lt=423&evt=pageLoad&sv=2&asc=G&cdb=AQET&rn=971920)
