Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
This section contains reference and instructions for Lakeflow Declarative Pipelines developers.
Data loading and transformations are implemented in Lakeflow Declarative Pipelines by queries that define streaming tables and materialized views. To implement these queries, Lakeflow Declarative Pipelines supports SQL and Python interfaces. Because these interfaces provide equivalent functionality for most data processing use cases, pipeline developers can choose the interface that they are most comfortable with.
Python development
Create Lakeflow Declarative Pipelines using Python code.
Topic | Description |
---|---|
Develop pipeline code with Python | An overview of developing Lakeflow Declarative Pipelines in Python. |
Lakeflow Declarative Pipelines Python language reference | Python reference documentation for the dlt module. |
Manage Python dependencies for Lakeflow Declarative Pipelines | Instructions for managing Python libraries with Lakeflow Declarative Pipelines. |
Import Python modules from Git folders or workspace files | Instructions for using Python modules that you have stored in Azure Databricks. |
SQL development
Create Lakeflow Declarative Pipelines using SQL code.
Topic | Description |
---|---|
Develop pipeline code with SQL | An overview of developing Lakeflow Declarative Pipelines in SQL. |
Lakeflow Declarative Pipelines SQL language reference | Reference documentation for SQL syntax for Lakeflow Declarative Pipelines. |
Use Lakeflow Declarative Pipelines in Databricks SQL | Use Databricks SQL to work with Lakeflow Declarative Pipelines. |
Other development topics
The following topics describe other ways to develop Lakeflow Declarative Pipelines.
Topic | Description |
---|---|
Convert Lakeflow Declarative Pipelines into a Databricks Asset Bundle project | Convert an existing pipeline to a bundle, which allows you to manage your data processing configuration in a source-controlled YAML file for easier maintenance and automated deployments to target environments. |
Create Lakeflow Declarative Pipelines with dlt-meta | Use the open source dlt-meta library to automate the creation of Lakeflow Declarative Pipelines with a metadata-driven framework. |
Develop Lakeflow Declarative Pipelines code in your local development environment | An overview of options for developing Lakeflow Declarative Pipelines code locally. |