Get started tutorials on Azure Databricks

The tutorials in this section introduce core features and guide you through the basics of working with the Azure Databricks platform.

For information about online training resources, see Get free Databricks training.

If you do not have an Azure Databricks account, sign up for a trial.

Tutorial Description
Query and visualize data Use a Databricks notebook to query sample data stored in Unity Catalog and then visualize the query results in the notebook.
Import and visualize CSV data from a notebook Use a Databricks notebook to import data from a CSV file from https://health.data.ny.gov into your Unity Catalog volume.
Create a table Create a table and grant privileges in Azure Databricks using the Unity Catalog data governance model.
Explore dashboards and query data in Databricks One Navigate the Databricks One interface designed for business users. View dashboards, and discover assets shared with you.

Data engineering

Tutorial Description
Build an ETL pipeline using Lakeflow Spark Declarative Pipelines Create and deploy an ETL (extract, transform, and load) pipeline for data orchestration using Lakeflow Spark Declarative Pipelines and Auto Loader.
Build an ETL pipeline using Apache Spark Develop and deploy your first ETL (extract, transform, and load) pipeline for data orchestration with Apache Sparkā„¢.

AI and machine learning

Tutorial Description
Train and deploy an ML model Build a machine learning classification model using the scikit-learn library on Databricks to predict whether a wine is considered "high-quality". This tutorial also illustrates the use of MLflow to track the model development process, and Hyperopt to automate hyperparameter tuning.

Get help

  • If you have any questions about setting up Azure Databricks and need live help, please e-mail onboarding-help@databricks.com.

  • If your organization does not have an Azure Databricks support subscription, or if you are not an authorized contact for your company's support subscription, you can get answers from the Databricks Community.