Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
If you're a data analyst who works primarily with SQL queries and your favorite BI tools, Databricks SQL provides an intuitive environment for running ad-hoc queries and creating dashboards on data stored in your data lake. These articles can help you get started.
Note
Databricks SQL Serverless is not available in Azure operated by 21Vianet.
To start, familiarize yourself with some basic Databricks SQL concepts. See Databricks SQL concepts.
Then, learn how to import and use dashboards in the Dashboard Samples Gallery that visualize queries. See Tutorial: Use sample dashboards.
Next, use dashboards to explore data and create a dashboard that you can share. See Dashboards.
Next, use the SQL task type in an Azure Databricks job, allowing you to create, schedule, operate, and monitor workflows that include Databricks SQL objects such as queries, legacy dashboards, and alerts. See SQL task for jobs.
You can also attach a notebook to a SQL warehouse. See Notebooks and SQL warehouses for more information and limitations.
Next, learn how to use COPY INTO in Databricks SQL. See Tutorial: Use COPY INTO with Databricks SQL.
To create a SQL warehouse, see Configure SQL warehouse.