How can I use PyCharm with Azure Databricks?
PyCharm by JetBrains is a dedicated Python integrated development environment (IDE) providing a wide range of essential tools for Python developers, tightly integrated to create a convenient environment for productive Python, web, and data science development. You can use PyCharm on your local development machine to write, run, and debug Python code in remote Azure Databricks workspaces.
The following Databricks tools enable functionality for working with Azure Databricks from PyCharm:
Name | Description |
---|---|
PyCharm Databricks plugin | Configure a connection to a remote Databricks workspace and run files on Databricks clusters from PyCharm. This plugin is developed and provided by JetBrains in partnership with Databricks. |
Databricks Connect in PyCharm with Python | Write, run, and debug local Python code on a remote Azure Databricks workspace from PyCharm. |
Databricks Asset Bundles | Programmatically define, deploy, and run Azure Databricks jobs, Delta Live Tables pipelines, and MLOps Stacks using CI/CD best practices and workflows from PyCharm. |
Databricks CLI | Work with Azure Databricks from the command line using the built-in Terminal in PyCharm. |
Databricks SDK for Python | Write, run, and debug Python code that works with Azure Databricks in PyCharm. |
Databricks SQL Connector for Python | Write, run, and debug Python code that works with Databricks SQL warehouses in remote Azure Databricks workspaces. |
Provision infrastructure | Provision Azure Databricks infrastructure with Terraform and follow infrastructure-as-code (IaC) best practices using the Terraform and HCL plugin for PyCharm. Write and deploy Python definitions of Azure Databricks infrastructure in PyCharm through third-party offerings such as the Cloud Development Kit for Terraform (CDKTF) and Pulumi. |