API 参考API reference

对于 Delta 表的大多数读取和写入操作,可使用 Apache Spark 读取器和编写器 API。For most read and write operations on Delta tables, you can use Apache Spark reader and writer APIs. 有关示例,请参阅表批处理读取和写入以及表流式处理读取和写入For examples, see Table batch reads and writes and Table streaming reads and writes.

但是,有一些操作是特定于 Delta Lake 的,必须使用 Delta Lake API。However, there are some operations that are specific to Delta Lake and you must use Delta Lake APIs. 有关示例,请参阅表实用工具命令For examples, see Table utility commands.

备注

某些 Delta Lake API 仍在不断发展,在 API 文档中用“Evolving”限定符来表示。Some Delta Lake APIs are still evolving and are indicated with the Evolving qualifier in the API docs.

Azure Databricks 可确保 Delta Lake 项目与 Databricks Runtime 中的 Delta Lake 之间的二进制兼容性。Azure Databricks ensures binary compatibility between the Delta Lake project and Delta Lake in Databricks Runtime. 若要查看每个 Databricks Runtime 版本中打包的 Delta Lake API 版本以及指向 API 文档的链接,请参阅 Delta Lake API 兼容性矩阵To view the Delta Lake API version packaged in each Databricks Runtime version and links to the API documentation, see the Delta Lake API compatibility matrix.