API 参考API reference

对于 Delta 表上最常见的读写操作,可以使用 Apache Spark 读取器和编写器 API(请参阅表的批量读取和写入表的流式读取和写入)。For most common read and write operations on Delta tables, you can use Apache Spark reader and writer APIs (see Table batch reads and writes and Table streaming reads and writes). 但是,有一些操作是特定于 Delta Lake 的,必须使用 Delta Lake 编程 API。However, there are some operations that are specific to Delta Lake and you must use Delta Lake programmatic APIs. 本文将介绍这些编程 API。This article describes these programmatic APIs.

备注

某些编程 API 仍在不断发展,在 API 文档中用“Evolving”限定符来表示。Some programmatic APIs are still evolving and are indicated with the Evolving qualifier in the API docs.

Azure Databricks 可确保 Delta Lake 项目与 Databricks Runtime 中的 Delta Lake 之间的二进制兼容性。Azure Databricks ensures binary compatibility between the Delta Lake project and Delta Lake in Databricks Runtime. 兼容性矩阵列出了每个 Databricks Runtime 版本中打包的 Delta Lake API 版本以及指向相应 API 文档的链接。Compatibility matrixes lists the Delta Lake API version packaged in each Databricks Runtime version and a link to the respective API documentation.