ALTER TABLE … COLUMN clause
Applies to: Databricks SQL Databricks Runtime
Adds, modifies, or drops a column in a table or a field in a column in a Delta Lake table.
Required permissions
If you use Unity Catalog you must have MODIFY
permission to:
- ALTER COLUMN
- ADD COLUMN
- DROP COLUMN
All other operations require ownership of the table.
Syntax
ALTER TABLE table_name
{ ADD COLUMN clause |
ALTER COLUMN clause |
DROP COLUMN clause |
RENAME COLUMN clause }
ADD COLUMN clause
This clause is not supported for JDBC
data sources.
Adds one or more columns to the table, or fields to existing columns in a Delta Lake table.
Note
When you add a column to an existing Delta table, you cannot define a DEFAULT
value. All columns added to Delta tables are treated as NULL
for existing rows. After adding a column, you can optionally define a default value for the column, but this is only applied for new rows inserted into the table. Use the following syntax:
ALTER TABLE table_name ALTER COLUMN column_name SET DEFAULT default_expression
Syntax
{ ADD [ COLUMN | COLUMNS ]
( { { column_identifier | field_name } data_type
[ DEFAULT clause ] [ COMMENT comment ] [ FIRST | AFTER identifier ]
[ MASK clause ] } [, ...] ) }
Parameters
-
The name of the column to be added. The name must be unique within the table.
Unless
FIRST
orAFTER name
are specified the column or field will be appended at the end. -
The fully qualified name of the field to be added to an existing column. All components of the path to the nested field must exist and the field name itself must be unique.
DEFAULT default_expression
Applies to: Databricks SQL Databricks Runtime 11.3 LTS and above
Defines a
DEFAULT
value for the column which is used onINSERT
andMERGE ... INSERT
when the column is not specified.If no default is specified,
DEFAULT NULL
is implied for nullable columns.default_expression
may be composed of literals, and built-in SQL functions or operators except:- Aggregate functions
- Analytic window functions
- Ranking window functions
- Table valued generator functions
default_expression
must not contain any subquery.DEFAULT
is supported forCSV
,JSON
,PARQUET
, andORC
sources.-
Specifies the data type of the column or field. Not all data types supported by Azure Databricks are supported by all data sources.
COMMENT comment
An optional
STRING
literal describing the added column or field.FIRST
If specified the column will be added as the first column of the table, or the field will be added as the first field of in the containing struct.
AFTER
identifierIf specified the column or field will be added immediately after the field or column
identifier
.-
Applies to: Databricks SQL Databricks Runtime 12.2 LTS and above Unity Catalog only
Important
This feature is in Public Preview.
Adds a column mask function to anonymize sensitive data. All future queries from that column will receive the result of evaluating that function over the column in place of the column's original value. This can be useful for fine-grained access control purposes wherein the function can inspect the identity and/or group memberships of the invoking user in order to decide whether to redact the value.
ALTER COLUMN clause
Applies to: Databricks SQL Databricks Runtime
Changes a property or the location of a column.
Syntax
{ { ALTER | CHANGE } [ COLUMN ] { column_identifier | field_name }
{ COMMENT comment |
{ FIRST | AFTER column_identifier } |
{ SET | DROP } NOT NULL |
TYPE data_type |
SET DEFAULT clause |
DROP DEFAULT |
SYNC IDENTITY |
SET { MASK clause } |
DROP MASK |
SET TAGS clause |
UNSET TAGS clause } }
Parameters
-
The name of the column to be altered.
-
The fully qualified name of the field to be altered. All components of the path to the nested field must exist.
COMMENT comment
Changes the description of the
column_name
column.comment
must be aSTRING
literal.FIRST
orAFTER
identifierMoves the column from its current position to the front (
FIRST
) or immediatelyAFTER
theidentifier
. This clause is only supported iftable_name
is a Delta table.TYPE
data_typeApplies to: Databricks SQL Databricks Runtime 15.2 and above
Changes the data type of the
column_name
column.This clause is only supported if
table_name
is a Delta table.The following type changes are supported for all Delta tables:
- Increasing the size of a
VARCHAR
column, for example, fromVARCHAR(5)
toVARCHAR(10)
- Changing the type of a
CHAR
column to aVARCHAR
, for example, fromCHAR(5)
toVARCHAR(5)
- Changing the type of a
CHAR
orVARCHAR
column toSTRING
, for example, fromVARCHAR(10)
toSTRING
.
The following type changes are supported for Delta tables with
delta.enableTypeWidening
set totrue
:Important
This feature is in Public Preview in Databricks Runtime 15.2 and above.
Source type Supported wider types BYTE
SHORT
,INT
,BIGINT
,DECIMAL
,DOUBLE
SHORT
INT
,BIGINT
,DECIMAL
,DOUBLE
INT
BIGINT
,DECIMAL
,DOUBLE
BIGINT
DECIMAL
,DOUBLE
FLOAT
DOUBLE
DECIMAL
DECIMAL
with greater precision and scaleDATE
TIMESTAMP_NTZ
For more detailed information on type widening, see Type widening.
- Increasing the size of a
SET NOT NULL
orDROP NOT NULL
Changes the domain of valid column values to exclude nulls
SET NOT NULL
, or include nullsDROP NOT NULL
. This option is only supported for Delta Lake tables. Delta Lake will ensure the constraint is valid for all existing and new data.SYNC IDENTITY
Applies to: Databricks SQL Databricks Runtime 10.4 LTS and above
Synchronize the metadata of an identity column with the actual data. When you write your own values to an identity column, it might not comply with the metadata. This option evaluates the state and updates the metadata to be consistent with the actual data. After this command, the next automatically assigned identity value will start from
start + (n + 1) * step
, wheren
is the smallest value that satisfiesstart + n * step >= max()
(for a positive step).This option is only supported for identity columns on Delta Lake tables.
DROP DEFAULT
Applies to: Databricks SQL Databricks Runtime 11.3 LTS and above
Removes the default expression from the column. For nullable columns this is equivalent to
SET DEFAULT NULL
. For columns defined withNOT NULL
you need to provide a value on every futureINSERT
operationSET DEFAULT default_expression
Applies to: Databricks SQL Databricks Runtime 11.3 LTS and above
Defines a
DEFAULT
value for the column which is used onINSERT
andMERGE ... INSERT
when the column is not specified.If no default is specified
DEFAULT NULL
is implied for nullable columns.default_expression
may be composed of literals, built-in SQL functions, or operators except:- Aggregate functions
- Analytic window functions
- Ranking window functions
- Table valued generator functions
default_expression
must not contain a subquery.DEFAULT
is supported forCSV
,JSON
,ORC
, andPARQUET
sources.When you define the default for a newly added column, the default applies to all pre-existing rows. If the default includes a non-deterministic function such as
rand
orcurrent_timestamp
the value is computed once when theALTER TABLE
is executed, and applied as a constant to pre-existing rows. For newly inserted rows, the default expression runs once per rows.When you set a default using
ALTER COLUMN
, existing rows are not affected by that change.SET
MASK clauseApplies to: Databricks SQL Databricks Runtime 12.2 LTS and above Unity Catalog only
Important
This feature is in Public Preview.
Adds a column mask function to anonymize sensitive data. All future queries from that column will receive the result of evaluating that function over the column in place of the column's original value. This can be useful for fine-grained access control purposes wherein the function can inspect the identity and/or group memberships of the invoking user in order to decide whether to redact the value.
DROP MASK
Applies to: Unity Catalog only
Important
This feature is in Public Preview.
Removes the column mask for this column, if any. Future queries from this column will receive the column's original values.
SET TAGS ( { tag_name = tag_value } [, ...] )
Applies to: Databricks SQL Databricks Runtime 13.3 LTS and above
Apply tags to the column. You need to have
APPLY TAG
permission to add tags to the column.tag_name
A literal
STRING
. Thetag_name
must be unique within the table or column.tag_value
A literal
STRING
.
UNSET TAGS ( tag_name [, ...] )
Applies to: Databricks SQL Databricks Runtime 13.3 LTS and above
Remove tags from the column. You need to have
APPLY TAG
permission to remove tags from the column.tag_name
A literal
STRING
. Thetag_name
must be unique within the table or column.
DROP COLUMN clause
Important
This feature is in Public Preview.
Applies to: Databricks SQL Databricks Runtime 11.3 LTS and above
Drop one or more columns or fields in a Delta Lake table.
When you drop a column or field, you must drop dependent check constraints and generated columns.
For requirements, see Rename and drop columns with Delta Lake column mapping.
Syntax
DROP [COLUMN | COLUMNS] [ IF EXISTS ] ( { {column_identifier | field_name} [, ...] )
Parameters
IF EXISTS
When you specify
IF EXISTS
, Azure Databricks ignores an attempt to drop columns that do not exist. Otherwise, dropping non-existing columns will cause an error.-
The name of the existing column.
-
The fully qualified name of an existing field.
RENAME COLUMN clause
Important
This feature is in Public Preview.
Applies to: Databricks SQL Databricks Runtime 11.3 LTS and above
Renames a column or field in a Delta Lake table.
When you rename a column or field you also need to change dependent check constraints and generated columns. Any primary keys and foreign keys using the column will be dropped. In case of foreign keys you must own the table on which the foreign key is defined.
For requirements, see Rename and drop columns with Delta Lake column mapping.
Syntax
RENAME COLUMN { column_identifier TO to_column_identifier|
field_name TO to_field_identifier }
Parameters
-
The existing name of the column.
-
The new column identifier. The identifier must be unique within the table.
-
The existing fully qualified name of a field.
-
The new field identifier. The identifier must be unique within the local struct.
Examples
See ALTER TABLE examples.