Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Cannot resolve <sqlExpr>
due to data type mismatch:
Input to <functionName>
should have been <dataType>
followed by a value with same element type, but it's [<leftType>
, <rightType>
].
Input to function <functionName>
should have been two <arrayType>
with same element type, but it's [<leftType>
, <rightType>
].
the left and right operands of the binary operator have incompatible types (<left>
and <right>
).
the binary operator requires the input type <inputType>
, not <actualDataType>
.
The Bloom filter binary input to <functionName>
should be either a constant value or a scalar subquery expression, but it's <actual>
.
Input to function <functionName>
should have been <expectedLeft>
followed by value with <expectedRight>
, but it's [<actual>
].
Unable to convert column <name>
of type <type>
to JSON.
Cannot drop all fields in struct.
The function <functionName>
parameter <parameterName>
at position <pos>
requires <requiredType>
. The argument given is <argumentType>
.
cannot cast <srcType>
to <targetType>
.
cannot cast <srcType>
to <targetType>
with ANSI mode on.
If you have to cast <srcType>
to <targetType>
, you can set <config>
as <configVal>
.
cannot cast <srcType>
to <targetType>
.
To convert values from <srcType>
to <targetType>
, you can use the functions <functionNames>
instead.
The given keys of function <functionName>
should all be the same type, but they are <dataType>
.
The given values of function <functionName>
should all be the same type, but they are <dataType>
.
Only foldable STRING
expressions are allowed to appear at odd position, but they are <inputExprs>
.
Input to <functionName>
should all be the same type, but it's <dataType>
.
Filter expression <filter>
of type <type>
is not a boolean.
Input to the function <functionName>
cannot contain elements of the "MAP
" type. In Spark, same maps may have different hashcode, thus hash expressions are prohibited on "MAP
" elements. To restore previous behavior set "spark.sql.legacy.allowHashOnMapType" to "true".
Input to the function <functionName>
cannot contain elements of the "VARIANT
" type yet.
Length of <exprName>
should be 1.
The <inputName>
value must to be a <requireType>
literal of <validValues>
, but got <inputValue>
.
Input schema <schema>
can only contain STRING
as a key type for a MAP
.
Input schema <schema>
must be a struct, an array, a map or a variant.
The key of map cannot be/contain <keyType>
.
The <functionName>
does not support ordering on type <dataType>
.
<errors>
The parameter value of the "apiKey" argument to the ai_generate_text function can not be a constant <inputExpr>
. Recommended usages include secret(scope, key)
function or a SELECT ...
subquery.
Input schema <schema>
can only contain STRING
as a key type for a MAP
.
The data type of one or more elements in the left hand side of an IN
subquery is not compatible with the data type of the output of the subquery. Mismatched columns: [<mismatchedColumns>
], left side: [<leftType>
], right side: [<rightType>
].
The number of columns in the left hand side of an IN
subquery does not match the number of columns in the output of subquery. Left hand side columns(length: <leftLength>
): [<leftColumns>
], right hand side columns(length: <rightLength>
): [<rightColumns>
].
The <functionName>
should all be of type map, but it's <dataType>
.
Input to <functionName>
should have been <dataType>
followed by a value with same key type, but it's [<leftType>
, <rightType>
].
Input to the <functionName>
should have been two maps with compatible key types, but it's [<leftType>
, <rightType>
].
the input <inputName>
should be a foldable <inputType>
expression; however, got <inputExpr>
.
Parameter <paramIndex>
must be an array of string literals.
all arguments must be strings.
Null typed values cannot be used as arguments of <functionName>
.
The <leftExprName>(<leftExprValue>
) must be <constraint>
the <rightExprName>(<rightExprValue>
).
The data type <orderSpecType>
used in the order specification does not support the data type <valueBoundaryType>
which is used in the range frame.
A range window frame with value boundaries cannot be used in a window specification with multiple order by expressions: <orderSpec>
.
A range window frame cannot be used in an unordered window specification.
The input parameter: <paramName>
, value: <paramValue>
can not be used to construct a valid remote URL because <reason>
The input parameter names are not correct. Missing required parameters: <missingValues>
, unrecognized parameters: <unknownValues>
.
<functionName>
uses the wrong parameter type. The parameter type must conform to:
- The start and stop expressions must resolve to the same type.
- If start and stop expressions resolve to the
<startType>
type, then the step expression must resolve to the<stepType>
type. - Otherwise, if start and stop expressions resolve to the
<otherStartType>
type, then the step expression must resolve to the same type.
Window frame bounds <lower>
and <upper>
do not have the same type: <lowerType>
<> <upperType>
.
Window frame upper bound <upper>
does not follow the lower bound <lower>
.
The data type of the <location>
bound <exprType>
does not match the expected data type <expectedType>
.
Window frame <location>
bound <expression>
is not a literal.
The lower bound of a window frame must be <comparison>
to the upper bound.
The data type of the column (<columnIndex>
) do not have the same type: <leftType> (<leftParamIndex>
) <> <rightType> (<rightParamIndex>
).
<msg> <hint>
.
class <className>
not found.
The <paramIndex>
parameter requires the <requiredType>
type, however <inputSql>
has the type <inputType>
.
The <exprName>
must not be null.
The <functionName>
requires return <expectedType>
type, but the actual is <actualType>
type.
cannot find a static method <methodName>
that matches the argument types in <className>
.
The input of <functionName>
can't be <dataType>
type data.
UDFs do not support '<dataType>
' as an input data type.
UDFs do not support '<dataType>
' as an output data type.
The <exprName>
must be between <valueRange>
(current value = <currentValue>
).
The expression requires <expectedNum>
argument types but the actual number is <actualNum>
.
The number of endpoints must be >= 2 to construct intervals but the actual number is <actualNumber>
.