element_at(array, index) |
Returns element of array at given (1-based) index. If Index is 0,
Spark will throw an error. If index < 0, accesses elements from the last to the first.
The function returns NULL if the index exceeds the length of the array and
`spark.sql.ansi.enabled` is set to false.
If `spark.sql.ansi.enabled` is set to true, it throws ArrayIndexOutOfBoundsException
for invalid indices. |
element_at(map, key) |
Returns value for given key. The function returns NULL
if the key is not contained in the map and `spark.sql.ansi.enabled` is set to false.
If `spark.sql.ansi.enabled` is set to true, it throws NoSuchElementException instead. |
map(key0, value0, key1, value1, ...) |
Creates a map with the given key/value pairs. |
map_concat(map, ...) |
Returns the union of all the given maps |
map_contains_key(map, key) |
Returns true if the map contains the key. |
map_entries(map) |
Returns an unordered array of all entries in the given map. |
map_from_arrays(keys, values) |
Creates a map with a pair of the given key/value arrays. All elements
in keys should not be null |
map_from_entries(arrayOfEntries) |
Returns a map created from the given array of entries. |
map_keys(map) |
Returns an unordered array containing the keys of the map. |
map_values(map) |
Returns an unordered array containing the values of the map. |
str_to_map(text[, pairDelim[, keyValueDelim]]) |
Creates a map after splitting the text into key/value pairs using delimiters. Default delimiters are ',' for `pairDelim` and ':' for `keyValueDelim`. Both `pairDelim` and `keyValueDelim` are treated as regular expressions. |
try_element_at(array, index) |
Returns element of array at given (1-based) index. If Index is 0,
Spark will throw an error. If index < 0, accesses elements from the last to the first.
The function always returns NULL if the index exceeds the length of the array. |
try_element_at(map, key) |
Returns value for given key. The function always returns NULL
if the key is not contained in the map. |