pyspark.sql.functions.cbrt#
- pyspark.sql.functions.cbrt(col)[source]#
 Computes the cube-root of the given value.
New in version 1.4.0.
Changed in version 3.4.0: Supports Spark Connect.
- Parameters
 - col
Columnor column name target column to compute on.
- col
 - Returns
 Columnthe column for computed results.
Examples
Example 1: Compute the cube-root
>>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([(-8,), (0,), (8,)], ["value"]) >>> df.select("*", sf.cbrt(df.value)).show() +-----+-----------+ |value|CBRT(value)| +-----+-----------+ | -8| -2.0| | 0| 0.0| | 8| 2.0| +-----+-----------+
Example 2: Compute the cube-root of invalid values
>>> from pyspark.sql import functions as sf >>> spark.sql( ... "SELECT * FROM VALUES (FLOAT('NAN')), (NULL) AS TAB(value)" ... ).select("*", sf.cbrt("value")).show() +-----+-----------+ |value|CBRT(value)| +-----+-----------+ | NaN| NaN| | NULL| NULL| +-----+-----------+