pyspark.sql.functions.unix_micros#

pyspark.sql.functions.unix_micros(col)[source]#

Returns the number of microseconds since 1970-01-01 00:00:00 UTC.

New in version 3.5.0.

Examples

>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
>>> df = spark.createDataFrame([('2015-07-22 10:00:00',)], ['t'])
>>> df.select(unix_micros(to_timestamp(df.t)).alias('n')).collect()
[Row(n=1437584400000000)]
>>> spark.conf.unset("spark.sql.session.timeZone")