pyspark.sql.functions.second#
- pyspark.sql.functions.second(col)[source]#
Extract the seconds of a given date as integer.
New in version 1.5.0.
Changed in version 3.4.0: Supports Spark Connect.
Changed in version 4.1.0: Added support for time type.
- Parameters
- col
Column
or column name target date/time/timestamp column to work on.
- col
- Returns
Column
seconds part of the timestamp as integer.
See also
Examples
Example 1: Extract the seconds from a string column representing timestamp
>>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([('2015-04-08 13:08:15',), ('2024-10-31 10:09:16',)], ['ts']) >>> df.select("*", sf.typeof('ts'), sf.second('ts')).show() +-------------------+----------+----------+ | ts|typeof(ts)|second(ts)| +-------------------+----------+----------+ |2015-04-08 13:08:15| string| 15| |2024-10-31 10:09:16| string| 16| +-------------------+----------+----------+
Example 2: Extract the seconds from a timestamp column
>>> import datetime >>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([ ... (datetime.datetime(2015, 4, 8, 13, 8, 15),), ... (datetime.datetime(2024, 10, 31, 10, 9, 16),)], ['ts']) >>> df.select("*", sf.typeof('ts'), sf.second('ts')).show() +-------------------+----------+----------+ | ts|typeof(ts)|second(ts)| +-------------------+----------+----------+ |2015-04-08 13:08:15| timestamp| 15| |2024-10-31 10:09:16| timestamp| 16| +-------------------+----------+----------+
Example 3: Extract the seconds from a time column
>>> import datetime >>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([ ... ("13:08:15",), ... ("10:09:16",)], ['t']).withColumn("t", sf.col("t").cast("time")) >>> df.select("*", sf.typeof('t'), sf.second('t')).show() +--------+---------+---------+ | t|typeof(t)|second(t)| +--------+---------+---------+ |13:08:15| time(6)| 15| |10:09:16| time(6)| 16| +--------+---------+---------+