SPRKPY1041
pyspark.sql.functions.explode_outer
Description
Scenario
df = spark.createDataFrame(
[(1, ["foo", "bar"], {"x": 1.0}),
(2, [], {}),
(3, None, None)],
("id", "an_array", "a_map")
)
df.select("id", "an_array", explode_outer("a_map")).show()Additional recommendations
Last updated
