使用Spark scala从字符串格式的复杂JSON创建数据帧的步骤如下:
import org.apache.spark.sql.{SparkSession, DataFrame}
import org.apache.spark.sql.functions._
val spark = SparkSession.builder()
.appName("JSON to DataFrame")
.getOrCreate()
val jsonString = """
{
"name": "John",
"age": 30,
"address": {
"street": "123 Main St",
"city": "New York",
"state": "NY"
},
"hobbies": ["reading", "traveling"],
"education": [
{
"degree": "Bachelor",
"major": "Computer Science"
},
{
"degree": "Master",
"major": "Data Science"
}
]
}
"""
val df = spark.read.json(Seq(jsonString).toDS())
df.printSchema()
df.show()
val flattenedDF = df.select(
col("name"),
col("age"),
col("address.street").alias("street"),
col("address.city").alias("city"),
col("address.state").alias("state"),
explode(col("hobbies")).alias("hobby"),
col("education.degree").alias("degree"),
col("education.major").alias("major")
)
flattenedDF.show()
以上是使用Spark scala从字符串格式的复杂JSON创建数据帧的基本步骤。根据具体的业务需求,你可以进一步对数据帧进行转换、过滤、聚合等操作。如果你想了解更多关于Spark的功能和用法,可以参考腾讯云的Spark产品文档:Spark产品介绍。
领取专属 10元无门槛券
手把手带您无忧上云