spark同时读取多个指定文件

it2022-05-05  94

val result = spark.read.text("hdfs://192.168.40.51:9000/user/test/cxb/aa/aa.txt","hdfs://192.168.40.51:9000/user/test/cxb/bb/bb.txt") .toDF("str").as[A].rdd.collect().foreach(println)

 


最新回复(0)