rdd - How to load data from saved file with Spark -


spark provide method saveastextfile can store rdd[t] disk or hdfs easily.

t arbitrary serializable class.

i want reverse operation. wonder whether there loadfromtextfile can load file rdd[t]?

let me make clear:

class extends serializable { ... }  val path:string = "hdfs..." val d1:rdd[a] = create_a  d1.saveastextfile(path)  val d2:rdd[a] = a_load_function(path) // function want  //d2 should same d1 

try use d1.saveasobjectfile(path) store , val d2 = sc.objectfile[a](path) load.

i think cannot saveastextfile , read out rdd[a] without transformation rdd[string]


Comments

Popular posts from this blog

c++ - Difference between pre and post decrement in recursive function argument -

php - Nothing but 'run(); ' when browsing to my local project, how do I fix this? -

php - How can I echo out this array? -