spark dataframe to pairedRDD in Scala -


i new spark , want convert dataframe pairedrdd. dataframe looks like:

tagname,value,minute tag1,13.87,5 tag2,32.50,10 tag3,35.00,5 tag1,10.98,2 tag5,11.0,5 

i want pairedrdd(tagname, value). tried

val bykey:map[string,long] = winowfiverdd.map({case (tagname,value) => (tagname)->value}) 

i getting following error:

error: constructor cannot instantiated expected type 

help appreciated. in advance.

i'd use dataset.as:

import org.apache.spark.rdd.rdd  val df = seq(   ("tag1", "13.87", "5"), ("tag2", "32.50", "10"), ("tag3", "35.00", "5"),    ("tag1", "10.98", "2"), ("tag5", "11.0", "5") ).todf("tagname", "value", "minute")  val pairedrdd: rdd[(string, double)] = df   .select($"tagname", $"value".cast("double"))   .as[(string, double)].rdd 

Comments

Popular posts from this blog

python - Operations inside variables -

Generic Map Parameter java -

arrays - What causes a java.lang.ArrayIndexOutOfBoundsException and how do I prevent it? -