spark dataframe to pairedRDD in Scala -


i new spark , want convert dataframe pairedrdd. dataframe looks like:

tagname,value,minute tag1,13.87,5 tag2,32.50,10 tag3,35.00,5 tag1,10.98,2 tag5,11.0,5 

i want pairedrdd(tagname, value). tried

val bykey:map[string,long] = winowfiverdd.map({case (tagname,value) => (tagname)->value}) 

i getting following error:

error: constructor cannot instantiated expected type 

help appreciated. in advance.

i'd use dataset.as:

import org.apache.spark.rdd.rdd  val df = seq(   ("tag1", "13.87", "5"), ("tag2", "32.50", "10"), ("tag3", "35.00", "5"),    ("tag1", "10.98", "2"), ("tag5", "11.0", "5") ).todf("tagname", "value", "minute")  val pairedrdd: rdd[(string, double)] = df   .select($"tagname", $"value".cast("double"))   .as[(string, double)].rdd 

Comments

Popular posts from this blog

ubuntu - PHP script to find files of certain extensions in a directory, returns populated array when run in browser, but empty array when run from terminal -

php - How can i create a user dashboard -

javascript - How to detect toggling of the fullscreen-toolbar in jQuery Mobile? -