java - Error in "Eclipse Plugin for Scala" while compiling a Spark class -
i using cdh5.1.0 simple spark programming. also, have eclipse juno (comes vm) , installed scala ide plugin 2.10.0. getting following error in ide:
bad symbolic reference. signature in sparkcontext.class refers term io in package org.apache.hadoop not available. may missing current classpath, or version on classpath might incompatible version used when compiling sparkcontext.class. simpleapp.scala /myscalaproject/src/com/test/spark1 line 10 scala problem
code:
package com.test.spark1 import org.apache.spark.sparkconf import org.apache.spark.sparkcontext import org.apache.spark.sparkcontext._ object simpleapp { def main(args: array[string]) { val logfile = "/home/desktop/scala/sparktest.txt" // should file on system val conf = new org.apache.spark.sparkconf().setappname("simple application") val sc = new sparkcontext(conf) val logdata = sc.textfile(logfile, 2).cache() val numas = logdata.filter(line => line.contains("a")).count() val numbs = logdata.filter(line => line.contains("b")).count() println("lines a: %s, lines b: %s").format(numas, numbs) } }
i same error @ line# 10 (var conf - new org.apache.spark.sparkcon...) , line# 15 (println...).
my project build path has /usr/lib/spark/assembly/lib/spark-assembly-1.0.0-cdh5.1.0-hadoop2.3.0-cdh5.1.0.jar
, checked necessary classes simple scala program there.
the compilation error went away once added following jar in build path:
hadoop-common-2.3.0-cdh5.1.0.jar
so there internal dependency missing causing error.
Comments
Post a Comment