java - getting error in importing spark dependencies in intellij idea -
i using intelli j thought maven integration getting error on next lines
import org.apache.spark.sparkconf; import org.apache.spark.api.java.javardd; import org.apache.spark.api.java.javasparkcontext; import org.apache.spark.api.java.function.function;
i trying run next illustration
package com.spark.hello; import org.apache.spark.sparkconf; import org.apache.spark.api.java.javardd; import org.apache.spark.api.java.javasparkcontext; import org.apache.spark.api.java.function.function; public class hello { public static void main(string[] args) { string logfile = "f:\\spark\\a.java"; sparkconf conf = new sparkconf().setappname("simple application"); javasparkcontext sc = new javasparkcontext(conf); javardd<string> logdata = sc.textfile(logfile).cache(); long numas = logdata.filter(new function<string, boolean>() { public boolean call(string s) { homecoming s.contains("a"); } }).count(); long numbs = logdata.filter(new function<string, boolean>() { public boolean call(string s) { homecoming s.contains("b"); } }).count(); system.out.println("lines a: " + numas + ", lines b: " + numbs); } }
plz help me solve issue or there other way run kind of project???
without seeing error, i'm guessing ide telling unused imports sure double check dependencies , versions.
alt + enter shortcut i've used resolve many of issues.
java apache maven intellij-idea
No comments:
Post a Comment