java – 在Local中执行示例Flink程序

前端之家收集整理的这篇文章主要介绍了java – 在Local中执行示例Flink程序前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。
我试图在本地模式下在Apache Flink中执行示例程序.
import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.java.DataSet;
import org.apache.flink.api.java.ExecutionEnvironment;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.util.Collector;


public class WordCountExample {
    public static void main(String[] args) throws Exception {
        final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();

        DataSet<String> text = env.fromElements(
            "Who's there?","I think I hear them. Stand,ho! Who's there?");
        //DataSet<String> text1 = env.readTextFile(args[0]);

        DataSet<Tuple2<String,Integer>> wordCounts = text
            .flatMap(new LineSplitter())
            .groupBy(0)
            .sum(1);

        wordCounts.print();
        env.execute();

        env.execute("Word Count Example");
    }

    public static class LineSplitter implements FlatMapFunction<String,Tuple2<String,Integer>> {
        @Override
        public void flatMap(String line,Collector<Tuple2<String,Integer>> out) {
            for (String word : line.split(" ")) {
                out.collect(new Tuple2<String,Integer>(word,1));
            }
        }
    }
}

它给了我例外:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/InputFormat
    at WordCountExample.main(WordCountExample.java:10)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.InputFormat
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 1 more

我究竟做错了什么?

我也使用过正确的罐子.
弗林克的Java-0.9.0-里程碑,1.jar
弗林克的客户端-0.9.0-里程碑,1.jar
弗林克核-0.9.0-里程碑,1.jar

解决方法

在项目中添加三个Flink Jar文件作为依赖项是不够的,因为它们具有其他传递依赖性,例如在Hadoop上.

获得开发(和本地执行)Flink程序的工作设置的最简单方法是遵循使用Maven原型配置Maven项目的quickstart guide.可以将此Maven项目导入IDE.

猜你在找的Java相关文章