欢迎来到尧图网

客户服务 关于我们

您的位置:首页 > 教育 > 培训 > Flink的环境搭建及使用

Flink的环境搭建及使用

2025/2/25 9:32:03 来源:https://blog.csdn.net/ABU009/article/details/143509173  浏览:    关键词:Flink的环境搭建及使用

在idea中创建一个Maven项目,导入Flink的依赖,在代码中创建Flink环境,编写代码.

如果不想去找flink依赖,就去flink官网,提供了一个mvn的命令,快速下载在本地构建一个flink的项目,可以直接从这个项目的pom.xml文件中拿到依赖配置

一、环境搭建

pom.xml文件的依赖导入

<properties><project.build.sourceEncoding>UTF-8</project.build.sourceEncoding><flink.version>1.15.4</flink.version><target.java.version>1.8</target.java.version><scala.binary.version>2.12</scala.binary.version><maven.compiler.source>${target.java.version}</maven.compiler.source><maven.compiler.target>${target.java.version}</maven.compiler.target><log4j.version>2.17.1</log4j.version></properties><dependencies><dependency><groupId>org.apache.flink</groupId><artifactId>flink-streaming-java</artifactId><version>${flink.version}</version></dependency><dependency><groupId>org.apache.flink</groupId><artifactId>flink-clients</artifactId><version>${flink.version}</version></dependency><dependency><groupId>org.apache.logging.log4j</groupId><artifactId>log4j-slf4j-impl</artifactId><version>${log4j.version}</version><scope>runtime</scope></dependency><dependency><groupId>org.apache.logging.log4j</groupId><artifactId>log4j-api</artifactId><version>${log4j.version}</version><scope>runtime</scope></dependency><dependency><groupId>org.apache.logging.log4j</groupId><artifactId>log4j-core</artifactId><version>${log4j.version}</version><scope>runtime</scope></dependency></dependencies>


二、使用Flink

以WordCount为例:

import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.common.functions.ReduceFunction;
import org.apache.flink.api.java.functions.KeySelector;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.datastream.KeyedStream;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.util.Collector;public class Demo1WordCount {public static void main(String[] args) throws Exception {//1、创建flink的执行环境StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();//设置并行度,一个并行度对应一个taskenv.setParallelism(2);//修改数据从上游发送到下游的缓存时间env.setBufferTimeout(2000);/** 无界流*///2、读取数据//nc -lk 8888DataStream<String> linesDS = env.socketTextStream("master", 8888);//一行转换成多行DataStream<String> wordsDS = linesDS.flatMap(new FlatMapFunction<String, String>() {@Overridepublic void flatMap(String line, Collector<String> out) throws Exception {for (String word : line.split(",")) {//将数据发送到下游out.collect(word);}}});//转换成kv格式DataStream<Tuple2<String, Integer>> kvDS = wordsDS.map(new MapFunction<String, Tuple2<String, Integer>>() {@Overridepublic Tuple2<String, Integer> map(String word) throws Exception {//返回一个二元组return Tuple2.of(word, 1);}});//按照单词进行分组//底层是hash分区KeyedStream<Tuple2<String, Integer>, String> keyByDS = kvDS.keyBy(new KeySelector<Tuple2<String, Integer>, String>() {@Overridepublic String getKey(Tuple2<String, Integer> kv) throws Exception {return kv.f0;}});//统计数量DataStream<Tuple2<String, Integer>> countDS = keyByDS.reduce(new ReduceFunction<Tuple2<String, Integer>>() {@Overridepublic Tuple2<String, Integer> reduce(Tuple2<String, Integer> kv1,Tuple2<String, Integer> kv2) throws Exception {int count = kv1.f1 + kv2.f1;return Tuple2.of(kv1.f0, count);}});//打印结果countDS.print();//3、启动flinkenv.execute("wc");}
}

版权声明:

本网仅为发布的内容提供存储空间,不对发表、转载的内容提供任何形式的保证。凡本网注明“来源:XXX网络”的作品,均转载自其它媒体,著作权归作者所有,商业转载请联系作者获得授权,非商业转载请注明出处。

我们尊重并感谢每一位作者,均已注明文章来源和作者。如因作品内容、版权或其它问题,请及时与我们联系,联系邮箱:809451989@qq.com,投稿邮箱:809451989@qq.com

热搜词