12、Flink基础:Source之从文件读取

文章目录

  • 一.文件准备
  • 二.程序准备
  • 三.运行Flink程序
  • 参考:

一.文件准备

sensor.txt

sensor_1 1547718199 35.8
sensor_6 1547718201, 15.4
sensor_7 1547718202, 6.7
sensor_10 1547718205 38.1

将文件上传到hdfs

hadoop fs -copyFromLocal sensor.txt /user/hive/warehouse

二.程序准备

SourceTest2_File

package org.example;

import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;

/**
 * @author  只是甲
 * @date    2021-08-30
 * @remark  Flink Source之从文件读取
 */

public class SourceTest2_File {
   
     
    public static void main(String[] args) throws Exception {
   
     
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        env.setParallelism(1);

        //从文件读取
        DataStream<String> dataStream = env.readTextFile("hdfs://hp1:8020/user/hive/warehouse/sensor.txt");

        // 打印输出
        dataStream.print();

        env.execute();
    }
}

三.运行Flink程序

运行命令:

flink run -m yarn-cluster -c org.example.SourceTest2_File FlinkStudy-1.0-SNAPSHOT.jar

运行截图:
*

参考:

1、 https://www.bilibili.com/video/BV1qy4y1q728;
2. https://ashiamd.github.io/docsify-notes/#/study/BigData/Flink/%E5%B0%9A%E7%A1%85%E8%B0%B7Flink%E5%85%A5%E9%97%A8%E5%88%B0%E5%AE%9E%E6%88%98-%E5%AD%A6%E4%B9%A0%E7%AC%94%E8%AE%B0?id=_521-%e4%bb%8e%e9%9b%86%e5%90%88%e8%af%bb%e5%8f%96%e6%95%b0%e6%8d%ae
3、 https://www.pianshen.com/article/26011976902/;

版权声明:本文不是「本站」原创文章,版权归原作者所有 | 原文地址: