Loading [MathJax]/jax/input/TeX/config.js
前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
MCP广场
社区首页 >专栏 >大数据-Flink版本升级到1.17Maven中的相关依赖

大数据-Flink版本升级到1.17Maven中的相关依赖

作者头像
码客说
发布于 2023-09-01 11:26:48
发布于 2023-09-01 11:26:48
2.8K01
代码可运行
举报
文章被收录于专栏:码客码客
运行总次数:1
代码可运行

前言

本文环境版本

zookeeper 3.8.2 hadoop 2.10.2 hive 2.3.9 hbase 2.5.5 flink 1.17.1 scala 2.12 phoenix 5.1.3

官方文档

https://nightlies.apache.org/flink/flink-docs-release-1.17/docs/dev/table/overview/

https://nightlies.apache.org/flink/flink-docs-release-1.17/docs/connectors/table/hive/overview/

依赖搜索

https://central.sonatype.com/

类搜索

https://www.classnotfound.com.cn/

版本定义

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
<properties>
    <maven.compiler.source>8</maven.compiler.source>
    <maven.compiler.target>8</maven.compiler.target>
    <hadoop.version>2.10.2</hadoop.version>
    <hive.version>2.3.9</hive.version>
    <flink.version>1.17.1</flink.version>
    <scala.binary.version>2.12</scala.binary.version>
    <!--这个不是所有的版本都有 2.5.0是兼容2.5.6-->
    <hbase.version>2.5.0</hbase.version>
    <phoenix.version>5.1.3</phoenix.version>

    <!-- 文件拷贝时的编码 -->
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
    <!-- 编译时的编码 -->
    <maven.compiler.encoding>UTF-8</maven.compiler.encoding>
</properties>

Hive

Could not find artifact org.pentaho:pentaho-aggdesigner-algorithm:pom:5.1.5-jhyde

Spark连接Hive导入相关maven依赖时

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
<dependency>
    <groupId>org.apache.hive</groupId>
    <artifactId>hive-exec</artifactId>
    <version>${hive.version}</version>
    <exclusions>
        <exclusion>
            <groupId>org.apache.calcite</groupId>
            <artifactId>calcite-core</artifactId>
        </exclusion>
        <exclusion>
            <groupId>org.apache.calcite</groupId>
            <artifactId>calcite-avatica</artifactId>
        </exclusion>
        <exclusion>
            <artifactId>hadoop-hdfs</artifactId>
            <groupId>org.apache.hadoop</groupId>
        </exclusion>
        <exclusion>
            <artifactId>commons-lang</artifactId>
            <groupId>commons-lang</groupId>
        </exclusion>
        <exclusion>
            <artifactId>commons-collections</artifactId>
            <groupId>commons-collections</groupId>
        </exclusion>
        <exclusion>
            <artifactId>protobuf-java</artifactId>
            <groupId>com.google.protobuf</groupId>
        </exclusion>
        <exclusion>
            <artifactId>slf4j-api</artifactId>
            <groupId>org.slf4j</groupId>
        </exclusion>

    </exclusions>
</dependency>

是因为这个包不在阿里云公共maven镜像仓库上,需要添加一个新的镜像仓库,修改maven的settings.xml

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
<!-- 添加这个镜像仓库在阿里云公共仓库前面 -->
<mirror>
  <id>aliyunmaven</id>
  <mirrorOf>*</mirrorOf>
  <name>spring-plugin</name>
  <url>https://maven.aliyun.com/repository/spring-plugin</url>
</mirror>
<mirror>
  <id>aliyunmaven</id>
  <mirrorOf>*</mirrorOf>
  <name>阿里云公共仓库</name>
  <url>https://maven.aliyun.com/repository/public</url>
</mirror>

之后重新导入依赖即可。 如果是idea开发,需要重启idea,刷新依赖即可。

或者项目中(推荐)

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
<repositories>
    <repository>
        <id>alimaven1</id>
        <name>aliyun maven1</name>
        <url>https://maven.aliyun.com/repository/spring-plugin</url>
    </repository>
    <repository>
        <id>alimaven2</id>
        <name>aliyun maven2</name>
        <url>https://maven.aliyun.com/repository/public</url>
    </repository>
</repositories>

连接Kafka

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
    <version>${flink.version}</version>
    <exclusions>
        <exclusion>
            <artifactId>slf4j-api</artifactId>
            <groupId>org.slf4j</groupId>
        </exclusion>
    </exclusions>
</dependency>

新版本

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-connector-kafka</artifactId>
    <version>1.17.1</version>
</dependency>

测试

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
import org.apache.flink.api.common.serialization.SimpleStringSchema;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.common.serialization.StringDeserializer;

import java.util.Properties;

public class KafkaConsumerExample {
    public static void main(String[] args) throws Exception {
        final String bootstrapServers = "localhost:9092";
        final String groupId = "flink-kafka-consumer";
        final String topicName = "test-topic";

        // 创建 Flink 运行环境
        final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

        // 配置 Kafka 的 Consumer 属性
        final Properties props = new Properties();
        props.setProperty(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
        props.setProperty(ConsumerConfig.GROUP_ID_CONFIG, groupId);
        props.setProperty(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
        props.setProperty(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());

        // 使用 Flink 的 Kafka Consumer 构建数据流
        final FlinkKafkaConsumer<String> kafkaConsumer = new FlinkKafkaConsumer<>(topicName, new SimpleStringSchema(), props);
        final DataStream<String> stream = env.addSource(kafkaConsumer);

        // 打印数据流
        stream.print();

        // 启动 Flink 程序
        env.execute();
    }
}

Flink

创建项目

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
mvn archetype:generate -DarchetypeGroupId=org.apache.flink -DarchetypeArtifactId=flink-quickstart-java -DarchetypeVersion=1.17.1

默认会添加以下依赖

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-streaming-java</artifactId>
    <version>${flink.version}</version>
    <scope>provided</scope>
</dependency>
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-clients</artifactId>
    <version>${flink.version}</version>
    <scope>provided</scope>
</dependency>

Table+DataStream

APIs you want to use

Dependency you need to add

DataStream

flink-streaming-java

DataStream with Scala

flink-streaming-scala_2.12

Table API

flink-table-api-java

Table API with Scala

flink-table-api-scala_2.12

Table API + DataStream

flink-table-api-java-bridge

Table API + DataStream with Scala

flink-table-api-scala-bridge_2.12

  • flink-clients_2.12
  • flink-table-api-java-bridge_2.12
  • flink-table-planner-blink_2.12

Flink的useBlinkPlanner选项是用于启用Blink批处理和流处理的Planner。

Blink是Flink的新一代查询引擎,它提供了更高的性能和更好的稳定性。

useBlinkPlanner选项的作用是告诉Flink使用Blink Planner来解析和优化查询计划。

通过启用Blink Planner,您可以获得更好的查询性能和更高级的查询优化功能。

新版本已经没有该方法,默认使用新的Blink查询引擎。

必装的Jar

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
<!-- flink核心API -->
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-java</artifactId>
    <version>${flink.version}</version>
</dependency>
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-scala_${scala.binary.version}</artifactId>
    <version>${flink.version}</version>
</dependency>

<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-table-common</artifactId>
    <version>${flink.version}</version>
</dependency>

<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-table-api-scala_${scala.binary.version}</artifactId>
    <version>${flink.version}</version>
</dependency>
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-table-api-scala-bridge_${scala.binary.version}</artifactId>
    <version>${flink.version}</version>
</dependency>

<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-table-planner_${scala.binary.version}</artifactId>
    <version>${flink.version}</version>
    <scope>provided</scope>
</dependency>
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-json</artifactId>
    <version>${flink.version}</version>
</dependency>

Flink连接Hive

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
<!-- Hive Connector的支持-->
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-connector-hive_${scala.binary.version}</artifactId>
    <version>${flink.version}</version>
</dependency>

Flink1.12版本备份

这里记录一下旧版本的依赖

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.xhkjedu</groupId>
    <artifactId>yxzt-data-tcs</artifactId>
    <version>1.0-SNAPSHOT</version>
    <packaging>jar</packaging>

    <properties>
        <maven.compiler.source>8</maven.compiler.source>
        <maven.compiler.target>8</maven.compiler.target>
        <hadoop.version>2.7.7</hadoop.version>
        <hive.version>2.1.0</hive.version>
        <flink.version>1.12.7</flink.version>
        <scala.binary.version>2.12</scala.binary.version>
        <!--这个不是所有的版本都有 2.1.6是兼容2.1.10-->
        <hbase.version>2.1.6</hbase.version>
        <phoenix.version>5.1.2</phoenix.version>

        <!-- 文件拷贝时的编码 -->
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
        <!-- 编译时的编码 -->
        <maven.compiler.encoding>UTF-8</maven.compiler.encoding>
    </properties>

    <repositories>
        <repository>
            <id>alimaven</id>
            <name>aliyun maven</name>
            <url>https://maven.aliyun.com/repository/public</url>
        </repository>
    </repositories>


    <dependencies>
        <dependency>
            <groupId>jdk.tools</groupId>
            <artifactId>jdk.tools</artifactId>
            <version>1.8</version>
            <scope>system</scope>
            <systemPath>${env.JAVA_HOME}/lib/tools.jar</systemPath>
        </dependency>

        <!-- flink核心API -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-java</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
            <exclusions>
                <exclusion>
                    <artifactId>flink-shaded-guava</artifactId>
                    <groupId>org.apache.flink</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-scala_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-common</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java-bridge_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala-bridge_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner-blink_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
            <exclusions>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <!-- Hive Connector的支持-->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-hive_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
            <exclusions>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-exec</artifactId>
            <version>${hive.version}</version>
            <exclusions>
                <exclusion>
                    <groupId>org.apache.calcite</groupId>
                    <artifactId>calcite-core</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.calcite</groupId>
                    <artifactId>calcite-avatica</artifactId>
                </exclusion>
                <exclusion>
                    <artifactId>hadoop-hdfs</artifactId>
                    <groupId>org.apache.hadoop</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>commons-lang</artifactId>
                    <groupId>commons-lang</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>commons-collections</artifactId>
                    <groupId>commons-collections</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>protobuf-java</artifactId>
                    <groupId>com.google.protobuf</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>

            </exclusions>
        </dependency>

        <!--读取hadoop文件-->
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>${hadoop.version}</version>
            <exclusions>
                <exclusion>
                    <artifactId>commons-lang</artifactId>
                    <groupId>commons-lang</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>commons-collections</artifactId>
                    <groupId>commons-collections</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>protobuf-java</artifactId>
                    <groupId>com.google.protobuf</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>

            </exclusions>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>


        <!--操作Hbase-->
        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-client</artifactId>
            <version>2.1.10</version>
            <exclusions>
                <exclusion>
                    <artifactId>protobuf-java</artifactId>
                    <groupId>com.google.protobuf</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>metrics-core</artifactId>
                    <groupId>io.dropwizard.metrics</groupId>
                </exclusion>
            </exclusions>
        </dependency>


        <!--JSON工具-->
        <dependency>
            <groupId>com.alibaba.fastjson2</groupId>
            <artifactId>fastjson2</artifactId>
            <version>2.0.22</version>
        </dependency>

        <!--Hive JDBC-->
        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-jdbc</artifactId>
            <version>${hive.version}</version>
            <exclusions>
                <exclusion>
                    <artifactId>protobuf-java</artifactId>
                    <groupId>com.google.protobuf</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>metrics-core</artifactId>
                    <groupId>io.dropwizard.metrics</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>metrics-core</artifactId>
                    <groupId>com.yammer.metrics</groupId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>com.codahale.metrics</groupId>
            <artifactId>metrics-core</artifactId>
            <version>3.0.2</version>
        </dependency>

        <!--连接PostgreSQL-->
        <dependency>
            <groupId>org.postgresql</groupId>
            <artifactId>postgresql</artifactId>
            <version>42.2.2</version>
        </dependency>

        <!--Phoenix-->
        <dependency>
            <groupId>org.apache.phoenix</groupId>
            <artifactId>phoenix-core</artifactId>
            <version>${phoenix.version}</version>
            <exclusions>
                <exclusion>
                    <artifactId>hadoop-auth</artifactId>
                    <groupId>org.apache.hadoop</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hadoop-common</artifactId>
                    <groupId>org.apache.hadoop</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hadoop-yarn-api</artifactId>
                    <groupId>org.apache.hadoop</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-shaded-protobuf</artifactId>
                    <groupId>org.apache.hbase.thirdparty</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-shaded-miscellaneous</artifactId>
                    <groupId>org.apache.hbase.thirdparty</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-protocol</artifactId>
                    <groupId>org.apache.hbase</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-protocol-shaded</artifactId>
                    <groupId>org.apache.hbase</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hadoop-mapreduce-client-core</artifactId>
                    <groupId>org.apache.hadoop</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-client</artifactId>
                    <groupId>org.apache.hbase</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-common</artifactId>
                    <groupId>org.apache.hbase</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-hadoop-compat</artifactId>
                    <groupId>org.apache.hbase</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-hadoop2-compat</artifactId>
                    <groupId>org.apache.hbase</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>antlr-runtime</artifactId>
                    <groupId>org.antlr</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>commons-codec</artifactId>
                    <groupId>commons-codec</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>commons-io</artifactId>
                    <groupId>commons-io</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>commons-lang3</artifactId>
                    <groupId>org.apache.commons</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>jackson-databind</artifactId>
                    <groupId>com.fasterxml.jackson.core</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>jcodings</artifactId>
                    <groupId>org.jruby.jcodings</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>joni</artifactId>
                    <groupId>org.jruby.joni</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>jsr305</artifactId>
                    <groupId>com.google.code.findbugs</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>libthrift</artifactId>
                    <groupId>org.apache.thrift</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>zookeeper</artifactId>
                    <groupId>org.apache.zookeeper</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>metrics-core</artifactId>
                    <groupId>io.dropwizard.metrics</groupId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.apache.phoenix</groupId>
            <artifactId>phoenix-hbase-compat-${hbase.version}</artifactId>
            <version>${phoenix.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.phoenix</groupId>
            <artifactId>phoenix-core</artifactId>
            <version>${phoenix.version}</version>
        </dependency>


        <dependency>
            <groupId>com.google.protobuf</groupId>
            <artifactId>protobuf-java</artifactId>
            <version>2.5.0</version>
            <scope>provided</scope>
        </dependency>
        <!--连接Oracle-->
        <dependency>
            <groupId>com.oracle</groupId>
            <artifactId>ojdbc6</artifactId>
            <version>11.2.0.3</version>
        </dependency>
        <!--连接SQL Server-->
        <dependency>
            <groupId>com.microsoft.sqlserver</groupId>
            <artifactId>sqljdbc4</artifactId>
            <version>4.0</version>
        </dependency>
        <!--操作Mysql-->
        <dependency>
            <groupId>cn.psvmc</groupId>
            <artifactId>mysql8</artifactId>
            <version>1.1</version>
        </dependency>

        <!--mongodb-->
        <dependency>
            <groupId>org.mongodb</groupId>
            <artifactId>bson</artifactId>
            <version>3.12.11</version>
        </dependency>
        <dependency>
            <groupId>org.mongodb</groupId>
            <artifactId>mongodb-driver</artifactId>
            <version>3.12.11</version>
        </dependency>
        <dependency>
            <groupId>org.mongodb</groupId>
            <artifactId>mongodb-driver-core</artifactId>
            <version>3.12.11</version>
        </dependency>


        <!--工具-->
        <dependency>
            <groupId>cn.hutool</groupId>
            <artifactId>hutool-all</artifactId>
            <version>5.5.9</version>
        </dependency>
        <dependency>
            <groupId>com.yxkj</groupId>
            <artifactId>yxdp-algorithm</artifactId>
            <version>1.0.0</version>
        </dependency>

        <dependency>
            <groupId>com.rabbitmq</groupId>
            <artifactId>amqp-client</artifactId>
            <version>4.1.0</version>
        </dependency>
    </dependencies>


    <build>
        <plugins>
            <plugin>
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.4.6</version>

                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>3.0.0</version>
                <configuration>
                    <encoding>UTF-8</encoding>
                    <archive>
                        <manifest>
                            <mainClass>com.xhkjedu.TCS</mainClass>
                        </manifest>
                    </archive>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                        <!-- 增加配置 -->
                        <configuration>
                            <!-- assembly.xml文件路径 -->
                            <descriptors>
                                <descriptor>src/assembly/assembly.xml</descriptor>
                            </descriptors>
                        </configuration>
                    </execution>
                </executions>
            </plugin>

        </plugins>
    </build>

</project>

Flink1.17版本配置

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.xhkjedu</groupId>
    <artifactId>yxzt-data-tcs</artifactId>
    <version>1.0-SNAPSHOT</version>
    <packaging>jar</packaging>

    <properties>
        <maven.compiler.source>8</maven.compiler.source>
        <maven.compiler.target>8</maven.compiler.target>
        <hadoop.version>2.10.2</hadoop.version>
        <hive.version>2.3.9</hive.version>
        <flink.version>1.17.1</flink.version>
        <scala.binary.version>2.12</scala.binary.version>
        <!--这个不是所有的版本都有 2.5.0是兼容2.5.6-->
        <hbase.version>2.5.0</hbase.version>
        <phoenix.version>5.1.3</phoenix.version>

        <!-- 文件拷贝时的编码 -->
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
        <!-- 编译时的编码 -->
        <maven.compiler.encoding>UTF-8</maven.compiler.encoding>
    </properties>

    <repositories>
        <repository>
            <id>alimaven1</id>
            <name>aliyun maven1</name>
            <url>https://maven.aliyun.com/repository/spring-plugin</url>
        </repository>
        <repository>
            <id>alimaven2</id>
            <name>aliyun maven2</name>
            <url>https://maven.aliyun.com/repository/public</url>
        </repository>
    </repositories>


    <dependencies>
        <dependency>
            <groupId>jdk.tools</groupId>
            <artifactId>jdk.tools</artifactId>
            <version>1.8</version>
            <scope>system</scope>
            <systemPath>${env.JAVA_HOME}/lib/tools.jar</systemPath>
        </dependency>

        <!-- flink核心API -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-java</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients</artifactId>
            <version>${flink.version}</version>
            <exclusions>
                <exclusion>
                    <artifactId>flink-shaded-guava</artifactId>
                    <groupId>org.apache.flink</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-common</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java-bridge</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-scala-bridge_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <!-- Hive Connector的支持-->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-hive_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-exec</artifactId>
            <version>${hive.version}</version>
            <exclusions>
                <exclusion>
                    <groupId>org.apache.calcite</groupId>
                    <artifactId>calcite-core</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.calcite</groupId>
                    <artifactId>calcite-avatica</artifactId>
                </exclusion>
                <exclusion>
                    <artifactId>hadoop-hdfs</artifactId>
                    <groupId>org.apache.hadoop</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>commons-lang</artifactId>
                    <groupId>commons-lang</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>commons-collections</artifactId>
                    <groupId>commons-collections</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>protobuf-java</artifactId>
                    <groupId>com.google.protobuf</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>

            </exclusions>
        </dependency>

        <!--读取hadoop文件-->
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>${hadoop.version}</version>
            <exclusions>
                <exclusion>
                    <artifactId>commons-lang</artifactId>
                    <groupId>commons-lang</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>commons-collections</artifactId>
                    <groupId>commons-collections</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>protobuf-java</artifactId>
                    <groupId>com.google.protobuf</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>

            </exclusions>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>


        <!--操作Hbase-->
        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-client</artifactId>
            <version>${hbase.version}</version>
            <exclusions>
                <exclusion>
                    <artifactId>protobuf-java</artifactId>
                    <groupId>com.google.protobuf</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>metrics-core</artifactId>
                    <groupId>io.dropwizard.metrics</groupId>
                </exclusion>
            </exclusions>
        </dependency>


        <!--JSON工具-->
        <dependency>
            <groupId>com.alibaba.fastjson2</groupId>
            <artifactId>fastjson2</artifactId>
            <version>2.0.22</version>
        </dependency>

        <!--Hive JDBC-->
        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-jdbc</artifactId>
            <version>${hive.version}</version>
            <exclusions>
                <exclusion>
                    <artifactId>protobuf-java</artifactId>
                    <groupId>com.google.protobuf</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>metrics-core</artifactId>
                    <groupId>io.dropwizard.metrics</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>metrics-core</artifactId>
                    <groupId>com.yammer.metrics</groupId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>com.codahale.metrics</groupId>
            <artifactId>metrics-core</artifactId>
            <version>3.0.2</version>
        </dependency>

        <!--连接PostgreSQL-->
        <dependency>
            <groupId>org.postgresql</groupId>
            <artifactId>postgresql</artifactId>
            <version>42.2.2</version>
        </dependency>

        <!--Phoenix-->
        <dependency>
            <groupId>org.apache.phoenix</groupId>
            <artifactId>phoenix-core</artifactId>
            <version>${phoenix.version}</version>
            <exclusions>
                <exclusion>
                    <artifactId>hadoop-auth</artifactId>
                    <groupId>org.apache.hadoop</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hadoop-yarn-api</artifactId>
                    <groupId>org.apache.hadoop</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-shaded-protobuf</artifactId>
                    <groupId>org.apache.hbase.thirdparty</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-shaded-miscellaneous</artifactId>
                    <groupId>org.apache.hbase.thirdparty</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-protocol</artifactId>
                    <groupId>org.apache.hbase</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-protocol-shaded</artifactId>
                    <groupId>org.apache.hbase</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hadoop-mapreduce-client-core</artifactId>
                    <groupId>org.apache.hadoop</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-client</artifactId>
                    <groupId>org.apache.hbase</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-common</artifactId>
                    <groupId>org.apache.hbase</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-hadoop-compat</artifactId>
                    <groupId>org.apache.hbase</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>hbase-hadoop2-compat</artifactId>
                    <groupId>org.apache.hbase</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>antlr-runtime</artifactId>
                    <groupId>org.antlr</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>commons-codec</artifactId>
                    <groupId>commons-codec</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>commons-io</artifactId>
                    <groupId>commons-io</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>commons-lang3</artifactId>
                    <groupId>org.apache.commons</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>jackson-databind</artifactId>
                    <groupId>com.fasterxml.jackson.core</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>jcodings</artifactId>
                    <groupId>org.jruby.jcodings</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>joni</artifactId>
                    <groupId>org.jruby.joni</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>jsr305</artifactId>
                    <groupId>com.google.code.findbugs</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>libthrift</artifactId>
                    <groupId>org.apache.thrift</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>slf4j-api</artifactId>
                    <groupId>org.slf4j</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>zookeeper</artifactId>
                    <groupId>org.apache.zookeeper</groupId>
                </exclusion>
                <exclusion>
                    <artifactId>metrics-core</artifactId>
                    <groupId>io.dropwizard.metrics</groupId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.apache.phoenix</groupId>
            <artifactId>phoenix-hbase-compat-${hbase.version}</artifactId>
            <version>${phoenix.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.phoenix</groupId>
            <artifactId>phoenix-core</artifactId>
            <version>${phoenix.version}</version>
        </dependency>


        <dependency>
            <groupId>com.google.protobuf</groupId>
            <artifactId>protobuf-java</artifactId>
            <version>2.5.0</version>
            <scope>provided</scope>
        </dependency>
        <!--连接Oracle-->
        <dependency>
            <groupId>com.oracle</groupId>
            <artifactId>ojdbc6</artifactId>
            <version>11.2.0.3</version>
        </dependency>
        <!--连接SQL Server-->
        <dependency>
            <groupId>com.microsoft.sqlserver</groupId>
            <artifactId>sqljdbc4</artifactId>
            <version>4.0</version>
        </dependency>
        <!--操作Mysql-->
        <dependency>
            <groupId>cn.psvmc</groupId>
            <artifactId>mysql8</artifactId>
            <version>1.1</version>
        </dependency>

        <!--mongodb-->
        <dependency>
            <groupId>org.mongodb</groupId>
            <artifactId>bson</artifactId>
            <version>3.12.11</version>
        </dependency>
        <dependency>
            <groupId>org.mongodb</groupId>
            <artifactId>mongodb-driver</artifactId>
            <version>3.12.11</version>
        </dependency>
        <dependency>
            <groupId>org.mongodb</groupId>
            <artifactId>mongodb-driver-core</artifactId>
            <version>3.12.11</version>
        </dependency>


        <!--工具-->
        <dependency>
            <groupId>cn.hutool</groupId>
            <artifactId>hutool-all</artifactId>
            <version>5.5.9</version>
        </dependency>
        <dependency>
            <groupId>com.yxkj</groupId>
            <artifactId>yxdp-algorithm</artifactId>
            <version>1.0.0</version>
        </dependency>

        <dependency>
            <groupId>com.rabbitmq</groupId>
            <artifactId>amqp-client</artifactId>
            <version>4.1.0</version>
        </dependency>
    </dependencies>


    <build>
        <plugins>
            <plugin>
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.4.6</version>

                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>3.0.0</version>
                <configuration>
                    <encoding>UTF-8</encoding>
                    <archive>
                        <manifest>
                            <mainClass>com.xhkjedu.TCS</mainClass>
                        </manifest>
                    </archive>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                        <!-- 增加配置 -->
                        <configuration>
                            <!-- assembly.xml文件路径 -->
                            <descriptors>
                                <descriptor>src/assembly/assembly.xml</descriptor>
                            </descriptors>
                        </configuration>
                    </execution>
                </executions>
            </plugin>

        </plugins>
    </build>
</project>
代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
flink run-application -t yarn-application \
  -yD security.kerberos.login.keytab=/data/kerberos/kerberos.keytab \
  -yD security.kerberos.login.principal=hdfs/hdp01@HADOOP.COM \
  /root/yxzt-data-tcs-1.0-SNAPSHOT-jar-with-dependencies.jar -job /root/zjhome/test.json
代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
flink run \
  -yD security.kerberos.login.keytab=/data/kerberos/kerberos.keytab \
  -yD security.kerberos.login.principal=hdfs/hdp01@HADOOP.COM \
  /root/yxzt-data-tcs-1.0-SNAPSHOT-jar-with-dependencies.jar -job /root/zjhome/test.json
本文参与 腾讯云自媒体同步曝光计划,分享自作者个人站点/博客。
原始发表:2023-08-29,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
暂无评论
推荐阅读
编辑精选文章
换一批
Mysql To ES By Flink-CDC
CDC是(Change Data Capture 变更数据获取)的简称。核心思想是,监测并捕获数据库的变动(包括数据 或 数据表的插入INSERT、
沈小翊
2023/11/27
1.2K0
spark-sql 批量增量抽取MySQL数据至hive ODS层
码农GT038527
2024/11/23
2480
spark-sql 批量增量抽取MySQL数据至hive ODS层
数据湖(十七):Flink与Iceberg整合DataStream API操作
目前Flink支持使用DataStream API 和SQL API 方式实时读取和写入Iceberg表,建议大家使用SQL API 方式实时读取和写入Iceberg表。
Lansonli
2022/07/15
2.3K0
数据湖(十七):Flink与Iceberg整合DataStream API操作
数据湖(十八):Flink与Iceberg整合SQL API操作
Flink SQL 在操作Iceberg时,对应的版本为Flink 1.11.x 与Iceberg0.11.1版本,目前,Flink1.14.2版本与Iceberg0.12.1版本对于SQL API 来说兼容有问题,所以这里使用Flink1.11.6版本与Iceberg0.11.1版本来演示Flink SQL API 操作Iceberg。
Lansonli
2022/07/22
1.3K0
数据湖(十八):Flink与Iceberg整合SQL API操作
spark-sql 批量全量抽取MySQL数据至hive ODS层
码农GT038527
2024/11/23
2140
大数据Flink进阶(六):Flink入门案例
本案例编写Flink代码选择语言为Java和Scala,所以这里我们通过IntelliJ IDEA创建一个目录,其中包括Java项目模块和Scala项目模块,将Flink Java api和Flink Scala api分别在不同项目模块中实现。步骤如下:
Lansonli
2023/03/20
1.2K0
大数据Flink进阶(六):Flink入门案例
2021年大数据Flink(八):Flink入门案例
Flink提供了多个层次的API供开发者使用,越往上抽象程度越高,使用起来越方便;越往下越底层,使用起来难度越大
Lansonli
2021/10/11
1.4K0
[1133]flink问题集锦
原因:flink1.8版本之后已弃用该参数,ResourceManager将自动启动所需的尽可能多的容器,以满足作业请求的并行性。解决方法:去掉即可
周小董
2022/04/28
4.4K0
[1133]flink问题集锦
实操 | Flink1.12.1通过Table API / Flink SQL读取HBase2.4.0
昨天群里有人问 Flink 1.12 读取Hbase的问题,于是看到这篇文章分享给大家。本文作者Ashiamd。
王知无-import_bigdata
2021/05/07
2.9K1
【极数系列】Flink集成KafkaSink & 实时输出数据(11)
夏之以寒
2024/03/04
4570
【极数系列】Flink集成KafkaSink & 实时输出数据(11)
【极数系列】Flink搭建入门项目Demo & 秒懂Flink开发运行原理(05)
gitee地址:https://gitee.com/shawsongyue/aurora.git 源码直接下载可运行,模块:aurora_flink Flink 版本:1.18.0 Jdk 版本:11
夏之以寒
2024/03/04
7040
【极数系列】Flink搭建入门项目Demo & 秒懂Flink开发运行原理(05)
Flink Logback日志与邮件报警配置
Flink官方推荐使用Logback替代默认的Log4j作为日志框架。我们之前一直用Log4j,最近切换成了更优秀的Logback,但是配置起来略有点麻烦,本文简述配置过程。
王知无-import_bigdata
2020/03/18
2.5K0
003. Flink Local模式安装以及基于Local模式运行程序
因为服务器上的Flink安装目录下已经有了Flink的基础jar包,所以打包时应该把Flink的基础包的范围设置为provided,而在idea中直接运行代码的时候,应该设置这些基础包的范围为compile。
CoderJed
2019/07/25
1.8K0
003. Flink Local模式安装以及基于Local模式运行程序
Flink 系列:Flink 入门不再难!3000字深入浅出 WordCount 实战及精解
在这个数据驱动的时代,掌握大数据技术成为了每一位开发者必不可少的技能。而在众多技术栈中,Flink无疑占据了重要的位置。作为一个高性能、可扩展的实时数据处理框架,Flink已经成为了很多企业和开发者的首选。但对于初学者来说,Flink的学习曲线可能会显得有些陡峭。因此,我们决定打造一系列通俗易懂的Flink学习文章,希望能帮助大家更快地掌握这一强大的技术。
create17
2024/04/15
5890
Flink 系列:Flink 入门不再难!3000字深入浅出 WordCount 实战及精解
大数据开发之Flink Table操作
前言 本文使用环境版本 Hive:2.3.9 Flink:flink-1.12.7-bin-scala_2.12 依赖 <?xml version="1.0" encoding="UTF-8"?> <
码客说
2022/11/22
4510
大数据-Flink编程
groupBy会将一个DataSet转化为一个GroupedDataSet,聚合操作会将GroupedDataSet转化为DataSet。如果聚合前每个元素数据类型是T,聚合后的数据类型仍为T。
码客说
2022/10/04
1.1K0
大数据-Flink编程
大数据-各组件之间的版本兼容关系
http://archive.apache.org/dist/phoenix/phoenix-5.1.2/
码客说
2023/09/29
7130
Flink任务提交问题分析和解决
最近在提交flink项目example下WordCount.jar批处理任务时遇到以下问题:
大数据学习与分享
2023/09/06
1.8K0
Flink任务提交问题分析和解决
Flink 开发环境搭建
Flink 分别提供了基于 Java 语言和 Scala 语言的 API ,如果想要使用 Scala 语言来开发 Flink 程序,可以通过在 IDEA 中安装 Scala 插件来提供语法提示,代码高亮等功能。打开 IDEA , 依次点击 File => settings => plugins 打开插件安装页面,搜索 Scala 插件并进行安装,安装完成后,重启 IDEA 即可生效。
每天进步一点点
2022/07/27
7080
Flink 开发环境搭建
构建第一个Flink应用-WordCount
使用maven初始化第一个flink的wordcount应用,将应用打包上传到flink-standalone集群,运行起来。
Eights
2020/07/10
4470
构建第一个Flink应用-WordCount
推荐阅读
相关推荐
Mysql To ES By Flink-CDC
更多 >
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档
本文部分代码块支持一键运行,欢迎体验
本文部分代码块支持一键运行,欢迎体验