【已解决】Flink连接JDBC报错 org.apache.flink.runtime.client.JobExecutionException: Job execution failed.

这篇具有很好参考价值的文章主要介绍了【已解决】Flink连接JDBC报错 org.apache.flink.runtime.client.JobExecutionException: Job execution failed.。希望对大家有所帮助。如果存在错误或未考虑完全的地方,请大家不吝赐教,您也可以点击"举报违法"按钮提交疑问。

Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
The last packet successfully received from the server was 1,102 milliseconds ago.  The last packet sent successfully to the server was 1,095 milliseconds ago.
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	Caused by: javax.net.ssl.SSLHandshakeException: No appropriate protocol (protocol is disabled or cipher suites are inappropriate)
	
	

环境:

Flink1.13.6
MySQL5.7.27
JDK8
Hadoop3.1.4
集成环境Idea2020
【已解决】Flink连接JDBC报错 org.apache.flink.runtime.client.JobExecutionException: Job execution failed.,报错,flink,apache,java

Maven依赖

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>org.example</groupId>
    <artifactId>flink-test</artifactId>
    <version>1.0-SNAPSHOT</version>

    <properties>
        <flink.version>1.13.6</flink.version>
    </properties>

    <dependencies>
        <!--        JDBC            -->
        <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <version>8.0.21</version>
        </dependency>
        <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <version>5.1.38</version>
        </dependency>

        <!--        Flink           -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-java</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-java_2.12</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop-2-uber</artifactId>
            <version>2.7.5-10.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-jdbc_2.12</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.12</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <!--            Kafka           -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-kafka_2.12</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactId>kafka-clients</artifactId>
            <version>0.11.0.0</version>
        </dependency>
        <!--            Log4j         -->
        <dependency>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
            <version>1.2.17</version>
        </dependency>


    </dependencies>

</project>

报错内容

C:\Users\Lenovo\Documents\jdk\bin\java.exe "-javaagent:D:\Program Files\JetBrains\IntelliJ IDEA 2020.1.4\lib\idea_rt.jar=10816:D:\Program Files\JetBrains\IntelliJ IDEA 2020.1.4\bin" -Dfile.encoding=UTF-8 -classpath C:\Users\Lenovo\Documents\jdk\jre\lib\charsets.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\deploy.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\ext\access-bridge-64.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\ext\cldrdata.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\ext\dnsns.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\ext\jaccess.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\ext\jfxrt.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\ext\localedata.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\ext\nashorn.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\ext\sunec.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\ext\sunjce_provider.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\ext\sunmscapi.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\ext\sunpkcs11.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\ext\zipfs.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\javaws.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\jce.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\jfr.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\jfxswt.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\jsse.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\management-agent.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\plugin.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\resources.jar;C:\Users\Lenovo\Documents\jdk\jre\lib\rt.jar;D:\BD2302\BD_code\flink-test\target\classes;E:\Maven\repository\org\apache\flink\flink-streaming-java_2.12\1.13.6\flink-streaming-java_2.12-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-core\1.13.6\flink-core-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-annotations\1.13.6\flink-annotations-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-metrics-core\1.13.6\flink-metrics-core-1.13.6.jar;E:\Maven\repository\org\apache\commons\commons-lang3\3.3.2\commons-lang3-3.3.2.jar;E:\Maven\repository\com\esotericsoftware\kryo\kryo\2.24.0\kryo-2.24.0.jar;E:\Maven\repository\com\esotericsoftware\minlog\minlog\1.2\minlog-1.2.jar;E:\Maven\repository\org\objenesis\objenesis\2.1\objenesis-2.1.jar;E:\Maven\repository\commons-collections\commons-collections\3.2.2\commons-collections-3.2.2.jar;E:\Maven\repository\org\apache\commons\commons-compress\1.21\commons-compress-1.21.jar;E:\Maven\repository\org\apache\flink\flink-file-sink-common\1.13.6\flink-file-sink-common-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-runtime_2.12\1.13.6\flink-runtime_2.12-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-queryable-state-client-java\1.13.6\flink-queryable-state-client-java-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-hadoop-fs\1.13.6\flink-hadoop-fs-1.13.6.jar;E:\Maven\repository\commons-io\commons-io\2.8.0\commons-io-2.8.0.jar;E:\Maven\repository\org\apache\flink\flink-shaded-netty\4.1.49.Final-13.0\flink-shaded-netty-4.1.49.Final-13.0.jar;E:\Maven\repository\org\apache\flink\flink-shaded-jackson\2.12.1-13.0\flink-shaded-jackson-2.12.1-13.0.jar;E:\Maven\repository\org\apache\flink\flink-shaded-zookeeper-3\3.4.14-13.0\flink-shaded-zookeeper-3-3.4.14-13.0.jar;E:\Maven\repository\org\javassist\javassist\3.24.0-GA\javassist-3.24.0-GA.jar;E:\Maven\repository\org\scala-lang\scala-library\2.12.7\scala-library-2.12.7.jar;E:\Maven\repository\com\typesafe\akka\akka-actor_2.12\2.5.21\akka-actor_2.12-2.5.21.jar;E:\Maven\repository\com\typesafe\config\1.3.3\config-1.3.3.jar;E:\Maven\repository\org\scala-lang\modules\scala-java8-compat_2.12\0.8.0\scala-java8-compat_2.12-0.8.0.jar;E:\Maven\repository\com\typesafe\akka\akka-stream_2.12\2.5.21\akka-stream_2.12-2.5.21.jar;E:\Maven\repository\org\reactivestreams\reactive-streams\1.0.2\reactive-streams-1.0.2.jar;E:\Maven\repository\com\typesafe\ssl-config-core_2.12\0.3.7\ssl-config-core_2.12-0.3.7.jar;E:\Maven\repository\org\scala-lang\modules\scala-parser-combinators_2.12\1.1.1\scala-parser-combinators_2.12-1.1.1.jar;E:\Maven\repository\com\typesafe\akka\akka-protobuf_2.12\2.5.21\akka-protobuf_2.12-2.5.21.jar;E:\Maven\repository\com\typesafe\akka\akka-slf4j_2.12\2.5.21\akka-slf4j_2.12-2.5.21.jar;E:\Maven\repository\org\clapper\grizzled-slf4j_2.12\1.3.2\grizzled-slf4j_2.12-1.3.2.jar;E:\Maven\repository\com\github\scopt\scopt_2.12\3.5.0\scopt_2.12-3.5.0.jar;E:\Maven\repository\com\twitter\chill_2.12\0.7.6\chill_2.12-0.7.6.jar;E:\Maven\repository\com\twitter\chill-java\0.7.6\chill-java-0.7.6.jar;E:\Maven\repository\org\lz4\lz4-java\1.6.0\lz4-java-1.6.0.jar;E:\Maven\repository\org\apache\flink\flink-java\1.13.6\flink-java-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-shaded-guava\18.0-13.0\flink-shaded-guava-18.0-13.0.jar;E:\Maven\repository\org\apache\commons\commons-math3\3.5\commons-math3-3.5.jar;E:\Maven\repository\org\slf4j\slf4j-api\1.7.15\slf4j-api-1.7.15.jar;E:\Maven\repository\com\google\code\findbugs\jsr305\1.3.9\jsr305-1.3.9.jar;E:\Maven\repository\org\apache\flink\force-shading\1.13.6\force-shading-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-clients_2.12\1.13.6\flink-clients_2.12-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-optimizer_2.12\1.13.6\flink-optimizer_2.12-1.13.6.jar;E:\Maven\repository\commons-cli\commons-cli\1.3.1\commons-cli-1.3.1.jar;E:\Maven\repository\org\apache\flink\flink-table-api-java-bridge_2.12\1.13.6\flink-table-api-java-bridge_2.12-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-table-api-java\1.13.6\flink-table-api-java-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-table-planner-blink_2.12\1.13.6\flink-table-planner-blink_2.12-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-table-api-scala_2.12\1.13.6\flink-table-api-scala_2.12-1.13.6.jar;E:\Maven\repository\org\scala-lang\scala-reflect\2.12.7\scala-reflect-2.12.7.jar;E:\Maven\repository\org\scala-lang\scala-compiler\2.12.7\scala-compiler-2.12.7.jar;E:\Maven\repository\org\scala-lang\modules\scala-xml_2.12\1.0.6\scala-xml_2.12-1.0.6.jar;E:\Maven\repository\org\apache\flink\flink-table-api-scala-bridge_2.12\1.13.6\flink-table-api-scala-bridge_2.12-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-scala_2.12\1.13.6\flink-scala_2.12-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-streaming-scala_2.12\1.13.6\flink-streaming-scala_2.12-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-table-runtime-blink_2.12\1.13.6\flink-table-runtime-blink_2.12-1.13.6.jar;E:\Maven\repository\org\codehaus\janino\janino\3.0.11\janino-3.0.11.jar;E:\Maven\repository\org\codehaus\janino\commons-compiler\3.0.11\commons-compiler-3.0.11.jar;E:\Maven\repository\org\apache\calcite\avatica\avatica-core\1.17.0\avatica-core-1.17.0.jar;E:\Maven\repository\mysql\mysql-connector-java\5.1.38\mysql-connector-java-5.1.38.jar;E:\Maven\repository\com\ververica\flink-connector-mysql-cdc\2.1.0\flink-connector-mysql-cdc-2.1.0.jar;E:\Maven\repository\com\ververica\flink-connector-debezium\2.1.0\flink-connector-debezium-2.1.0.jar;E:\Maven\repository\io\debezium\debezium-api\1.5.4.Final\debezium-api-1.5.4.Final.jar;E:\Maven\repository\io\debezium\debezium-embedded\1.5.4.Final\debezium-embedded-1.5.4.Final.jar;E:\Maven\repository\org\apache\kafka\connect-api\2.7.0\connect-api-2.7.0.jar;E:\Maven\repository\javax\ws\rs\javax.ws.rs-api\2.1.1\javax.ws.rs-api-2.1.1.jar;E:\Maven\repository\org\apache\kafka\connect-runtime\2.7.0\connect-runtime-2.7.0.jar;E:\Maven\repository\org\apache\kafka\kafka-tools\2.7.0\kafka-tools-2.7.0.jar;E:\Maven\repository\net\sourceforge\argparse4j\argparse4j\0.7.0\argparse4j-0.7.0.jar;E:\Maven\repository\org\apache\kafka\connect-transforms\2.7.0\connect-transforms-2.7.0.jar;E:\Maven\repository\com\fasterxml\jackson\jaxrs\jackson-jaxrs-json-provider\2.10.5\jackson-jaxrs-json-provider-2.10.5.jar;E:\Maven\repository\com\fasterxml\jackson\jaxrs\jackson-jaxrs-base\2.10.5\jackson-jaxrs-base-2.10.5.jar;E:\Maven\repository\com\fasterxml\jackson\module\jackson-module-jaxb-annotations\2.10.5\jackson-module-jaxb-annotations-2.10.5.jar;E:\Maven\repository\jakarta\xml\bind\jakarta.xml.bind-api\2.3.2\jakarta.xml.bind-api-2.3.2.jar;E:\Maven\repository\jakarta\activation\jakarta.activation-api\1.2.1\jakarta.activation-api-1.2.1.jar;E:\Maven\repository\org\glassfish\jersey\containers\jersey-container-servlet\2.31\jersey-container-servlet-2.31.jar;E:\Maven\repository\org\glassfish\jersey\containers\jersey-container-servlet-core\2.31\jersey-container-servlet-core-2.31.jar;E:\Maven\repository\org\glassfish\hk2\external\jakarta.inject\2.6.1\jakarta.inject-2.6.1.jar;E:\Maven\repository\jakarta\ws\rs\jakarta.ws.rs-api\2.1.6\jakarta.ws.rs-api-2.1.6.jar;E:\Maven\repository\org\glassfish\jersey\inject\jersey-hk2\2.31\jersey-hk2-2.31.jar;E:\Maven\repository\org\glassfish\hk2\hk2-locator\2.6.1\hk2-locator-2.6.1.jar;E:\Maven\repository\org\glassfish\hk2\external\aopalliance-repackaged\2.6.1\aopalliance-repackaged-2.6.1.jar;E:\Maven\repository\org\glassfish\hk2\hk2-api\2.6.1\hk2-api-2.6.1.jar;E:\Maven\repository\org\glassfish\hk2\hk2-utils\2.6.1\hk2-utils-2.6.1.jar;E:\Maven\repository\javax\xml\bind\jaxb-api\2.3.0\jaxb-api-2.3.0.jar;E:\Maven\repository\javax\activation\activation\1.1.1\activation-1.1.1.jar;E:\Maven\repository\org\eclipse\jetty\jetty-server\9.4.33.v20201020\jetty-server-9.4.33.v20201020.jar;E:\Maven\repository\javax\servlet\javax.servlet-api\3.1.0\javax.servlet-api-3.1.0.jar;E:\Maven\repository\org\eclipse\jetty\jetty-http\9.4.33.v20201020\jetty-http-9.4.33.v20201020.jar;E:\Maven\repository\org\eclipse\jetty\jetty-io\9.4.33.v20201020\jetty-io-9.4.33.v20201020.jar;E:\Maven\repository\org\eclipse\jetty\jetty-servlet\9.4.33.v20201020\jetty-servlet-9.4.33.v20201020.jar;E:\Maven\repository\org\eclipse\jetty\jetty-security\9.4.33.v20201020\jetty-security-9.4.33.v20201020.jar;E:\Maven\repository\org\eclipse\jetty\jetty-servlets\9.4.33.v20201020\jetty-servlets-9.4.33.v20201020.jar;E:\Maven\repository\org\eclipse\jetty\jetty-continuation\9.4.33.v20201020\jetty-continuation-9.4.33.v20201020.jar;E:\Maven\repository\org\eclipse\jetty\jetty-util\9.4.33.v20201020\jetty-util-9.4.33.v20201020.jar;E:\Maven\repository\org\eclipse\jetty\jetty-client\9.4.33.v20201020\jetty-client-9.4.33.v20201020.jar;E:\Maven\repository\org\reflections\reflections\0.9.12\reflections-0.9.12.jar;E:\Maven\repository\org\apache\maven\maven-artifact\3.6.3\maven-artifact-3.6.3.jar;E:\Maven\repository\org\codehaus\plexus\plexus-utils\3.2.1\plexus-utils-3.2.1.jar;E:\Maven\repository\org\apache\kafka\connect-json\2.7.0\connect-json-2.7.0.jar;E:\Maven\repository\com\fasterxml\jackson\datatype\jackson-datatype-jdk8\2.10.5\jackson-datatype-jdk8-2.10.5.jar;E:\Maven\repository\org\apache\kafka\connect-file\2.7.0\connect-file-2.7.0.jar;E:\Maven\repository\io\debezium\debezium-connector-mysql\1.5.4.Final\debezium-connector-mysql-1.5.4.Final.jar;E:\Maven\repository\io\debezium\debezium-core\1.5.4.Final\debezium-core-1.5.4.Final.jar;E:\Maven\repository\com\fasterxml\jackson\core\jackson-core\2.10.5\jackson-core-2.10.5.jar;E:\Maven\repository\com\fasterxml\jackson\core\jackson-databind\2.10.5\jackson-databind-2.10.5.jar;E:\Maven\repository\com\fasterxml\jackson\core\jackson-annotations\2.10.5\jackson-annotations-2.10.5.jar;E:\Maven\repository\com\fasterxml\jackson\datatype\jackson-datatype-jsr310\2.10.5\jackson-datatype-jsr310-2.10.5.jar;E:\Maven\repository\com\google\guava\guava\30.0-jre\guava-30.0-jre.jar;E:\Maven\repository\com\google\guava\failureaccess\1.0.1\failureaccess-1.0.1.jar;E:\Maven\repository\com\google\guava\listenablefuture\9999.0-empty-to-avoid-conflict-with-guava\listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar;E:\Maven\repository\io\debezium\debezium-ddl-parser\1.5.4.Final\debezium-ddl-parser-1.5.4.Final.jar;E:\Maven\repository\org\antlr\antlr4-runtime\4.7.2\antlr4-runtime-4.7.2.jar;E:\Maven\repository\com\zendesk\mysql-binlog-connector-java\0.25.1\mysql-binlog-connector-java-0.25.1.jar;E:\Maven\repository\com\esri\geometry\esri-geometry-api\2.2.0\esri-geometry-api-2.2.0.jar;E:\Maven\repository\com\zaxxer\HikariCP\4.0.3\HikariCP-4.0.3.jar;E:\Maven\repository\org\apache\flink\flink-table-common\1.13.6\flink-table-common-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-connector-files\1.13.6\flink-connector-files-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-shaded-asm-7\7.1-13.0\flink-shaded-asm-7-7.1-13.0.jar;E:\Maven\repository\org\apache\flink\flink-json\1.13.6\flink-json-1.13.6.jar;E:\Maven\repository\org\projectlombok\lombok\1.18.24\lombok-1.18.24.jar;E:\Maven\repository\com\alibaba\fastjson\1.2.78\fastjson-1.2.78.jar;E:\Maven\repository\redis\clients\jedis\3.2.0\jedis-3.2.0.jar;E:\Maven\repository\org\apache\commons\commons-pool2\2.6.2\commons-pool2-2.6.2.jar;E:\Maven\repository\org\apache\flink\flink-shaded-hadoop-2-uber\2.6.5-10.0\flink-shaded-hadoop-2-uber-2.6.5-10.0.jar;E:\Maven\repository\org\slf4j\slf4j-log4j12\1.7.15\slf4j-log4j12-1.7.15.jar;E:\Maven\repository\log4j\log4j\1.2.17\log4j-1.2.17.jar;E:\Maven\repository\org\apache\flink\flink-connector-jdbc_2.12\1.13.6\flink-connector-jdbc_2.12-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-connector-kafka_2.12\1.13.6\flink-connector-kafka_2.12-1.13.6.jar;E:\Maven\repository\org\apache\flink\flink-connector-base\1.13.6\flink-connector-base-1.13.6.jar;E:\Maven\repository\org\apache\bahir\flink-connector-redis_2.12\1.1.0\flink-connector-redis_2.12-1.1.0.jar;E:\Maven\repository\cn\hutool\hutool-json\5.8.5\hutool-json-5.8.5.jar;E:\Maven\repository\cn\hutool\hutool-core\5.8.5\hutool-core-5.8.5.jar;E:\Maven\repository\org\apache\kafka\kafka-clients\0.11.0.0\kafka-clients-0.11.0.0.jar;E:\Maven\repository\net\jpountz\lz4\lz4\1.3.0\lz4-1.3.0.jar;E:\Maven\repository\org\xerial\snappy\snappy-java\1.1.2.6\snappy-java-1.1.2.6.jar day10610.FlinkJDBC
log4j:WARN No appenders could be found for logger (org.apache.flink.api.java.ClosureCleaner).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
	at org.apache.flink.runtime.minicluster.MiniClusterJobClient.lambda$getJobExecutionResult$3(MiniClusterJobClient.java:137)
	at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
	at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
	at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
	at org.apache.flink.runtime.rpc.akka.AkkaInvocationHandler.lambda$invokeRpc$0(AkkaInvocationHandler.java:237)
	at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
	at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
	at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
	at org.apache.flink.runtime.concurrent.FutureUtils$1.onComplete(FutureUtils.java:1081)
	at akka.dispatch.OnComplete.internal(Future.scala:264)
	at akka.dispatch.OnComplete.internal(Future.scala:261)
	at akka.dispatch.japi$CallbackBridge.apply(Future.scala:191)
	at akka.dispatch.japi$CallbackBridge.apply(Future.scala:188)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
	at org.apache.flink.runtime.concurrent.Executors$DirectExecutionContext.execute(Executors.java:73)
	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:68)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:284)
	at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:284)
	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:284)
	at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:573)
	at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:22)
	at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:21)
	at scala.concurrent.Future.$anonfun$andThen$1(Future.scala:532)
	at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:29)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:29)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
	at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
	at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:91)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:81)
	at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:91)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:44)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
	at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
	at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
	at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:216)
	at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:206)
	at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:197)
	at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:682)
	at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:79)
	at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:435)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction.applyOrElse(PartialFunction.scala:123)
	at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
	at akka.actor.Actor.aroundReceive(Actor.scala:517)
	at akka.actor.Actor.aroundReceive$(Actor.scala:515)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	... 4 more
Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet successfully received from the server was 1,102 milliseconds ago.  The last packet sent successfully to the server was 1,095 milliseconds ago.
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at com.mysql.jdbc.Util.handleNewInstance(Util.java:404)
	at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:981)
	at com.mysql.jdbc.ExportControlled.transformSocketToSSLSocket(ExportControlled.java:164)
	at com.mysql.jdbc.MysqlIO.negotiateSSLConnection(MysqlIO.java:4801)
	at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1643)
	at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1215)
	at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2255)
	at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2286)
	at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2085)
	at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:795)
	at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:44)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at com.mysql.jdbc.Util.handleNewInstance(Util.java:404)
	at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:400)
	at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:327)
	at java.sql.DriverManager.getConnection(DriverManager.java:664)
	at java.sql.DriverManager.getConnection(DriverManager.java:247)
	at day10610.FlinkJDBC$MySQLSink.open(FlinkJDBC.java:35)
	at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:34)
	at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
	at org.apache.flink.streaming.api.operators.StreamSink.open(StreamSink.java:46)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:442)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreGates(StreamTask.java:585)
	at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$1.call(StreamTaskActionExecutor.java:55)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.executeRestore(StreamTask.java:565)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.runWithCleanUpOnFail(StreamTask.java:650)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.restore(StreamTask.java:540)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:759)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:566)
	at java.lang.Thread.run(Thread.java:750)
Caused by: javax.net.ssl.SSLHandshakeException: No appropriate protocol (protocol is disabled or cipher suites are inappropriate)
	at sun.security.ssl.HandshakeContext.<init>(HandshakeContext.java:171)
	at sun.security.ssl.ClientHandshakeContext.<init>(ClientHandshakeContext.java:106)
	at sun.security.ssl.TransportContext.kickstart(TransportContext.java:238)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:410)
	at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:389)
	at com.mysql.jdbc.ExportControlled.transformSocketToSSLSocket(ExportControlled.java:149)
	... 30 more

Process finished with exit code 1

运行代码

package day10610;

import org.apache.flink.api.common.RuntimeExecutionMode;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.sink.RichSinkFunction;

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;

public class FlinkJDBC {

    public static void main(String[] args) throws Exception {
        //TODO 1.env-准备环境
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        env.setRuntimeMode(RuntimeExecutionMode.AUTOMATIC);
        //TODO 2.source-加载数据
        DataStream<Student> studentDS = env.fromElements(new Student(0, "lucy", 19));
        //TODO 3.transformation-数据转换处理
        //TODO 4.sink-数据输出
        studentDS.addSink(new MySQLSink());
        //TODO 5.execute-执行
        env.execute();
    }

    private static class MySQLSink extends RichSinkFunction<Student> {
        Connection conn = null;
        PreparedStatement ps = null;

        //连接JDBC一次
        @Override
        public void open(Configuration parameters) throws Exception {
            conn = DriverManager.getConnection("jdbc:mysql://hadoop10:3306/yangyulin?useSSL=true", "root", "0000");
            ps = conn.prepareStatement("INSERT INTO `t_student` (`id`, `name`, `age`) VALUES (999, ?, ?)");
        }

        @Override
        public void invoke(Student value, Context context) throws Exception {
            //设置?占位符参数
            ps.setString(1, value.getName());
            ps.setInt(2, value.getAge());
            //执行SQL将数据插入到MySQL
            ps.executeUpdate();
        }

        //关闭JDBC一次
        @Override
        public void close() throws Exception {
            if (conn != null) conn.close();
            if (ps != null) ps.close();

        }
    }
}

###修改代码

 conn = DriverManager.getConnection("jdbc:mysql://hadoop10:3306/yangyulin?useSSL=true", "root", "0000");

修改为

conn = DriverManager.getConnection("jdbc:mysql://hadoop10:3306/yangyulin?useSSL=false", "root", "0000");

true修改为false即可


2023-6-28更新:
Job execution failed.这个报错是Flink给出job中的运行时异常,需要做的是查看log,检查业务代码中的逻辑。文章来源地址https://www.toymoban.com/news/detail-734371.html

到了这里,关于【已解决】Flink连接JDBC报错 org.apache.flink.runtime.client.JobExecutionException: Job execution failed.的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处: 如若内容造成侵权/违法违规/事实不符,请点击违法举报进行投诉反馈,一经查实,立即删除!

领支付宝红包 赞助服务器费用

相关文章

  • flink连接kafka报:org.apache.kafka.common.errors.TimeoutException

    测试flink1.12.7 连接kafka: 执行报错如下: 经排除,找到文章:flink连接kafka报:org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic_未来的资深Java架构师的博客-CSDN博客 因为工程中log4j默认等级是error,所以,先配置resource/log4j.properties,日志等级改成info: 再运行

    2024年02月06日
    浏览(37)
  • org.apache.flink.table.client.SqlClientException: Could not read from command line【Flink解决方案】

    在启动Flink的SQL集群时遇到启动异常,可能的原因有多种。以下是一些可能的原因猜测及解释: 配置错误 : flink-conf.yaml 或其他配置文件可能存在错误或遗漏,导致Flink无法正确加载配置。 SQL客户端的配置(如TableSource或TableResult的配置)可能不正确,导致SQL作业无法启动。

    2024年04月26日
    浏览(38)
  • flink连接kafka报:org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic

    1、在网上搜了半天,大多数都是说需要改kafka的server.properties配置,指明0.0.0.0,外网才能访问( 其实是不对的,压根就不需要改,kafka安装好里面参数是啥就是啥 )。 2、还有说程序中引入的scala依赖需要跟Linux上运行的kafka内嵌的scala版本一致( 这个确实需要对应 ),但是改

    2024年02月12日
    浏览(45)
  • Flink CDC SQL Oracle to Postgresql与jdbc连接oracle报错处理

    flink-cdc官网:Oracle CDC Connector — CDC Connectors for Apache Flink® documentation Flink环境依赖: (3)启用日志归档 (4)检查是否启用了日志归档 (5)创建具有权限的 Oracle 用户 (5.1)。创建表空间 (5.2)。创建用户并授予权限 Flink SQL 客户端连接器测试: 创建 Oracle 链接器 返回内容 以上代

    2024年02月11日
    浏览(36)
  • java.lang.ClassNotFoundException: org.apache.flink.connector.base.source.reader.RecordEmitter解决

    环境 :Flink 1.15.0,cdc2.3.0 目的 :为了测试cdc2.3支持从\\\"specific-offset\\\"启动程序。 代码如下: 报错如下: 引入下面依赖,解决报错:

    2024年02月07日
    浏览(36)
  • 【Flink】ClassNotFoundException: org.apache.hadoop.conf.Configuration

    问题背景 在Flink的sql-client客户端中执行连接hive的sql代码时出现如下错误,版本Flink1.13.6 Flink SQL  create catalog test with(  \\\'type\\\'=\\\'hive\\\', \\\'default-database\\\'=\\\'default\\\', \\\'hive-conf-dir\\\'=\\\'/opt/hive/conf\\\'); [ERROR] Could not execute SQL statement. Reason: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration 问题

    2024年02月21日
    浏览(26)
  • Flink系列之:JDBC SQL 连接器

    Scan Source: Bounded Lookup Source: Sync Mode Sink: Batch Sink: Streaming Append Upsert Mode JDBC 连接器允许使用 JDBC 驱动向任意类型的关系型数据库读取或者写入数据。本文档描述了针对关系型数据库如何通过建立 JDBC 连接器来执行 SQL 查询。 如果在 DDL 中定义了主键,JDBC sink 将以 upsert 模式与外

    2024年02月02日
    浏览(33)
  • 【Flink】ValidationException: Could not find any factory for identifier ‘jdbc‘ that implements ‘org.ap

    在我们使用FlinkSQL客户端执行sql的时候,报下图错误: Flink SQL CREATE TABLE test_input (      id STRING primary key,      name STRING,      type STRING ) WITH (   \\\'connector\\\' = \\\'jdbc\\\',   \\\'url\\\' = \\\'jdbc:mysql://localhost:3306/cdc\\\',   \\\'username\\\' = \\\'root\\\',   \\\'password\\\' = \\\'root\\\',   \\\'table-name\\\' = \\\'cdc_test\\\' ); [INFO] Execute state

    2024年02月03日
    浏览(37)
  • java.lang.LinkageError: loader constraint violation: loader (instance of org/apache/flink/util/Child

    flink job用application模式运行时 jar包依赖冲突,job中pom.xml中kafka-clients依赖与jobmanager中存在冲突,将 pom.xml中kafka-clients依赖设置限制范围,scope=provided 一般默认为compile,其代表着编译环境、测试环境以及运行环境三种环境都可以有效,而provided仅代表着编译环境和测试环境有效,

    2024年02月05日
    浏览(34)
  • Flink系列之:Apache Kafka SQL 连接器

    Scan Source: Unbounded Sink: Streaming Append Mode Kafka 连接器提供从 Kafka topic 中消费和写入数据的能力。 以下示例展示了如何创建 Kafka 表: 以下的连接器元数据可以在表定义中通过元数据列的形式获取。 R/W 列定义了一个元数据是可读的(R)还是可写的(W)。 只读列必须声明为 VI

    2024年02月01日
    浏览(37)

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

博客赞助

微信扫一扫打赏

请作者喝杯咖啡吧~博客赞助

支付宝扫一扫领取红包,优惠每天领

二维码1

领取红包

二维码2

领红包