dataFrame(DF)将数据插入ES时报错 org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect E

这篇具有很好参考价值的文章主要介绍了dataFrame(DF)将数据插入ES时报错 org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect E。希望对大家有所帮助。如果存在错误或未考虑完全的地方,请大家不吝赐教,您也可以点击"举报违法"按钮提交疑问。

SparkSql插入ES时报错解决(针对增加用户权限之后的报错解决)

org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting ‘es.nodes.wan.only’

以下是报错信息

Exception in thread "main" org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only'
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.elasticsearch.hadoop.rest.InitializationUtils.discoverClusterInfo(InitializationUtils.java:340)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.elasticsearch.spark.sql.EsSparkSQL$.saveToEs(EsSparkSQL.scala:97)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.elasticsearch.spark.sql.ElasticsearchRelation.insert(DefaultSource.scala:620)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.elasticsearch.spark.sql.DefaultSource.createRelation(DefaultSource.scala:107)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at com.pukka.bigdata.dealwith.toes.SparkToEs$.main(SparkToEs.scala:112)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at com.pukka.bigdata.dealwith.toes.SparkToEs.main(SparkToEs.scala)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at java.lang.reflect.Method.invoke(Method.java:498)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15-11-2022 14:43:56 CST playrecordtoes INFO - Caused by: org.elasticsearch.hadoop.rest.EsHadoopInvalidRequest: org.elasticsearch.hadoop.rest.EsHadoopRemoteException: security_exception: missing authentication credentials for REST request [/]
15-11-2022 14:43:56 CST playrecordtoes INFO - null
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.elasticsearch.hadoop.rest.RestClient.checkResponse(RestClient.java:469)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:426)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:388)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:392)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.elasticsearch.hadoop.rest.RestClient.get(RestClient.java:168)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.elasticsearch.hadoop.rest.RestClient.mainInfo(RestClient.java:735)
15-11-2022 14:43:56 CST playrecordtoes INFO - 	at org.elasticsearch.hadoop.rest.InitializationUtils.discoverClusterInfo(InitializationUtils.java:330)

报错解决
在配置参数中加入 “es.net.http.auth.user”、 “es.net.http.auth.pass” 即可文章来源地址https://www.toymoban.com/news/detail-581751.html

val options = Map(
      //如果没有index,则新建
      "es.index.auto.create" -> "true",
      "es.nodes.wan.only" -> "true",
      "es.nodes" -> "192.168.20.131,192.168.20.130",
      //"es.nodes" -> "10.0.6.56",
      //"es.nodes" -> "60.167.225.130",
      //es的端口
      "es.port" -> "9200",
      "es.net.http.auth.user" ->"elastic",
      "es.net.http.auth.pass" ->"test@2022!"
    )
    val esPath=s"play_record_$year$month/_doc"
    dataFrame
      .write
      .format("org.elasticsearch.spark.sql")
      .options(options)
      .mode(SaveMode.Append)
      //index为epg_itv_play_record_$year$month
      .save(esPath)
    spark.stop()

到了这里,关于dataFrame(DF)将数据插入ES时报错 org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect E的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处: 如若内容造成侵权/违法违规/事实不符,请点击违法举报进行投诉反馈,一经查实,立即删除!

领支付宝红包 赞助服务器费用

相关文章

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

博客赞助

微信扫一扫打赏

请作者喝杯咖啡吧~博客赞助

支付宝扫一扫领取红包,优惠每天领

二维码1

领取红包

二维码2

领红包