页面树结构
转至元数据结尾
转至元数据起始

问题描述

ES是集群,ETL服务器到各个ES节点的网络都是通的,并且在ETL服务器上能够通过curl命令查询ES数据。但是ES数据源节点执行报如下错误:

024-04-24 16:59:21.448  [882] ERROR node.GenericNode.handleExecuteError:149 [tid=697e65be22fbc42e] - Node execution failed.(id:b032fc7852bc1234a4c00c9d0e7424e6,name:ES_NODE)
smartbix.datamining.engine.execute.exception.ExecuteException: 连接异常,请检查节点地址是否正确或可连通,具体错误详情请查看日志!
	at smartbix.datamining.engine.execute.node.datasource.EsDataSourceNode.execute(EsDataSourceNode.java:86) ~[EngineCommonNode-1.0.jar:?]
	at smartbix.datamining.engine.execute.node.GenericNode.start(GenericNode.java:118) ~[EngineCore-1.0.jar:?]
	at smartbix.datamining.engine.agent.execute.executor.DefaultNodeExecutor.execute(DefaultNodeExecutor.java:54) ~[EngineAgent-1.0.jar:?]
	at smartbix.datamining.engine.agent.execute.launcher.DefaultLauncher.run(DefaultLauncher.java:79) ~[EngineAgent-1.0.jar:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112]
	at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_112]
Caused by: org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only'
	at org.elasticsearch.hadoop.rest.InitializationUtils.discoverClusterInfo(InitializationUtils.java:403) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.spark.sql.ElasticsearchRelation.cfg$lzycompute(DefaultSource.scala:234) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.spark.sql.ElasticsearchRelation.cfg(DefaultSource.scala:231) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.spark.sql.ElasticsearchRelation.lazySchema$lzycompute(DefaultSource.scala:238) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.spark.sql.ElasticsearchRelation.lazySchema(DefaultSource.scala:238) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.spark.sql.ElasticsearchRelation.$anonfun$schema$1(DefaultSource.scala:242) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at scala.Option.getOrElse(Option.scala:189) ~[scala-library-2.12.17.jar:?]
	at org.elasticsearch.spark.sql.ElasticsearchRelation.schema(DefaultSource.scala:242) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:434) ~[spark-sql_2.12-3.4.1.jar:3.4.1]
	at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:229) ~[spark-sql_2.12-3.4.1.jar:3.4.1]
	at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:211) ~[spark-sql_2.12-3.4.1.jar:3.4.1]
	at scala.Option.getOrElse(Option.scala:189) ~[scala-library-2.12.17.jar:?]
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211) ~[spark-sql_2.12-3.4.1.jar:3.4.1]
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:172) ~[spark-sql_2.12-3.4.1.jar:3.4.1]
	at smartbix.datamining.engine.execute.node.datasource.EsDataSourceNode.execute(EsDataSourceNode.java:81) ~[EngineCommonNode-1.0.jar:?]
	... 8 more
Caused by: org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed; tried [[10.11.110.97:9200, 10.11.110.103:9200]] 
	at org.elasticsearch.hadoop.rest.NetworkClient.execute(NetworkClient.java:160) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:442) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:438) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:406) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.hadoop.rest.RestClient.mainInfo(RestClient.java:755) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.hadoop.rest.InitializationUtils.discoverClusterInfo(InitializationUtils.java:393) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.spark.sql.ElasticsearchRelation.cfg$lzycompute(DefaultSource.scala:234) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.spark.sql.ElasticsearchRelation.cfg(DefaultSource.scala:231) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.spark.sql.ElasticsearchRelation.lazySchema$lzycompute(DefaultSource.scala:238) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.spark.sql.ElasticsearchRelation.lazySchema(DefaultSource.scala:238) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.elasticsearch.spark.sql.ElasticsearchRelation.$anonfun$schema$1(DefaultSource.scala:242) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at scala.Option.getOrElse(Option.scala:189) ~[scala-library-2.12.17.jar:?]
	at org.elasticsearch.spark.sql.ElasticsearchRelation.schema(DefaultSource.scala:242) ~[elasticsearch-spark-30_2.12-8.11.1.jar:8.11.1]
	at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:434) ~[spark-sql_2.12-3.4.1.jar:3.4.1]
	at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:229) ~[spark-sql_2.12-3.4.1.jar:3.4.1]
	at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:211) ~[spark-sql_2.12-3.4.1.jar:3.4.1]
	at scala.Option.getOrElse(Option.scala:189) ~[scala-library-2.12.17.jar:?]
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211) ~[spark-sql_2.12-3.4.1.jar:3.4.1]
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:172) ~[spark-sql_2.12-3.4.1.jar:3.4.1]
	at smartbix.datamining.engine.execute.node.datasource.EsDataSourceNode.execute(EsDataSourceNode.java:81) ~[EngineCommonNode-1.0.jar:?]
	... 8 more

解决方案

根据报错提示,在ES数据源更多设置中添加配置es.nodes.wan.only=true

  • 无标签