页面树结构
转至元数据结尾
转至元数据起始

(本文档仅供参考)

问题说明

在系统中连接StarRocks数据源,做为ETL的关系目标表录入数据时,报连接超时,BI和挖掘是部署在一台机器上的,BI连接数据库是成功的,但是ETL报连接超时。

2024-09-22 22:01:29.754 [745165] ERROR node.GenericNode.handleExecuteError:149 - Node execution failed.(id:ad2f9f28c9afd47a2de28a5e15d46e07,name:JDBC_DATATARGER_OVERWRITE)
smartbix.datamining.engine.execute.exception.ExecuteException: 写入数据失败,详情请查看日志!
	at smartbix.datamining.engine.execute.node.datasource.handler.SelectDBHandler.writeDataByStreamLoad(SelectDBHandler.java:85) ~[EngineCommonNode-1.0.jar:?]
	at smartbix.datamining.engine.execute.node.datasource.handler.SelectDBHandler.executor(SelectDBHandler.java:51) ~[EngineCommonNode-1.0.jar:?]
	at smartbix.datamining.engine.execute.node.datatarget.JdbcDataTargetNode.execute(JdbcDataTargetNode.java:181) ~[EngineCommonNode-1.0.jar:?]
	at smartbix.datamining.engine.execute.node.GenericNode.start(GenericNode.java:118) ~[EngineCore-1.0.jar:?]
	at smartbix.datamining.engine.agent.execute.executor.DefaultNodeExecutor.execute(DefaultNodeExecutor.java:50) ~[EngineAgent-1.0.jar:?]
	at smartbix.datamining.engine.agent.execute.launcher.DefaultLauncher.run(DefaultLauncher.java:79) ~[EngineAgent-1.0.jar:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_202]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_202]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_202]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_202]
	at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_202]
Caused by: smartbix.datamining.engine.execute.exception.ExecuteException: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 74455.0 failed 1 times, most recent failure: Lost task 0.0 in stage 74455.0 (TID 62126) (10.0.0.5 executor driver): org.apache.http.conn.HttpHostConnectException: Connect to 203.176.93.159:8040 [/203.176.93.159] failed: 连接超时 (Connection timed out)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:156)
	at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:376)
	at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:393)
	at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
	at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
	at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
	at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
	at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
	at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
	at smartbix.datamining.engine.execute.bean.StreamLoader.sendData(StreamLoader.java:189)
	at smartbix.datamining.engine.execute.bean.StreamLoader.access$200(StreamLoader.java:31)
	at smartbix.datamining.engine.execute.bean.StreamLoader$1.call(StreamLoader.java:124)
	at org.apache.spark.sql.Dataset.$anonfun$foreachPartition$2(Dataset.scala:3370)
	at org.apache.spark.sql.Dataset.$anonfun$foreachPartition$2$adapted(Dataset.scala:3370)
	at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2(RDD.scala:1009)
	at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2$adapted(RDD.scala:1009)
	at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2303)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:92)
	at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
	at org.apache.spark.scheduler.Task.run(Task.scala:139)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:554)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1529)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:557)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.ConnectException: 连接超时 (Connection timed out)
	at java.net.PlainSocketImpl.socketConnect(Native Method)
	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
	at java.net.Socket.connect(Socket.java:589)
	at org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket(PlainConnectionSocketFactory.java:75)
	at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
	... 26 more


解决方案

从报错分析,是连接不上StarRocks数据库的8040端口,用StarRocks数据库作为目标源,需要开放两个端口,9030是查数据的端口,8040是写入数据的端口。

  • 无标签