为提高效率,提问时请提供以下信息,问题描述清晰可优先响应。
【DM版本】:
【操作系统】:
【CPU】:
【问题描述】*:Java程序调用kettle任务时,往达梦数据库插入一定量的的数据会中断,停留在194000之后的数据就无法进行了
报错信息:
org.pentaho.di.core.exception.KettleDatabaseException:
2025/05/19 18:34:35 - 表输出.0 - Error performing rollback on connection
2025/05/19 18:34:35 - 表输出.0 - Connection is colsed or not build
2025/05/19 18:34:35 - 表输出.0 -
2025/05/19 18:34:35 - 表输出.0 - at org.pentaho.di.core.database.Database.rollback(Database.java:902)
2025/05/19 18:34:35 - 表输出.0 - at org.pentaho.di.core.database.Database.rollback(Database.java:880)
2025/05/19 18:34:35 - 表输出.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.dispose(TableOutput.java:621)
2025/05/19 18:34:35 - 表输出.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:96)
2025/05/19 18:34:35 - 表输出.0 - at java.lang.Thread.run(Thread.java:750)
2025/05/19 18:34:35 - 表输出.0 - Caused by: dm.jdbc.driver.DMException: Connection is colsed or not build
2025/05/19 18:34:35 - 表输出.0 - at dm.jdbc.driver.DBError.throwException(DBError.java:678)
2025/05/19 18:34:35 - 表输出.0 - at dm.jdbc.driver.DmdbConnection.checkClosed(DmdbConnection.java:906)
2025/05/19 18:34:35 - 表输出.0 - at dm.jdbc.driver.DmdbConnection.do_rollback(DmdbConnection.java:836)
2025/05/19 18:34:35 - 表输出.0 - at dm.jdbc.driver.DmdbConnection.rollback(DmdbConnection.java:1415)
2025/05/19 18:34:35 - 表输出.0 - at org.pentaho.di.core.database.Database.rollback(Database.java:893)
2025/05/19 18:34:35 - 表输出.0 - ... 4 more
2025/05/19 18:34:35 - 表输出.0 - Finished processing (I=0, O=194000, R=194500, W=194000, U=0, E=1)
2025/05/19 18:34:35 - T_GZ45_VB_SECURITY - Transformation detected one or more steps with errors.
2025/05/19 18:34:35 - T_GZ45_VB_SECURITY - Transformation is killing the other steps!
2025/05/19 18:34:35 - DB_SOURCE - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : Error committing:
2025/05/19 18:34:35 - DB_SOURCE -
2025/05/19 18:34:35 - DB_SOURCE - Error comitting connection
2025/05/19 18:34:35 - DB_SOURCE - No more data to read from socket
连接断了,排查一下几个方向吧
1.数据导入中断,数据库是否存活
2.kettle是否设置连接超时时间
3.检查一下中断时kettle程序是不是内存占用过多奔溃了
4.检查数据库中是否设置用户连接超时时间