previewing a db2 table failed: org.pentaho.di.core.exception.KettleDatabaseException - pentaho-data-integration

Previewing a db2 table through pentaho failed with the following errors, any advice would be appreciated:
2017/09/27 11:27:18 - Carte - Installing timer to purge stale objects after 1440 minutes.
2017/09/27 14:01:59 - C:\Projects\lovebuy\bbb.ktr : bbb - Dispatching started for transformation [C:\Projects\lovebuy\bbb.ktr : bbb]
2017/09/27 14:02:02 - Table input.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : Unexpected error
2017/09/27 14:02:02 - Table input.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException:
2017/09/27 14:02:02 - Table input.0 - An error occurred executing SQL:
2017/09/27 14:02:02 - Table input.0 - SELECT * FROM TB_CUSTOMER
2017/09/27 14:02:02 - Table input.0 - DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704, SQLERRMC=DB2ADMIN.TB_CUSTOMER, DRIVER=3.68.61
2017/09/27 14:02:02 - Table input.0 -
2017/09/27 14:02:02 - Table input.0 - at org.pentaho.di.core.database.Database.openQuery(Database.java:1764)
2017/09/27 14:02:02 - Table input.0 - at org.pentaho.di.trans.steps.tableinput.TableInput.doQuery(TableInput.java:236)
2017/09/27 14:02:02 - Table input.0 - at org.pentaho.di.trans.steps.tableinput.TableInput.processRow(TableInput.java:140)
2017/09/27 14:02:02 - Table input.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2017/09/27 14:02:02 - Table input.0 - at java.lang.Thread.run(Thread.java:745)
2017/09/27 14:02:02 - Table input.0 - Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704, SQLERRMC=DB2ADMIN.TB_CUSTOMER, DRIVER=3.68.61
2017/09/27 14:02:02 - Table input.0 - at com.ibm.db2.jcc.am.gd.a(gd.java:749)
2017/09/27 14:02:02 - Table input.0 - at com.ibm.db2.jcc.am.gd.a(gd.java:66)
2017/09/27 14:02:02 - Table input.0 - at com.ibm.db2.jcc.am.gd.a(gd.java:135)
2017/09/27 14:02:02 - Table input.0 - at com.ibm.db2.jcc.am.uo.c(uo.java:2780)
2017/09/27 14:02:02 - Table input.0 - at com.ibm.db2.jcc.am.uo.d(uo.java:2768)
2017/09/27 14:02:02 - Table input.0 - at com.ibm.db2.jcc.am.uo.a(uo.java:2217)
2017/09/27 14:02:02 - Table input.0 - at com.ibm.db2.jcc.t4.bb.h(bb.java:141)
2017/09/27 14:02:02 - Table input.0 - at com.ibm.db2.jcc.t4.bb.b(bb.java:41)
2017/09/27 14:02:02 - Table input.0 - at com.ibm.db2.jcc.t4.p.a(p.java:32)
2017/09/27 14:02:02 - Table input.0 - at com.ibm.db2.jcc.t4.vb.i(vb.java:145)
2017/09/27 14:02:02 - Table input.0 - at com.ibm.db2.jcc.am.uo.ib(uo.java:2186)
2017/09/27 14:02:02 - Table input.0 - at com.ibm.db2.jcc.am.uo.a(uo.java:3267)
2017/09/27 14:02:02 - Table input.0 - at com.ibm.db2.jcc.am.uo.a(uo.java:708)
2017/09/27 14:02:02 - Table input.0 - at com.ibm.db2.jcc.am.uo.executeQuery(uo.java:687)
2017/09/27 14:02:02 - Table input.0 - at org.pentaho.di.core.database.Database.openQuery(Database.java:1753)
2017/09/27 14:02:02 - Table input.0 - ... 4 more
2017/09/27 14:02:02 - Table input.0 - Finished reading query, closing connection.
2017/09/27 14:02:02 - Table input.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=1)
2017/09/27 14:02:02 - C:\Projects\lovebuy\bbb.ktr : bbb - Transformation detected one or more steps with errors.
2017/09/27 14:02:02 - C:\Projects\lovebuy\bbb.ktr : bbb - Transformation is killing the other steps!

solved by: SELECT * FROM db_virtualbusiness.TB_SAL instead of SESELECT * FROM db2admin.TB_SALE

Related

Error when run StartSonar.bat "Failed to create table schema_migrations"

After running startSonar.bat I encountered the following error:
ERROR web[][o.s.s.p.PlatformImpl] Web server startup failed
java.lang.IllegalStateException: Failed to create table schema_migrations
.
.
.
Caused by:
org.h2.jdbc.JdbcSQLSyntaxErrorException: Syntax error in SQL statement "CREATE TABLE SCHEMA_MIGRATIONS (VERSION VARCHAR (۲۵۵[]) NOT NULL) "; expected "long"; SQL statement:
CREATE TABLE schema_migrations (version VARCHAR (۲۵۵) NOT NULL) [42001-199]*

Doesn't able to run job(kjb) in Pentaho Data Integration 9.2

I have tried running job using pentaho 9.2 it is not picking the job when running by kitchen comment
./kitchen.sh -rep=repo1 -file=/var/lib/jenkins/project/path/etl/Job1.kjb
Job1.kjb contains multiple sub_jobs. Pentaho 9.2 is not picking the sub job and showing following error.
Error Log:
2022/03/02 05:00:28 - Job1 - Start of job execution
2022/03/02 05:00:28 - Job1 - Starting entry [sub_job_1]
2022/03/02 05:00:28 - sub_job_1 - ERROR (version 9.2.0.0-290, build 9.2.0.0-290 from 2021-06-02 06.36.08 by buildguy) : Error running job entry 'job' :
2022/03/02 05:00:28 - sub_job_1 - ERROR (version 9.2.0.0-290, build 9.2.0.0-290 from 2021-06-02 06.36.08 by buildguy) : org.pentaho.di.core.exception.KettleException:
2022/03/02 05:00:28 - sub_job_1 - Unexpected error during job metadata load
2022/03/02 05:00:28 - sub_job_1 -
2022/03/02 05:00:28 - sub_job_1 - Unable to load the job from XML file [/var/lib/jenkins/project/path/etl/sub_job_1.kjb.kjb]
2022/03/02 05:00:28 - sub_job_1 -
2022/03/02 05:00:28 - sub_job_1 - File [file:///var/lib/jenkins/project/path/etl/sub_job_1.kjb.kjb] does not exists.**
2022/03/02 05:00:28 - sub_job_1 -
2022/03/02 05:00:28 - sub_job_1 -
2022/03/02 05:00:28 - sub_job_1 -
2022/03/02 05:00:28 - sub_job_1 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMeta(JobEntryJob.java:1467)
2022/03/02 05:00:28 - sub_job_1 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMeta(JobEntryJob.java:1385)
2022/03/02 05:00:28 - sub_job_1 - at org.pentaho.di.job.entries.job.JobEntryJob.execute(JobEntryJob.java:695)
2022/03/02 05:00:28 - sub_job_1 - at org.pentaho.di.job.Job.execute(Job.java:693)
2022/03/02 05:00:28 - sub_job_1 - at org.pentaho.di.job.Job.execute(Job.java:834)
2022/03/02 05:00:28 - sub_job_1 - at org.pentaho.di.job.Job.execute(Job.java:503)
2022/03/02 05:00:28 - sub_job_1 - at org.pentaho.di.job.Job.run(Job.java:389)
2022/03/02 05:00:28 - sub_job_1 - Caused by: org.pentaho.di.core.exception.KettleXMLException:
2022/03/02 05:00:28 - sub_job_1 - Unable to load the job from XML file [/var/lib/jenkins/project/path/etl/sub_job_1.kjb.kjb]
You are not providing much details, but maybe the problem is that in your Job1.kjb job you have not defined correctly the location of the subjob. In the log it's saying that it cannot find the file:
/var/lib/jenkins/project/path/etl/sub_job_1.kjb.kjb
Probably your subjob is the file: /var/lib/jenkins/project/path/etl/sub_job_1.kjb

HikariCP connection broken/unavailable woes

I am unable to understand the reason behind intermittent HikariCP Connection is not available.
From the logs, It doesn't look like a connection leak issue. A bigger problem is I am unable to predictably reproduce the error. Following is a sample log trace where the error starts while this gist contains it till the end.
2017-12-12T19:31:55.958Z DEBUG <> [HikariPool-1 housekeeper] com.zaxxer.hikari.pool.HikariPool - HikariPool-1 - Pool stats (total=10, active=1, idle=9, waiting=0)
2017-12-12T19:31:57.052Z WARN <> [main] c.zaxxer.hikari.pool.ProxyConnection - HikariPool-1 - Connection org.postgresql.jdbc.PgConnection#1de5f0ef marked as broken because of SQLSTATE(08P01), ErrorCode(0)
org.postgresql.util.PSQLException: Expected command status BEGIN, got EMPTY.
at org.postgresql.core.v3.QueryExecutorImpl$1.handleCommandStatus(QueryExecutorImpl.java:515)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2180)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:288)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:430)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:356)
at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:168)
at org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:116)
at com.zaxxer.hikari.pool.ProxyPreparedStatement.executeQuery(ProxyPreparedStatement.java:52)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeQuery(HikariProxyPreparedStatement.java)
at org.hibernate.engine.jdbc.internal.ResultSetReturnImpl.extract(ResultSetReturnImpl.java:70)
at org.hibernate.id.SequenceGenerator.generateHolder(SequenceGenerator.java:116)
at org.hibernate.id.SequenceHiLoGenerator.generate(SequenceHiLoGenerator.java:62)
at org.hibernate.event.internal.AbstractSaveEventListener.saveWithGeneratedId(AbstractSaveEventListener.java:101)
at org.hibernate.jpa.event.internal.core.JpaPersistEventListener.saveWithGeneratedId(JpaPersistEventListener.java:67)
... MORE DETAILS IN [GIST][1]
2017-12-12T19:31:57.067Z ERROR <> [main] c.o.r.s.t.RAUTwitterUserService - populateFromPayload: id=128, twitter_handle=non_local, exception=org.springframework.transaction.TransactionSystemException: Could not roll back JPA transaction; nested exception is javax.persistence.PersistenceException: unexpected error when rollbacking
2017-12-12T19:31:57.067Z INFO <> [main] c.o.r.s.t.TwitterCollectorService - RAU from RAU service: RAU(id=129, twitter=non_global)
2017-12-12T19:31:57.089Z DEBUG <> [HikariPool-1 connection adder] com.zaxxer.hikari.pool.HikariPool - HikariPool-1 - Added connection org.postgresql.jdbc.PgConnection#419dcf40

Pentaho: Am not able to write data from Pentaho to BigQuery

I am using Starschema's JDBC driver to connect Pentaho to BigQuery. I am able to successfully fetch data from BigQuery into Pentaho. However I am not able to write data from Pentaho into BigQuery. There is an exception thrown while Inserting Rows into BigQuery and it seems that the operation may not be supported. How do I solve this?
Error message:
2017/10/30 14:27:43 - Table output 2.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : Because of an error, this step can't continue:
2017/10/30 14:27:43 - Table output 2.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : org.pentaho.di.core.exception.KettleException:
2017/10/30 14:27:43 - Table output 2.0 - Error inserting row into table [TableID] with values: [A], [I], [G], [1], [2016-02-18], [11], [2016-02-18-12.00.00.123456], [GG], [CB], [132], [null], [null], [null]
2017/10/30 14:27:43 - Table output 2.0 -
2017/10/30 14:27:43 - Table output 2.0 - Error inserting/updating row
2017/10/30 14:27:43 - Table output 2.0 - executeUpdate()
2017/10/30 14:27:43 - Table output 2.0 -
2017/10/30 14:27:43 - Table output 2.0 -
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:385)
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:125)
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2017/10/30 14:27:43 - Table output 2.0 - at java.lang.Thread.run(Unknown Source)
2017/10/30 14:27:43 - Table output 2.0 - Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
2017/10/30 14:27:43 - Table output 2.0 - Error inserting/updating row
2017/10/30 14:27:43 - Table output 2.0 - executeUpdate()
2017/10/30 14:27:43 - Table output 2.0 -
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.core.database.Database.insertRow(Database.java:1321)
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:262)
2017/10/30 14:27:43 - Table output 2.0 - ... 3 more
2017/10/30 14:27:43 - Table output 2.0 - Caused by: net.starschema.clouddb.jdbc.BQSQLFeatureNotSupportedException: executeUpdate()
2017/10/30 14:27:43 - Table output 2.0 - at net.starschema.clouddb.jdbc.BQPreparedStatement.executeUpdate(BQPreparedStatement.java:317)
2017/10/30 14:27:43 - Table output 2.0 - at org.pentaho.di.core.database.Database.insertRow(Database.java:1288)
2017/10/30 14:27:43 - Table output 2.0 - ... 4 more
2017/10/30 14:27:43 - BigQuery_rwa-tooling - Statement canceled!
2017/10/30 14:27:43 - Simple Read Write from csv to txt - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : Something went wrong while trying to stop the transformation: org.pentaho.di.core.exception.KettleDatabaseException:
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Error cancelling statement
2017/10/30 14:27:43 - Simple Read Write from csv to txt - cancel()
2017/10/30 14:27:43 - Simple Read Write from csv to txt - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException:
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Error cancelling statement
2017/10/30 14:27:43 - Simple Read Write from csv to txt - cancel()
2017/10/30 14:27:43 - Simple Read Write from csv to txt -
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.core.database.Database.cancelStatement(Database.java:750)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.core.database.Database.cancelQuery(Database.java:732)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.steps.tableinput.TableInput.stopRunning(TableInput.java:299)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.Trans.stopAll(Trans.java:1889)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.step.BaseStep.stopAll(BaseStep.java:2915)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:139)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at java.lang.Thread.run(Unknown Source)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Caused by: net.starschema.clouddb.jdbc.BQSQLFeatureNotSupportedException: cancel()
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at net.starschema.clouddb.jdbc.BQStatementRoot.cancel(BQStatementRoot.java:113)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - at org.pentaho.di.core.database.Database.cancelStatement(Database.java:744)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - ... 7 more
2017/10/30 14:27:43 - Table output 2.0 - Signaling 'output done' to 0 output rowsets.
2017/10/30 14:27:43 - BigQuery_prID - No commit possible on database connection [BigQuery_prID]
It looks like you may be trying to do this via legacy SQL, which has no support for DML statements (INSERT/UPDATE/DELETE).
Standard SQL does support DML, but these are largely to support bulk table manipulations as opposed to row-oriented insertions; ingesting data via the use of individual DML INSERTs is not recommended. See the quotas on the DML reference documentation for more details.
You're better off using either BigQuery streaming or bulk ingestion via a load job for ingestion, but as these mechanisms are outside of the query language you may need to move beyond using a JDBC driver.

Spoon PDI Data Validator Error

I am trying to validate that an assignment is correct. I can't say much, however we have internal and external users. I have an SQL Script that looks for anything other than internal on an internal assignment - result should be 0 rows. I then place this in a SQL table. After that, I've got a statement to calculate if there is an assignment error, and then I store that in a variable. Based off this, I try to validate the data with the 'Data Validator' step. Running the code manually, it should pass, however Spoon PDI is giving me the following error:
2015/05/04 13:03:19 - Data Validator.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Unexpected error
2015/05/04 13:03:19 - Data Validator.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : org.pentaho.di.core.exception.KettleException:
2015/05/04 13:03:19 - Data Validator.0 - Correct Group/Dashboard Assignment
2015/05/04 13:03:19 - Data Validator.0 - Correct Group/Dashboard Assignment
2015/05/04 13:03:19 - Data Validator.0 -
2015/05/04 13:03:19 - Data Validator.0 - at org.pentaho.di.trans.steps.validator.Validator.processRow(Validator.java:159)
2015/05/04 13:03:19 - Data Validator.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2015/05/04 13:03:19 - Data Validator.0 - at java.lang.Thread.run(Unknown Source)
2015/05/04 13:03:19 - Data Validator.0 - Caused by: org.pentaho.di.trans.steps.validator.KettleValidatorException: Correct Group/Dashboard Assignment
2015/05/04 13:03:19 - Data Validator.0 - at org.pentaho.di.trans.steps.validator.Validator.validateFields(Validator.java:258)
2015/05/04 13:03:19 - Data Validator.0 - at org.pentaho.di.trans.steps.validator.Validator.processRow(Validator.java:130)
2015/05/04 13:03:19 - Data Validator.0 - ... 2 more
2015/05/04 13:03:19 - Data Validator.0 - Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)
2015/05/04 13:03:19 - transformation_group_dashboard_validator - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Errors detected!
2015/05/04 13:03:19 - Spoon - The transformation has finished!!
2015/05/04 13:03:19 - transformation_group_dashboard_validator - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Errors detected!
2015/05/04 13:03:19 - transformation_group_dashboard_validator - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Errors detected!
2015/05/04 13:03:19 - transformation_group_dashboard_validator - Transformation detected one or more steps with errors.
2015/05/04 13:03:19 - transformation_group_dashboard_validator - Transformation is killing the other steps!
Is there anyway I can try to fix this?
It looks like the validator is rejecting your input(s), and according to the line in the source code, it isn't handling errors so all you get is an exception. Try creating another step linked to that validator, then right-click on the validator and choose "Define error handling..." and set up some error-related fields that the step will fill in. Also you will want to double-click on the Data Validator step and make sure the "Report all errors" and "...concatenate all errors" checkboxes are selected. That will ensure each row gets a full list of any validation errors that may have occurred.
This often happens when the validation conditions are not set the way the user intended them to be, so rows are rejected when they "should be" selected :)
I managed to fix my problem by deleting my Data Validator step and re-adding afresh one. I've noticed this with Spoon PDI a lot - the end outcome can sometimes be unpredictable and an occasional refresh of step fixes the issue.

Resources