A non-read-only mapping must be defined for the sequence number field - toplink

I am getting the following error from Toplink when I start my application. I am trying to add two new tables to our application.
EXCEPTION [TOPLINK-41] (TopLink - 9.0.3.7 (Build 440)): oracle.toplink.exceptions.DescriptorException
EXCEPTION DESCRIPTION: A non-read-only mapping must be defined for the sequence number field.
DESCRIPTOR: Descriptor(icis.cr.common.db.entities.ClerkReviewTask --> [DatabaseTable(CREV_TASK)])
I have compared the mappings to one that works and haven't noticed anything. I compared the new Class in TopLink workbench and don't see any missing mapping. It appears my sequence is mapped correctly. Does anyone have any suggestions with this?
The descriptor has the following for the TASK_ID field:
<primaryKeyFieldHandles>
<FieldHandle>
<table>CREV_TASK</table>
<fieldName>TASK_ID</fieldName>
</FieldHandle>
</primaryKeyFieldHandles>
<sequenceNumberName>SEQ_CREV_TASK_ID</sequenceNumberName>
<sequenceNumberFieldHandle>
<FieldHandle>
<table>CREV_TASK</table>
<fieldName>TASK_ID</fieldName>
</FieldHandle>
</sequenceNumberFieldHandle>
<Mapping>
<descriptor>icis.cr.common.db.entities.ClerkReviewTask.ClassDescriptor</descriptor>
<usesMethodAccessing>false</usesMethodAccessing>
<inherited>false</inherited>
<readOnly>false</readOnly>
<getMethodHandle>
<MethodHandle emptyAggregate="true">
</MethodHandle>
</getMethodHandle>
<setMethodHandle>
<MethodHandle emptyAggregate="true">
</MethodHandle>
</setMethodHandle>
<instanceVariableName>id</instanceVariableName>
<defaultFieldNames>
<defaultFieldName>direct field=</defaultFieldName>
</defaultFieldNames>
<fieldHandle>
<FieldHandle>
<table>CREV_TASK</table>
<fieldName>TASK_ID</fieldName>
</FieldHandle>
</fieldHandle>
<classIndicator>BldrDirectToFieldMapping</classIndicator>
</Mapping>

I was able to fix this by right-clicking my project in TopLink Mapping Workbench and selecting Export Project to Java Source. My file was out of date and caused this error and the following:
EXCEPTION [TOPLINK-110] (TopLink - 9.0.3.7 (Build 440)): oracle.toplink.exceptions.DescriptorException
EXCEPTION DESCRIPTION: Descriptor is missing for class [icis.cr.common.db.entities.ClerkReviewCaseTask].
MAPPING: oracle.toplink.mappings.OneToManyMapping[caseTasks]
DESCRIPTOR: Descriptor(icis.cr.common.db.entities.ClerkReviewTask --> [DatabaseTable(CREV_TASK)])

Related

Feature type rename failed after 9

I'm trying to upload shape file for geoserver and I'm getting this error when I upload the same feature type 10 times in to different datastores.
I'm getting this warning,
WARN [rest.catalog] - Feature type surface_zone_line-line already exists in namespace MyWorkSpace, attempting to rename
And In the next line this error is showing,
ERROR java.lang.RuntimeException: java.lang.IllegalArgumentException: Resource named 'surface_zone_line-line9' already exists in namespace: 'MyWorkSpace'
That renaming worked well up to 9 feature types but It didn't work for 10th.
Please help!
GeoServer Version 2.14.1

How do I find out which schema script spring-boot is running?

I have an application based on spring-boot 1.4 with spring-jdbc.
I've added Flyway, which works in the application itself, but I get errors for my JdbcDAO test cases.
Question: I know spring-boot overrides 'dataSource' to embedded HSQLDB datasource, but I have no idea where it finds the SQL scripts to populate the empty database when I run tests.
Documentation says that it looks for 'schema.sql' or 'data.sql' in classpath, but I've renamed all of the script files in both main and test resource paths, but I still get this error:
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'flywayInitializer' defined in class path resource [org/springframework/boot/autoconfigure/flyway/FlywayAutoConfiguration$FlywayConfiguration.class]: Invocation of init method failed;
nested exception is org.flywaydb.core.api.FlywayException: Found non-empty schema(s) "PUBLIC" without schema history table! Use baseline() or set baselineOnMigrate to true to initialize the schema history table.
This error leads me to believe that Spring-boot had created a schema definition BEFORE Flyway scripts were applied.
UPDATE: I'm getting a different error now, this is a Bad SQL error:
org.springframework.jdbc.BadSqlGrammarException: StatementCallback; bad SQL grammar [INSERT INTO BLUECOST_SSCDATA (SSCDATAID,PROCESSGROUPID,COSTINGAMOUNT,CHRG_TYP_CD,CONTROL_GROUP_CD,ACCOUNT_ID,CHRGHS_END_DT,ORIG_LOC_CD,SERVICE_TYP_CD,SERVICE_CD,SERVICE_GROUP_ID,SERVICE_ENV_CD,SERVICE_ADDER_CD,SERVICE_RESTYP_CD,RATECLAS_CD,PRICELST_UP_AMT,CHRGHS_USAGE_QTY,EMP_FA_CD,EMP_DIV_CD,EMP_DPT_CD,EMP_COUNTRY_CD,EMP_COMPANY_CD,EMP_NUM,EMP_INITS,EMP_LASTNAME,USER_ID,ADJUSTMENT_ID,CUST_REFERENCE_ID,ORIG_DIV_CD,ORIG_DPT_ID,ORIG_COUNTRY_CD,ORIG_COMPANY_CD,LOCAL_FIELD_1,LOCAL_FIELD_2,LOCAL_FIELD_3,LOCAL_FIELD_4,LOCAL_FIELD_5,LOCAL_FIELD_6,CREATETIME,PROCESSTIME,LAST_ALTER_TMS,TRX_TYP_CD,FILENAME) VALUES (100,null,368.60,'CTA','EMEA ','D286148 ','2018-03-19','SLR','SLR','SLIC','BASE',null,null,null,'OGS',null,null,null,null,null,null,null,null,null,'IBM SLIC BV',null,null,null,null,null,'653','SOFTLAYR','INVCE ID','X91927','ACCNT ID','FILENAME1.XLS',null,null,{ts '2018-04-22 01:30:21.437000'},null,{ts '2018-03-22 01:32:21.437000'},'I','FILENAME1.XLS')]; nested exception is java.sql.SQLSyntaxErrorException: requires either DEFAULT keyword or OVERRIDING clause
at com.ibm.cio.cloud.cost.spreadsheet.dao.UTJdbcBluecostSSCDataDAOTest.testDeleteBluecostSSCDataByLocalField2AndLocalField4(UTJdbcBluecostSSCDataDAOTest.java:265)
Caused by: java.sql.SQLSyntaxErrorException: requires either DEFAULT keyword or OVERRIDING clause
at com.ibm.cio.cloud.cost.spreadsheet.dao.UTJdbcBluecostSSCDataDAOTest.testDeleteBluecostSSCDataByLocalField2AndLocalField4(UTJdbcBluecostSSCDataDAOTest.java:265)
Caused by: org.hsqldb.HsqlException: requires either DEFAULT keyword or OVERRIDING clause
at com.ibm.cio.cloud.cost.spreadsheet.dao.UTJdbcBluecostSSCDataDAOTest.testDeleteBluecostSSCDataByLocalField2AndLocalField4(UTJdbcBluecostSSCDataDAOTest.java:265)
This gives me a clue: looks like my latest Flyway script is not applied to the schema when running the 'mvn test' goal.
How do I turn on the debugging to see what is happening with Flyway when running mvn test goal?
Fixed - retracting this question now.
<logger name="org.flywaydb" level="DEBUG">
<appender-ref ref="STDOUT"/>
</logger>

PXF JSON plugin error

Using HDP 2.4 and HAWQ 2.0
Wanted to read json data kept in HDFS path into HAWQ external table?
Followed below steps to add new json plugin into PXF and read data.
Download plugin "json-pxf-ext-3.0.1.0-1.jar" from
https://bintray.com/big-data/maven/pxf-plugins/view#
Copy the plugin into path /usr/lib/pxf.
Create External table
CREATE EXTERNAL TABLE ext_json_mytestfile ( created_at TEXT,
id_str TEXT, text TEXT, source TEXT, "user.id" INTEGER,
"user.location" TEXT,
"coordinates.type" TEXT,
"coordinates.coordinates[0]" DOUBLE PRECISION,
"coordinates.coordinates[1]" DOUBLE PRECISION)
LOCATION ('pxf://localhost:51200/tmp/hawq_test.json'
'?FRAGMENTER=org.apache.hawq.pxf.plugins.hdfs.HdfsDataFragmenter'
'&ACCESSOR=org.apache.hawq.pxf.plugins.json.JsonAccessor'
'&RESOLVER=org.apache.hawq.pxf.plugins.json.JsonResolver'
'&ANALYZER=org.apache.hawq.pxf.plugins.hdfs.HdfsAnalyzer')
FORMAT 'CUSTOM' (FORMATTER='pxfwritable_import')
LOG ERRORS INTO err_json_mytestfile SEGMENT REJECT LIMIT 10 ROWS;
When execute the above DDL table create successfully. After that trying to execute select query
select * from ext_json_mytestfile;
But getting error: -
ERROR: remote component error (500) from 'localhost:51200': type Exception report message java.lang.ClassNotFoundException: org.apache.hawq.pxf.plugins.json.JsonAccessor description The server encountered an internal error that prevented it from fulfilling this request. exception javax.servlet.ServletException: java.lang.ClassNotFoundException: org.apache.hawq.pxf.plugins.json.JsonAccessor (libchurl.c:878) (seg4 sandbox.hortonworks.com:40000 pid=117710) (dispatcher.c:1801)
DETAIL: External table ext_json_mytestfile
Any help would be much appreciated.
It seems that referenced jar file has old package name as com.pivotal.*. The JSON PXF extension is still incubating, the jar pxf-json-3.0.0.jar is built for JDK 1.7 as Single node HDB VM is using JDK 1.7 and uploaded to dropbox.
https://www.dropbox.com/s/9ljnv7jiin866mp/pxf-json-3.0.0.jar?dl=0
Echo'ing the details of the above comments so that the steps are performed correctly to ensure the PXF service recognize the jar file. The below steps assume that Hawq/HDB is managed by Ambari. If not, the manual steps as mentioned by the previous updates should work.
Copy the pxf-json-3.0.0.jar to /usr/lib/pxf/ of all your HAWQ nodes (master and segments).
In Ambari managed PXF, add the below line by going through Ambari Admin -> PXF -> Advanced pxf-public-classpath
/usr/lib/pxf/pxf-json-3.0.0.jar
In Ambari managed PXF, add this snippet to your pxf profile xml at the end by going through Ambari Admin -> PXF -> Advanced pxf-profiles
<profile>
<name>Json</name>
<description>
JSON Accessor</description>
<plugins>
<fragmenter>org.apache.hawq.pxf.plugins.hdfs.HdfsDataFragmenter</fragmenter>
<accessor>org.apache.hawq.pxf.plugins.json.JsonAccessor</accessor>
<resolver>org.apache.hawq.pxf.plugins.json.JsonResolver</resolver>
</plugins>
</profile>
Restart PXF service via Ambari
Did you add the jar file location to /etc//conf/pxf-public.classpath?
Did you try:
copying PXF JSON jar file to /usr/lib/pxf
updating /etc/pxf/conf/pxf-profiles.xml to include the Json plug-in profile if not already present
(per comment above) updating the /etc/pxf/conf/pxf-public.classpath
restarting the PXF service either via Ambari or command line (sudo service pxf-service restart)
likely didn't add json jar in classpath.
Create External Table DDL will always succeed as it was just a definition.
Only when you run queries, HAWQ will check the run time jar dependencies.
Yes, the jar json-pxf-ext-3.0.1.0-1.jar" from https://bintray.com/big-data/maven/pxf-plugins/view# has old package name as com.pivotal.*. The previous update has edited with details to download the correct jar from dropbox

Getting error try to select hive table using hcatalog from HAWQ

I am using Hortonworks (HDP)sandbox and on top of that install HAWQ 2.0
I'm trying to select hive table using hcatalog but not able to access hive tables form HAWQ. Executing below steps mention in pivotal doc.
postgres=# SET pxf_service_address TO "localhost:51200";
SET
postgres=# select count(*) from hcatalog.default.sample_07;
ERROR: remote component error (500) from 'localhost:51200': type Exception report message Internal server error. Property "METADATA" has no value in current request description The server encountered an internal error that prevented it from fulfilling this request. exception java.lang.IllegalArgumentException: Internal server error. Property "METADATA" has no value in current request (libchurl.c:878)
LINE 1: select count(*) from hcatalog.default.sample_07;
I think there's a missing property in pxf-profile.xml
check if you have <metadata> property under Hive profile
this is newly added profile and if you were using a legacy build it might not have it.
<profile>
<name>Hive</name>
<description>This profile is suitable for using when connecting to Hive</description>
<plugins>
<fragmenter>org.apache.hawq.pxf.plugins.hive.HiveDataFragmenter</fragmenter>
<accessor>org.apache.hawq.pxf.plugins.hive.HiveAccessor</accessor>
<resolver>org.apache.hawq.pxf.plugins.hive.HiveResolver</resolver>
<metadata>org.apache.hawq.pxf.plugins.hive.HiveMetadataFetcher</metadata>
</plugins>
</profile>

Subreport Data Source Expression for XML document

I'm trying to get a subreport working for a report using an XML document as a data source.
When I'm sorting main report data, subreport won't have to requery the XML document. Its expecting a JRSortableDataSource, not a JRXmlDataSource.
What am I doing wrong?
I used following datasource expression:
$P{REPORT_DATA_SOURCE}).subDataSource("/person/phone")
The stack trace:
Error filling print... Error evaluating expression : Source text : $P{REPORT_DATA_SOURCE}.subDataSource("/person/phone")
net.sf.jasperreports.engine.fill.JRExpressionEvalException: Error evaluating expression :
Source text : $P{REPORT_DATA_SOURCE}.subDataSource("/person/phone")
at net.sf.jasperreports.engine.fill.JREvaluator.evaluate(JREvaluator.java:203)
at net.sf.jasperreports.engine.fill.JRCalculator.evaluate(JRCalculator.java:591)
at net.sf.jasperreports.engine.fill.JRCalculator.evaluate(JRCalculator.java:559)
at net.sf.jasperreports.engine.fill.JRFillElement.evaluateExpression(JRFillElement.java:966)
at net.sf.jasperreports.engine.fill.JRFillSubreport.evaluateSubreport(JRFillSubreport.java:384)
at net.sf.jasperreports.engine.fill.JRFillSubreport.evaluate(JRFillSubreport.java:286)
at net.sf.jasperreports.engine.fill.JRFillElementContainer.evaluate(JRFillElementContainer.java:259)
at net.sf.jasperreports.engine.fill.JRFillBand.evaluate(JRFillBand.java:459)
at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillColumnBand(JRVerticalFiller.java:2044)
at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillDetail(JRVerticalFiller.java:778)
at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillReportStart(JRVerticalFiller.java:288)
at net.sf.jasperreports.engine.fill.JRVerticalFiller.fillReport(JRVerticalFiller.java:151)
at net.sf.jasperreports.engine.fill.JRBaseFiller.fill(JRBaseFiller.java:909)
at net.sf.jasperreports.engine.fill.JRFiller.fill(JRFiller.java:126)
at net.sf.jasperreports.engine.JasperFillManager.fill(JasperFillManager.java:464)
at net.sf.jasperreports.engine.JasperFillManager.fill(JasperFillManager.java:300)
at net.sf.jasperreports.engine.JasperFillManager.fillReport(JasperFillManager.java:757)
at com.jaspersoft.ireport.designer.compiler.IReportCompiler.run(IReportCompiler.java:1003)
at org.openide.util.RequestProcessor$Task.run(RequestProcessor.java:572)
at org.openide.util.RequestProcessor$Processor.run(RequestProcessor.java:997) Caused by: groovy.lang.MissingMethodException:
No signature of method: net.sf.jasperreports.engine.fill.SortedDataSource.subDataSource()
is applicable for argument types: (java.lang.String) values: [/person/phone]
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:54)
at org.codehaus.groovy.runtime.callsite.PojoMetaClassSite.call(PojoMetaClassSite.java:46)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:40)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:124)
at TestParam_1362739351228_895383.evaluate(calculator_TestParam_1362739351228_895383:223)
at net.sf.jasperreports.engine.fill.JREvaluator.evaluate(JREvaluator.java:190)
Your datasource expression $P{REPORT_DATA_SOURCE}).subDataSource("/B/C")
Has a syntax error
You missed unmatched character at first (
Then
If its expecting a JRSortableDataSource
Use this expression
((net.sf.jasperreports.engine.data.JRSortableDataSource)$P{REPORT_DATA_SOURCE}).subDataSource("DefineThat")
If its expecting a JRXmlDataSource
Use this expression
((net.sf.jasperreports.engine.data.JRXmlDataSource)$P{REPORT_DATA_SOURCE}).subDataSource("DefineThat")
$P{REPORT_DATA_SOURCE}).subDataSource("Xpath")
when somebody uses the above expression to connect to create sub report in Jasper using XML Datasource we need to set the Report Language as groovy instead of Java .I think subdatasource is not a method defined in Java but in groovy .I'm trying to find a method which is equivalent to above in Java
Had the same issue, found this solution at the jaspersoft-community:
Capturing the original data source in a parameter:
<parameter name="MyDataSource" class="net.sf.jasperreports.engine.JRDataSource" isForPrompting="false">
<defaultValueExpression><![CDATA[$P{REPORT_DATA_SOURCE}]]></defaultValueExpression>
</parameter>
same in iReport-GUI:
create new paramater at the main report
Name: "MyDataSource"
Parameter Class: "net.sf.jasperreports.engine.JRDataSource"
Use as a prompt: No
Default value Expression: "$P{REPORT_DATA_SOURCE}"
Now using the MyDataSource Parameter at the subreport-datasource:
<dataSourceExpression><![CDATA[((net.sf.jasperreports.engine.data.JRXmlDataSource)$P{MyDataSource}).dataSource("/Subreport/path")]]></dataSourceExpression>

Resources