Converting JSON to POJO using json string as report parameter - birt

We are trying to convert JSON to POJO and then use this POJO data source in BIRT. We are passing json string as a report parameter. We have also linked the same to dataset parameter.
Conversion of json string to java objects works when run individually in eclipse. However when we run the report, we get the below exception:
org.eclipse.birt.report.engine.api.EngineException: Cannot execute the
statement. org.eclipse.datatools.connectivity.oda.OdaException ;
java.lang.reflect.InvocationTargetException
Due to this exception we are unable to view the report in BIRT. Has anyone faced such an issue before? If yes, please let us know the resolution. Any pointers to solve this exception is really helpful.
Thanks in advance.

Even I was facing the same issue.
Then I realized I had not added supported jars in the BIRT POJO Data Source.
Along with the main POJO jar file, please add all the supporting libraries or jar files the classes in POJO jar refers to in the POJO Data Source.
For example if the class needs GSON jar, add that jar file in the POJO Data Source.

Related

How to split GraphQL schema on the server side (spring boot application)

I want to logically separate all my Graphql schemas into diff. entities. I am doing this in a spring-boot project.
I do not want to use Appolo or any other 3rd party libraries.
E.g.
book.graphql
author.graphql
I tried the solution provided, and I got no compile error but I am also not getting results from the query anymore.
Spring GraphQLmultiple schemas with Query per file
Please see below:
Can someone pls point me out if I am doing something incorrectly?
The solution provided here enter link description here works as is.
The reason I was getting errors was that I had not implemented the endpoint.

Mule - data weave throwing error during munit test, error is There are two transformers that are an exact match for input

The transformation (xml to json) works completely fine during normal run, but when the same flow is being tested using MUnit it fails at dataweave with error message
There are two transformers that are an exact match for input: interface org.mule.api.transport.OutputHandler, output: class java.lang.String.
Transformers are: ObjectToAtomString(class org.mule.transformer.simple.ObjectToString) and ObjectToString(class org.mule.transformer.simple.ObjectToString$$EnhancerByMUNIT$$99111c4f)
I have added the input MIME TYPE in data transformation as well.
I tried to check the tranformation in another demo project to test it using munit and it passed dataweave successfully without any error.
I have data mapper 3.7.3 jar included in the project and pom file.
We faced similar issue with DataWeave in Mule 3.8.2 version for munit, though normal run was working fine. we resolved it with below workaround:
We had Byte Array to String transformer which was throwing this exception. We changed it to Byte Array to Object and in the object we gave class as java.lang.String.

Spring Batch and ElasticSearch

Here is my scenario..
I need to read csv file and store output to ElasticSearch. I am using Spring Batch to read csv file. can anyone give me example how to save in elasticsearch using Spring Batch or Spring Batch Extension?
Its an old question and probably you might have found answer by now but here it goes...
To work with ElasticSearch, you need Spring Data and you simply write items from your writer as you normally do but with a repository instance like - repository.save(list) where list is a List of items passed from Spring Batch processor to writer.
Where repository is basically a ElasticsearchRepository from Spring Data. You need to define repositories for your items.
You need to provide your ElasticsearchRepository definitions to ElasticSearch instance definitions by editing - #EnableElasticsearchRepositories in and you define persistent layer as done here. Edit #EnableElasticsearchRepositories to actual repository package location of your project.
Hope it helps !!
Actually, I worked with a similar project but instead of importing data from a CSV file, I imported it from a relational database MySQL, reading and filtering data with the spring batch and write it into elasticsearch , this is the link of the project in the GitHub read carefully the readme.md file you will find all the required configuration :
the github project link

Spring boot createJarFileFromFileEntry

I have a spring boot app and I start it with -Dloader.path=. One of the jar files is hive-exec.jar. This has a jar file bundled called minlog-1.2.jar. If I specify this file in -Dloader.path, I get an error,
java.lang.IllegalStateException: Unable to open nested entry 'minlog-1.2.jar'. It has been compressed and nested jar files must be stored without compression. Please check the mechanism used to create your executable jar file
at org.springframework.boot.loader.jar.JarFile.createJarFileFromFileEntry(JarFile.java:378)
at org.springframework.boot.loader.jar.JarFile.createJarFileFromEntry(JarFile.java:355)
at org.springframework.boot.loader.jar.JarFile.getNestedJarFile(JarFile.java:341)
at org.springframework.boot.loader.archive.JarFileArchive.getNestedArchive(JarFileArchive.java:108)
at org.springframework.boot.loader.archive.JarFileArchive.getNestedArchives(JarFileArchive.java:92)
at org.springframework.boot.loader.PropertiesLauncher.getClassPathArchives(PropertiesLauncher.java:445)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:60)
at org.springframework.boot.loader.PropertiesLauncher.main(PropertiesLauncher.java:564)
However, if I copy this file into a folder and add that folder in -Dloader.path, I don't get any error.
What am I missing?
Thank You,
I am trying to manage the database driver as external jar instead of a project's maven's dependency. The application use the JPA framework, and we wanted to be able to switch from one SQL-database implementation from one environment to another (for example, H2 in DEV, Oracle in production). I had to manage the h2-database-driver jar as an external dependency. While loading it with the "-Dloader.path" command line option, I came accross he same problem as you described.
Viewing the org.springframework.boot.loader.jar.JarFile source code, the class manage an folder-entry and a jar-entry differently.
The method getNestedArchives seems to invoke the createJarFileFromFileEntry method which throw this exception.
There must be a good reason for it, if someone know about it, any comment is welcomed!
When loading jar dependencies from a directory, the java.util.jar.JarFile constructor is used instead and does not seem to throw any exception...
In the createJarFileFromFileEntry method, instad of throwing an exception, wouldn't it be possible to only have a log.warn and not throw an exception?

SOAP UI: Can I add assertion to validate with local XML Schema file?

I have created a Web Service (with Java, Axis). Depending on the content of the request, the response may have a different XSD. So, the WSDL only specifies that the response is of a generic XSD, and the responses comply to XSDs that import and extend the generic XSD.
Unfortunately, the Schema assertion fail because the XSD specified in the WSDL can only the generic one. Is there a way to manually specify which XSD I want the assertion to use? For instance, depending on the request I prepare, I know the specific XSD of the response. So, it would be perfect if I could say to SoapUI to assert the response by that XSD, which I can store either locally or at a url.
So, is there a way to make a schema assertion using a locally (or remotely) stored XML schema?
Thanks,
Markos
What I did after all is that I created a simple class to do this in Java.I exported the project as a jar, imported it in Groovy and just called it.
This is normal, as both Java and Groovy both play on the JVM, so it is perfectly normal to call classes and methods from one another.

Resources