idea Invalid bound statement (not found) - maven

i am using idea 2021, i have a spring boot project with maven and mybatis.
i often encounter this problem:
once I modify the mybatis sql xml file(e.g. booking.xml), then I redeploy this project(I have to redeploy such that make the modification under src/main/resource effect).
After that, error Invalid bound statement (not found): xxx throw if i access the sql which in the modification sql xml file(e.g. I modified content of updateBooking, it will say invalid bound statment(not found): selectBooking). I am pretty sure the bound statement is exist in this sql xml file and all works before modifying.
I checked the target of this idea project and found that there is no booking.xml file, it seems that this sql xml file is deleted from target after I modified it and redeploy.
In order to solve it, I need to run mvn clean package for this project then redeploy it.
it seems that this happen in idea, i didn't encounter it in eclipse before.
How can i fix this problem permanently?

I use the following configuration every time I have this problem.
<resources>
<resource>
<directory>src/main/java</directory>
<includes>
<include>**/*.xml</include>
....
</includes>
</resource>
</resources>
mvn clean package.
restart IDEA.

Related

How to override include pattern of resources from CLI

I am using Maven resources plugin and want to override include pattern from CLI. Is there a way?
I tried this
<configuration>
<outputDirectory>${project.build.outputDirectory}</outputDirectory>
<resources>
<resource>
<directory>${basedir}/src/main/content/jcr_root</directory>
<filtering>false</filtering>
<includes>
<include>${include.val}</include>
</includes>
</resource>
</resources>
</configuration>
Command from CLI is : mvn clean install -Dinclude.val=/components/content/anchor/
But my use-case is I may have multiple values to include based on the input and this is dynamic. So I cannot have an XML include element for every include.
The include pattern as far as I read - does not take comma separated values. Is there any other way? Or any other plugin?
I feel not taking comma separated values is a short-coming of this plugin. And though it accepts pattern, it does not take the normal regular expressions, so i cant have a pattern added here too.
Any pointers would help.
Use case: The use case is of atomic packaging.
We have multiple files in the source directory
Lets assume only 2 files got modified by the developer. lets assume we get to know the exact 2 files that got modified.
I want only those 2 modified files to be included in the final package post maven build.
If i have a command, that can take comma separated include patterns then this can be achieved very easily.
I read about this on the internet. Found some useful resources but could not get that working
https://blog.sonatype.com/2011/03/configuring-plugin-goals-in-maven-3/
Wanted to know if anyone has made this working or found any alternative

PigUnit--how to get access to pig scripts under Maven project structure?

I'm trying to use PigUnit with Maven. In Maven, my pig scripts are located under the module's root directory at <module>/src/main/pig/<scriptName>.pig
All the PigUnit test tutorials either code the absolute url of the pig script (which obviously won't work when the build runs anywhere but my machine) or the relative path and it just "magically" works. But when I put in either the script name directly or src/main/pig/<scriptName>.pig to PigTest, the script can't be found when running mvn test.
Test with line (using scala + scalatest):
val pigTest = new PigTest("src/main/pig/calcProductVectors.pig", args)
Results in:
- Script does something *** FAILED ***
java.io.FileNotFoundException: src/main/pig/calcProductVectors.pig (No such file or directory)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at org.apache.pig.pigunit.PigTest.readFile(PigTest.java:296)
at org.apache.pig.pigunit.PigTest.readFile(PigTest.java:292)
How do I get src/main/pig to be on the path when mvn test runs?
Okay, so I found some tests in our Java projects that just added the pig directory to project.build.testResources in maven:
<testResources>
<testResource>
<!-- Pig is being included in test/resource so that we can access it with PigUnit -->
<directory>src/main/pig</directory>
<filtering>true</filtering>
</testResource>
<testResource>
<directory>src/test/resources</directory>
<filtering>true</filtering>
</testResource>
</testResources>
Then the JUnit tests would pass src/main/pig/<scriptName>.pig to PigTest.
This didn't work in my project, not sure if it's because of scalatest or something else. I could see the contents of the directory getting copied to <module>/target/test-classes, but they still weren't found either with their direct name (which seems like it would make more sense because they were copied without the src/main/pig part) or that whole path which was working for the Java projects.
Eventually from this answer, I found out where the tests were running from on the file system, and it was just the module name, so I adjusted the path like so and it worked:
val pigTest = new PigTest("<module>/src/main/pig/calcProductVectors.pig", args)

cxf-codegen-plugin to exclude XSD files

I use cxf-codegen-plugin to generate a series of WS clients using Maven, on construction time. These WSDL reference some XSD schema definitions using a relative path like so: ../someService/schema.xsd
Now when I trigger a construction from Eclipse this works properly since my XSD files are placed in the right path.
But when I launch a construction job from Jenkins, it fails because it seems it's using Jenkins workspace as the root of the construction.
I don't even know if you can change this behavior of Jenkins, but since I have no control over my Jenkins instance, what I would like to know is for cxf-codegen-plugin to exclude XSD processing altogether, and then generate those classes explicitly using a different execution phase with a different plugin.
I've read you can do it like this:
<defaultOptions>
<extraargs>
<extraarg>-nexclude</extraarg>
<extraarg>http://*.ws.cntxes.emprego.xunta.es</extraarg>
</extraargs>
</defaultOptions>
But this assumes I know those namespaces prior to constructing, which I don't (WSDL files are taken from external dependencies using maven dependency plugin).
I also tried:
<wsdlRoot>${basedir}/src/main/resources/wsdl</wsdlRoot>
<includes>
<include>
**/*.wsdl
</include>
</includes>
<excludes>
<exclude>
*.xsd
</exclude>
</excludes>
But this does not work, the plugin just keeps parsing the XSD files and generating the related classes.
Is there any other way I'm missing to prevent the parsing of XSD files and just process the WSDL definitions?
EDIT: this is the error Jenkins is giving me:
[ERROR] Failed to execute goal org.apache.cxf:cxf-codegen-plugin:2.7.3:wsdl2java (generate-sources-wsclient-cxf) on project my-project: Execution generate-sources-wsclient-cxf of goal org.apache.cxf:cxf-codegen-plugin:2.7.3:wsdl2java failed: org.apache.cxf.wsdl11.WSDLRuntimeException: Fail to create wsdl definition from : file:/var/lib/jenkins/workspace/MYPROJECT/myproject-webservice/src/main/resources/wsdl/Descriptor/serviceDescriptor.wsdl
[ERROR] Caused by : WSDLException (at /definitions/types/xsd:schema): faultCode=PARSER_ERROR: Problem parsing '../xsd/schema.xsd'.: java.io.FileNotFoundException: /var/lib/jenkins/workspace/xsd/actividadFormativa.xsd (No such file or directory)
It is looking on the root of jenkins' workspace instead of /var/lib/jenkins/workspace/MYPROJECT/myproject-webservice/src/main/resources/wsdl/xsd/schema.xsd
I had the same problem (only with the wsdl files). After long research I figured out that the problem was a case sensitivity issue - windows (local CLI and eclipse builds) and the linux/unix hudson/jenkins build environment:
The problematic wsdl had a big letter S
<wsdlOption>
<wsdl>${basedir}/src/main/resources/Some.wsdl</wsdl>
</wsdlOption>
But on the filesystem the file was some.wsdl
So it was not the path issue (.../workspace/...) as I also initially expected...

Maven jetty plugin using wrong resource file URL

I am running a project within a vagrant box setup over my eclipse workspace. I am working on setting up a new maven project but I am having problems with the plugin using windows paths instead of the vagrant path. My windows workspace is setup in C:\Dev, so when I 'vagrant up', my entire workspace is available within my VM. In other words, /vagrant in the VM contains the contents of C:\Dev.
When I execute mvn jetty:run, everything starts up fine and all paths use the vagrant versions (/vagrant/mvn_project/target...). However, once the plugin starts scanning the project for changes, it throws the following error:
2014-02-26 01:18:53.756:WARN:oejw.WebAppContext:Scanner-0: Failed startup of context o.e.j.m.p.JettyWebAppContext#591f46d8{/,[file:/vagrant/mvn_project/web, file:/vagrant/mvn_project/target/webapps/ROOT/],STARTING}{/ROOT/]}
javax.servlet.UnavailableException: Malformed URL 'file://C:\\Dev\\mvn_project/target/webapps/ROOT/WEB-INF/dtd/web-app_2_3.dtd' : For input string: "\\Dev\\mvn_project"
at org.apache.struts.action.ActionServlet.init(ActionServlet.java:402)
at javax.servlet.GenericServlet.init(GenericServlet.java:244)
at org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:561)
at org.eclipse.jetty.servlet.ServletHolder.initialize(ServletHolder.java:351)
at org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:840)
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:300)
at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1347)
at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:745)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:492)
at org.eclipse.jetty.maven.plugin.JettyWebAppContext.doStart(JettyWebAppContext.java:282)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:69)
at org.eclipse.jetty.maven.plugin.JettyRunMojo.restartWebApp(JettyRunMojo.java:532)
at org.eclipse.jetty.maven.plugin.JettyRunMojo$1.filesChanged(JettyRunMojo.java:485)
at org.eclipse.jetty.util.Scanner.reportBulkChanges(Scanner.java:681)
at org.eclipse.jetty.util.Scanner.reportDifferences(Scanner.java:539)
at org.eclipse.jetty.util.Scanner.scan(Scanner.java:391)
at org.eclipse.jetty.util.Scanner$1.run(Scanner.java:329)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
Is this a bug with the plugin or is there a configuration setting I can use to set this value?
Edit: A little more context... It seems that the problem has something to do with filtered resources.
<resources>
<resource>
<directory>${project.basedir}/src/main/filtered-resources</directory>
<filtering>true</filtering>
<targetPath>${project.basedir}/target/webapps/ROOT</targetPath>
</resource>
</resources>
Changing ${project.basedir} to /vagrant/mvn_project seems to fix the problem, but clearly this is just a workaround and not a solution (won't work in CI for example).
UPDATE: It turns out, the blame is on Eclipse. Eclipse is occasionally building the project and when it does so, ${project.basedir} refers to C:\Dev\mvn_project instead of /vagrant/mvn_project. Is there a way to override ${project.basedir} without hard coding?
Simple answer: disable builds in eclipse. Uncheck Project -> Build Automatically. Always run builds from within vagrant (mvn compile). JSP hot changes still work automatically.

How to update folders automatically with IntelliJ

I have a Maven configuration that copies my web resources to a directory in target. From there it is read by Jetty. What I want (and what Eclipse always did for me) is update the target/web directory when something in the src/main/webapp directory changed. I can't get IntelliJ to do the same:
The resource configuration like this:
<resource>
<directory>src/main/webapp</directory>
<excludes>
<exclude>less/</exclude>
</excludes>
<filtering>false</filtering>
<targetPath>${project.build.directory}/web</targetPath>
</resource>
Right now I have to run the Generate sources and update folders everytime I make a change. Can't IntelliJ Detect this automatically?
Notes:
I do not build a war but a folder distribution.
I already tried moving it to target/generated-sources/web but that makes no difference.
The target/web is not marked as excluded in the module configuration.
The folder is marked as a resource folder. I tried marking it as a source folder but that made no difference.
The problem can be solved by using the File watchers plug-in. This plug-in doesn't ship with IntelliJ by default but it is very useful. From there, you can watch your *.less/html/js files and re-generate them if you edit them. In my case I run the appropriate Maven goals but you can also call the less compiler directly if you want to.
In the configuration set "Output paths to refresh" to the the custom directory you are using (in my case $OutputPath$/web). After that, any change should be refreshed automatically.
I think, yes: try pressing Ctrl+Shift+A, type "Import Maven", click the checkbox "Import Maven project automatically". This will enable auto-import which copies resources as well.

Resources