I am porting my old Eclipse Restful project into Spring Boot 2. So far I am happy that my project is ready to deploy in production, BUT...
When I do "mvn clean install" the process fails since it tries to validate an internal IP address for my production DB server.
Current Condition: I work from home and, I don't need to test the connectivity on my computer, since I have no access to internal network, So I need to do a RDP to deploy the project.
Question: In Eclipse you can deploy any project without forcing to test the connection pool , can I do the same with Spring Boot 2? Can I bypass this initialization from Hiraki?
Thanks in advance for any info.
mvn install by default will run your test cases, and as part of that it will bring up your Spring Boot app to run those tests. Even if we disable the Hikari connect tests, without a proper database connection many other things will likely subsequently fail.
Is there a dev db server at work you can test against? (Or can you set up your tests to run against an in-memory db like HSQLDB?)
If you're very confident that you don't need to re-run the tests, you can disable them during install with:
mvn install -DskipTests
Related
I'm facing very weird issue while integrating flyway DB migration with spring boot application.
When I run the application from executable WAR using command line, it creates new DB at the start-up of application.
Now, If I switch the application run mode to IDE (i.e. run from STS), it again fires all the script from my db/migration folder. I can see the installed_on column time changes every-time I switch between these 2 run modes. I have tried enabling baselineOnMigrate property, but didn't get any effect of it.
Do you think its something related to spring boot embedded tomcat ? because at both run it creates individual tomcat which is embedded.
Please find my spring boot application.properties below:
mssql.dbname=issueDB
mssql.password=password
mssql.dbserver=localhost
mssql.port=1501
spring.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring.datasource.url=jdbc:sqlserver://${mssql.dbserver}:${mssql.port};databaseName=${mssql.dbname}
spring.datasource.username=user
spring.datasource.password=${mssql.password}
spring.flyway.baselineOnMigrate=true
spring.flyway.locations=classpath:db/migration/testissue
spring.flyway.out-of-order=true
spring.flyway.baseline-version=1.3
spring.flyway.placeholder-prefix=$
spring.flyway.placeholder-suffix=$
spring.flyway.mixed=true
spring.flyway.cleanOnValidationError=true
I suppose, it could be caused by this property spring.flyway.cleanOnValidationError=true. According to the docs:
Whether to automatically call clean or not when a validation error occurs.
This is exclusively intended as a convenience for development. Even tough we strongly recommend not to change migration scripts once they have been checked into SCM and run, this provides a way of dealing with this case in a smooth manner. The database will be wiped clean automatically, ensuring that the next migration will bring you back to the state checked into SCM.
May be that you got some validation problems if you are running your application in different ways on the same database and flyway just clean your database and overwrite it with the current scripts state.
I have a multi-modules vertx application deployed on OpenShift. For integration testing purposes, I would like to deploy a database container with pre-defined data, and destroy it when the test is finished.
How can I achieve this ?
My application uses junit and maven fabric8 plugin to deploy containers in Openshift.
This is something that could be done relatively easy using arquillian-cube, which does support Kubernetes and Openshift.
What arquillian-cube can do for you, is to (optionally) create an ephemeral project, deploy everything you need for your test and once everything is up and running, then start your tests. In the end it can also do the cleaning up for you.
It is quite flexible so according to your needs and requirements it can work with either ephemeral or fixed projects. And also there are pletny of configuration options when it comes to cleaning up.
Last but not least, it does play quite nicely with the fabric8 maven plugin.
https://github.com/arquillian/arquillian-cube/blob/master/docs/kubernetes.adoc
I'm wrinting a junit test with #RunWith(Arquillian.class) annotation like described in https://docs.jboss.org/author/display/ARQ/Drone
While writing the test I would like to rin it without wait package war, start tomcat, deploy war, stop tomcat each time.
I run test inside eclipse and I can run tomcat with my web application once and run test multiple times inside the IDE.
Is there any parameter to let Arquillian use an already deployed and running application, without change the source of my test class?
Nope.
Arquillian is about creating deployable archive and testing it inside true server container.
By the way: if you are looking for ways to speed up your development, then take look at tomcat remote adapter. Generally with remote adapter you don't have server startup for each test launch. Just start it manually once.
I created a web application using Jersey through this maven code:
mvn archetype:generate -DarchetypeArtifactId=jersey-quickstart-webapp \
-DarchetypeGroupId=org.glassfish.jersey.archetypes -DinteractiveMode=false \
-DgroupId=com.example -DartifactId=simple-service-webapp -Dpackage=com.example \
-DarchetypeVersion=2.4.1
And I am using Tomcat v7 as my Java server. When I finish writing some code, I use mvn's package command to generate a .war file, copy this file to the /webapps folder and then start tomcat to run my application and test it on browser. But I think I waste lots of time doing these things. So I want to ask if there is an easier way test my code on browsers. How do you guys run your web applications, especially Jersey app, on your server?
And I am using Intellij Idea, does it have some features that help me build and run Jersey apps, or other J2EE apps? how to use them?
In IntelliJ IDEA you can create a Tomcat Run/Debug configuration. In that configuration you can specify "before launch" tasks/options, including running a maven goal. So by running the Tomcat configuration, IDEA will run the maven goal, deploy your code to Tomcat, start the tomcat server, and (if desired) open you web browser to a specified page.
JetBrains has a Getting Started with Spring MVC, Hibernate and JSON tutorial. What you want to do is very similar. The main difference is you will need to remove the default "make" option in the "Before Launch" section at the bottom of the run/debug configuration and instead have it run your maven task.
There is also the Creating a simple Web application and deploying it to Tomcat tutorial. It's a little older and some of the options on the run/debug dialog have changed. But at the core, its still valid. Combined with the above, you should be in good shape.
Finally take a look at the Run/Debug Configuration: Tomcat page in the help guide (also available in the online webhelp).
How can maven be configured to support this type of workflow:
One Time Setup Invoke maven to do one time setup of a developers machine such as
Create a custom version of tomcat configured for this application
Create a local postgres database on the developers machine
load sample data into the database
run a junit test to configure other resources needed to run the application
Integration Tests Invoke maven to do run integration tests which should do the following
Create an integration test db
setup the db
Run command line integration tests against the db
Run a test version of tomcat with the application in it
Run command line junit tests that test the restful services exposed by the application
Release Build Invoke maven to do a release build of the system
do all the steps for an integration test
generate resources and configurations that are used on the server rather than production
deposit the end result in a git repo, commit, and push the changes to production
Test Build Invoke maven to do a test build of the system
do all the steps of a release build but configure the test release package with test server configuration
The main thing I am struggling with is that maven has a single build life-cycle with a well defined sequence of phases not sure if the workflow I want to build is a good fit for maven.
Can maven be configured for this type of workflow? If yes what are the key features of maven that allow for the different configurations of the four main ways that I want to use maven?
Update What I mean by this workflow, is that I want to be able to do something like
mvn setup
mvn integration
mvn prod-release
mvn test-release
I know the above example look like ant, I am long time ant user and total noob with maven.
You could setup Maven to do all that...
You probably would use (shock horror) profiles to achieve some of this...
BUT you don't want to do that
You are following ANT style thinking... if you like that style of thinking then use ANT or Gradle and be happy.
If you want to follow the Maven way, then you will solve the problem differently.
Coming from the Maven way, here are my thoughts:
Why do you need one-time setup? I usually have a run profile that dynamically provisions the correct application server and starts it with the App deployed, tearing down everything afterwards when I hit ^C. Typically this involves starting up a database server or two... hence things I have developed like the cassandra-maven-plugin. That way when I am working on a different project (which could be in 10 minutes time) I don't have to worry about background database servers eating up all my laptop's ram.
Integration tests are actually trivial when you have the above working... in fact I created the Maven Failsafe Plugin to make it easy to have plugin execution tied to the appropriate phases for integration testing. The Maven convention is to have a profile called run-its for running integration tests.
Release builds being different from test builds... ugh! You should be building environment agnostic artifacts. Have them pick up their configuration from the environment they are deployed in. That removes the worry that something has changed between the "test" build and the "production" build. If you really need to bundle the config, then I usually would resort to a separate module for taking the agnostic artifact and rebundling with the required configuration. That way it is easy to prove that you have a reproducible transformation and that nothing has changed inbetween what went to QA vs what is going to Ops.
I always make the release builds include the integration testing.
So typically I have my projects such that
$ mvn -Prun
will fire up the application starting from zero. Hitting ^C will tear everything back down again, and mvn clean or in extreme situations if I have a more complex setup process and need some caching mvn post-clean (think really clean) will remove anything that the run profile put into play
To run the integration tests I typically do
$ mvn -Prun-its verify
To make a release I typically do
$ mvn release:prepare release:perform -B
That is (in my view) the ideal way of handling the above steps you need.
HTH.
BTW I have not had to use PostgreSQL specifically (typically my integration tests and run profile can get away with a pure java database such as derby or hsqldb and because the artifacts are environment agnostic it is easy to have the integration test/dev flyweight app server inject the correct JDBC url) so you may hit some issues with regard to PostgreSQL