I am really did not understand the concept here and trying to clear what is happening in build time.
My application get built using clean package and Now I have different profile based on environment. my default profile does not had db connection details. when Jenkins build happen I am not expecting it connect to db or some aws services as most of those are defined in respective profile file. When I do build its keep failing to required all those settings defined.
About JUnit test all of dependencies are mocked and its working as expected.
Also another question related to profile, if i set a profile while building do i need to change that for every environment?
Error I get
[ERROR][error]: Build step io.quarkus.hibernate.orm.deployment.HibernateOrmProcessor#configurationDescriptorBuilding threw an exception: io.quarkus.runtime.configuration.ConfigurationException: Model cl
asses are defined for the default persistence unit, but no default datasource was found. The default EntityManagerFactory will not be created. To solve this, configure the default datasource. Refer to https://qu
arkus.io/guides/datasource for guidance.
Related
I'm facing very weird issue while integrating flyway DB migration with spring boot application.
When I run the application from executable WAR using command line, it creates new DB at the start-up of application.
Now, If I switch the application run mode to IDE (i.e. run from STS), it again fires all the script from my db/migration folder. I can see the installed_on column time changes every-time I switch between these 2 run modes. I have tried enabling baselineOnMigrate property, but didn't get any effect of it.
Do you think its something related to spring boot embedded tomcat ? because at both run it creates individual tomcat which is embedded.
Please find my spring boot application.properties below:
mssql.dbname=issueDB
mssql.password=password
mssql.dbserver=localhost
mssql.port=1501
spring.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring.datasource.url=jdbc:sqlserver://${mssql.dbserver}:${mssql.port};databaseName=${mssql.dbname}
spring.datasource.username=user
spring.datasource.password=${mssql.password}
spring.flyway.baselineOnMigrate=true
spring.flyway.locations=classpath:db/migration/testissue
spring.flyway.out-of-order=true
spring.flyway.baseline-version=1.3
spring.flyway.placeholder-prefix=$
spring.flyway.placeholder-suffix=$
spring.flyway.mixed=true
spring.flyway.cleanOnValidationError=true
I suppose, it could be caused by this property spring.flyway.cleanOnValidationError=true. According to the docs:
Whether to automatically call clean or not when a validation error occurs.
This is exclusively intended as a convenience for development. Even tough we strongly recommend not to change migration scripts once they have been checked into SCM and run, this provides a way of dealing with this case in a smooth manner. The database will be wiped clean automatically, ensuring that the next migration will bring you back to the state checked into SCM.
May be that you got some validation problems if you are running your application in different ways on the same database and flyway just clean your database and overwrite it with the current scripts state.
I have a #Service class where i'm caching some table data. I don't want those queries to run while building mvn install. Is there a way to ignore the file while building and it only execute when i start the server ?
It's a spring-boot application.
Here is background of my issue. I have initialized the spring boot app from http://start.spring.io/ site, which actually adds dummy application test file with SpringBootTest annotation and default contextLoads() with Test annotation, with an intention to initialize and execute all test cases, which needs to initialize and execute all code base. In my opinion this is not required, as we can have respective Test classes per controller/manager, which will give more controlled environment to hook up your Test setups and executions.
I have removed the default application Test file and included respective test classes for code coverage and quality. This way my beans are not executed at server startup time rather build time.
I am trying to find the best way to configure my Spring Boot Web application to easily switching between the following data sources for both local testing and deployment.
H2 in memory db. Local testing only.
Dev oracle. Local testing and deployment.
Prod oracle. Deployment only.
By local testing, I mean to test in IDE environment (Eclipse). Dev and prod oracle databases are set up on two remote servers.
After some research, there are different ways to switch from one data source to another.
Use Spring profile. Using H2 and Oracle with Spring Boot. Set up the following files in classpath, application.properties, application-h2. properties and application-dev.properties. While connections for h2 and dev are defined in corresponding properties files, spring.profiles.active is set in application.properties. My understanding is this property can be overridden during build process by specifying spring.profiles.active. However, it seems to be a JVM variable, how do I set it running maven?
Maven profile. Create multiple profiles in pom and a filter pointing to application properties files. The profile specified by -P option during maven build will determine which application properties file to look. However, according to maven application with multi environment configuration can't deploy on tomcat, this will generate multiple wars for different deployment. So method 1 is preferred. Plus, it does not apply to switching datasources while testing locally.
Persistence units. Define different persistence units for different data sources in persistence.xml. Use EntityManager by choosing a specific unit. Variation of this method include having a variable in unit names which is determined in application.properties.
JNDI lookup. Set up a jndi name in application.properties with spring.datasource.jndi-name. The actual database information including url and credentials will be specified in context.xml in the tomcat folder where the war will be deployed.
My mind is set on local testing environment. Gonna go with method 1. Switching between H2 in memory and oracle is so easy just by changing the property in application.properties. Since the testing is usually done in IDE, war does not need to be generated, although answers are welcome for run maven install with spring.profiles.active.
As far as deployment, JNDI is definitely the way to go. However, I am concerned that the two properties in application.properties: spring.profiles.active and spring.datasource.jndi-name may be conflicting with each other. If I have spring.profiles.active=h2 and then tried to deploy the war to prod server, does it try to connect to h2 based on the spring profile or to prod db based on jdni-name? What is the best practice to accommodate all scenarios with enough flexibility?
Also is a explicit configuration class for DataSource required such as Configure Mutiple DataSource in Spring Boot with JNDI? My understanding is application.properties and spring profile should be enough to handle it, right?
Definitely use Spring profiles.
You don't want to use Maven profiles as it creates different artifacts. Ask your QA/Release engineers how they feel about having different artifacts for different environments :). They wouldn't be happy.
H2 is what you want to use in CI server integration testing as well. Such integration testing is fast and easy.
Instead of changing profile in application.properties, consider defining profile via command line parameter. So that configuration file changes are not required to run your application in different profiles.
I've been looking for someone else doing this same thing, but haven't seen a scenario that's quite like this so I thought I'd see if anyone here has any good ideas on how to accomplish this.
My group builds and maintains an open-source neuroimaging data archive tool called XNAT. Previous versions of our application have always required users to run a builder application that took in a build.properties file and used that to initialize the database server configuration, among other things. We're really trying to get down to a single installable war file that we can make available on the NeuroDebian repository. In order to do this, we need to be able to start the application WITHOUT any database configuration information, run through a configuration wizard a la Wordpress or Drupal installations that includes the user inputting the database configuration, and finally setting this configuration information SOMEWHERE and re-starting or re-initializing the application context so that it gets its data source started up, Hibernate entity scans run, all auto-wired or injected dependencies that require the data source or Hibernate transaction manager resolved, and services scanned for #Transactional annotations, and so on.
I can easily see how we can use the new Spring Framework WebApplicationInitializer to detect whether the user has already set up the database configuration and initialize the app properly based on that:
If database has not been configured, create an servlet that just supports the UI for the initialization wizard
If database has been configured, create the regular application context
The problem in the first case is what happens once the user has completed the initialization wizard? We can store the database configuration somewhere and now we're ready to go but... how do we get the regular application context working? Can we just take the code that we'd call in the already initialized scenario and call that? Will that initialize the application properly then, with component scans and so on all being handled or...?
The only solution we have currently is to have the user restart the server manually (it's usually Tomcat) or use the server manager application to restart just our application. That's not very aesthetically pleasing though.
My end goal here will be to write a simple test app that takes in the database credentials and then tries to initialize everything else afterwards, but I'm hoping to see if anyone's thought about this particular issue and/or tried it and has any advice on how to handle it. Any help would be greatly appreciated!
Just trying to use the new features from Spring 3.2 testing framework in order to have real integration tests on the web layer.
I'm running across a problem when triggering the tests with the "SpringJUnit4ClassRunner" in Spring 3.2 because the template engine complains about not being able to resolve the template name:
2013-06-28 09:29:18,372 ERROR TemplateEngine - [THYMELEAF][main] Exception processing template "mobile/index": Error resolving template "mobile/index", template might not exist or might not be accessible by any of the configured Template Resolvers
Of course, the engine is searching for the resource around /WEB-INF/views/mobile/index.html, what is correct in a normal execution mode, but does not exist along the test execution environment in the class path with: *classpath (src/test/resources) in a Maven based project * /WEB-INF/views/mobile/index.html
Is there anyway to make the engine get the resources from the "real path" in order not to maintain a copy of each html view in the test classpath?
Thanks in advance,
One solution is having all the views under "/src/main/resources/views" and using the
org.thymeleaf.templateresolver.ClassLoaderTemplateResolver
This way you don't have any dependency (rather than using the "FileTemplateResolver") on the system and also you do not need having any duplicated spring configuration files for the test execution context.
Still it's a bit weird for me having the views in there but... Why not?
Any comment/suggestion over having the views under resource folders?