How to test two Infinispan with JUnit? - spring

I want to create test to make sure two instances of Infinispan cache are communication well.
In first step I create two applicationContexts using two different application-test.properties
In logs I can see two instances of cache are created.
In debug I can see also two different instances of CacheManager | DefaultCacheManager.
Everything looks fine - but when I add some valued to one instance second one instance of Cache (Infinispan) is not notified about that.
Any advice?

Currently you can use NoSQLUnit https://github.com/lordofthejars/nosql-unit#infinispan-engine which gives support for testing and managing lifecycle of Infinispan.
In next weeks we are going to integrate this to Arquillian APE as well.
If you have any question don't hesitate to ping me, my twitter is #alexsotob

if you have problem with start two Infinispan's caches on local machine try to use real host name or IP instead 'localhost' or '127.0.0.1'
if you have problem with multiple JUnit tests and Infinispan's caches try to stop transport after each test - like:
#After
public void tearDown() {
applicationContext.getBean(CacheManager.class).getTransport().stop();
}

Create a file infinispan.xml
< infinispan
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="urn:infinispan:config:5.1 http://www.infinispan.org/schemas/infinispan-config-5.1.xsd"
xmlns="urn:infinispan:config:5.1">
<namedCache name="xml-configured-cache">
<eviction strategy="LIRS" maxEntries="10" />
</namedCache>
< / infinispan >
Init cache with file configuration:
Cahe c = new DefaultCacheManager("infinispan.xml").getCache("xml-configured-cache");
That's all!

Related

Enables Master/Replica operations with spring-boot-starter-data-redis-reactive

I'm using spring-boot-starter-data-redis-reactive and #SpringBootApplication annotation to auto configure redis connection. I have set up a redis cluster with 1 master and 2 slaves. I have the following config in the application.properties file
spring.redis.cluster.nodes=master-node:6379,slave1-node:6379,slave2-node:6379
I want to configure it so that all writes go to master, and all reads go to slaves (slave preferred).
I found that it is using Lettuce driver under the hood. In order to achieve this, I need to add .readFrom(SLAVE_PREFERRED) into the LettuceClientConfiguration. Looked at the org\springframework\boot\autoconfigure\data\redis\LettuceConnectionConfiguration.class, I don't see a way to add this config. Any idea how to achieve this?
You need to use the LettuceClientConfigurationBuilderCustomizer
public LettuceClientConfigurationBuilderCustomizer lettuceClientConfigurationBuilderCustomizer() {
return builder -> builder.readFrom(ReadFrom.REPLICA);
}

SpringApplicationBuilder for specified profile over -Dspring.profile.active=local

I have a integrationTest which needs to spin to applications. One at 8181 and 8185 both application needs to behave exactly the same but only difference is to listen on different ports.
I give -Dspring.profile.active=local for 8181 server and for other server I do
applicationContext = new SpringApplicationBuilder(springConfigs)
.profiles("abc")
.run();
But looks like even though I am specifying abc as a profile, other server starts with local profile - hence port 8181.
If I don't specify -Dspring.profile.active=local and use ActiveProfile then all works fine but since I cannot change -Dspring.profile.active=local piece I have to come up with alternate route. Is it possible to force SpringApplicationBuilder to use profile I specify?
Thanks in advance
Based on the order of precedence for spring boot properties give here command line is among the top. But before that is #TestPropertySource and SpringBootTest#properties. The latter takes in an array of strings of form key=value.

Multiple Embedded HSQLDB databases in jUnit errors during build

I'm working on a new Spring Batch (3.0.3.RELEASE) application where there will be multiple databases accessed during the jobs. For testing we are using HSQLDB (2.3.2) as the embedded database.
In my Application context I have the following.
<jdbc:embedded-database id="dataSource">
</jdbc:embedded-database>
<jdbc:embedded-database id="proDataSource">
<jdbc:script location="classpath:script-tables.sql" />
<jdbc:script location="classpath:script-constraints.sql" />
</jdbc:embedded-database>
<jdbc:embedded-database id="altDataSource">
<jdbc:script location="classpath:script-alt-tables.sql" />
</jdbc:embedded-database>
When I run a single test in Eclipse, things are fine. When I build from the command line, after the first test, I get errors
Failed to execute SQL script statement at line 3 of resource class path resource [script-promrkt-promo.sql]
object name already exists: PROMRKT
It appears to me that the population process in EmbeddedDatabaseFactory is receiving an already populated database. From what I can tell is that after each test there is not a SHUTDOWN being executed and HSQLDB is leaving the already populated database in memory.
I have re-reviewed the documentation and in a Spring Doc this does show a explicit shutdown command. But if spring starts up the embedded database when my test starts why doesn't it shut it down when the test completes ?
Is it expected the embedded databases will remain after each unit test for the same application context?
What is the order that spring starts up an embedded database and when is the transactional context initialized?
Do I need to use a database cleaner ?
Can the populate be updated to only populate when the database is first started, and rollback to the original script configuration when my test is complete ( kinda like how the AbstractTransactionalSpringContextTests worked )
Do I need some transactional markers? Spring Batch's JobRepo is properly being populated and destroyed between each test. Why are my custom dataSources not ?
The script the log message is complaining about isn't in your configuration. I presume it's being executed somewhere else? If that's the case, you'll probably need to add #DirtiesContext to your tests so that Spring doesn't cache the context (I'm assuming you're using the SpringJunit4Runner with #ContextConfiguration but can't be sure since your actual test isn't in the question).
If my assumption is correct, Spring caches the context in an effort to improve performance over the running of a unit test suite. If your test modifies the context in a way that can impact other tests (like running scripts in one test that need to be run again in others), you mark the tests with #DirtiesContext and Spring won't cache the context. You can use the annotation at either the method or class level. You can read more about the annotation here: http://docs.spring.io/spring/docs/current/javadoc-api/org/springframework/test/annotation/DirtiesContext.html
I spent a lot of time looking at this and reading the Spring Framework documentation (gasp!) and tracing through code. There are some interesting changes in 4.1 spring core, especially the testing.
I found out that ApplicationContext(s) are cached at the JVM level now. If a second test asks for a context the TestContext looks first in it's cache to see if some other test has already asked for the identical configuration.
I have some profiles for some of my tests. A test with a different profile but the same #ContextConfiguration causes the that context to be re-loaded with the profile applied. When the "Bean Loader" arrives at creating the embedded databases, the EmbeddedDatabaseFactory does not take into consideration that the embedded database (in memory HSQLDB) may have already been created or cached from previous tests and does not need to be re-initialized.
Therefore I added some logic to the EmbeddedDatabaseFactory.initDatabase() checking if the database already exists before re-initializing & running the DatabasePopulator.
List existingDataBases = org.hsqldb.DatabaseManager.getDatabaseURIs();
boolean isExisting = false;
String localDBName = StringUtils.lowerCase(this.databaseName);
for (Object object : existingDataBases) {
if (object.toString().contains(localDBName)) {
isExisting = true;
break;
}
}
// Now populate the database
if (!isExisting && this.databasePopulator != null) {
( of course this isn't quite kosher for what spring would need but it gets the point across )
In my opinion it looks like an issue partially with the EmbeddedDatabaseFactory and the TestContext caching mechanism. My "jdbc:embedded-database" definitions do not have any profiles associated with them. Why does the cache need to re-create them and not load them out of the existing cached beans?
You can try to force creation of new embedded database by setting unique name with generateUniqueName(true) each time new object is created.
Here is an example:
embeddedDatabase = new EmbeddedDatabaseBuilder()
.setType(EmbeddedDatabaseType.H2)
.generateUniqueName(true)
.addScripts("db/sql/create-db.sql", "db/sql/insert-data.sql")
.build();

Access to h2 web console while running junit test in a Spring application

I'm building a Spring application and I need to inspect my H2 in-memory database while I'm running my JUnit tests from a web browser.
In my Spring configuration I have a bean which is responsible of creating my database schema and populating it with some data which will be used within my JUnit tests. I've also added a bean in my test context which creates a web server where I eventually will look for my data.
<bean id="org.h2.tools.Server-WebServer" class="org.h2.tools.Server"
factory-method="createWebServer" init-method="start" lazy-init="false">
<constructor-arg value="-web,-webAllowOthers,-webPort,11111" />
</bean>
Everything seems ok because the database is populated properly since I can access to its data from my JUnit tests and H2 Server only runs while I'm in my test-phase (I can know that, because if I try to access to my_ip:111111 before debugging my tests I cannot connnect but I can connect afterwards once I've started my tests).
Anyway If I open my H2 console from a web browser no schema is shown in it. Any ideas??
Many thanks!!
As this is probably going to be a test-debugging feature, you can add it at runtime with your #Before:
import org.h2.tools.Server;
/* Initialization logic here */
#BeforeAll
public void initTest() throws SQLException {
Server.createWebServer("-web", "-webAllowOthers", "-webPort", "8082")
.start();
}
And then connect to http://localhost:8082/
Note: unless you need this to run as part of your CI build, you'll need to remove this code when you're finished debugging
For future reference here's another way to do it:
Start database and web servers (version can differ):
$ cd .../maven_repository/com/h2database/h2/1.4.194
$ java -cp h2-1.4.194.jar org.h2.tools.Server -tcp -web -browser
TCP server running at tcp://169.254.104.55:9092 (only local connections)
Web Console server running at http://169.254.104.55:8082 (only local connections)
Set database url for tests in code to jdbc:h2:tcp://localhost:9092/mem:mytest.
Run or debug tests.
Click Connect in browser window which opened in step 1.
Jar file for H2 can be downloaded at https://mvnrepository.com/artifact/com.h2database/h2.
Server can be started via #Before in test file like in snovelli's answer, but only in case connection to database in established afterwards, which might be a problem.
I guess the problem is that you are connecting to h2db directly from your application. Not through the server you are launching with bean. Because of this your app and h2db-web-interface can't share one in-memory database.
You should change jdbcUrl in tests to something like jdbc:h2:tcp://localhost/mem:my_DB;DB_CLOSE_DELAY=-1;MODE=Oracle and in browser you should connect to the same url.
With jdbc urls like jdbc:h2:tcp://localhost/... all connections will go through the h2db-server and you can view database state in browser.
If you have defined the jdbc url to something like jdbc:h2:mem:db in your properties, when the database is created it actually gets a bit longer name.
Add a #Autowired DataSource dataSource to your test class, set a debug point somewhere, and inspect that datasource with dataSource.getConnection() and look at the url property. In the case I'm running right this moment, it is
jdbc:h2:mem:43ed83d6-97a1-4515-a925-a8ba53cd322c
Plugging that into the web cosole shows everything I'm expecting.
It isn't the most straightforward way, but it does work.
#snovelli answer above is good.
To debug a particular test case in your IDE, add a infinite loop at the end of the test case and go to browser and launch the console and you can query the data.
Something like below
import org.h2.tools.Server;
/* Initialization logic here */
#BeforeAll
public void initTest() throws SQLException {
Server.createWebServer("-web", "-webAllowOthers", "-webPort", "8082")
.start();
}
#Test
void testMyDBOperation() {
//some db operations like save and get
while(true) {
}
}
now you can go to browser and launch the console at http://localhost:8082/
Of course delete above two changes after debugging
It is not the answer, but a debugging tip.
When you finally access h2-conole http://127.0.0.1:8082/ you may notice that database changes are not shown.
This is because the test cases are not transactional and the data is not committed. Although this behavior is good, as each test case, must run in predefined environment. It is not good if you want to debug and see database changes.
To achieve this, add #Commit annotation above test case and put a dummy line in a #AfterAll annotated method, to stop test and let you see the h2 console ( The h2 server will stop as the test finish).
#AfterAll
public static void finalizeTest() throws Exception {
System.out.print("Just put a break point here");
}
#Test
#Commit
void should_store_an_article() {
// Your test here
}

Acceptance testing preloading of data into GAE dev server datastore

In my application I have a set of of DAOs which I inject into my application layer. For an acceptance test I'm writing, I want to preload the dev_server datastore with data, so I use the same Spring config in my JUnit test (using the #ContextConfiguration annotation) to inject an instance of the relevant DAO into my test. When I actually go to store some data eg:
dao.add(entity)
I get the dreaded "No API environment is registered for this thread."
Caused by: java.lang.NullPointerException: No API environment is registered for this thread.
at com.google.appengine.api.datastore.DatastoreApiHelper.getCurrentAppId(DatastoreApiHelper.java:108)
at com.google.appengine.api.datastore.DatastoreApiHelper.getCurrentAppIdNamespace(DatastoreApiHelper.java:118)
....
This is probably because my test case hasn't read in the GAE application-web.xml with the app details (although I'm guessing here I could really be wrong); so it doesn't know to write to the same datastore that the app running on the dev_server is reading/writing to.
How can I get my test to "point" to the same datastore as the app? Is there some "datasource" mechanism that I can inject both into the app and the test? Is there a way to get my test to force the datastore api to read the needed config?
Here is a page that talks about how to do unit tests that connect to a dev datastore. Is this the kind of thing you're looking for? Basically it talks about two classes, LocalServiceTestHelper and LocalDatastoreServiceTestConfig that you can use to set up an environment for testing. While the example given is for unit tests, I believe it will also work for your situation.
You can then configure things like whether the dev datastore is written to disk or just kept in memory (for faster tests). If you want this data to go to the same place as your dev server, you will probably want to adjust this, as I think the default is the "in memory" option. If you look at the javadoc there is a "setBackingStoreLocation" method where you can point to whatever file you want.
I've found the solution!!!!
For some reason the Namespace, AppID and the AuthDomain fields of the test datastore have to match that of the dev_server, then the dev_server can see the entities inserted by the test.
You can see the values for the environment (dev_server or test code) with the following statements
System.out.println(NamespaceManager.get());
System.out.println(ApiProxy.getCurrentEnvironment().getAppId());
System.out.println(ApiProxy.getCurrentEnvironment().getAuthDomain());
In your instance of LocalServiceTestHelper (eg: gaeHelper), you can set the values for the test environment
// the NamespaceManager is thread local.
NamespaceManager.set(NamespaceManager.getGoogleAppsNamespace());
gaeHelper.setEnvAppId(<the name of your app in appengine-web.xml>);
gaeHelper.setEnvAuthDomain("gmail.com");
Then the dev_server will see your entities. However because of synchronisation issues, if the test writes to the datastore after the dev_server has been started the dev_server wont see it unless it can be forced to reread the file (which I haven't figured out yet). Else the server has to be restarted.
I've found a workaround, although it's not very nice because each test method doesn't clean up the Datastore, as explained in the article Local Unit Testing for Java, however, the Datastore starts clean each time the Test class is run, so it's not so bad, provided that you're careful about that.
The problem is, that when using SpringJUnit4ClassRunner, the spring environment is created before the #Before annotation can be run, the solution is use #BeforeClass and use a static variable for LocalServiceTestHelper, to have them created before the Spring Environment is set up.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration("classpath:META-INF/spring/context-test.xml")
#Transactional
public class MyTest {
#Inject
private MyService myService;
private static final LocalServiceTestHelper helper =
new LocalServiceTestHelper(new LocalDatastoreServiceTestConfig());
#BeforeClass
public static void beforeClass() {
helper.setUp();
}
#AfterClass
public static void afterClass() {
helper.tearDown();
}
If anyone has a better solution, I'll be glad to hear!

Resources