I need to execute integration tests using DbUnit. I have created to datasets (before and after test) and compare them using #DatabaseSetup and #ExpectedDatabase annotations. During test one new database row was created (it presents in after test dataset, which I specify using #ExpectedDatabase annotation). The problem is that row id is generating automatically (I am using Hibernate), so row id is changing permanently. Therefore my test pass only once and after that I need to change id in after test dataset, but this is not that I need. Can you suggest me please any solutions for this issue, if this issue can be resolved with DbUnit.
Solution A:
Use assigned id strategy and use a seperate query to retrieve next value in business logic. So you can always assign a known id in your persistence tests with some appropriate database cleanup. Note that this only works if you're using Oracle Sequence.
Solution B:
If I'm not mistaken, there are some methods similar to assertEqualsIngoreColumns() in org.dbunit.Assertion. So you can ignore the id assertion if you don't mind. Usually I'll compensate that with a not null check on id. Maybe there some options in #ExpectedDatabase but I'm not sure.
Solution C:
I'd like to know if there is a better solution because that solution A introduces some performance overhead while solution B sacrifices a little test coverage.
What version of dbunit you're using by the way. I have never seen these annotations in 2.4.9 and below, they looks easier to use.
This workaround is saving my skin till now:
I implemented a AbstractDataSetLoader with replacement feature:
public class ReplacerDataSetLoader extends AbstractDataSetLoader {
private Map<String, Object> replacements = new ConcurrentHashMap<>();
#Override
protected IDataSet createDataSet(Resource resource) throws Exception {
FlatXmlDataSetBuilder builder = new FlatXmlDataSetBuilder();
builder.setColumnSensing(true);
try (InputStream inputStream = resource.getInputStream()) {
return createReplacementDataSet(builder.build(inputStream));
}
}
/**
* prepare some replacements
* #param dataSet
* #return
*/
private ReplacementDataSet createReplacementDataSet(FlatXmlDataSet dataSet) {
ReplacementDataSet replacementDataSet = new ReplacementDataSet(dataSet);
//Configure the replacement dataset to replace '[null]' strings with null.
replacementDataSet.addReplacementObject("[null]", null);
replacementDataSet.addReplacementObject("[NULL]", null);
replacementDataSet.addReplacementObject("[TODAY]", new Date());
replacementDataSet.addReplacementObject("[NOW]", new Timestamp(System.currentTimeMillis()));
for (java.util.Map.Entry<String, Object> entry : replacements.entrySet()) {
replacementDataSet.addReplacementObject("["+entry.getKey()+"]", entry.getValue());
}
replacements.clear();
return replacementDataSet;
}
public void replace(String replacement, Object value){
replacements.put(replacement, value);
}
}
With this you could somehow track the ids you need and replace in your testes
#DatabaseSetup(value="/test_data_user.xml")
#DbUnitConfiguration(dataSetLoaderBean = "replacerDataSetLoader")
public class ControllerITest extends WebAppConfigurationAware {
//reference my test dbconnection so I can get last Id using regular query
#Autowired
DatabaseDataSourceConnection dbUnitDatabaseConnection;
//reference my datasetloader so i can iteract with it
#Autowired
ColumnSensingFlatXMLDataSetLoader datasetLoader;
private static Number lastid = Integer.valueOf(15156);
#Before
public void setup(){
System.out.println("setting "+lastid);
datasetLoader.replace("emp1", lastid.intValue()+1);
datasetLoader.replace("emp2", lastid.intValue()+2);
}
#After
public void tearDown() throws SQLException, DataSetException{
ITable table = dbUnitDatabaseConnection.createQueryTable("ids", "select max(id) as id from company.entity_group");
lastid = (Number)table.getValue(0, "id");
}
#Test
#ExpectedDatabase(value="/expected_data.xml", assertionMode=DatabaseAssertionMode.NON_STRICT)
public void test1() throws Exception{
//run your test logic
}
#Test
#ExpectedDatabase(value="/expected_data.xml", assertionMode=DatabaseAssertionMode.NON_STRICT)
public void test2() throws Exception{
//run your test logic
}
}
And my expected dataset need some replacement emp1 and emp2
<?xml version='1.0' encoding='UTF-8'?>
<dataset>
<company.entity_group ID="15155" corporate_name="comp1"/>
<company.entity_group ID="15156" corporate_name="comp2"/>
<company.entity_group ID="[emp1]" corporate_name="comp3"/>
<company.entity_group ID="[emp2]" corporate_name="comp3"/>
<company.ref_entity ID="1" entity_group_id="[emp1]"/>
<company.ref_entity ID="2" entity_group_id="[emp2]"/>
</dataset>
Use DatabaseAssertionMode.NO_STRICT, and delete the 'id' column from your 'expect.xml'.
DBUnit will ignore this column.
Related
I have a requirement whereby I need to advise all delete and save methods and send the deleted/saved record somewhere else.
I am using JpaRepository which has
6 x delete
3 x save
Basically I need to advise all these methods. The trouble is that each of these has different method signatures and return types, sometimes accepting a Long, Object or List. I am considering using aspects to achieve this but it seems that it would be nasty as I currently have 4 objects I need to audit which comes to 4 x 9 = 36 different pointcuts. There are more of these to come so this would soon come into the hundreds.
Is there a better way?
I got it working as #sheltem suggested. I used EntityListeners. In my case I needed access to a spring bean and was able to it this way:
#Component
public class PublishEntityListener {
private static PublishingService publishingService;
#Autowired(
required = true)
public void setPublishingService(PublishingService publishingService) {
this.publishingService = publishingService;
}
#PostConstruct
public void init() {
//Allow the static dependency to be setup post construct as #EntityListeners are no spring managed
}
#PostPersist
public void prePersist(DomainObject<?> entity) {
publishingService.publish(getTopicName(entity), HttpMethod.POST, entity);
}
#PostUpdate
public void preUpdate(DomainObject<?> entity) {
publishingService.publish(getTopicName(entity), HttpMethod.PUT, entity);
}
#PostRemove
public void onDelete(DomainObject<?> entity) {
publishingService.publish(getTopicName(entity), HttpMethod.DELETE, entity);
}
}
I have my entities with a #version column, daos, and junit test.
How can i induce an optimistic lock exception in the junit test case, to see that it's handled correctly?
I am using spring transaction managamemnt, so this makes it more complicated i think
Open a transaction from a jUnit test method and read one row from a certain table.
Create a new thread and open another database transaction which will read the same row.
Update it, and save it to the database.
Pause the main thread used by the jUnit test method.
Modify the data read at the beginning and try updating the row. As a result an optimistic lock exception should be thrown.
In my current Project we have to handle the OptimisticLockException and wrap it into a coustomized exception. Instead of Spring we are using hibernate. But maybe this way will help you.
For my solution you need an OpenEJB-Container in your Test:
#LocalClient
public class ExampleClassTest {
//its a self written class to bootstrap the open ejb container
private static OpenEjbContainerStarter openEjbStarter;
private static Context context;
#Resource
private UserTransaction userTransaction;
#EJB
private ExampleWithSaveActionFacade facade;
#EJB
private ExampleDAO exampleDataAccessObject;
#BeforeClass
public static void setUpBefore() throws Exception {
openEjbStarter = new OpenEjbContainerStarter();
context = openEjbStarter.startOpenEJB();
}
#AfterClass
public static void shutdown() throws NamingException, SQLException {
openEjbStarter.destroyOpenEJB();
}
#Before
public void before() throws Exception {
context.bind("inject", this);
}
#Test(expected = OptimisticLockException.class)
public void testSaveSomethingWithException(){
//get the first object you will manipulate and change the Version
//current Version is 1
ExampleModel example = exampleDataAccessObject.findById(1L);
example.setSomeData("test");
//get the second object what will cause the exception
//current version is 1
ExampleModel exampleOldObject = exampleDataAccessObject.findById(1L);
// returnValue with the version number 2
ExampleModel returnValue = facade.persistOrUpdateUser(example);
//try to save the object with the version 1
//throws the OptimisticLockException
facade.persistOrUpdateUser(exampleOldObject);
}
}
I'm upgrading a Spring 3 application to Spring 4. My #Repository has ParameterizedRowMapper objects to map the SQL results to objects. But since Spring 4 that interface has been deprecated "in favor of the regular SingleColumnRowMapper". But I use mappers for mapping multiple columns. How am I meant to map multiple columns using SingleColumnRowMapper? Or am I meant to be doing something completely different?
For example, here is the kind of code I have now:
private static final ParameterizedRowMapper<Thing> THING_ENTRY_MAPPER = new ParameterizedRowMapper<Thing>() {
#Override
public Thing mapRow(ResultSet rs, int rowNum) throws SQLException {
return new Thing(rs.getLong(1), rs.getLong(2), rs.getInt(3));
}
};
#Override
public List<Thing> getThings(
ID id, long start, long end) {
final Map<String, Object> params = new HashMap<String, Object>(4);
putIDParams(params, id);
putTimeRangeParams(params, start, end);
return getNamedParameterJdbcTemplate().query(QUERY_THING, params,
THING_ENTRY_MAPPER);
}
How should I be implementing that kind of functionality now?
The Javadoc seems to be wrong. The Spring Framework designers probably intend use of the RowMapper<Thing> interface to replace ParameterizedRowMapper<Thing>. That is, use the base interface.
I'm trying to test my dao, that uses jdbcTemplate.batchUpdate method under the hood.
My tests are run against real datasource and all methods are performed in transactions marked as rollback-only (any changes are rolled back after the test). Test transactions are managed by PlatformTransactionManager.
The issue here, is that jdbcTemplate.batchUpdate seems to be executed in separate transaction started by DataSourceTransactionManager and thus, I can't see changes made by jdbcTemplate.
My test:
#Transactional
#TransactionConfiguration(transactionManager = "txManager", defaultRollback = true)
#RunWith(SpringJUnit4ClassRunner.class)
public abstract class AbstractDbUnitTest
...
#Test
public void removeSpecific() {
myDao.removeSpecific(myDao.findAllAliasedItems());
Assert.assertEquals(0, myDao.findAllAliasedItems().size());
}
Dao
#Override
public void removeSpecific(final List<? extends Item> items) {
jdbcTemplate.batchUpdate("delete from ITEM where item_type = ? and item_id = ?", new BatchPreparedStatementSetter() {
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
ps.setString(1, items.get(i).getType().name());
ps.setString(2, items.get(i).getId(););
}
#Override
public int getBatchSize() {
return items.size();
}
});
}
Is there any way to test batchUpdate method item without actually altering the data?
Thanks in advance.
I'm sorry for misleading you all. The issue was not caused by nested transaction.
This question is a result of my total misunderstanding of the hierarchy of transactions-related classes in Java and how batch statements in JDBC implemented.
Test was failing because subsequent calls to
myDao.findAllAliasedItems()
were cached. Spring JDBC template worked just fine and as one might expect.
I'm new to JMock, trying to develop a Spring controller test. Here is my test method:
#Test
public void testList() {
context.checking(new Expectations() {{
Student student = new Student(767001);
oneOf(studentService).getByNumber(767001); will(returnValue(student));
}});
ModelMap model = new ModelMap();
Student student = new Student(767001);
model.addAttribute("student", student);
CourseRightController instance = new CourseRightController();
request.setMethod("GET");
Assert.assertEquals(studentService.getByNumber(767001),model.get(student));
The question is how I'm able to test if the model contains the right object and object values? ModelMap is not that flexible than e.g ModelAndWiew. I can't get access to model attributes so the last code line here is not how it should be.
I usually use the Model interface and then in a test super class I have code which allows me to get at things in the Model
#Ignore
public abstract class SpringControllerTestCase {
/**
* Spring Model object - initialised in #Before method.
*/
private Model model;
/**
* Initialise fields before each test case.
*/
#Before
public final void setUpAll() {
model = new ExtendedModelMap();
}
public final Model getModel() {
return model;
}
#SuppressWarnings("unchecked")
public <T> T getModelValue(final String key, final Class<T> clazz) {
return (T) getModel().asMap().get(key);
}
}
then in a test I can do
assertEquals("someValue", getModelValue("bean", String.class));
or
assertTrue(getModelValue("student", Student.class).getId() == "767001");
Note this is all just shorthand for code like this
Student student = (Student) model.asMap().get("student");
assertEquals(767001, student.getId());
You can use extended model map instead for more flexibility. And you should declare references using the interface not implementation.
There is also this package to be included in spring 3.2 which may help : https://github.com/SpringSource/spring-test-mvc
However I have always been fine using extendedmodelmap and plain old hashmaps.
In your example, have you implemented equals (and hashcode) correctly, if you have not overrridden these methods the assertEquals will be testing if the objects are the same reference.