Sonarqube exclusion on Test Data - spring

Fellow Members,
I am trying to configure Sonarqube on my service.
For tests, I have extracted test Data (setup) into a separate class - testData.java class.
My question is, how should I analyse my testData.java using Sonar. Since Sonar mandates all the files under test package to end with *Test.java. Therefore if I try to rename the file to something say testDataTest.java it asks me to add a Test to the class.
Since my class is a final class and does not contain any test, I have to add a hack to add a meaning less #Test.
I would like to understand what are the best practices here and how could I improvise.
An Example of my class:
#TestConfiguration
#Import({ TestSecurityConfig.class , TestAuthentication.class})
public class ConfigurationSetupForTest{
#Test
#DisplayName("This is a dumb test so Sonar will not give inclusion errors")
void dumbTest() {
assertTrue(true);
}
}
Thanks

Related

writing load tests for junit springboot application

I have written the following junit test.
#SpringBootTest
public class abc {
#Autowired
private ActorService actorService;
#Test
#Transactional
public void testCorrectIdHandlingForNewInsertedActors() {
int elementsInDb = 0;
for (Actor a : this.actorService.findAll()) {
elementsInDb++;
}
Actor actor = this.actorService.saveAndSetId(new Actor());
assertEquals(elementsInDb + 1, actor.getId());
}
}
Now I want to write some load tests for performance testing but I don't know which tools I can use within my spring application. I am using gradle as my build tool. Any tutorial will be appreciated.
PS: I have already tried zerocode but does not work for me
You have some useful features out of the box such as #RepeatedTest and #Timeout (see the JUnit 5 annotations reference here) which respectively allow you to repeat a specific test method n times and set a maximum time limit before a test will fail automatically.
Other than that, for more complete and meaningful load testing you should consider relying on a full-fledged solution such as Apache JMeter or Gatling, rather than unit tests.

How to run multiple tests with Spring Boot

With Spring Boot 1.5, how could I run multiple tests which are in different classes?
E.g.
I have `Service1` tests in `Service1test.java`;
I have `Service2` tests in `Service2test.java`;
I shall need to run both in one go.
What I have done is as follows:
In the main class
#RunWith(Suite.class)
#Suite.SuiteClasses({
PostServiceTest.class,
UserServiceTest.class
})
public class DataApplicationTests {
#Test
public void contextLoads() {
}
}
In the PostServiceTest I have
#RunWith(SpringRunner.class)
#SpringBootTest
#Transactional
public class PostServiceTest {
#Autowired
IPostService postService;
#Before
public void initiate() {
System.out.println("Initiating the before steps");
}
#Test
public void testFindPosts() {
List<Post> posts= postService.findPosts();
Assert.assertNotNull("failure - expected Not Null", posts);
}
}
The second class, UserServiceTest has similar structure.
When I run the DataApplicationTests, it runs both the classes.
I will assume you are using IntelliJ, but the same stuff apply for all the IDEs.
Gradle and Maven have got a standarized project structure, that means all Test classes positioned inside the 'test-root' will be ran on either mvn test (to just test), or while you build (to check wether the code behaves correctly. In that case, if a test fails, build fails too).
Here's an image of a marked-green test directory on IntelliJ :
Your IDE should allow you to run specific tests, test suites or classes seperately, without the need to type out any command. IntelliJ provides some Icons on the separator column thingy (near to the line numbers) that enables you to run that specific stuff. Check out these green play buttons:
Be careful with creating test suites though. That way unless you manually configure the tests that need to be run, you will get duplicate runs because the build tools will run all the test suites independently and then all the tests! (That means that if test suite A contains test suite B and C, B and C are going to be ran 2 times: 1 each from A, and 1 each independently. The same applies for standalone test classes).

Can I exclude logging in JUnit tests on ZK controller using Log4j and not look for a file

I need some help on logging and unit testing. The class under test is a Zk GenericForwardComposer and i want to exclude concrete logging and logging configuration from the test. Im following a kind of TDD and have a test failure because of logging. Ive posted the class under test and the test. My test doesnt have any configuration for log4j because I want as pure a unit test as possible and as simple as possible.
The test failure:
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /log/t2-console.log (No such file or directory)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:212)
at java.io.FileOutputStream.<init>(FileOutputStream.java:136)
at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
at org.apache.log4j.RollingFileAppender.setFile(RollingFileAppender.java:207)
at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
at org.apache.log4j.xml.DOMConfigurator.parseAppender(DOMConfigurator.java:295)
at org.apache.log4j.xml.DOMConfigurator.findAppenderByName(DOMConfigurator.java:176)
at org. apache.log4j.xml.DOMConfigurator.findAppenderByReference(DOMConfigurator.java:191)
at org.apache.log4j.xml.DOMConfigurator.parseChildrenOfLoggerElement(DOMConfigurator.java:523)
at org.apache.log4j.xml.DOMConfigurator.parseCategory(DOMConfigurator.java:436)
at org.apache.log4j.xml.DOMConfigurator.parse(DOMConfigurator.java:1004)
at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:872)
at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:778)
at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
at org.apache.log4j.Logger.getLogger(Logger.java:117)
at com.t2.integration.controller.IntegrationSearchController.<clinit>(IntegrationSearchController.java:60)
at com.t2.integration.controller.IntegrationSearchControllerTest.doesSomeCalling(IntegrationSearchControllerTest.java:14)
I haven't configured the unit test for Log4j and I want to follow Red-Green-Refactor. I think I could handle the logging call in the test but want to find a way to exclude logging entirely if that's possible.
public class IntegrationSearchControllerTest {
#Test
public void doesSomeCalling() {
IntegrationSearchController searchController = new IntegrationSearchController();
}
}
I don't want any ZK context or ZK integration testing components to leak into my unit tests. And I want the tests to be as simple as possible. Is it AOP, interfaces, dependency injection or refactoring?
The class under test:
package ...
import org.apache.log4j.Logger;
public class IntegrationSearchController extends IntegrationBaseController {
private static final Logger LOGGER = Logger.getLogger(IntegrationSearchController.class);
...
The controller is ZK managed
As your Logger is static (normal) you will need to use a mock framework that can mock statics. This is much more difficult to do (from a framework perspective as it usually involves manipulating byte code). Anyway it's been done and you have options.
Here's how it would look using PowerMock with Mockito:
#RunWith(PowerMockRunner.class)
#PrepareForTest(Logger.class)
public class TestIntegrationSearchController {
#Before
public void initMockLogger() {
mockStatic(Logger.class);
when(Logger.getLogger(any(Class.class))).thenReturn(mock(Logger.class));
}
#Test
public void test() {
IntegrationSearchController controller = new IntegrationSearchController();
// controller.LOGGER is a Mockito mocked Logger
}
}
Note: you don't need to set up mocking in an #Before but I find it reads easier.
Now.. all that said, I think you're solving the wrong problem here. Rather than mocking logging in every test, consider alternatives that obviate the need. For example, you could use a different Log4J configuration in tests which logs to STDOUT.

Using Spring Framework's #Autowired for instantiating and injecting SUT (System Under Test) in the test fixture

I have seen developers using Spring's #Autowired feature making Spring framework responsible for instantiating and injecting SUT (System Under Test) or CUT (Class Under Test) in the test class or fixture. The following is the snippet showing #Autowired being used in the test fixture:
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.test.annotation.ExpectedException;
import org.springframework.test.context.testng.AbstractTestNGSpringContextTests;
import org.springframework.test.context.ContextConfiguration;
interface BackingStore
{
//Some methods
}
class Logger
{
public Logger(BackingStore backingStore)
{
//Capture input parameter into member variables
}
public void Log(String message)
{
//Some logic
}
}
#ContextConfiguration(locations = { "classpath:someconfig.xml" })
public class LoggerTests extends AbstractTestNGSpringContextTests
{
#Autowired
private Logger _logger;
#Test
#ExpectedException(NullPointerException)
public void Log_WhenPassedNull_ShouldThrowException()
{
_logger.Log(null);
}
}
All the dependencies (recursively) required by the SUT are specified as part of the Spring configuration XML file. I do not like this approach. I like all the test (unit or integration) to read like a story (I heard Kent Beck saying the same thing :)). I like instantiating SUT/CUT in the test case itself even though if it is complex. This gives a clear picture about the test case.
I have few concerns regarding #Autowired or any auto injection mechanism being used for injecting SUT in the test fixture:
It reduces test code readability. #Autowire appears like magic. Arrange of AAA (Arrange-Act-Assert) moves to XML file from test code.
It reduces test code maintainability. This is because of #2.
Could not effectively verify constructor of SUT/CUT throwing exception in exceptional cases. I am not 100% sure about this. I do not know if Spring framework has an answer for this.
It seems overkill for unit or integration tests.
I ask experts for 2 cents on this subject.
Thanks.
It only reduces test code readability if you dont know where to look for the #Autowired objects. I would advise using SpringJunitTestRunner that defines the test application context at the top of the unit test class.
Using dependency injection in your test cases allows you to easily test with different objects.
E.g. If your code required a 3rd party service, you could use dependency injection (e.g. Autowiring a Spring bean) to inject a mocked service for your unit tests and the real service for the application.
So for this reason it definitley doesnt decrease the test code maintainability, as it is really encouraging loose coupling between the code youre testing and any external objects.
It may be overkill to inject all objects in such a way, but it is definitely not overkill for unit tests/integration tests in general.

Maven report on all tests on method level

I want to generate a report after maven runs the tests that will display a list of all tests that are run. I don't want the tests to be grouped by class, I just want a list of executed tests. For example if I have
class Test1 {
#Test
public void test1() {
}
}
class Test2 {
#Test
public void test2() {
}
}
After running the tests I want a report that looks like:
Test1.test1 PASSED time:2sec
Test2.test2 FAILED time:1sec
Is there a maven report plugin that does this or any other way that this can be done?
I've looked at maven test report format but it seems there you have to click class by class to see the failing methods.
I would appreciate any suggestions
I researched the documentation for both SureFire and Cobertura, and there does not appear to be a built-in report in the format you seek.
However, the file TEST-TestSuite.xml generated by SureFire will contain an entry for every individual test case at the method-level. So, your best bet is to develop a trivial XML transform that converts this file to the format you prefer.

Resources