This function is used to query the database to search for a specific category. The testcase which i wrote for this function covers the entire code by when I see the code coverage using eclipse ecelma it shows red for a specific line. Can someone help me to rectify this?
#Override
public List<Services> searchCategory(String name) throws CategoryNameNotFoundException{
logger.info("{}.{}",new ServicesBoImpl().getClass().getPackageName(), new ServicesBoImpl().getClass().getName());
logger.info("Function: searchCategory(), Information: querying the database for the search categories");
List<Services> searchCategory = jdbcTemplate.query(env.getProperty("searchCategory"), new PreparedStatementSetter() {
#Override
public void setValues(PreparedStatement ps) throws SQLException {
ps.setString(1, name+"%");
}
} ,new SearchCategoryRowMapper());
if(searchCategory.size()==0) {
logger.info("Function: searchCategory(), Information: Throwing CategoryNameNotFoundException because the paticular category is not found");
throw new CategoryNameNotFoundException("Category Not Found");
}
return searchCategory;
}
What logic are you trying to cover by writing this UT ? No logic is executed inside test...
Assuming that serviceBoImpl is mock:
#Test(expected = CategoryNameNotFoundException.class)
public void testIfSearchCategoryThrowsException() {
Mockito.doThrow(CategoryNameNotFoundException).when(serviceBoImpl).searchCategory("a!#")
...here execution which calls this method...
}
Remember that this test will not cover logic of searchCategory method as it is throwing an error once it is executed
Related
I am trying to mock queryForObject method using Mockito. Unit test actually is passed but the lines is not fully covered.
The code to get the people object is like below:
jdbcTemplate.queryForObject(GET_PEOPLE,
(rs, rowNum) -> new People()
.setId(rs.getInt("id"))
.setFirstName(rs.getString("first_name"))
.setLastName(rs.getString("last_name")),
department, position);
FYI: GET_PEOPLE is a static constant contain the SQL query.
and the unit test is:
People people = new People();
people.setId(1);
people.setFirstName("John");
people.setLastName("Doe");
when(jdbcTemplate.queryForObject(any(), (RowMapper<Object>) any(), any())).thenReturn(people);
Can anyone let me know how to mock to get the lines fully covered. Thanks in advance.
You are not getting coverage because you never execute that code.
You need to call your rowmapper:
#ExtendWith(MockitoExtension.class)
public class MyTest {
#Mock
JdbcTemplate jdbcTemplate;
#Mock
ResultSet resultSet;
#Test
public void myTest() throws SQLException {
when(resultSet.getInt(1)).thenReturn(1);
...
when(jdbcTemplate.queryForObject(any(), (RowMapper<People>) any(), any())).thenAnswer(new Answer<People>() {
#Override
public People answer(InvocationOnMock invocationOnMock) throws Throwable {
RowMapper<People> rowMapper = invocationOnMock.getArgument(1);
return rowMapper.mapRow(resultSet, 1);
}
});
...
}
}
I'm unit testing a Spring boot web app with Mockito. One of my methods is returning a void, but if I try to test it, I get compilation errors.
This is the test I wrote:
public void testDeleteActor()throws NetflixException {
when(actorRepository.findById(1L)).thenReturn(Optional.of(Mockito.any(Actor.class)));
assertEquals(null, service.deleteActorById(Mockito.anyLong());
}
And this is the method I'm trying to test:
#Override
public void deleteActorById(Long id) throws NetflixException {
Actor actor = actorRepository
.findById(id)
.orElseThrow(() -> new NotFoundException("Actor id not found - " + id));
actorRepository.delete(actor);
}
As you can see in the following screenshot, I'm getting an error with my assertEquals() statement:
In your code, your method isn't actually returning null, it's returning nothing (void). That means that you can't write assertions based on what the method returns. This is the reason why the assertEquals() statement is giving you an error.
In stead of testing what the method returns, you can test the expected behaviour of the method. In this example, there are three things we expect:
The method should retrieve the actor by its ID.
The method should throw an exception if no actor was found with the given ID.
The method should delete the actor if it was found.
To implement this these tests, you can use Mockito's verify() and AssertJ's assertThatExceptionOfType(). For example:
#Test
void deleteActorById_retrievesActorByID() {
Actor actor = new Actor();
when(actorRepository.findById(1L)).thenReturn(Optional.of(actor));
service.deleteActorById(1L);
verify(repository).findById(1L);
}
#Test
void deleteActorById_throwsExceptionIfIDNotFound() {
assertThatExceptionOfType(NotFoundException.class)
.isThrownBy(() -> service.deleteActorById(1L))
.withMessage("Actor id not found - 1");
}
#Test
void deleteActorById_deletesActorIfIDFound() {
Actor actor = new Actor();
when(actorRepository.findById(1L)).thenReturn(Optional.of(actor));
service.deleteActorById(1L);
verify(actorRepository).delete(actor);
}
I have created a JUNIT test cases for elasticsearch CRUD operation i have given the code below. After code reviewing phase i got an update from team that i have covered all the positive scenario of test cases still not yet covered the negative scenarios. I am not aware of handling the negative use cases.
#Test
void findById() throws Exception {
EmployeeInformation EmployeeGet = Eservice.findById("elcrud", "2");
assertNotNull(EmployeeGet.getId());
assertNotNull(EmployeeGet.getFirstName());
assertNotNull(EmployeeGet.getLastName());
}
#Test
void deleteProfileDocument() throws Exception {
String Result = Eservice.deleteProfileDocument("elcrud", "3");
System.out.println(Result);
assertEquals(Result, "DELETED");
}
#Test
void search() throws Exception {
List<EmployeeInformation> Emp=Eservice.searchByTechnology("Lucidworks","elcrud");
System.out.println(Emp.size());
int Result = Emp.size();
assertTrue(Result >= 0 );
}
#Test
void searchByName() throws Exception {
List<EmployeeInformation> Emp=Eservice.findProfileByName("junit","elcrud");
System.out.println(Emp.size());
int Result = Emp.size();
assertTrue(Result >= 0 );
}
Could some one help me to implement the negative scenario of JUNIT test cases for above code?
I think you tech team wants a test if the operation of ES failed, give no results or any other unexpected scenario happens.
One example could be the deletion of a profile document:
You already covered the test if the delete operation is successful. But you dont have a test if this operation failed or is not handled successful.
#Test
void deleteProfileDocument() throws Exception {
//here you delete a profile which is NOT in the index
String Result = Eservice.deleteProfileDocument("elcrud", "3");
System.out.println(Result);
//and here you asssert the negative result. (Not sure which result will come)
assertEquals(Result, "NOT_FOUND");
}
I also see that on your test you throw an exception in case of an error. This could be another good negative test scenario. So, if you can send an operation to ES and this operation throws an exception you can create a test to expect this exception.
For Junit4 you could use the following:
#Test(expected = YourExpectedException.class)
For Junit5 you could use this:
Exception exception = assertThrows(YourExpectedException.class, () -> {
Eservice.findProfileByName(Exception values);
});
String expectedMessage = "expected message";
String actualMessage = exception.getMessage();
assertTrue(actualMessage.contains(expectedMessage));
EDit by addtional questions
If you have a positive and a negeative test, I would suggest to keep them in separate test methods. So there will be no need to comment things out and in.
In your case it could be like:
positve:
#Test
void testUpdateItem_POSITIVE() throws Exception {
....
}
negative:
#Test
void testUpdateItem_NEGATIVE() throws Exception {
....
}
#Test(expected = NotFound.class)
void shouldThrowExceptionWhenProfileIsNotExist() throws Exception {
Eservice.findById("elcrud", "some_id");
}
Instead NotFound place your exception which is thrown when search profile with an id which is not exist
Is there a way to get the reason a HystrixCommand failed when using the #HystrixCommand annotation within a Spring Boot application? It looks like if you implement your own HystrixCommand, you have access to the getFailedExecutionException but how can you get access to this when using the annotation? I would like to be able to do different things in the fallback method based on the type of exception that occurred. Is this possible?
I saw a note about HystrixRequestContext.initializeContext() but the HystrixRequestContext doesn't give you access to anything, is there a different way to use that context to get access to the exceptions?
Simply add a Throwable parameter to the fallback method and it will receive the exception which the original command produced.
From https://github.com/Netflix/Hystrix/tree/master/hystrix-contrib/hystrix-javanica
#HystrixCommand(fallbackMethod = "fallback1")
User getUserById(String id) {
throw new RuntimeException("getUserById command failed");
}
#HystrixCommand(fallbackMethod = "fallback2")
User fallback1(String id, Throwable e) {
assert "getUserById command failed".equals(e.getMessage());
throw new RuntimeException("fallback1 failed");
}
I haven't found a way to get the exception with Annotations either, but creating my own Command worked for me like so:
public static class DemoCommand extends HystrixCommand<String> {
protected DemoCommand() {
super(HystrixCommandGroupKey.Factory.asKey("Demo"));
}
#Override
protected String run() throws Exception {
throw new RuntimeException("failed!");
}
#Override
protected String getFallback() {
System.out.println("Events (so far) in Fallback: " + getExecutionEvents());
return getFailedExecutionException().getMessage();
}
}
Hopefully this helps someone else as well.
As said in the documentation Hystrix-documentation getFallback() method will be thrown when:
Whenever a command execution fails: when an exception is thrown by construct() or run()
When the command is short-circuited because the circuit is open
When the command’s thread pool and queue or semaphore are at capacity
When the command has exceeded its timeout length.
So you can easily get what raised your fallback method called by assigning the the execution exception to a Throwable object.
Assuming your HystrixCommand returns a String
public class ExampleTask extends HystrixCommand<String> {
//Your class body
}
do as follows:
#Override
protected ErrorCodes getFallback() {
Throwable t = getExecutionException();
if (circuitBreaker.isOpen()) {
// Log or something
} else if (t instanceof RejectedExecutionException) {
// Log and get the threadpool name, could be useful
} else {
// Maybe something else happened
}
return "A default String"; // Avoid using any HTTP request or ypu will need to wrap it also in HystrixCommand
}
More info here
I couldn't find a way to obtain the exception with the annotations, but i found HystrixPlugins , with that you can register a HystrixCommandExecutionHook and you can get the exact exception in that like this :
HystrixPlugins.getInstance().registerCommandExecutionHook(new HystrixCommandExecutionHook() {
#Override
public <T> void onFallbackStart(final HystrixInvokable<T> commandInstance) {
}
});
The command instance is a GenericCommand.
Most of the time just using getFailedExecutionException().getMessage() gave me null values.
Exception errorFromThrowable = getExceptionFromThrowable(getExecutionException());
String errMessage = (errorFromThrowable != null) ? errorFromThrowable.getMessage()
this gives me better results all the time.
I am using jmeter with Java Request samplers. These call java classes I have written which returns a SampleResult object which contains the timing metrics for the use case. SampleResult is a tree and can have child SampleResult objects (SampleResult.addSubResult method). I cant seem to find a good way in jmeter to track the sub results so I can only easily get the results for the parent SampleResult.
Is there a listener in jmeter that allows me to see statistics / graphs for sub results (for instance see the average time across all sub results with the same name).
I have just succeeded in doing this, and wanted to share it. If you follow the instructions I provide here, it will work for you as well. I did this for the summary table listener. And, I did it on Windows. And, I used Eclipse
Steps:
Go to JMeter's web site and download the source code. You can find that here, for version 3.0.
http://jmeter.apache.org/download_jmeter.cgi
One there, I clicked the option to download the Zip file for the Source.
Then, on that same page, download the binary for version 3.0, if you have not already done so. Then, extract that zip file onto your hard drive.
Once you've extracted the zip file to your hard drive, grab the file "SummaryReport.java". It can be found here: "\apache-jmeter-3.0\src\components\org\apache\jmeter\visualizers\SummaryReport.java"
Create a new class in Eclipse, then Copy/Paste all of that code into your new class. Then, rename your class from what it is, "SummaryReport" to a different name. And everywhere in the code, replace "SummaryReport" with the new name of your class.
I am using Java 8. So, there is one line of code that won't compile for me. It's the line below.
private final Map tableRows = new ConcurrentHashMap<>();
You need to remove the <> on that line, as Java 1.8 doesn't support it. Then, it will compile
There was one more line that gave a compile error. It was the one below.
CSVSaveService.saveCSVStats(StatGraphVisualizer.getAllTableData(model, FORMATS),writer,`
saveHeaders.isSelected() ? StatGraphVisualizer.getLabels(COLUMNS) : null);
Firstly, it wasn't finding the source for class StatGraphVisualizer. So, I imported it, as below.
import org.apache.jmeter.visualizers.StatGraphVisualizer;
Secondly, it wasn't finding the method "getLabels" in "StatGraphVisualizer.getLabels." So, here is what this line of code looked like after I fixed it. It is seen below.
CSVSaveService.saveCSVStats(StatGraphVisualizer.getAllTableData(model, FORMATS),writer);
That compiles. That method doesn't need the second argument.
Now, everything should compile.
Find this method below. This is where you will begin adding your customizations.
#Override
public void add(final SampleResult res) {
You need to create an array of all of your sub results, as I did, as seen below. The line in Bold is the new code. (All new code is seen in Bold).
public void add(final SampleResult res) {
final String sampleLabel = res.getSampleLabel(); // useGroupName.isSelected());
**final SampleResult[] theSubResults = res.getSubResults();**
Then, create a String for each label for your sub results objects, as seen below.
**final String writesampleLabel = theSubResults[0].getSampleLabel(); // (useGroupName.isSelected());
final String readsampleLabel = theSubResults[1].getSampleLabel(); // (useGroupName.isSelected());**
Next, go to the method below.
JMeterUtils.runSafe(false, new Runnable() {
#Override
public void run() {
The new code added is below, in Bold.
JMeterUtils.runSafe(false, new Runnable() {
#Override
public void run() {
Calculator row = null;
**Calculator row1 = null;
Calculator row2 = null;**
synchronized (lock) {
row = tableRows.get(sampleLabel);
**row1 = tableRows.get(writesampleLabel);
row2 = tableRows.get(readsampleLabel);**
if (row == null) {
row = new Calculator(sampleLabel);
tableRows.put(row.getLabel(), row);
model.insertRow(row, model.getRowCount() - 1);
}
**if (row1 == null) {
row1 = new Calculator(writesampleLabel);
tableRows.put(row1.getLabel(), row1);
model.insertRow(row1, model.getRowCount() - 1);
}
if (row2 == null) {
row2 = new Calculator(readsampleLabel);
tableRows.put(row2.getLabel(), row2);
model.insertRow(row2, model.getRowCount() - 1);
}**
} // close lock
/*
* Synch is needed because multiple threads can update the counts.
*/
synchronized(row) {
row.addSample(res);
}
**synchronized(row1) {
row1.addSample(theSubResults[0]);
}**
**synchronized(row2) {
row2.addSample(theSubResults[1]);
}**
That is all that needs to be customized.
In Eclipse, export your new class into a Jar file. Then place it inside of the lib/ext folder of your binary of Jmeter that you extracted, from Step 1 above.
Start up Jmeter, as you normally would.
In your Java sampler, add a new Listener. You will now see two "Summary Table" listeners. One of these will be the new one that you have just created. Once you have brought that new one into your Java Sampler, rename it to something unique. Then run your test and look at your new "Summary Table" listener. You will see summary results/stats for all of your sample results.
My next step is to perform these same steps for all of the other Listeners that I would like to customize.
I hope that this post helps.
Here is some of my plugin code which you can use as a starting point in writing your own plugin. I cant really post everything as there are really dozens of classes. Few things to know are:
my plugin like all visualizer plugins extends the jmeter class
AbstractVisualizer
you need the following jars in eclipse to complile:
jfxrt.jar,ApacheJMeter_core.jar
you need java 1.8 for javafx (the jar file comes in the sdk)
if you compile a plugin you need to put that in jmeter/lib/ext.
You also need to put the jars from bullet 2 in jmeter/lib
there is a method called "add(SampleResult)" in my class. This
will get called by the jmeter framework every time a java sample
completes and will pass the SampleResult as a parameter. Assuming you
have your own Java Sample classes that extend
AbstractJavaSamplerClient your class will have a method called
runTest which returns a sampleresult. That same return object will be
passed into your plugins add method.
my plugin puts all the sample results into a buffer and only
updates the screen every 5 results.
Here is the code:
import java.awt.BorderLayout;
import java.util.ArrayList;
import java.util.List;
import javafx.application.Platform;
import javafx.embed.swing.JFXPanel;
import javax.swing.border.Border;
import javax.swing.border.EmptyBorder;
import org.apache.jmeter.samplers.SampleResult;
import org.apache.jmeter.testelement.TestStateListener;
import org.apache.jmeter.visualizers.gui.AbstractVisualizer;
public class FxVisualizer extends AbstractVisualizer implements TestStateListener {
int currentId = 0;
/**
*
*/
private static final long serialVersionUID = 1L;
private static final int BUFFER_SIZE = 5;
#Override
public String getName()
{
return super.getName();//"George's sub result viewer.";
}
#Override
public String getStaticLabel()
{
return "Georges FX Visualizer";
}
#Override
public String getComment()
{
return "George wrote this plugin. There are many plugins like it but this one is mine.";
}
static Long initCount = new Long(0);
public FxVisualizer()
{
init();
}
private void init()
{
//LoggingUtil.debug("in FxVisualizer init()");
try
{
FxTestListener.setListener(this);
this.setLayout(new BorderLayout());
Border margin = new EmptyBorder(10, 10, 5, 10);
this.setBorder(margin);
//this.add(makeTitlePanel(), BorderLayout.NORTH);
final JFXPanel fxPanel = new JFXPanel();
add(fxPanel);
//fxPanel.setScene(getScene());
Platform.runLater(new Runnable() {
#Override
public void run() {
initFX(fxPanel);
}
});
}
catch(Exception e)
{
e.printStackTrace();
}
}
static FxVisualizerScene fxScene;
private static void initFX(JFXPanel fxPanel) {
// This method is invoked on the JavaFX thread
fxScene = new FxVisualizerScene();
fxPanel.setScene(fxScene.getScene());
}
final List <Event> bufferedEvents = new ArrayList<Event>();
#Override
public void add(SampleResult result)
{
final List <Event> events = ...;//here you need to take the result.getSubResults() parameter and get all the children events.
final List<Event> eventsToAdd = new ArrayList<Event>();
synchronized(bufferedEvents)
{
for (Event evt : events)
{
bufferedEvents.add(evt);
}
if (bufferedEvents.size() >= BUFFER_SIZE)
{
eventsToAdd.addAll(bufferedEvents);
bufferedEvents.clear();
}
}
if (eventsToAdd.size() > 0)
{
Platform.runLater(new Runnable() {
#Override
public void run() {
updatePanel(eventsToAdd);
}
});
}
}
public void updatePanel(List <Event> events )
{
for (Event evt: events)
{
fxScene.addEvent(evt);
}
}
#Override
public void clearData()
{
synchronized(bufferedEvents)
{
Platform.runLater(new Runnable() {
#Override
public void run() {
bufferedEvents.clear();
fxScene.clearData();
}
});
}
}
#Override
public String getLabelResource() {
return "Georges Java Sub FX Sample Listener";
}
Boolean isRunning = false;
#Override
public void testEnded()
{
final List<Event> eventsToAdd = new ArrayList<Event>();
synchronized(bufferedEvents)
{
eventsToAdd.addAll(bufferedEvents);
bufferedEvents.clear();
}
if (eventsToAdd.size() > 0)
{
Platform.runLater(new Runnable() {
#Override
public void run() {
updatePanel(eventsToAdd);
fxScene.testStopped();
}
});
}
}
Long testCount = new Long(0);
#Override
public void testStarted() {
synchronized(bufferedEvents)
{
Platform.runLater(new Runnable() {
#Override
public void run() {
updatePanel(bufferedEvents);
bufferedEvents.clear();
fxScene.testStarted();
}
});
}
}
#Override
public void testEnded(String arg0)
{
//LoggingUtil.debug("testEnded 2:" + arg0);
testEnded();
}
int registeredCount = 0;
#Override
public void testStarted(String arg0) {
//LoggingUtil.debug("testStarted 2:" + arg0);
testStarted();
}
}
OK so I just decided to write my own jmeter plugin and it is dead simple. Ill share the code for posterity when it is complete. Just write a class that extends AbstractVisualizer, compile it into a jar, then throw it into the jmeter lib/ext directory. That plugin will show up in the listeners section of jmeter when you go to add visualizers.