Hibernate will not remove item in delete test method in Spring app - spring

I'm working on integration testing of a spring application,
and I'm trying to test a simple delete method.
I first add an item (in my case a sector) and then test its deletion.
However, when the test method is run it is called but as if the item is not deleted, I do not know for what reason.
Location and sectors are smaller in relation one to may - many to one, where location has more sectors.
Currently I have 7 sectors in the test database and it is added to the 8 one, which has id 8 because the id is set to identity.
This is my test method (have #Transactional on test class):
#Test
public void deleteSectorTest(){
Sector sector = sectorService.addSector(SectorConst.newDtoToAdd());
int sizeBeforeDel = sectorRepository.findAll().size();
sectorService.deleteSector(8L);
int sizeAfterDel = sectorRepository.findAll().size();
assertEquals(sizeBeforeDel - 1, sizeAfterDel);
}
This is my methods add and delete sector in sector service:
public Sector addSector(SectorDto sd) {
Sector sector = new Sector();
if(locationService.findOneLocation(sd.getLocationId()) == null){
throw new LocationNotFoundException("Location with id : " +sd.getLocationId() +" not found.");
}
Location l = locationService.findOneLocation(sd.getLocationId());
sector = mapFromDto(sd);
sector.setLocation(l);
l.getSectors().add(sector);
sectorRepository.save(sector);
return sector;
}
public void deleteSector(Long id){
sectorRepository.deleteById(id);
}
And here is my error in console:
If anyone knows what the problem is I would appreciate it, thanks!

Related

Spring data Page/Pageable returns duplicates on large data sets?

When operating on large data sets, Spring Data presents two abstractions: Stream and Page. We've been using Stream for awhile and had no issues, but recently I wanted to try a paginated approach and ran into a reliability issue.
Consider the following:
#Entity
public class MyData {
}
public interface MyDataRepository extends JpaRepository<MyData, UUID> {
}
#Component
public class MyDataService {
private MyDataRepository repository;
// Bridge between a Reactive service and a transactional / non-reactive database call
#Transactional
public void getAllMyData(final FluxSink<MyData> sink) {
final Pageable firstPage = PageRequest.of(0, 500);
Page<MyData> page = repository.findAll(firstPage);
while (page != null && page.hasContent()) {
page.getContent().forEach(sink::next);
if (page.hasNext()) {
page = repository.findAll(page.nextPageable());
}
else {
page = null;
}
}
sink.complete();
}
}
Using two Postgres 9.5 databases, the source database had close to 100,000 rows while the destination was empty. The example code was then used to copy from the source to the destination. At the end I would find that my destination database had far smaller row count than the source.
Run as a springboot app
The flux doing the copy was using 4-6 threads in parallel (for speed)
Total run time of at least an hour (max was 2 hours)
As it turns out, I was eventually processing the same rows multiple times (and missing other rows as a result). This lead me to discovering a fix that others had already ran into, where you should provide a Sort.by("") argument.
After changing the service to use:
// Make our pages sorted by the PKEY
final Pageable firstPage = PageRequest.of(0, 500, Sort.by("id"));
I found that while it GREATLY helped, I would still process multiple rows (from losing about half the rows to only seeing ~12 duplicates). When I use a Stream instead, I have no issues.
Does anyone have any explanation for what is going on? I don't seem to have any duplicates come through until the test has been running for at least 10-15min, which almost leads me to believe that there is some kind of session or other timeout happening (either in the client, or on the database) that causes the hiccups. But I'm really far out of my knowledge area for troubleshooting it further heh.

Long running Spring Service is locking DB table

I have a Spring Service that is going through multiple items in a list and for each one it is making an extra WS call to external services. The Service is called by a Job on a fixed time interval.
As a first step, the Service is saving in a JOB_CONTROL table the status of the Job (STARTED), then it iterates through the list and at the end it saves it to (FINISHED).
There are 2 issues:
the JOB_CONTROL table doesn't get saved gradually - only the
"FINISHED" value is saved and never "STARTED"
if using flush method in order to force the commit, the table gets locked, eg. no other select can be made on it until the Service finishes
#Service
public class PromotionSchedulerService implements Runnable {
#Autowired
GeofencingAreaDAO storeDao;
#Autowired
promotionsWSClient promotionsWSClient;
#Autowired
private JobControlDAO jobControlDAO;
public void run() {
JobControl job = jobControlDAO.findByClassName(this.getClass().getSimpleName());
job.setState(JobControlStateTypes.RUNNING.getStateType());
job.setLastRunDate(new Date());
// LINE BELLOW DOES NOT GET COMMITED IN DB
jobControlDAO.save(job);
List < GeofencingArea > stores = storeDao.findAllStores();
for (GeofencingArea store: stores) {
/** Call WS **/
GetActivePromotionsResponse rsp = null;
try {
rsp = promotionsWSClient.getpromotions();
} catch (Exception e) {
e.printStackTrace();
job.setState(JobControlStateTypes.FAILED.getStateType());
job.setLastRunStatus("There was an error calling promagic promotions");
jobControlDAO.save(job);
return;
}
List < PromotionBean > promos = rsp.getReturn();
for (PromotionBean promo: promos) {
BackendPromotionPOJO backendPromotionsPOJO = new BackendPromotionPOJO();
backendPromotionsPOJO.setDescription(promo.getDescription());
}
}
// ONLY THIS JOB STATE GOES TO DB. IT ACTUALLY SEEM TO OVERWRITE PREVIOUS SET VALUE ("RUNNING") from line 16
job.setLastRunStatus("COMPLETED");
job.setState(JobControlStateTypes.SUCCESS.getStateType());
jobControlDAO.save(job);
}
}
I would like to force the commit after changing job state and not locking the table when doing this.

Lazy Hibernate Initialization No Session, Could not initialize Proxy

I am creating a e-prescribing module will allow several orders to be created and added to a list before being confirmed (saved in DB at once together). I have two forms - One where the order is created and the second with a link to add more orders, where the submit button will store all the orders (ArrayList items) in the DB. i.e. Each order upon creation gets added to an ArrayList [createNewDrugOrder() method below]
When I try to save each order inside this method - it works fine. but my requirement is to have the facility to store the orders as a draft before confirming them (storing in DB).
public class DrugordersPageController{
SessionFactory sessionfactory;
public void controller(){
if ("addDraftOrder".equals(action)){
createNewDrugOrder();
}
if ("confirmOrder".equals(action)){
saveOrder();
}
}
createNewDrugOrder(){
DrugOrder order = new DrugOrder();
order.setDose(1);
order.setDuration(1);
order.setQuantity(1);
ConfirmOrderClass.orderToConfirm.add(order);
}
saveOrder(){
DrugOrder order = ConfirmOrderClass.getDrugOrderMain();
order = (DrugOrder) Context.getOrderService().saveOrder(order, null); //Here is where error is thrown
}
}
class ConfirmOrderClass{
public static ArrayList<DrugOrder> orderToConfirm = new ArrayList<DrugOrder>();
public static ArrayList<DrugOrder> getDrugOrderMain(){
return orderToConfirm;
}
}
I understand this is a case of object being detached from the session, but I am unable to get it fixed. Calling session.save or session.update in the second form doesn't help..Neither does Hibernate.initialize(object) or adding a #Transactional notation
when I call Session s = sessionfactory.getCurrentSession or openSession, i still get null pointer error. Although I am able to retrieve the values of order.getDose(), order.getDuration() etc..
Please help!

ArrayListModel will not sync with JList

I have combed through SO, and have found many questions on the topic of my problem but do not answer it.
I am setting up an MVC, I have set up things correctly to best of my knowledge but I cannot get the Controller to show in my view. I am working on an assignment that essentially is a program for a Video Rental Store.
First, In a class called RentalStoreGUI, I set up my panels and everything looks good when I run.
RentalStoreEngine model = new RentalStoreEngine();
JList<DVD> list = new JList<DVD>();
list.setModel(model);
list.setSelectionMode(ListSelectionModel.SINGLE_SELECTION);
list.setVisible(true);
list.setSelectedIndex(0);
jScrollPane = new JScrollPane(list);
add(jScrollPane, BorderLayout.CENTER);
add(buttonPanel, BorderLayout.SOUTH);
As you can see I set my model for the list based on another class called RentalStoreEngine() and it implements AbstractListModel. The Abstract List model is functioning when I do class specific testing and all of the necessary methods are implemented. Here is an example of my add method from that class:
public void add(DVD d){
if (d != null){
rentals.add(d);//rentals is an arrayList<DVD> instantiated earlier
fireIntervalAdded(this, rentals.size() - 1, rentals.size() - 1);
}
}
Here is the actionPerformed method, it runs DVD_Dialog which simply gets some input from the user and creates a new DVD object from that.
public void actionPerformed(ActionEvent event) {
if(event.getSource() == rentDVD){
DVD_Dialog = new RentDVDDialog(this, null);
DVD_Dialog.clear();
DVD_Dialog.setVisible(true);
dvd = new DVD(DVD_Dialog.getTitleText(),DVD_Dialog.getRenterText(),
DVD_Dialog.getRentedOnText(), DVD_Dialog.getDueBackText());
if(DVD_Dialog.closeStatus() == true){
model.add(dvd);
}
}
Eclipse gives me no errors, until I run it. I then receive a nullPointerException at the line model.add(dvd); Based on all my research the list.setModel(model) and the fireIntervalAdded method line should update the Jlist on its own. But it does not. And as I said, class specific testing for both the GUI and the Model are producing the desired results, but when it comes to integrating them I am at a loss.

Non-Blocking Endpoint: Returning an operation ID to the caller - Would like to get your opinion on my implementation?

Boot Pros,
I recently started to program in spring-boot and I stumbled upon a question where I would like to get your opinion on.
What I try to achieve:
I created a Controller that exposes a GET endpoint, named nonBlockingEndpoint. This nonBlockingEndpoint executes a pretty long operation that is resource heavy and can run between 20 and 40 seconds.(in the attached code, it is mocked by a Thread.sleep())
Whenever the nonBlockingEndpoint is called, the spring application should register that call and immediatelly return an Operation ID to the caller.
The caller can then use this ID to query on another endpoint queryOpStatus the status of this operation. At the beginning it will be started, and once the controller is done serving the reuqest it will be to a code such as SERVICE_OK. The caller then knows that his request was successfully completed on the server.
The solution that I found:
I have the following controller (note that it is explicitely not tagged with #Async)
It uses an APIOperationsManager to register that a new operation was started
I use the CompletableFuture java construct to supply the long running code as a new asynch process by using CompletableFuture.supplyAsync(() -> {}
I immdiatelly return a response to the caller, telling that the operation is in progress
Once the Async Task has finished, i use cf.thenRun() to update the Operation status via the API Operations Manager
Here is the code:
#GetMapping(path="/nonBlockingEndpoint")
public #ResponseBody ResponseOperation nonBlocking() {
// Register a new operation
APIOperationsManager apiOpsManager = APIOperationsManager.getInstance();
final int operationID = apiOpsManager.registerNewOperation(Constants.OpStatus.PROCESSING);
ResponseOperation response = new ResponseOperation();
response.setMessage("Triggered non-blocking call, use the operation id to check status");
response.setOperationID(operationID);
response.setOpRes(Constants.OpStatus.PROCESSING);
CompletableFuture<Boolean> cf = CompletableFuture.supplyAsync(() -> {
try {
// Here we will
Thread.sleep(10000L);
} catch (InterruptedException e) {}
// whatever the return value was
return true;
});
cf.thenRun(() ->{
// We are done with the super long process, so update our Operations Manager
APIOperationsManager a = APIOperationsManager.getInstance();
boolean asyncSuccess = false;
try {asyncSuccess = cf.get();}
catch (Exception e) {}
if(true == asyncSuccess) {
a.updateOperationStatus(operationID, Constants.OpStatus.OK);
a.updateOperationMessage(operationID, "success: The long running process has finished and this is your result: SOME RESULT" );
}
else {
a.updateOperationStatus(operationID, Constants.OpStatus.INTERNAL_ERROR);
a.updateOperationMessage(operationID, "error: The long running process has failed.");
}
});
return response;
}
Here is also the APIOperationsManager.java for completness:
public class APIOperationsManager {
private static APIOperationsManager instance = null;
private Vector<Operation> operations;
private int currentOperationId;
private static final Logger log = LoggerFactory.getLogger(Application.class);
protected APIOperationsManager() {}
public static APIOperationsManager getInstance() {
if(instance == null) {
synchronized(APIOperationsManager.class) {
if(instance == null) {
instance = new APIOperationsManager();
instance.operations = new Vector<Operation>();
instance.currentOperationId = 1;
}
}
}
return instance;
}
public synchronized int registerNewOperation(OpStatus status) {
cleanOperationsList();
currentOperationId = currentOperationId + 1;
Operation newOperation = new Operation(currentOperationId, status);
operations.add(newOperation);
log.info("Registered new Operation to watch: " + newOperation.toString());
return newOperation.getId();
}
public synchronized Operation getOperation(int id) {
for(Iterator<Operation> iterator = operations.iterator(); iterator.hasNext();) {
Operation op = iterator.next();
if(op.getId() == id) {
return op;
}
}
Operation notFound = new Operation(-1, OpStatus.INTERNAL_ERROR);
notFound.setCrated(null);
return notFound;
}
public synchronized void updateOperationStatus (int id, OpStatus newStatus) {
iteration : for(Iterator<Operation> iterator = operations.iterator(); iterator.hasNext();) {
Operation op = iterator.next();
if(op.getId() == id) {
op.setStatus(newStatus);
log.info("Updated Operation status: " + op.toString());
break iteration;
}
}
}
public synchronized void updateOperationMessage (int id, String message) {
iteration : for(Iterator<Operation> iterator = operations.iterator(); iterator.hasNext();) {
Operation op = iterator.next();
if(op.getId() == id) {
op.setMessage(message);
log.info("Updated Operation status: " + op.toString());
break iteration;
}
}
}
private synchronized void cleanOperationsList() {
Date now = new Date();
for(Iterator<Operation> iterator = operations.iterator(); iterator.hasNext();) {
Operation op = iterator.next();
if((now.getTime() - op.getCrated().getTime()) >= Constants.MIN_HOLD_DURATION_OPERATIONS ) {
log.info("Removed operation from watchlist: " + op.toString());
iterator.remove();
}
}
}
}
The questions that I have
Is that concept a valid one that also scales? What could be improved?
Will i run into concurrency issues / race conditions?
Is there a better way to achieve the same in boot spring, but I just didn't find that yet? (maybe with the #Async directive?)
I would be very happy to get your feedback.
Thank you so much,
Peter P
It is a valid pattern to submit a long running task with one request, returning an id that allows the client to ask for the result later.
But there are some things I would suggest to reconsider :
do not use an Integer as id, as it allows an attacker to guess ids and to get the results for those ids. Instead use a random UUID.
if you need to restart your application, all ids and their results will be lost. You should persist them to a database.
Your solution will not work in a cluster with many instances of your application, as each instance would only know its 'own' ids and results. This could also be solved by persisting them to a database or Reddis store.
The way you are using CompletableFuture gives you no control over the number of threads used for the asynchronous operation. It is possible to do this with standard Java, but I would suggest to use Spring to configure the thread pool
Annotating the controller method with #Async is not an option, this does not work no way. Instead put all asynchronous operations into a simple service and annotate this with #Async. This has some advantages :
You can use this service also synchronously, which makes testing a lot easier
You can configure the thread pool with Spring
The /nonBlockingEndpoint should not return the id, but a complete link to the queryOpStatus, including id. The client than can directly use this link without any additional information.
Additionally there are some low level implementation issues which you may also want to change :
Do not use Vector, it synchronizes on every operation. Use a List instead. Iterating over a List is also much easier, you can use for-loops or streams.
If you need to lookup a value, do not iterate over a Vector or List, use a Map instead.
APIOperationsManager is a singleton. That makes no sense in a Spring application. Make it a normal PoJo and create a bean of it, get it autowired into the controller. Spring beans by default are singletons.
You should avoid to do complicated operations in a controller method. Instead move anything into a service (which may be annotated with #Async). This makes testing easier, as you can test this service without a web context
Hope this helps.
Do I need to make database access transactional ?
As long as you write/update only one row, there is no need to make this transactional as this is indeed 'atomic'.
If you write/update many rows at once you should make it transactional to guarantee, that either all rows are updated or none.
However, if two operations (may be from two clients) update the same row, always the last one will win.

Resources