How do I get a Mono to wait till dependenat fetch method has run - spring-boot

I am trying to implement an export to excel function via a web service which uses webflux as the other api and controllers work well. My problem is that calling the function that generates the excel file is accessed after retrieving data from repository as a Flux (no problem there). I have sorted the results and am trying to call another populate methid via flatMap, I am having a number of issues trying to get this to work and to make sure that the code in the flatMap runs before the code in the webservice to return the file.
Below is the code for the webservice:
#GetMapping(API_BASE_PATH + "/download")
public ResponseEntity<byte[]> download() {
Mono<Void> createExcel = excelExport.createDocument(false);
Mono.when(createExcel).log("Excel Created").then();
Workbook workbook = excelExport.getWb();
OutputStream outputStream = new ByteArrayOutputStream();
try {
workbook.write(outputStream);
} catch (IOException e) {
e.printStackTrace();
}
byte[] media = ((ByteArrayOutputStream) outputStream).toByteArray();
HttpHeaders headers = new HttpHeaders();
headers.setCacheControl(CacheControl.noCache().getHeaderValue());
headers.setContentType(MediaType.valueOf("text/html"));
headers.set("Content-disposition", "attachment; filename=filename.xlsx");
ResponseEntity<byte[]> responseEntity = new ResponseEntity<>(media, headers, HttpStatus.OK);
return responseEntity;
}
And the code for the exelExport class:
public Mono<Void> createDocument(boolean all) {
InputStream inputStream = new ClassPathResource("Timesheet Template.xlsx").getInputStream();
try {
wb = WorkbookFactory.create(inputStream);
Sheet sheet = wb.getSheetAt(0);
Row row = sheet.getRow(1);
Cell cell = row.getCell(3);
if (cell == null)
cell = row.createCell(3);
cell.setCellType(CellType.STRING);
cell.setCellValue("a test");
log.info("Created document");
Flux<TimeKeepingEntry> entries = service.findByMonth(LocalDate.now().getMonth().getDisplayName(TextStyle.FULL, Locale.ENGLISH)).log("Excel Export - retrievedMonths");
entries.subscribe();
return entries.groupBy(TimeKeepingEntry::getDateOfMonth).flatMap(Flux::collectList).flatMap(timeKeepingEntries -> this.populateEntry(sheet, timeKeepingEntries)).then();
} catch (IOException e) {
log.error("Error Creating Document", e);
}
//should never get here
return Mono.empty();
}
private void populateEntry(Sheet sheet, List<TimeKeepingEntry> timeKeepingEntries) {
int rowNum = 0;
for (int i = 0; i < timeKeepingEntries.size(); i++) {
TimeKeepingEntry timeKeepingEntry = timeKeepingEntries.get(i);
if (i == 0) {
rowNum = calculateFirstRow(timeKeepingEntry.getDay());
}
LocalDate date = timeKeepingEntry.getFullDate();
Row row2 = sheet.getRow(rowNum);
Cell cell2 = row2.getCell(1);
cell2.setCellValue(date.toString());
if (timeKeepingEntry.getDay().equals(DayOfWeek.FRIDAY.getDisplayName(TextStyle.FULL, Locale.ENGLISH))) {
rowNum = +2;
} else {
rowNum++;
}
}
}
The workbook is never update because the populateEntry is never executed. As I said I have tried a number of differnt methods including Mono.just and Mono.when, but I cant seem to get the correct combination to get it to process before the webservice method tries to return the file.
Any help would be great.
Edit1: Shows the ideal crateDocument Method.
public Mono<Void> createDocument(boolean all) {
try {
InputStream inputStream = new ClassPathResource("Timesheet Template.xlsx").getInputStream();
wb = WorkbookFactory.create(inputStream);
Sheet sheet = wb.getSheetAt(0);
log.info("Created document");
if (all) {
//all entries
} else {
service.findByMonth(currentMonthName).log("Excel Export - retrievedMonths").collectSortedList(Comparator.comparing(TimeKeepingEntry::getDateOfMonth)).doOnNext(timeKeepingEntries -> {
this.populateEntry(sheet, timeKeepingEntries);
});
}
} catch (IOException e) {
log.error("Error Importing File", e);
}
return Mono.empty();
}

There are several problems in the implementation of your webservice.
When to subscribe
First off, in reactive programming you must generally try to build a single processing pipeline (by calling Mono and Flux operators and returning the end result as a Mono and Flux). In any case, you should either let the framework do the subscribe or at least only subscribe once, at the end of that pipeline.
Here instead you are mixing two approaches: your createDocument method correctly returns a Mono, but it also does the subscribe. Even worse, the subscription is done on an intermediate step, and nothing subscribes to the whole pipeline in the webservice method.
So in effect, nobody sees the second half of the pipeline (starting with groupBy) and thus it never gets executed (this is a lazy Flux, also called a "cold" Flux).
Mixing synchronous and asynchronous
The other problem is again an issue of mixing two approaches: your Flux are lazy and asynchronous, but your webservice is written in an imperative and synchronous style.
So the code starts an asynchronous Flux from the DB, immediately return to the controller and tries to load the file data from disk.
Option 1: Making the controller more Flux-oriented
If you use Spring MVC, you can still write these imperative style controllers yet sprinkle in some WebFlux. In that case, you can return a Mono or Flux and Spring MVC will translate that to the correct asynchronous Servlet construct. But that would mean that you must turn the OutputStream and bytes handling into a Mono, to chain it to the document-writing Mono using something like then/flatMap/etc... It is a bit more involved.
Option 2: Turning the Flux into imperative blocking code
The other option is to go back to imperative and blocking style by calling block() on the createDocument() Mono. This will subscribe to it and wait for it to complete. After that, the rest of your imperative code should work fine.
Side Note
groupBy has a limitation where if it results in more than 256 open groups it can hang. Here the groups cannot close until the end of the file has been reached, but fortunately since you only process data for a single month, the Flux wouldn't exceed 31 groups.

Thanks to #SimonBasie for the pointers, my working code is now as follows.
#GetMapping(value = API_BASE_PATH + "/download", produces = "application/vnd.ms-excel")
public Mono<Resource> download() throws IOException {
Flux<TimeKeepingEntry> createExcel = excelExport.createDocument(false);
return createExcel.then(Mono.fromCallable(() -> {
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
excelExport.getWb().write(outputStream);
return new ByteArrayResource(outputStream.toByteArray());
}));
}
public Flux<TimeKeepingEntry> createDocument(boolean all) {
Flux<TimeKeepingEntry> entries = null;
try {
InputStream inputStream = new ClassPathResource("Timesheet Template.xlsx").getInputStream();
wb = WorkbookFactory.create(inputStream);
Sheet sheet = wb.getSheetAt(0);
log.info("Created document");
if (all) {
//all entries
} else {
entries = service.findByMonth(currentMonthName).log("Excel Export - retrievedMonths").sort(Comparator.comparing(TimeKeepingEntry::getDateOfMonth)).doOnNext(timeKeepingEntry-> {
this.populateEntry(sheet, timeKeepingEntry);
});
}
} catch (IOException e) {
log.error("Error Importing File", e);
}
return entries;
}

Related

java stream is making weird things to generate csv file in Spring Boot

I'm processing a csv file through my springboot app, the file is to download it, in my case I use streams but there is a problem what I don't know what's wrong in my code because some rows is complete with the columns but next row only write some columns and leftover columns are write below as if were a new row. I hope you understand what I mean. I hope you give a hand, thank you in advance.
This code below is the controller
.....
#RequestMapping(value="/stream/csv/{grupo}/{iduser}", method = RequestMethod.GET)
public void generateCSVUsingStream(#PathVariable("grupo") String grupo,
#PathVariable("iduser") String userId,HttpServletResponse response) {
response.addHeader("Content-Type", "application/csv");
response.addHeader("Content-Disposition", "attachment; filename=\""+userId+"_Reporte_PayCash"+grupo.replaceAll("\\s", "")+".csv");
response.setCharacterEncoding("UTF-8");
try (Stream<ReportePayCashDTO> streamPaycashdatos = capaDatosDao.ReportePayCashStream(userId, grupo);PrintWriter out = response.getWriter();) {
//PrintWriter out = response.getWriter();
out.write(String.join(",", "Cuenta" , "Referencia", "Referencia_paycash","Distrito","Plaza","Cartera"));
out.write("\n");
streamPaycashdatos.forEach(streamdato -> {
out.write(streamdato.getAccount()+","+streamdato.getReferencia()+","+streamdato.getReferenciapaycash()
+","+streamdato.getCartera()+","+streamdato.getState()+","+streamdato.getCity());
out.append("\r\n");
});
out.flush();
out.close();
streamPaycashdatos.close();
} catch (IOException ix) {
throw new RuntimeException("There is an error while downloading file", ix);
}
}
The method on DAO is this
...
#Override
public Stream<ReportePayCashDTO> ReportePayCashStream(String userId, String grupo) {
// TODO Auto-generated method stub
Stream<ReportePayCashDTO > stream = null ;
String query ="";
//more code
try {
stream = getJdbcTemplate().queryForStream(query, (rs, rowNum) -> {
return new ReportePayCashDTO(Utils.valnull(rs.getString("account")),
Utils.valnull(rs.getString("reference")),
Utils.valnull(rs.getString("referencepaycash")),
Utils.valnull(rs.getString("state")),
Utils.valnull(rs.getString("city")),
Utils.valnull(rs.getString("cartera"))
);
});
}catch(Exception e) {
e.printStackTrace();
logger.error(e.getMessage());
}
return stream;
}
Example: This is what I hoped will write into csv file
55xxxxx02,88xxxx153,1170050202662,TAMAULIPAS,TAMPICO,AmericanExpre
58xxxxx25,88xxx899,1170050202662,TAMAULIPAS,TAMPICO,AmericanClasic
but some rows was written like this
55xxxxx02,88xxxx153,1170050202662
,TAMAULIPAS,TAMPICO,AmericanExpre
58xxxxx25,88xxx899,1170050202662
,TAMAULIPAS,TAMPICO,AmericanClasic

Nifi Processor gets triggered twice for single Input flow file

I am currently new on Apache Nifi and still exploring it.
I made a custom processor where I will fetch data from server with pagination.
I pass the input file which will contains the attribute "url".
Finally transfer the response in output flow file, as I fetch data with pagination, so I made a new output flow file for each page and transferred it to Successful relationship.
Below is the code part:
#Override
public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
FlowFile incomingFlowFile = session.get();
String api = null;
if (incomingFlowFile == null) {
logger.info("empty input flow file");
session.commit();
return;
} else {
api=incomingFlowFile.getAttribute("url");
}
session.remove(incomingFlowFile);
if(api == null) {
logger.warn("API url is null");
session.commit();
return;
}
int page = Integer.parseInt(context.getProperty(PAGE).getValue());
while(page < 3) {
try {
String url = api + "&curpg=" + page;
logger.info("input url is: {}", url);
HttpResponse response = httpGetApiCall(url, 10000);
if(response == null || response.getEntity() == null) {
logger.warn("response null");
session.commit();
return;
}
String resp = EntityUtils.toString(response.getEntity());
InputStream is = new ByteArrayInputStream(StandardCharsets.UTF_16.encode(resp).array());
FlowFile outFlowFile = session.create();
outFlowFile = session.importFrom(is, outFlowFile);
session.transfer(outFlowFile, SUCCESSFUL);
} catch (IOException e) {
logger.warn("IOException :{}", e.getMessage());
return;
}
++page;
}
session.commit();
}
I am facing issue that for a single Input flow file, this processor get triggered twice and so it generates 4 flow files for a single input flow file.
I am not able to figure out this where I have done wrong.
Please help in this issue.
Thanks in advance.
======================================================================
processor group 1(Nifi_Parvin)
processor group 2 (News_Point_custom)

WebApi: Reading errors

I've got a simple web api which is consumed from a mvc project, I keep on getting the 'Response status code does not indicate success' and was wondering how would I get the response body from the error, I can see the error within a rest viewer but can't navigate through to the error. This is the following code within the MVC app
public ActionResult Index()
{
try
{
var uri = "http://localhost:57089/api/values";
using (var client = new HttpClient())
{
Task<string> response = client.GetStringAsync(uri);
object result = JsonConvert.DeserializeObject(response.Result);
return (ActionResult) result;
}
}
catch (Exception ex)
{
return Content(ex.ToString());
}
return View();
}
Within the API controller I'm sending a bad request, here's the code
public IHttpActionResult Get()
{
return BadRequest("this is a very bad request " + System.DateTime.Now.ToUniversalTime());
}
I've tried to use WebException, HttpRequestException as exceptions to catch the error with no luck.
I can see the response body within the rest viewer
I want to be able to navigate to the Error Message so I can pass that to the client (which later will be changed to a guid).
[EDITED]
I've got a solution without using GetStringAsync, but wanted to use that if possible.
Here's the solution
var httpClient = new HttpClient();
httpClient.BaseAddress = new Uri(url);
HttpResponseMessage responseMessage = httpClient.GetAsync("").Result;
if (responseMessage.IsSuccessStatusCode) return Content(responseMessage.ToString());
var a = responseMessage.Content.ReadAsStringAsync().Result;
var result = JsonConvert.DeserializeObject<HttpError>(a);
object value = "";
return Content(result.TryGetValue("ErrorMessage", out value) ? value.ToString() : responseMessage.ToString());
Is there a better way?
Using WebException you should be able to get to the ResponseStream and the custom error message like this:
catch (WebException e)
{
var message = e.Message;
using (var reader = new StreamReader(e.Response.GetResponseStream()))
{
var content = reader.ReadToEnd();
}
}
Hope that helps.

Populating a WP7 List from a webservice causes 'Invalid cross-thread access.'

Sorry if this is a easy question but I am totally new to WP7.
I have a rest service that I am trying to consume however I get an error 'Invalid cross-thread access.'
This is my code
public ObservableCollection<TransactionViewModel> Transactions { get;private set; }
public MainViewModel()
{
this.Transactions = new ObservableCollection<TransactionViewModel>();
}
public void LoadTransactions(string id)
{
var req = (HttpWebRequest)WebRequest.Create(string.Format("http://domain.com/Transactions?Id={0}", id));
req.Method = "POST";
req.ContentType = "application/json; charset=utf-8";
// call async
req.BeginGetResponse(new AsyncCallback(jsonGetRequestStreamCallback), req);
this.IsDataLoaded = true;
}
void jsonGetRequestStreamCallback(IAsyncResult asynchronousResult)
{
WebResponse response = ((HttpWebRequest)asynchronousResult.AsyncState).EndGetResponse(asynchronousResult);
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
string responseString = reader.ReadToEnd();
reader.Close();
var s = JsonConvert.DeserializeObject<List<TransactionViewModel>>(responseString);
foreach (var t in s)
{
Transactions.Add(new TransactionViewModel()
{
.........
}
}
Have I done something really stupid here?
When you come back from the request you are no longer on the UI thread. So you need to switch control back to the UI thread before performing any actions that will affect the UI.
You are updating an ObservableCollection, which will be bound on the UI and therefore the update is going to affect the UI.
There are a number of approaches, the simplest for you purposes will be
Deployment.Current.Dispatcher.BeginInvoke(()=> {
foreach (var t in s) {
Transactions.Add(new TransactionViewModel());
}
});
Edit: Also if you want to read a little more about this, I have a blog post about it here http://csainty.blogspot.com/2010/10/windows-phone-7asynchronous-programming.html it starts from code like yours that looks reasonable and should work, explains a few of the gotchas and how to get it working.

WP7 WebClient DownloadStringAsync and Map

I'm using WebClient object in to poll some data from server.
It's working good and it's updating text block fine. Till I don't use map on same Page. When I add a map, only one request get completed and data is retrieved only once.
This is the code for getting messages:
public MessagesPage()
{
InitializeComponent();
new System.Threading.Timer(messagePolling, null, 0, 5000); // every 5 seconds
}
void messagePolling(object state)
{
getMessages(Const.GET_MESSAGES_URL + uuid);
}
private void getMessages(string uri)
{
WebClient webClient = new WebClient();
webClient.DownloadStringAsync(new Uri(uri));
webClient.DownloadStringCompleted += new DownloadStringCompletedEventHandler(messagesResponseCompleted);
}
void messagesResponseCompleted(object sender, DownloadStringCompletedEventArgs e)
{
lock (this)
{
try
{
string s = e.Result;
if (s.Length > 0)
{
List<Message> messagesResult = JSONHelper.Deserialize<List<Message>>(s);
foreach (Message m in messagesResult)
{
tbMessages.Text += m.message + "\n";
}
}
else
{
tbMessages.Text += "No new messages #: " + System.DateTime.Now + "\n";
}
}
catch (System.Net.WebException we)
{
MessageBox.Show(we.Message);
}
}
}
Anyone?
The WebClient response is processed on the UI thread - so you don't need the lock that you have in your event handler.
For your particular problem - is this just occurring in the emulator? I've seen quite a few timer issues with the emulator - but never anything similar on the real phone.
As an aside, I believe in general it's better to use HttpWebRequest rather than WebClient - see the explanation here of webclient using the UI thread Silverlight Background Thread using WebClient - for your particular code I don't think this will be a problem.
If using
System.Windows.Threading.DispatcherTimer myDispatcherTimer = new System.Windows.Threading.DispatcherTimer();
myDispatcherTimer.Interval = new TimeSpan(0, 0, 0, 0, 5000);
myDispatcherTimer.Tick += new EventHandler(messagePolling);
myDispatcherTimer.Start();
instead of
new System.Threading.Timer(messagePolling, null, 0, 5000); // every 5 seconds
is working fine =)

Resources