I'm using ObjectMapper to serialize posts in my system to json. These posts contain entries from all over the world and contain utf-8 characters. The problem is that the ObjectMapper doesn't seem to be handling these characters properly. For example, the string "Musée d'Orsay" gets serialized as "Mus?©e d'Orsay".
Here's my code that's doing the serialization:
public static String toJson(List<Post> posts) {
ObjectMapper objectMapper = new ObjectMapper()
.configure(Feature.USE_ANNOTATIONS, true);
ByteArrayOutputStream out = new ByteArrayOutputStream();
try {
objectMapper.writeValue(out, posts);
} catch (JsonGenerationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (JsonMappingException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return new String(out.toByteArray());
}
Interestingly, the exact same List<Post> posts gets serialized just fine when I return it via a request handler using #ResponseBody using the following configuration:
public void configureMessageConverters(List<HttpMessageConverter<?>> converters) {
ObjectMapper m = new ObjectMapper()
.enable(Feature.USE_ANNOTATIONS)
.disable(Feature.FAIL_ON_UNKNOWN_PROPERTIES);
MappingJacksonHttpMessageConverter c = new MappingJacksonHttpMessageConverter();
c.setObjectMapper(m);
converters.add(c);
super.configureMessageConverters(converters);
}
Any help greatly appreciated!
Aside from conversions, how about simplifying the process to:
return objectMapper.writeValueAsString(posts);
which speeds up the process (no need to go from POJO to byte to array to decode to char to build String) as well as (more importantly) shortens code.
Not 10 minutes later and I found the problem. The issue wasn't with the ObjectMapper, it was with how I was turning the ByteArrayOutputStream into a string. I changed the code as follows and everything started working:
try {
return out.toString("utf-8");
} catch (UnsupportedEncodingException e) {
return out.toString();
}
Related
method I am testing (the method setEventHubDataPayload throws JSONException and JsonProcessingException):
public class EventHubMapper {
//inits
public byte[] toEventDataJsonByteArray(UserRecord inbound) {
EventHubDto ehDto = new EventHubDto();
ehDto.setEventTypeVersion(inbound.getVersion());
ehDto.setEventId(inbound.getNotificationId());
JSONObject eventJson = new JSONObject(ehDto);
try {
eventJson.put("data", setEventHubDataPayload(ehDto, inbound));
} catch (JSONException e) {
analytics.trackError(AnalyticsConstants.EventHub.JSON_MAPPING_ERROR, e.toString());
} catch (JsonProcessingException e) {
analytics.trackError(AnalyticsConstants.EventHub.JSON_PROCESSING_ERROR, e.toString());
}
return eventJson.toString().getBytes();
}
}
unit test code:
#Test
public void toEventDataByteArray_JsonException() throws JSONException, JsonProcessingException {
EventHubMapper ehmMock = Mockito.spy(eventHubMapper);
doThrow(new JSONException("blah")).when(ehmMock).setEventHubDataPayload(any(), any());
eventHubMapper.toEventDataJsonByteArray(setUpMockUserRecord());
verify(analytics, times(1)).trackError( AnalyticsConstants.EventHub.JSON_MAPPING_ERROR, new JSONException("blah").toString());
}
I've tried using more specific matchers ... ex: any(EventHubDto.class) or any(UserRecord.class) and got the same result:
Wanted but not invoked:
analytics.trackError(
"EventHub_Publish_Error",
""
;
and also
Actually, there were zero interactions with this mock.
what is going on here?
I think you need to call like below while testing.
ehmMock.toEventDataJsonByteArray(setUpMockUserRecord());
I am setting up java based graphQl App and find graphql-java-tools really convenient the problem though while itis pretty straight forward
With graphql-java to make field resolvers Async I couldn't Find a way to do it using graphql-java-tools
I tried
#Bean
public ExecutionStrategy executionStrategy() {
return new AsyncExecutionStrategy();
}
Here resolvers I use in order to test
#Component
public class VideoResolver implements GraphQLResolver<Video> {
public Episode getEpisode(Video video){
Episode result = new Episode();
result.setTitle("episodeTitle");
result.setUuid("EpisodeUuid");
result.setBrand("episodeBrand");
try {
Thread.sleep(2000L);
System.out.println(Thread.currentThread().getName());
} catch (InterruptedException e) {
e.printStackTrace();
}
return result;
}
public List<Images> getImages(Video video){
Images image = new Images();
image.setFileName("Image FileName1");
List<Images> imageList = new ArrayList<>();
imageList.add(image);
try {
Thread.sleep(2000L);
System.out.println(Thread.currentThread().getName());
} catch (InterruptedException e) {
System.out.println("Exxxxxxxxxx");
}
return imageList;
}
}
Was assuming this should run in about 2 seconds and print two different streams but no it takes 4
and print it is all in one same stream
I am trying to read data from the database, and run process on each object concurrently.
My config as below,
#Bean
public Job job() {
return jobBuilderFactory.get("job").incrementer(new RunIdIncrementer()).listener(new Listener(videoDao))
.flow(step1()).end().build();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<VideosDTO, VideosDTO>chunk(3)
.reader(databaseVideoItemReader(null))
.processor(new Processor())
.writer(new Writer(videoDao))
.build();
}
#Bean
#StepScope
ItemReader<VideosDTO> databaseVideoItemReader(#Value("#{jobParameters[userId]}") String userId) {
logger.info("Fetching videos for userId:"+userId);
JdbcCursorItemReader<VideosDTO> databaseReader = new JdbcCursorItemReader<>();
databaseReader.setDataSource(dataSource);
databaseReader.setSql("SELECT * FROM voc.t_videos where user_id="+userId+"AND job_success_ind='N'");
databaseReader.setRowMapper(new BeanPropertyRowMapper<>(VideosDTO.class));
// databaseReader.open(new ExecutionContext());
ExecutionContext executionContext= new ExecutionContext();
executionContext.size();
databaseReader.open(executionContext);
return databaseReader;
}
My item process is as below,
#Override
public VideosDTO process(VideosDTO videosDTO) throws Exception {
log.info("processing........" + videosDTO.getVideoUrl());
try {
Process p = Runtime.getRuntime()
.exec("C:\\Program Files\\Git\\bin\\bash.exe " + "D:\\DRM\\script.sh " + videosDTO.getVideoUrl());
// .exec("D:\\PortableGit\\bin\\bash.exe
// D:\\Vocabimate_Files\\script.sh "+videosDTO.getVideoUrl());
// Thread.sleep(1000);
Thread.sleep(1000);
p.destroy();
try {
p.waitFor();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
try (InputStream is = p.getErrorStream()) {
int in = -1;
while ((in = is.read()) != -1) {
System.out.print((char) in);
}
}
try (InputStream is = p.getInputStream()) {
int in = -1;
while ((in = is.read()) != -1) {
System.out.print((char) in);
}
}
} catch (IOException e2) {
// TODO Auto-generated catch block
e2.printStackTrace();
}
return videosDTO;
}
writer is as below:
#Override
public void write(List<? extends VideosDTO>videosList) throws Exception {
for(VideosDTO vid:videosList){
log.info("writting...."+vid.getVideoUrl());
}
}
Suppose if there are 3 Objects fetched from DB this code first
complete process on first object,than second and than third than
starts writing.I want to Run process on the three object concurrently
at same time,than perform writing operation.
Is there any way to do this?
Using a multi-threaded step like suggested by #dimitrisli is the way to go. In addition to that, another way is to use the AsyncItemProcessor (in combination with an AsyncItemWriter).
A similar use case (calling a rest endpoint asynchronously from the processor) can be found here: https://stackoverflow.com/a/52309260/5019386 where I gave some more details.
Hope this helps.
Without getting into the details of your custom Reader/Processor/Writer, I think what you're looking for is a multi-threaded Step.
As also described in the above linked documentation in order to make your step multi-threaded (meaning reading/processing/writing each chunk in a separate thread) you first need to register a SimpleAsyncTaskExecutor:
#Bean
public TaskExecutor taskExecutor(){
return new SimpleAsyncTaskExecutor("myAsyncTaskExecutor");
}
and then register this task executor in your Step's builder:
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<VideosDTO, VideosDTO>chunk(3)
.reader(databaseVideoItemReader(null))
.processor(new Processor())
.writer(new Writer(videoDao))
//making the Step multi-threaded
.taskExecutor(taskExecutor())
.build();
}
I want to bind a connection to a thread, and use that connection for any JdbcTemplate calls, to finally commit the changes or do a rollback.
I'm declaring all sentences from a Groovy script, so I can't control how many SQL query will be call, that's why I have to used this method and not a TransactionalTemplate or something like that. this script will call a helper class that will use that connection and JdbcTemplate, let's call that class SqlHelper.
Right now my non-working-as-needed solution is call from groovy script to that SqlHelper to initialize a transaction:
initTransaction(ecommerce)
which calls
public void initTransaction(DataSource dataSource) {
DefaultTransactionDefinition transactionDefinition = new DefaultTransactionDefinition();
transactionDefinition.setReadOnly(false);
transactionDefinition.setIsolationLevel(TransactionDefinition.ISOLATION_SERIALIZABLE);
// dataSourceTransactionManager.setDataSource(dataSource);
// dataSourceTransactionManager.setRollbackOnCommitFailure(true);
// dataSourceTransactionManager.setTransactionSynchronization(TransactionSynchronization.STATUS_COMMITTED);
// dataSourceTransactionManager.setDataSource(dataSource);
try {
Connection connection = DataSourceUtils.getConnection(dataSource);
connection.setAutoCommit(false);
DataSourceUtils.prepareConnectionForTransaction(connection, transactionDefinition);
} catch (CannotGetJdbcConnectionException e) {
throw new RuntimeException(e);
} catch (SQLException e) {
throw new RuntimeException(e);
}
}
After that the script will call some SQL operations, like
sqlUpdate(ecommerce, insertSentence, insertParams)
which calls
public Integer update(DataSource dataSource, String sql, Map<String, Object> paramMap) {
return new NamedParameterJdbcTemplate(dataSource).update(sql, paramMap);
}
Finally I want to finish the transaction committing the changes using
commitTransaction(dataSource)
which calls
public void commitTransaction(DataSource dataSource) {
Connection connection = DataSourceUtils.getConnection(dataSource);
try {
connection.commit();
} catch (Exception e) {
rollbackTransaction(dataSource);
}
// DataSourceUtils.resetConnectionAfterTransaction(connection, TransactionDefinition.ISOLATION_DEFAULT);
// SimpleTransactionStatus transactionStatus = new SimpleTransactionStatus(false);
// try {
// dataSourceTransactionManager.commit(transactionStatus);
// jta.commit(transactionStatus);
// } catch (TransactionException e) {
// dataSourceTransactionManager.rollback(transactionStatus);
// throw new RuntimeException(e);
// }
}
private void rollbackTransaction(DataSource dataSource) {
Connection connection = DataSourceUtils.getConnection(dataSource);
try {
connection.rollback();
} catch (SQLException e) {
throw new RuntimeException(e);
}
DataSourceUtils.resetConnectionAfterTransaction(connection, TransactionDefinition.ISOLATION_DEFAULT);
}
I left commented blocks of some testing to show you what approaches I tried. I don't know very well how Spring transaction works, so I'm just trying different things and trying to learn how all this stuff works... I will provide you more information if you want ;)
Thank you in advance.
UPDATE
As M. Denium suggested, that's what I have for now:
I declared the variable, using the TransactionStatus as ThreadSafe and finally solved as:
private DataSourceTransactionManager dataSourceTransactionManager = null;
private DefaultTransactionDefinition transactionDefinition = null;
private static final ThreadLocal<TransactionStatus> transactionStatus = new ThreadLocal<TransactionStatus>() {
#Override
protected TransactionStatus initialValue() {
return null;
}
};
And then using the same call from Groovy script, using the helper methods:
public void initTransaction(DataSource dataSource) {
dataSourceTransactionManager = new DataSourceTransactionManager();
transactionDefinition = new DefaultTransactionDefinition();
transactionDefinition.setReadOnly(false);
transactionDefinition.setIsolationLevel(TransactionDefinition.ISOLATION_SERIALIZABLE);
dataSourceTransactionManager.setRollbackOnCommitFailure(true);
dataSourceTransactionManager.setTransactionSynchronization(TransactionSynchronization.STATUS_UNKNOWN);
dataSourceTransactionManager.setDataSource(dataSource);
transactionStatus.set(dataSourceTransactionManager.getTransaction(null));
}
public void commitTransaction() {
try {
LOG.info("Finishing transaction...");
dataSourceTransactionManager.commit(transactionStatus.get());
dataSourceTransactionManager.getDataSource().getConnection().close();
LOG.info("Done.");
} catch (Throwable e) {
dataSourceTransactionManager.rollback(transactionStatus.get());
throw new RuntimeException("An exception during transaction. Rolling back.");
}
}
You are trying to reimplement the things that are already implemented by the transaction abstraction of Spring. Simply use the proper PlatformTransactionManager (you can probably grab that from an ApplicationContext) keep a reference to the TransactionStatus instead of a DataSource and use that to commit/rollback.
public TransactionStatus initTransaction() {
return transactionManager.getTransaction(null);
}
public void commit(TransactionStatus status) {
transactionManager.commit(status);
}
Instead of passing the TransactionStatus around you could also store it in a ThreadLocal and retrieve it in the commit method. This would ease the pain.
Another tip you shouldn't be creating JdbcTemplates and NamedParameterJdbcTemplates those are heavy objects to create. Upon construction they consult a connection to determine which database and version this is needed for the exception conversion. So performance wise this isn't a smart thing to do. Create a single instance and reuse, the templates are thread safe so you would only be needing a single instance.
However I would strongly argue that you should actually be using Groovy and not to try to work around it. Groovy has the Sql class that can help you. You already have access to the DataSource so doing something like this would be all that is needed.
def sql = new Sql(dataSource);
sql.withTransaction {
sql.execute "INSERT INTO city (name, state, founded_year) VALUES ('Minneapolis', 'Minnesota', 1867)"
sql.execute "INSERT INTO city (name, state, founded_year) VALUES ('Orlando', 'Florida', 1875)"
sql.execute "INSERT INTO city (name, state, founded_year) VALUES ('Gulfport', 'Mississippi', 1887)"
}
This is plain Groovy, no need to develop additional classes or to write extensive documentation to get it working. Just Groovy...
I created servlet to download file from a server. In GWT I created a FormPanel and I am able to download a file.
Problem is, that I want to fire an event, when the file is ready. I tried to use onSubmitComplete event, but it isn't firing.
I found a suggestion, to change ContetType to "text/html", but still no luck. I found, that the problem lies in writing to OutputStream - when commented out, event is fired.
Here is my servlet code
public void handleRequest(HttpServletRequest request,
HttpServletResponse response) throws ServletException, IOException {
response.setContentType("text/html");
response.setHeader("Content-Disposition", "attachment; filename=File.xls");
HSSFWorkbook workbook = new HSSFWorkbook();
try {
workbook = fileCreator.getWorkbook();
} catch (ClassNotFoundException e) {
e.printStackTrace();
} catch (SQLException e) {
e.printStackTrace();
} catch (ParseException e) {
e.printStackTrace();
}
OutputStream out = response.getOutputStream();
workbook.write(out);
out.close();
response.setStatus(HttpServletResponse.SC_OK);
response.getWriter().print("something");
response.flushBuffer();
}
So file is downloaded successfully, but event is not triggered. Even when I just get OutputStream and close it (without writing to it), event stops working.
When I remove whole "writing-to-output-stream" code, event works like a charm.
Any suggestions?
UPDATE
Here is code for FormPanel and its handlers, meby there is a problem?
Form:
downloadFileFormPanel.setEncoding(FormPanel.ENCODING_URLENCODED);
downloadFileFormPanel.setMethod(FormPanel.METHOD_POST);
VerticalPanel panel = new VerticalPanel();
panel.setWidth(UIConstatns.SIZE_100percent);
downloadFileFormPanel.setWidget(panel);
downloadFileButton = new Button(messages.EXPORT_LIMITS());
downloadFileButton.setWidth(UIConstatns.SIZE_100percent);
downloadFileButton.addStyleName("navigation-button");
panel.add(downloadFileButton);
Handlers
private void registerExportLimitsHandler() {
registerHandler(getView().getDownloadFileButton().addClickHandler(new ClickHandler() {
#Override
public void onClick(ClickEvent event) {
getView().showLoadingDialog();
getDownloadFileForm().submit();
}
}));
}
private void registerFormSubmitCompleteHandler() {
getView().getDownloadFileForm().addSubmitCompleteHandler(new SubmitCompleteHandler() {
public void onSubmitComplete(SubmitCompleteEvent event) {
Window.alert("download complete");
getView().hideLoadingDialog();
}
});
}
According to the Javadoc of ServletResponse#getWriter() you can either use response.getOutputStream() or response.getWriter() to write the body of the response, but not both. Furthermore it is better to set the status code of the response before writing the body. Please try the following:
// ...
response.setStatus(HttpServletResponse.SC_OK);
OutputStream out = response.getOutputStream();
workbook.write(out);
out.flush();
out.close();
// response.getWriter().print("something");
// response.flushBuffer();
You did not post the line where you create your FormPanel so I'm not sure if this was your problem:
Looks like the FormPanel(String target) constructor does not work with the SubmitCompleteHandler. With the default constructor it seems to work.