I am currently working in Enterprise Java and I'm a newbie. I am trying to create a method which should delete a selected item from a data table. My project contains Graphical User Interface elements from "http://www.primefaces.org/showcase/".
The deletion is made through a web-service.
This is the method I created so far:
public boolean delete(String articleId) {
Client client = ClientBuilder.newClient();
WebTarget target
= client.target(DELETE_URL);//this is a String
//TODO call ws method delete
try{
target.request()....;
} catch(Exception ex) {
LOGGER.error("Delete Article Error ", ex);
}
return true;
}
Could you tell me how can I handle the deletion in an appropiate way?
All the best!
In your case the following should do the trick.
target.request().delete(Response.class)
Related
I am aware of the TransferManager and the .uploadFileList() and .uploadFileDirectory() methods, however they accept java.io.File types as arguments. I have a collection of byte array input streams containing jpeg image data. I don't want to create in-memory files to store this data before I upload it either.
So what I need is essentially what the S3 client's PutObjectRequest does but for a collection of InputStream objects. Also, if one upload fails, I want to abort the whole thing and not upload anything, much like how a database transaction will reverse the changes if something goes wrong along the way.
Is this possible with the Java SDK?
Before I share an answer, please consider upgrading...
fyi - TransferManager is deprecated, now supported as TransferManagerBuilder in JAVA AWS SDK, please consider upgrading if TransferManagerBuilder Object suits your needs.
now since you asked about TransferManager, you could either 1) copy the code below and replace the functionality/arguments with your custom in memory handling of the input stream and handle it in your custom function... or; 2) further below is another sample, try to use this as-is...
Github source modify with with inputstream and issue listed here
private def uploadFile(is: InputStream, s3ObjectName: String, metadata: ObjectMetadata) = {
try {
val putObjectRequest = new PutObjectRequest(bucketName, s3ObjectName,
is, metadata)
// TransferManager supports asynchronous uploads and downloads
val upload = transferManager.upload(putObjectRequest)
upload.addProgressListener(ExceptionReporter.wrap(UploadProgressListener(putObjectRequest)))
} catch {
case e: Exception => throw new RuntimeException(e)
}
}
Bonus, Nice custom answer here using sequence input streams
public void combineFiles() {
List<String> files = getFiles();
long totalFileSize = files.stream()
.map(this::getContentLength)
.reduce(0L, (f, s) -> f + s);
try {
try (InputStream partialFile = new SequenceInputStream(getInputStreamEnumeration(files))) {
ObjectMetadata resultFileMetadata = new ObjectMetadata();
resultFileMetadata.setContentLength(totalFileSize);
s3Client.putObject("bucketName", "resultFilePath", partialFile, resultFileMetadata);
}
} catch (IOException e) {
LOG.error("An error occurred while combining files. {}", e);
}
}
private Enumeration<? extends InputStream> getInputStreamEnumeration(List<String> files) {
return new Enumeration<InputStream>() {
private Iterator<String> fileNamesIterator = files.iterator();
#Override
public boolean hasMoreElements() {
return fileNamesIterator.hasNext();
}
#Override
public InputStream nextElement() {
try {
return new FileInputStream(Paths.get(fileNamesIterator.next()).toFile());
} catch (FileNotFoundException e) {
System.err.println(e.getMessage());
throw new RuntimeException(e);
}
}
};
}
Greeting my dear fellows
I would appreciate some help since I've been 2 days googling to find out why my code is not working. My webapp is Spring running in a Weblogic Server under Eclipse. Btw, apologies for my spelling (I am not native English speaker)
Straight from my webapp, the following controller works flawless
#RequestMapping(value = "/sendFile")
public ModelAndView vistaEnvioFicheros() throws myCustomException {
ModelAndView model = null;
try {
getLog().debug("Setting model for sending a file");
model = new ModelAndView("/content/sendFileWeb");
} catch (Exception ex) {
getLog().error("Shxx happens: ", ex);
throw new myCustomException(ex.getMessage(), ex);
}
return model;
}
This controller loads a jsp file with a file browser button and a file upload button which works great too.
So when I click on the upload button the following controller triggers:
#RequestMapping(value = "/uploadFile", method = RequestMethod.POST)
public Object subirFichero(#RequestParam(value = "file") MultipartFile file) throws myCustomException {
ModelAndView model = null;
if (file.isEmpty()){
try {
getLog().debug("File is empty");
model = new ModelAndView("/content/errorWeb");
} catch (Exception ex) {
getLog().error("Shxx happens again ", ex);
throw new myCustomException(ex.getMessage(), ex);
}
return model;
}
...
}
Problem is: when I upload empty file, errorWeb jsp file should be shown in my web browser, however nothing happens. If I debbug the controlles I can see the code runs propperly until return model sentence nonetheless errorWeb jsp file is not loaded
Could anyone give me a hint about why first controller loads it jsp view but second controller doesn't. Also I don't get a single error message in any console or whatever
Thank you very much
Ok, I got it solved! This was just a newbie thing (I am a newbie web developper).
The problem was in the client side, where my webapp was sending the request thay triggers the second controller, not by an standard href o after a submit (or whatever), but by a custom javascript function.
This custom javascript function firstly validates the file and then uploads it to the server. After that client side code stops listening the response from the server. That's why the new ModelAndView is ignored.
Thank you for your time to all of you who read the post
In spring framework we have #Cacheable to cache data right. Now my requirement is i want to retrieve all data form database by using Get method.
Controller
#RequestMapping(value = "/getUploadData", method = RequestMethod.GET)
public ResponseEntity<List<Ticket>> getUploadFileData() throws IOException {
return new ResponseEntity<>(ticketBookingService.getFileUploadData(), HttpStatus.OK);
}
Service
#Cacheable(value="ticketsCache")
public List<Ticket> getFileUploadData() {
List<Ticket> listOfData = (List<Ticket>) ticketBookingDao.findAll();
return listOfData;
}
}
output:
click image here to check output
http://localhost:8080/api/tickets/getUploadData
[{"ticketId":1,"passengerName":"Sean","bookingDate":1502649000000,"sourceStation":"Pune","destStation":"Mumbai","email":"sean.s2017#yahoo.com"},{"ticketId":2,"passengerName":"Raj","bookingDate":1502476200000,"sourceStation":"Chennai","destStation":"Mumbai","email":"raj.s2007#siffy.com"},{"ticketId":3,"passengerName":"Martin","bookingDate":1502735400000,"sourceStation":"Delhi","destStation":"Mumbai","email":"martin.s2001#xyz.com"},{"ticketId":4,"passengerName":"John","bookingDate":1503253800000,"sourceStation":"Chennai","destStation":"Mumbai","email":"john.s2011#yahoo.com"}]
Now i will do get and put operation by ticketid.
Get:
Controller:
#GetMapping(value="/ticket/{ticketId}")
public Ticket getTicketById(#PathVariable("ticketId")Integer ticketId){
return ticketBookingService.getTicketById(ticketId);
}
Service:
#Cacheable(value="ticketsCache",key="#ticketId",unless="#result==null")
public Ticket getTicketById(Integer ticketId) {
return ticketBookingDao.findOne(ticketId);
}
http://localhost:8080/api/tickets/ticket/1
{"ticketId":1,"passengerName":"Sean","bookingDate":1502649000000,"sourceStation":"Pune","destStation":"Mumbai","email":"sean.s2017#yahoo.com"}
Now when i do update email by using ticketid:
Put: controller
#PutMapping(value="/ticket/{ticketId}/{newEmail:.+}")
public Ticket updateTicket(#PathVariable("ticketId")Integer ticketId,#PathVariable("newEmail")String newEmail){
return ticketBookingService.updateTicket(ticketId,newEmail);
}
Service:
#CachePut(value="ticketsCache",key="#ticketId")
public Ticket updateTicket(Integer ticketId, String newEmail) {
Ticket upadedTicket = null;
Ticket ticketFromDb = ticketBookingDao.findOne(ticketId);
if(ticketFromDb != null){
ticketFromDb.setEmail(newEmail);
upadedTicket = ticketBookingDao.save(ticketFromDb);
}
return upadedTicket;
}
http://localhost:8080/api/tickets/ticket/1/abcd#yahoo.com
{
"ticketId": 1,
"passengerName": "Sean",
"bookingDate": 1502649000000,
"sourceStation": "Pune",
"destStation": "Mumbai",
"email": "abcd#yahoo.com"
}
Now when get data by using ID changes are updating.
http://localhost:8080/api/tickets/ticket/1
{"ticketId":1,"passengerName":"Sean","bookingDate":1502649000000,"sourceStation":"Pune","destStation":"Mumbai","email":"abcd#yahoo.com"}
Now my Question is if i try to get all data by using above first URL my changes are not reflecting.
http://localhost:8080/api/tickets/getUploadData
[{"ticketId":1,"passengerName":"Sean","bookingDate":1502649000000,"sourceStation":"Pune","destStation":"Mumbai","email":"sean.s2017#yahoo.com"},{"ticketId":2,"passengerName":"Raj","bookingDate":1502476200000,"sourceStation":"Chennai","destStation":"Mumbai","email":"raj.s2007#siffy.com"},{"ticketId":3,"passengerName":"Martin","bookingDate":1502735400000,"sourceStation":"Delhi","destStation":"Mumbai","email":"martin.s2001#xyz.com"},{"ticketId":4,"passengerName":"John","bookingDate":1503253800000,"sourceStation":"Chennai","destStation":"Mumbai","email":"john.s2011#yahoo.com"}]
Suggest me how to reslove this issue
You cannot bulk update the cache with Spring.
Please check the following issue - closed with status declined:
Thanks for creating the issue but I am not keen to add this extra complexity to the cache abstraction. It is not meant to manage state for you (the next logical step if we allow this is that we have to keep the returned list in sync with each item). And if we don't we are inconsistent and we merely provide a way to talk to the cache using annotations. That's not very helpful.
Back to your example, this is typically what a second level cache is meant to do for you. This is not in the scope of the cache abstraction.
I have some kind of promblem with BulkInsert on my OracleDB. I need to insert couple of thousand objects so I decided to use EF.BulkInsert.Oracle added by Nuget which is extension of EF6.BulkInsert for Oracle.
private IOracleDbContext _context;//Class property
//method body:
EF6.BulkInsert.ProviderFactory.Register<EF6.BulkInsert.Providers.OracleBulkInsertProvider>("BulkInsertProvider");
using (var context = (OracleDbContext)_context)
{
using (var dbContextTransaction = context.Database.BeginTransaction())
{
try
{
//Preparing list of objects
var opt = new EF6.BulkInsert.BulkInsertOptions();
opt.Connection = context.Database.Connection;
await context.BulkInsertAsync<ObjectType>(ObjectList,opt);
await context.SaveChangesAsync();
dbContextTransaction.Commit();
stopwatch.Stop();
}
catch (Exception ex)
{
dbContextTransaction.Rollback();
throw ex;
}
}
}
Without adding opt (BulkInsertOptions object) as parameter of BulkInsert it is trying to connect with SQLServer (which don't exist so I get connection failure). After add this BulkOptions with connection I get exception that connection is already part of transaction :/
Traditional way (_context.TableName.Add() ) of course works but It takes unacceptable amount of time.
Any idea what I did wrong here?
I found better way (BulkInsert still do not cooperate). I used Array Binding
mentioned here
It reduced insert time from ~6 minutes to ~1-1.5 seconds :D (7770 records)
In the SDK Javadoc, the Community class does not have a "setParentCommunity" method but the CommunityList class does have a getSubCommunities method so there must be a programmatic way to set a parent Community's Uuid on new Community creation. The REST API mentions a "rel="http://www.ibm.com/xmlns/prod/sn/parentcommunity" element". While looking for clues I check an existing Subcommunity's XmlDataHandler's nodes and found a link element. I tried getting the XmlDataHandler for a newly-created Community and adding a link node with href, rel and type nodes similar to those in the existing Community but when trying to update or re-save the Community I got a bad request error. Actually even when I tried calling dataHandler.setData(n) where n was set as Node n=dataHandler.getData(); without any changes, then calling updateCommunity or save I got the same error, so it appears that manipulating the dataHandler XML is not valid.
What is the recommended way to specify a parent Community when creating a new Community so that it is created as a SubCommunity ?
The correct way to create a sub-community programatically is to modify the POST request body for community creation - here is the link to the Connections 45 infocenter - http://www-10.lotus.com/ldd/appdevwiki.nsf/xpDocViewer.xsp?lookupName=IBM+Connections+4.5+API+Documentation#action=openDocument&res_title=Creating_subcommunities_programmatically_ic45&content=pdcontent
We do not have support in the SBT SDK to do this using CommunityService APIs. We need to use low level Java APIs using Endpoint and ClientService classes to directly call the REST APIs with the appropriate request body.
I'd go ahead and extend the class CommunityService
then go ahead and add CommunityService
https://github.com/OpenNTF/SocialSDK/blob/master/src/eclipse/plugins/com.ibm.sbt.core/src/com/ibm/sbt/services/client/connections/communities/CommunityService.java
Line 605
public String createCommunity(Community community) throws CommunityServiceException {
if (null == community){
throw new CommunityServiceException(null, Messages.NullCommunityObjectException);
}
try {
Object communityPayload;
try {
communityPayload = community.constructCreateRequestBody();
} catch (TransformerException e) {
throw new CommunityServiceException(e, Messages.CreateCommunityPayloadException);
}
String communityPostUrl = resolveCommunityUrl(CommunityEntity.COMMUNITIES.getCommunityEntityType(),CommunityType.MY.getCommunityType());
Response requestData = createData(communityPostUrl, null, communityPayload,ClientService.FORMAT_CONNECTIONS_OUTPUT);
community.clearFieldsMap();
return extractCommunityIdFromHeaders(requestData);
} catch (ClientServicesException e) {
throw new CommunityServiceException(e, Messages.CreateCommunityException);
} catch (IOException e) {
throw new CommunityServiceException(e, Messages.CreateCommunityException);
}
}
You'll want to change your communityPostUrl to match...
https://greenhouse.lotus.com/communities/service/atom/community/subcommunities?communityUuid=2fba29fd-adfa-4d28-98cc-05cab12a7c43
and where the Uuid here is the parent uuid.
I followed #PaulBastide 's recommendation and created a SubCommunityService class, currently only containing a method for creation. It wraps the CommunityService rather than subclassing it, since I found that preferrable. Here's the code in case you want to reuse it:
public class SubCommunityService {
private final CommunityService communityService;
public SubCommunityService(CommunityService communityService) {
this.communityService = communityService;
}
public Community createCommunity(Community community, String superCommunityId) throws ClientServicesException {
Object constructCreateRequestBody = community.constructCreateRequestBody();
ClientService clientService = communityService.getEndpoint().getClientService();
String entityType = CommunityEntity.COMMUNITY.getCommunityEntityType();
Map<String, String> params = new HashMap<>();
params.put("communityUuid", superCommunityId);
String postUrl = communityService.resolveCommunityUrl(entityType,
CommunityType.SUBCOMMUNITIES.getCommunityType(), params);
String newCommunityUrl = (String) clientService.post(postUrl, null, constructCreateRequestBody,
ClientService.FORMAT_CONNECTIONS_OUTPUT);
String communityId = newCommunityUrl.substring(newCommunityUrl.indexOf("communityUuid=")
+ "communityUuid=".length());
community.setCommunityUuid(communityId);
return community;
}
}