Anonymous caller does not have storage.objects.get access to the GCS object. Permission 'storage.objects.get' denied on resource - spring

Ok so i have a spring boot application which is i stored many many files in google cloud storage server. Everything is ok but I have an issue when I try to delete any file (which is UI in my thymeleaf temp engine pressing DELETE button) it says Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object. Permission 'storage.objects.get' denied on resource (or it may not exist). I can upload video file without any issue into my bucket but when i pressing delete button it says above error. Where is my mistake?
Service Layer
public VideoLesson uploadFile(MultipartFile file) {
log.debug("Start file uploading service");
VideoLesson inputFile = new VideoLesson();
String originalFileName = file.getOriginalFilename();
if (originalFileName == null) {
throw new BadRequestException("Original file name is null");
}
Path path = new File(originalFileName).toPath();
try {
String contentType = Files.probeContentType(path);
VideoLessonDto fileDto = dataBucketUtil.uploadVideo(file, originalFileName, contentType);
if (fileDto != null) {
inputFile.setName(fileDto.getFileName());
inputFile.setFileUrl(fileDto.getFileUrl());
videoLessonRepository.save(inputFile);
log.debug("File uploaded successfully, file name: {} and url: {}", fileDto.getFileName(), fileDto.getFileUrl());
}
} catch (Exception e) {
log.error("Error occurred while uploading. Error: ", e);
throw new GCPFileUploadException("Error occurred while uploading");
}
log.debug("File details successfully saved in the database");
return inputFile;
}
DataBucketUtil.java
#Component
#Slf4j
public class DataBucketUtil {
#Value("${gcp.config.file}")
private String gcpConfigFile;
#Value("${gcp.project.id}")
private String gcpProjectId;
#Value("${gcp.bucket.id}")
private String gcpBucketId;
public VideoLessonDto uploadVideo(MultipartFile multipartFile, String fileName, String contentType) {
try {
log.debug("Start file uploading process on GCS");
byte[] fileData = FileUtils.readFileToByteArray(convertFile(multipartFile));
InputStream inputStream = new ClassPathResource(gcpConfigFile).getInputStream();
StorageOptions options = StorageOptions.newBuilder().setProjectId(gcpProjectId)
.setCredentials(GoogleCredentials.fromStream(inputStream)).build();
Storage storage = options.getService();
Bucket bucket = storage.get(gcpBucketId, Storage.BucketGetOption.fields());
RandomString id = new RandomString(6, ThreadLocalRandom.current());
Blob blob = bucket.create(fileName + checkFileExtension(fileName), fileData, contentType);
if (blob != null) {
log.debug("File successfully uploaded to GCS");
return new VideoLessonDto(blob.getName(), blob.getMediaLink());
}
} catch (Exception e) {
log.error("An error occurred while uploading data. Exception: ", e);
throw new GCPFileUploadException("An error occurred while storing data to GCS");
}
return null;
}
private File convertFile(MultipartFile file) {
try {
if (file.getOriginalFilename() == null) {
throw new BadRequestException("Original file name is null");
}
File convertedFile = new File(file.getOriginalFilename());
FileOutputStream outputStream = new FileOutputStream(convertedFile);
outputStream.write(file.getBytes());
outputStream.close();
log.debug("Converting multipart file : {}", convertedFile);
return convertedFile;
} catch (Exception e) {
throw new FileWriteException("An error has occurred while converting the file");
}
}
private String checkFileExtension(String fileName) {
if (fileName != null && fileName.contains(".")) {
String extension = ".mp4";
log.debug("Accepted file type : {}", extension);
return extension;
}
log.error("Not a permitted file type");
throw new InvalidFileTypeException("Not a permitted file type");
}
}
Controller Layer
#PostMapping(value = "/video_lesson/upload")
public String uploadFile(#RequestParam("file") MultipartFile file, Model model) {
String message;
try {
videoLessonService.uploadFile(file);
message = "Video bazaya müvəfəqiyyətlə yükləndi: " + file.getOriginalFilename();
model.addAttribute("message", message);
Thread.sleep(4000);
} catch (Exception e) {
message = "Diqqət bir video seçməlisiniz!";
model.addAttribute("message", message);
}
return "redirect:/video_lesson/files";
}
And this is the error output
Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object. Permission 'storage.objects.get' denied on resource (or it may not exist).
I tried in my bucket full access in the permissions, in GRANT ACCESS section but same error occurred

Related

Spring Boot using Apache POI streaming workbook - How do I release the memory taken up

I am writing a simple Spring boot app that reads from the DB and writes it out to an excel file using Apache POI. The generated file can contain upto 100K rows, and is around 8-10 MB in size.
My controller class:
public ResponseEntity<Resource> getExcelData(
#RequestBody ExcelRequest request) {
InputStreamResource file = new InputStreamResource(downloadService.startExcelDownload(request));
return ResponseEntity.ok()
.header(HttpHeaders.CONTENT_DISPOSITION, "attachment; filename=myFile.xlsx")
.contentType(MediaType.parseMediaType("application/vnd.ms-excel"))
.body(file);
}
Service class:
public ByteArrayInputStream startExcelDownload(ExcelRequest request) {
/** Apache POI code using SXSSFWorkbook **/
SXSSFWorkbook workbook = new SXSSFWorkbook(1000);
ByteArrayOutputStream out = new ByteArrayOutputStream();
try {
// Excel generation logic here
...
workbook.write(out);
return new ByteArrayInputStream(out.toByteArray());
}
catch (IOException | ParseException e) {
throw new RuntimeException("fail to import data to Excel file: " + e.getMessage());
}
finally {
try {
out.close();
} catch (Exception e)
{
e.printStackTrace();
}
}
}
Here is what I see in VisualVM
And in the heap dump:
byte[] 151,521,152 B (41.2%) 3,100,020 (30.7%)
Is there something I have missed? Should the byte[] continue to take up memory after the response has been returned? The memory goes down once I manually run the GC.

Extra characters are added to a video file downloaded via a Spring #RestController

I am downloading a file via a Spring #RestController returning the bytes of the video in the OutputStream of the ResponseEntity object.
#GetMapping(value = "/video/{id}/data")
#ResponseBody
public ResponseEntity<OutputStream> getVideoData(#PathVariable("id") Long id, HttpServletResponse response) {
try {
for (Video video : videos) {
if (id == video.getId()) {
if (videoDataMgr == null) {
videoDataMgr = VideoFileManager.get();
}
videoDataMgr.copyVideoData(video, response.getOutputStream());
return new ResponseEntity<OutputStream>(response.getOutputStream(), HttpStatus.OK);
}
}
response.sendError(HttpServletResponse.SC_NOT_FOUND);
} catch (IOException ioe) {
logger.log (Level.WARNING, ioe.toString());
}
return null;
}
If I play the video being served and the downloaded video they both show the same content. However, the following test fails when comparing byte by byte the contents of the 2 MP4 files:
#Test
public void testAddVideoData() throws Exception {
Video received = videoSvc.addVideo(video);
VideoStatus status = videoSvc.setVideoData(received.getId(),
new TypedFile(received.getContentType(), testVideoData));
assertEquals(VideoState.READY, status.getState());
Response response = videoSvc.getData(received.getId());
assertEquals(200, response.getStatus());
InputStream videoData = response.getBody().in();
byte[] originalFile = IOUtils.toByteArray(new FileInputStream(testVideoData));
byte[] retrievedFile = IOUtils.toByteArray(videoData);
assertTrue(Arrays.equals(originalFile, retrievedFile));
}
Doing a comparison between the files I can see that the videos are almost the same except for some characters at the end(the content of the downloaded file is on the right):
Where is {"ready": false} coming from?

Web3J to generate Java wrapper automatically

I am trying to generate the wrapper on fly using web3j and invoking a method on the generated class using reflection. But I am getting classNotFound exception on the first try. (When I stop the server and re-run, it works because the class was already present)
Does java support generating class on fly (when server is running) ?
private void createContractClass(String contractFileNameWithoutExtension) {
try {
String command = "web3j solidity generate -b " +
contractLocation+contractFileNameWithoutExtension+".bin -a "+contractLocation+contractFileNameWithoutExtension+".abi" +
" -o "+sourceCodeLocation+" -p generated";
LOG.info("Executing {}", command);
Process p = Runtime.getRuntime().exec(command);
int exitCode = p.waitFor();
if(exitCode != 0) {
LOG.error("Error {}", p.getOutputStream());
}
} catch(IOException | InterruptedException ex) {
ex.printStackTrace();
throw new FormatException(ValidationMessages.FAILED_TO_DEPLOY_CONTRACT);
}
}
private String invokeDeployment(String password, String walletFileName, String contractFileName) {
try {
Credentials credentials = WalletUtils.loadCredentials(password, walletLocation + "/" + walletFileName);
Class classz = Class.forName("generated."+ StringUtils.capitalize(contractFileName));
Method method = classz.getDeclaredMethod("deploy", Web3j.class, Credentials.class, BigInteger.class, BigInteger.class);
RemoteCall<?> invoke = (RemoteCall<?>)method.invoke(classz, web3JClient.getClient(), credentials, DefaultGasProvider.GAS_PRICE, DefaultGasProvider.GAS_LIMIT);
Contract contract = (Contract)invoke.send();
return contract.getContractAddress();
} catch (Exception e){
e.printStackTrace();
throw new FormatException(ValidationMessages.FAILED_TO_DEPLOY_CONTRACT);
}
}
It was because, the class was not loaded in the class path. Compile and load the class to the class path resolved the issue.

Using Elasticsearch with jetty jersey

I am using Elastic search, and it works well, but not when I try to use it with a webservice with jetty and jersey.
Here is an example of a function that I want to use :
public boolean insertUser(RestHighLevelClient client, User user) throws IOException
{
java.util.Map<String, Object> jsonMap = new HashMap<String, Object>();
jsonMap.put("username", user.username);
jsonMap.put("password", user.password);
jsonMap.put("mail", user.mail);
jsonMap.put("friends", user.friends);
jsonMap.put("maps", user.maps);
System.out.println("insertUser");
IndexRequest indexRequest = new IndexRequest("users", "doc",user.username)
.source(jsonMap);
try {
IndexResponse indexResponse = client.index(indexRequest);
System.out.println("insertUser 222");
if (indexResponse.getResult() == DocWriteResponse.Result.CREATED) {
System.out.println("user "+user.username+" créé");
}
else if (indexResponse.getResult() == DocWriteResponse.Result.UPDATED) {
System.out.println("user "+user.username+" update dans insertUser (pas normal)");
}
} catch (IOException e) {
e.printStackTrace();
return false;
}
return true;
}
This function works well when I try it inside a test class. But If i start my server like this :
Server server = new Server();
// Add a connector
ServerConnector connector = new ServerConnector(server);
connector.setHost("0.0.0.0");
connector.setPort(8081);
connector.setIdleTimeout(30000);
server.addConnector(connector);
DAO.ClientConnection("0.0.0.0",8081);
// Configure Jersey
ResourceConfig rc = new ResourceConfig();
rc.packages(true, "com.example.jetty_jersey.ws");
rc.register(JacksonFeature.class);
// Add a servlet handler for web services (/ws/*)
ServletHolder servletHolder = new ServletHolder(new ServletContainer(rc));
ServletContextHandler handlerWebServices = new ServletContextHandler(ServletContextHandler.SESSIONS);
handlerWebServices.setContextPath("/ws");
handlerWebServices.addServlet(servletHolder, "/*");
// Add a handler for resources (/*)
ResourceHandler handlerPortal = new ResourceHandler();
handlerPortal.setResourceBase("src/main/webapp/temporary-work");
handlerPortal.setDirectoriesListed(false);
handlerPortal.setWelcomeFiles(new String[] { "homepage.html" });
ContextHandler handlerPortalCtx = new ContextHandler();
handlerPortalCtx.setContextPath("/");
handlerPortalCtx.setHandler(handlerPortal);
// Activate handlers
ContextHandlerCollection contexts = new ContextHandlerCollection();
contexts.setHandlers(new Handler[] { handlerWebServices, handlerPortalCtx });
server.setHandler(contexts);
// Start server
server.start();
And when I enter a form, then call this webservice :
#POST
#Path("/signup")
#Produces(MediaType.APPLICATION_JSON)
// #Consumes(MediaType.APPLICATION_FORM_URLENCODED)
public SimpleResponse signup(#Context HttpServletRequest httpRequest,
#FormParam("username") String username,
#FormParam("email") String email,
#FormParam("password") String password,
#FormParam("passwordConfirm") String passwordConfirm) {
System.out.println("k");
//if (httpRequest.getSession().getAttribute("user") != null) { //httpRequest.getUserPrincipal() == null) {
try {
if (password.equals(passwordConfirm)) {
User user = new User("jeanOknewmail#gmail.com", "abc");
user.username = "jeanok";
user.maps = new ArrayList<String>();
user.friends = new ArrayList<String>();
System.out.println(user);
System.out.println("avant insert");
DAO.getActionUser().createIndexUser();
//System.out.println(DAO.getActionUser().getOneUser(DAO.client, "joe"));
System.out.println("rdctfygbhunji,k");
DAO.getActionUser().insertUser(DAO.client, user);
System.out.println("après insert");
return new SimpleResponse(true);
}
} catch (IOException e) {
e.printStackTrace();
}
//}
return new SimpleResponse(false);
}
I get lots of errors :
avax.servlet.ServletException: ElasticsearchStatusException[Unable to parse response body]; nested: ResponseException[method [PUT], host [http://0.0.0.0:8081], URI [/users/doc/jeanok?timeout=1m], status line [HTTP/1.1 404 Not Found]|];
...
Caused by:
ElasticsearchStatusException[Unable to parse response body]; nested: ResponseException[method [PUT], host [http://0.0.0.0:8081], URI [/users/doc/jeanok?timeout=1m], status line [HTTP/1.1 404 Not Found]|];
at org.elasticsearch.client.RestHighLevelClient.parseResponseException(RestHighLevelClient.java:598)
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:501)
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:474)
at org.elasticsearch.client.RestHighLevelClient.index(RestHighLevelClient.java:335)
at DAO.UserDAO.insertUser(UserDAO.java:160)
Do you have any idea why the behaviour of my function isn't the same when I launch my server? And why this error? Thanks for your help
I wasn't connected to elastic search. My client was connected to the wrong port. Now it works

How to transfer *.pgp files using SFTP spring Integration

We are developing generic automated application which will download *.pgp file from SFTP server.
The application working fine with *.txt files. But when we are trying to pull *.pgp files we are getting the below exception.
2016-03-18 17:45:45 INFO jsch:52 - SSH_MSG_SERVICE_REQUEST sent
2016-03-18 17:45:46 INFO jsch:52 - SSH_MSG_SERVICE_ACCEPT received
2016-03-18 17:45:46 INFO jsch:52 - Next authentication method: publickey
2016-03-18 17:45:48 INFO jsch:52 - Authentication succeeded (publickey).
sftpSession org.springframework.integration.sftp.session.SftpSession#37831f
files size158
java.io.IOException: inputstream is closed
at com.jcraft.jsch.ChannelSftp.fill(ChannelSftp.java:2884)
at com.jcraft.jsch.ChannelSftp.header(ChannelSftp.java:2908)
at com.jcraft.jsch.ChannelSftp.access$500(ChannelSftp.java:36)
at com.jcraft.jsch.ChannelSftp$2.read(ChannelSftp.java:1390)
at com.jcraft.jsch.ChannelSftp$2.read(ChannelSftp.java:1340)
at org.springframework.util.StreamUtils.copy(StreamUtils.java:126)
at org.springframework.util.FileCopyUtils.copy(FileCopyUtils.java:109)
at org.springframework.integration.sftp.session.SftpSession.read(SftpSession.java:129)
at com.sftp.test.SFTPTest.main(SFTPTest.java:49)
java code :
public class SFTPTest {
public static void main(String[] args) {
ApplicationContext applicationContext = new ClassPathXmlApplicationContext("beans.xml");
DefaultSftpSessionFactory defaultSftpSessionFactory = applicationContext.getBean("defaultSftpSessionFactory", DefaultSftpSessionFactory.class);
System.out.println(defaultSftpSessionFactory);
SftpSession sftpSession = defaultSftpSessionFactory.getSession();
System.out.println("sftpSessikon "+sftpSession);
String remoteDirectory = "/";
String localDirectory = "C:/312421/temp/";
OutputStream outputStream = null;
List<String> fileAtSFTPList = new ArrayList<String>();
try {
String[] fileNames = sftpSession.listNames(remoteDirectory);
for (String fileName : fileNames) {
boolean isMatch = fileCheckingAtSFTPWithPattern(fileName);
if(isMatch){
fileAtSFTPList.add(fileName);
}
}
System.out.println("files size" + fileAtSFTPList.size());
for (String fileName : fileAtSFTPList) {
File file = new File(localDirectory + fileName);
/*InputStream ipstream= sftpSession.readRaw(fileName);
FileUtils.writeByteArrayToFile(file, IOUtils.toByteArray(ipstream));
ipstream.close();*/
outputStream = new FileOutputStream(file);
sftpSession.read(remoteDirectory + fileName, outputStream);
outputStream.close();
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}finally {
try {
if (outputStream != null)
outputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
public static boolean fileCheckingAtSFTPWithPattern(String fileName){
Pattern pattern = Pattern.compile(".*\\.pgp$");
Matcher matcher = pattern.matcher(fileName);
if(matcher.find()){
return true;
}
return false;
}
}
Please suggest how to sort out this issue.
Thanks
The file type is irrelevant to Spring Integration - it looks like the server is closing the connection while reading the preamble - before the data is being fetched...
at com.jcraft.jsch.ChannelSftp.header(ChannelSftp.java:2908)
at com.jcraft.jsch.ChannelSftp.access$500(ChannelSftp.java:36)
at com.jcraft.jsch.ChannelSftp$2.read(ChannelSftp.java:1390)
at com.jcraft.jsch.ChannelSftp$2.read(ChannelSftp.java:1340)
The data itself is not read until later (line 1442 in ChannelSftp).
So it looks like a server-side problem.

Resources