How to upload large files to a 99designs/gqlgen backend - go

I want to upload large files to my backend which I created with gqlgen.
To do that, I want to use multipart requests to keep only one endpoint.
(Client implementation e.g. apollo-upload-client)
There is an example and documentation on how to upload files using gqlgen using multipart requests. However, this only works for small files. When I try to upload large files (only 500mb in this case) I get a connection reset error. (I get the same error in my implementation and the example gqlgen provided)
Does anyone know a solution for this?

Related

How to return an image file (Byte[]) as a compressed file with Spring API?

I'm the company's file server
Get the file as byte[] through the image path and authentication key.
(This server is not accessible to me.)
What I want to do is, when the user downloads the selected files, I want to compress these files and provide them as a compressed file.
Since the company's file server does not have a download API for multiple files, I think I need to request as many APIs as the number of file lists with a for statement in my service API.
In other words, it seems that we need to take a List<Byte[]> and compress this list.
Is there something wrong with my method?
And can I pass the result as json after compression? (I confirmed that the image file is passed as json.)

Uploading unique files at concurrent load using JMeter

We have usecase where we need to call an API that uploads its respective category of unique file.
For every API call we need to use a unique FileName. I mean File once used in an API call should not be used again.
For Example
CarAPI will be called by uploading a file-name from list of files (CarAP_1.xml to CarAP_1000.xml files)
File CarAP_1.xml once used in an API call should not be used again in next call
BikeAPI will be called by uploading a file-name from list of files (BikeAP_1.xml to BikeAP_1000.xml files)
File once used in an API call should not be used again.
Any thoughts or inputs on how we can achieve this using JMeter.
You can put these filenames
either to a CSV file and use HTTP Simple Table Server, its READ endpoint has KEEP=false mode so once the data is used it will be removed from memory hence you can avoid duplicate requests
or to Redis and use Redis Data Set Config which also provides possibility to remove the data from the list once it has been utilized
Both plugins can be installed using JMeter Plugins Manager

Uploading images to Spring Boot and S3 all In-Memory

I have an Angular webapp that uses a Spring Boot REST service as its backing web service.
I am adding a "Profiles" feature for users, and as part of this I want to stand up an endpoint that allows users to upload profile images for themselves and immediately upload those files to S3 (where I will host all the images from).
Looking at several Spring Boot/file upload tutorials :
http://www.mkyong.com/spring-boot/spring-boot-file-upload-example/
I update avatar image and display it but the avatar does not change in Spring Boot , why?
Many others
It seems that the standard way of handling such file upload is exposing a controller endpoint that accepts MultipartFiles like so:
#RestController
#RequestMapping("/v1/profiles")
public class ProfileController {
#PostMapping("/photo")
public ResponseEntity uploadProfilePhoto(#RequestParam("mpf") MultipartFile mpf)
// ...
}
Looking at all this code, I can't tell if the MultipartFile instance is in-memory or if Spring sets its location somewhere (perhaps under /tmp?) on the disk.
Looking at the AWS S3 Java SDK tutorial, it seems the standard way to upload a disk-based File is like so:
File file = new File(uploadFileName);
s3client.putObject(new PutObjectRequest(bucketName, keyName, file));
So it looks like I must have a File on disk in order to upload to S3.
I'm wondering if there is a way to keep everything in memory, or whether this is a bad idea and I should stick to disks/File instances!
Is there a way to keep the entire profile image (MultipartFile) in-mempory inside the controller method?
Is there a way to feed (maybe via serialization?!) a MultipartFile instance to S3's PutObjectRequest?
Or is this all a terrible idea (if so, why?!)?
Is there a way to keep the entire profile image (MultipartFile) in-mempory inside the controller method?
No, there is NO way to keep an image File in-memory because File object in java represents a path in file system.
Is there a way to feed (maybe via serialization?!) a MultipartFile instance to S3's PutObjectRequest?
No, from S3's API documentation, there is no way for S3 to deserialize to the image file for you after/during the upload.
Or is this all a terrible idea (if so, why?!)?
It depends on your specific case but it is generally not preferred.
If - there are not many users uploading images at the same time, your memory is probably enough to handle.
Else - You can easily get out-of-memory problems.
If you insist on doing so, S3 API can upload an InputStream (If I remember correctly). You can convert your Multipart File to an InputStream.
This SO thread talks about uploading to S3 with InputStream
You can also take a look at File.createTempFile() to create a temp file.
I have been looking at the same thing. Basically you want a user to be able to be able to upload a photo album and have those photos served from S3 and probably have them secured so only that user can upload/delete/etc.
I believe the simpler answer is in spring boot to get a Pre-signed URL from S3. https://docs.aws.amazon.com/AmazonS3/latest/dev/PresignedUrlUploadObjectJavaSDK.html
which basically gives you a token defining the bucket, and object key ("/bobs_profile/smiling_bob.jpg") and a time limit for that image to be uploaded.
Give that to your angular app (or ionic app) to upload the image to that location.
That should do it. but someone let me know if I'm wrong.
The only issue that I see is if bob wants to upload "bobs_nude_photo.jpg" and only wants spring security logged in people to be able to see it... well I'm sure there is an S3 solution for that??

What's the fastest way to upload an image to a webserver?

I am building an application which will allow users to upload images. Mostly, it will work with mobile browsers with slow internet connections. I was wondering if there are best practices for this. Does doing some encryption and than doing the transfer and decoding on server is a trick to try ? OR something else?
You would want something preferably with resumable uploads. Since your connections is slow you'd need something that can be resumed where you left off. A library i've come across over the many years is Nginx upload module:
http://www.grid.net.ru/nginx/upload.en.html
According to the site:
The module parses request body storing all files being uploaded to a directory specified by upload_store directive. The files are then being stripped from body and altered request is then passed to a location specified by upload_pass directive, thus allowing arbitrary handling of uploaded files. Each of file fields are being replaced by a set of fields specified by upload_set_form_field directive. The content of each uploaded file then could be read from a file specified by $upload_tmp_path variable or the file could be simply moved to ultimate destination. Removal of output files is controlled by directive upload_cleanup. If a request has a method other than POST, the module returns error 405 (Method not allowed). Requests with such methods could be processed in alternative location via error_page directive.

Ajax file upload in node.js

I want to upload ajax file upload which uses xhr to send file data,
at client m using this
http://valums.com/ajax-upload/
how i will accept this data on node and save the file to server by node.js , which module i need to use in node.js?
I've created an uploader with progress bar using the formidable module, it's really easy to use and provides a lot of useful callbacks.
Have a look here:
https://github.com/felixge/node-formidable (scroll down to get the Docs)
http://debuggable.com/posts/parsing-file-uploads-at-500-mb-s-with-node-js:4c03862e-351c-4faa-bb67-4365cbdd56cb
due to the lack of an example file in valums ajax-uploader, I've just created one.
It catches up the XHR upload if possible, alternatively falling back to the old form-based method.
All in conclusion to valums ajax-uploader.
https://github.com/aldipower/file-uploader/blob/master/server/nodejs.js
Maybe Valums will accept the pull request some time and the sample file gets merged in the standard repository.

Resources