How to reduce image file size after updating it in Nest.JS - image

I have a problem in my image update routes in Nest.JS , users can upload images with any sizes, and this is very memory consuming ,but I don't want to "limit" image size that can be uploaded, I prefer that I had like a "resize image pipe" that would that automatically for me, I saw something like that in here but the logic is kinda messy in my opinion,and it does not fit in any way in my code logic,so if anyone has a tip for me I appreciate A LOT
By the way my code logic is like this :
#Post('upload/banner/:postId')
#UseInterceptors(
FileInterceptor('file', {
storage: diskStorage({
destination: './files/posts/banners',
}),
fileFilter: imageFileFilter,
}),
)
async uploadAnonymousPostBanner(
#UploadedFile() file: Express.Multer.File,
#CurrentUserCookie('userId') userId: string,
#Param('postId') postId: string,
#Query('postCode') postCode:string
) {
console.log(file);
const uploadImgFile = await this.postsService.uploadPostBannerImg(
userId,
postId,
file.filename,
postCode
);
return uploadImgFile;
}

Related

Parsing formdata from React using Serverless and API Gateway

I'm trying to upload a file and send data from a React frontend to a S3 bucket using an API Gateway/ Lambda function setup using the Serverless framework and I've been struggling with it for the last couple of days.
From the frontend I am using axios and creating a formdata to send a post request to the API like the following:
let formData = new FormData();
formData.append('imageFile', selectedImage);
formData.append('itemId', clubIdRef.current.value);
formData.append('itemDescription', itemDescRef.current.value);
axios.post(
baesURL+"/item/create", formData,
{headers: {
'Content-Type': 'multipart/form-data'
}}
).then((response) => {
console.log("response" + response)
console.log("response.data" + response.data)
})
Appending string attributes to the formdata feels off but the only way I could find to send data and an image at the same time was like the above.
Then to receive this data in the backend I've been using lambda-multipart-parser like the following:
const createItem = async (event) => {
const result = await multipartParser.parse(event);
const imageFile = result.imageFile;
const itemDescription = result.itemDescription;
where the result console logs as:
{
files: [],
imageFile: '[object File]',
itemId: '12',
itemDescription: "Description"
}
I can then store the imageFile successfully in S3 and generate the URL. Next, I create an Item object with the S3 url and id and description to store in dynamoDB. Everything works fine but when I open the S3 url the file is corrupted and just opens as a grey box instead of the actual image I uploaded.
This is how I am uploading the file using the s3 sdk
const AWS = require("aws-sdk");
const s3 = new AWS.S3();
const params = {
Bucket: BUCKET_NAME,
Key: `images/${directoryPath}/${id}.png`,
Body: imageFile,
ContentType: "image/png",
ACL : "public-read"
}
uploadResult = await s3.putObject(params).promise();
These are the things I've tried but still don't have any success uploading the correct image to my S3 bucket:
Looking and changing the BinaryMediaType of the API gateway but I can't find the settings under the API...
Tried using aws-lambda-multipart-parser but still wasn't able to add multipart/form-data binary media type and parse the full form data correctly
I know that I could first try to send a request directly from React to S3 to upload the image using aws-sdk in react to get a preSignedURL and attach that URL and make a POST request to my API Gateway simply parse the event.body without having to use a multipart form parser, but I want to avoid sending multiple requests if needed and handle everything in the backend.
Any suggestions would be highly appreciated!
It is quite hard to understand where is the problem with given context.
We have no idea which image format you are uploading, no idea how you store this image to S3.
My answer will try to cover these missing informations as it is a common mistake on S3 uploads.
S3 files are stored and returned with given ContentType.
You might check your S3 file's ContentType on AWS console.
Console > S3 > Select object (image) > Metadata > ContentType
I will suppose that image format is PNG and image data is correct and might be posted to S3 as is (from result).
S3Service.ts
import AWS, {S3} from "aws-sdk";
import {PutObjectRequest} from "aws-sdk/clients/s3";
import {PutObjectResponse} from "aws-sdk/clients/mediastoredata";
AWS.config.update({region: 'eu-west-3' });
const s3: S3 = new AWS.S3();
export class S3Service {
public static async putImage(key: string, data: string, contentType: string): Promise<PutObjectResponse> {
const s3Params: PutObjectRequest = {
Bucket: process.env.S3_BUCKET,
Key: key,
Body: data,
ContentType: contentType // <== I draw your attention here
}
return await s3.putObject(s3Params).promise()
}
}
index.ts
import { S3Service } from "service/aws/s3-service";
await S3Service.putImage(result.itemId + ".png", result.imageFile, "image/png");
A common mistake, which I assume might be the cause of your problem, is to forget content-type resulting in incorrect download format.

How to work with both mocked graphql API and an externally served GraphQL endpoint

I'm hoping to hear some inputs from the experts here.
I'm currently working on NextJS project and my graphql is running on mocked data which is setup in another repo.
and now that the backend is built by other devs were slowly moving away from mocked data to the real ones.
They've given me an endpoint to the backend where I'm supposed to be querying data.
So the goal is to make both mocked graphql data and the real data in backend work side by side at least until we fully removed mocked data.
So far saw 2 ways of doing it, but I was looking for a way where I could still use hooks like useQuery and useMutation
Way #1
require('isomorphic-fetch');
fetch('https://graphql.api....', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ query: `
query {
popularBrands ( storefront:"bax-shop.nl", limit:10, page:1){
totalCount
items{id
logo
name
image
}
}
}`
}),
})
.then(res => res.json())
.then(res => console.log(res.data));
Way #2
const client = new ApolloClient({
uri: 'https://api.spacex.land/graphql/',
cache: new InMemoryCache()
});
async function test () {
const { data: Data } = await client.query({
query: gql`
query GetLaunches {
launchesPast(limit: 10) {
id
mission_name
launch_date_local
launch_site {
site_name_long
}
links {
article_link
video_link
mission_patch
}
rocket {
rocket_name
}
}
}
`
});
console.log(Data)
}
Pseudo code:
Query the real data first
check if its empty, if it is, query the mock data.
If both are empty, then it's really an empty result set.
You can write a wrapper around the hooks you use that does this for you so you don't have to repeat yourself in every component. When you're ready to remove the mocked data you just remove the check for the second. data set.
This is a common technique when switching to a new database.

Uploading image data from CKEditor 5 to Firebase Storage creates a malformed image

I'm uploading form data containing an image using an XHRHttpRequest with CKEditor 5. I'm receiving the Buffer correctly, and I've retrieved the content type successfully:
const data = new FormData
data.append('upload', fileObject)
myXhrHttpRequest.send(data)
I retrieve the image data by accessing the body of the HTTP POST request (a Buffer), and then I upload it to Firebase Storage:
app.post('/save-image', async ({ query: { imageId, contentType }, body: data }, res) => {
storage
.file(`images/${id}`)
.save(data, {
public: true,
metadata: {
contentType,
metadata: {
firebaseStorageDownloadTokens: token
}
}
})
// send back results, etc...
})
Unfortunately, the image is corrupt. Any ideas about what I might be doing wrong? This is an example of one of the uploaded images:
https://firebasestorage.googleapis.com/v0/b/memorize-ai.appspot.com/o/deck-assets%2Fsample_deck_id%2FHq7vZ8oEgFqlLdSfWYBl?alt=media&token=cfa99560-5618-48e1-8772-4ffd9d45f789

Nativescript send camera capture to server

In my Nativescript application I would like to capture an image using the camera module and then send the bytes directly to the server via http call.
Here is my code (incomplete for brevity):
var cameraModule = require("camera");
var http = require("http");
...
cameraModule.takePicture().then(function (img) {
// how to extract the actual bytes from img???
http.request({
url: some_url,
method: "POST",
headers: { "Content-Type": "application/octet-stream" },
content: ???
});
});
Is there a way to do that?
I was looking at nativescript-background-http and it seems to fit my requirements exactly, but the example shows the file being loaded from a path only. I did not have any luck making this to work on iOS.
Any help is greatly appreciated.
Thank you.
A couple things;
"img" is actually a image source.
At this point the built in HTTP module does not support direct binary transfers, so we need to convert it to something that can be sent over the wire. So base64 is a text representation that can support binary, and it is a common encoding/decoding method.
Since we already have it as a image source we just use the cool toBase64String ability which give us the Base 64 data.
So here is basically is how I would do it (tested under android).
var cameraModule = require('camera');
var some_url="http://somesite";
// img is a image source
cameraModule.takePicture().then(function (img) {
// You can use "jpeg" or "png". Apparently "png" doesn't work in some
// cases on iOS.
var imageData = img.toBase64String("jpeg");
http.request({
url: some_url,
method: "POST",
headers: { "Content-Type": "application/base64" },
content: imageData
}).then(function() {
console.log("Woo Hoo, we sent our image up to the server!");
}).catch(function(e) {
console.log("Uh oh, something went wrong", e);
});
});
There are a few ways to do this. If your backend can take a base64 string you can use the image-source class and manipulate the data. I'm on my phone or I'd mock up a sample. It really depends what you expect on the server to be honest but most options are possible with NativeScript using the image-source and ui/image component.
http://docs.nativescript.org/api-reference/classes/_image_source_.imagesource.html#tobase64string
Going from memory here but try this when you get the (IMG) back.
var data = img.tobase64string(); that should give you a base 64 string of the image.
Just found this awesome sample from another question https://stackoverflow.com/a/37815237/1893557
This will work to send the file after you save it locally and uses the background-http plugin.

Getting binary file content instead of UTF-escaped using file.get

I'd like to know if it's possible to get exact binary data using callback from drive.files.get method of NodeJS Google API. I know that object returned by calling this API endpoint is a normal request object that could be e.g. piped like this:
drive.files.get({
fileId: fileId,
alt: 'media'
}).pipe(fs.createWriteStream('test'));
However I would like to know if it's possible to get binary data from within callback using this syntax:
drive.files.get({
fileId: fileId,
alt: 'media'
}, function(err, data) {
// Here I have binary data exposed
});
As far as I know, it should be possible to get that kind of data from request during its creation, passing {encoding: null} in request options object like this:
var requestSettings = {
method: 'GET',
url: url,
encoding: null // This is the important part
};
request(requestSettings, function(err, data) {/.../})`
however it seems that Google obscures this configuration object in its library.
So my question is - is it possible to do so without interfering/hacking the library?
Ok, so i found answer that could be useful for others :)
Aforementioned drive.files.get method returns Stream object, so it could be directly handled using proper event handlers. Then, buffer parts could be concatenated into one part and sent back in callback like this:
var stream = drive.files.get({
fileId: fileId,
alt: 'media'
});
// Build buffer
var chunks = [];
stream.on('data', (chunk) => {
chunks.push(chunk);
});
stream.on('end', () => {
return cb(null, Buffer.concat(chunks));
});

Resources