How to upload image file to Amazon S3 using aws-sdk and Nativescript - image

I'm trying to upload an image local PNG to Amazon S3 using aws-sdk, but the file that is arriving in Bucket is double the bytes and corrupted.
I'm able to list files, create Buckets among others.
Nativescript: 5.2.4
npm: 6.4.1
node: 10.15.3
I'm tried read using method file.readText()
await file.readText().then(
(content) => {
this.binarySource = content;
});
Init Amazon S3
AWS.config.update({
region: "us-east-1",
credentials: {
accessKeyId: this.chaveAmazonS3,
secretAccessKey: this.tokenAmazonS3
}
});
const s3 = new AWS.S3({
apiVersion: "2006-03-01"
});
this.amazonS3 = s3;
Get file and upload using putObject
let artefato = '/data/user/0/org.nativescript.aixlab/files/midias/41_000014_1558633914086.png';
let diretorioS3 = 'dirbucket/';
let filename = artefato.substring(artefato.lastIndexOf(separator) + 1);
filename = diretorioS3 + filename;
const file: File = File.fromPath(artefato);
this.binarySource = await file.readTextSync();
// await file.readText().then(
// (content) => {
// this.binarySource = content;
// });
console.log("filename", filename, file.size, this.binarySource.length, artefato);
let params = {
Bucket: this.bucketAmazonS3,
Key: filename,
Body: this.binarySource,
ContentLength: this.binarySource.length,
ContentType: 'image/png',
};
try {
//let options = {partSize: 10 * 1024 * 1024, queueSize: 1};
let response = await this.amazonS3.putObject(params, function (err, data) {
console.log("PUT", err, data);
});
} catch (error) {
console.error("Erro putObjectBucket", error);
return false;
}
Result in console
JS:
JS: PUT null {
JS: "ETag": "\"913b7bda195f7bebfdaff5e5b10138a0\""
JS: }
This is values of File (Module) and return file.readTextSync
file.size = 193139
this.binarySource.length = 182599
Object in Amazon Bucket
342.6 KB
I expect the PNG file in Amazon S3 Bucket in binary format.
I can not save in base64 format because of other legacy applications.

Related

ffmpeg using NestJS on ubuntu

I'm trying to use ffmpeg to generate a thumbnail for a video in input.
I'm using NestJS and hosting it on an AWS EC2 instance (ubuntu).
This is the generation and upload method I'm using and it's working perfectly well when I run it on localhost, but on ubuntu server it doesn't even generate the thumbnail and I don't know why ..
getVideoThumbnail(sourcePath: string): string {
const sourceName = path.parse(sourcePath).name + '_thumb.png';
ffmpeg({ source: sourcePath })
.on('end', () => {
this.logger.log("Thumbnail generated and uploaded successfully");
const uploadFile = () => {
const filePath = './thumbnails/' + sourceName; // file to upload
const params = {
Bucket: process.env.BUCKET_NAME,
Key: sourceName,
Body: fs.createReadStream(filePath),
ContentType: 'image/png',
ACL: 'public-read',
};
this.s3.upload(params, function (s3Err, data) {
if (s3Err) throw s3Err
if (data) {
// Unlink file - remove it from local storage
}
});
};
uploadFile();
})
.on('error', function (err, stdout, stderr) {
this.logger.error("Thumbnail generation failed:\n", err);
})
.thumbnail({
filename: sourceName,
count: 1,
}, './thumbnails/');
return `https://${process.env.BUCKET_NAME}.s3.amazonaws.com/${sourceName}`;;
}
}
I installed ffmpeg package on ubuntu but still the generation keeps on failing, and even the error log is empty so I can't figure out what is missing.
Do you guys have an idea ?

lambda is giving Error: wbuffer_write: write failed unix error: No space left on device vips2png: unable to write to target target

I am using Lamba to generate a thumbanil for image using sharp.
i have written following code.
const AWS = require("aws-sdk");
const util = require("util");
const Sharp = require("sharp");
const fs=require("fs");
// get reference to S3 client
const s3 = new AWS.S3({
accessKeyId: "AKIAU2UQ3GXUBWCGAIWB",
secretAccessKey: "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"
});
async function resizeBMP(statusObj, params, destparams,width){
}
async function resize(statusObj, params, destparams,width) {
try {
// Download the image from the S3 source bucket.
let orignImage = await s3.getObject(params).promise();
// Use the sharp module to resize the image and save in a buffer.
const buffer = await Sharp(orignImage.Body,{limitInputPixels:false}).resize(width).toBuffer();
console.log("passed");
Object.assign(destparams, { Body: buffer });
// Upload the thumbnail image to the destination bucket
const putResult = await s3.putObject(destparams).promise();
console.log("passed");
//fs.unlinkSync(temp_path);
Object.assign(statusObj, { status: 2, previewUrl: destparams.Key });
console.log(JSON.stringify(putResult));
} catch (error) {
console.log("Image resize failed due to", error);
throw new Error(error);
}
}
exports.handler = async (event, context, callback) => {
try {
util.inspect(event, { depth: 5 });
// Read options from the event parameter.
const body = JSON.parse(event.Records[0].body);
const srcBucket = body.bucket;
const srcKey = body.key;
const dstBucket = srcBucket;
const filePath = body.filePath;
const extension = body.extension;
const dstKey = `${filePath}resized${extension}`;
const statusObj = {
status: 3,
};
const width = 200;
const params = {
Bucket: srcBucket,
Key: srcKey,
};
const destparams = {
Bucket: dstBucket,
Key: dstKey,
ContentType: "image",
};
// resize image and upload it.
if(extension.toUpperCase()=='.BMP')
{
await resizeBMP(statusObj, params, destparams,width);
}
else{
await resize(statusObj, params, destparams,width);
}
console.log(
"Successfully resized " +
srcBucket +
"/" +
srcKey +
" and uploaded to " +
dstBucket +
"/" +
dstKey
);
} catch (err) {
console.log(err);
}
};
But for images over 100 mb i get the following error:
Image resize failed due to [Error: wbuffer_write: write failed
unix error: No space left on device
vips2png: unable to write to target target
]
i have set the env variable VIPS_DISC_THRESHOLD=750mb
and memory is also around 2040 mb

Image uploaded incorrectly to s3 using amplify Storage (React Native - Expo - Amplify)

I'm trying to upload an image to s3 with Amplify Storage API.
I use the EXPO imagePicker to upload the image from the phone to React-Native, and it's displayed correctly.
Then I upload it to s3 with amplify Storage.put, and it reaches the correct folder in s3, with the proper filename that I gave it and with public access I provided (to the image and the bucket itself).
BUT if I try to open the photo uri in s3 it doesn't display. When I inspect the browser it shows this error on console:
If I paste in the browser the uri I get from Storage.get , i get this error:
My code is the following (I helped myself with this tutorial https://blog.expo.io/how-to-build-cloud-powered-mobile-apps-with-expo-aws-amplify-2fddc898f9a2):
_handleImagePicked = async (pickerResult) => {
const s3path = 'fotos_cdcc/' + '186620' + '/'
const imageName = '186620_4' + '.jpeg'
const key = s3path + imageName
const fileType = pickerResult.type;
const access = { level: "public", contentType: 'image/jpg' };
const imageData = await global.fetch(pickerResult.uri)
const blobData = await imageData.blob()
try {
await Storage.put(
key,
blobData,
access,
fileType
).then(result => console.log(result))
} catch (err) {
console.log('error: ', err)
}
}
The pickerResult has this form:
Object {
"cancelled": false,
"height": 750,
"type": "image",
"uri": "file:///var/mobile/Containers/Data/Application/021D6288-9E88-4080-8FBF-49F5195C2073/Library/Caches/ExponentExperienceData/%2540anonymous%252Fcaballos-app-dd2fa92e-4625-47e7-940e-1630e824347a/ImagePicker/D9EE1F24-D840-457F-B884-D208FFA56892.jpg",
"width": 748,
}
I tried making the bucket and the photo public in s3, but the error persists.
Any ideas?
Thanks in advance!
Maybe this is not an issue with aws-amplify, instead with fetch and blob. Can you try this?
function urlToBlob(url) {
return new Promise((resolve, reject) => {
var xhr = new XMLHttpRequest();
xhr.onerror = reject;
xhr.onreadystatechange = () => {
if (xhr.readyState === 4) {
resolve(xhr.response);
}
};
xhr.open('GET', url);
xhr.responseType = 'blob'; // convert type
xhr.send();
})
}
Then instead of
const imageData = await global.fetch(pickerResult.uri)
const blobData = await imageData.blob()
Try
const blobData = await urlToBlob(pickerResult.uri)
Please find full discussion https://github.com/expo/firebase-storage-upload-example/issues/13

How to parse binary data ("multipart/form-data") in KOA?

If I send POST-query with text options, all OK:
query from front-end:
const request = require("request")
const options = {
method: 'POST',
url: 'http://localhost:4000/user',
headers: form: { data: '12345' }
}
On server-side (KOA) I can get parsed data of a.m.query:
ctx.request.method: "POST"
ctx.request.originalUrl: "user"
ctx.request.body.data: "12345"
But if I send a POST query with binary data (file):
const fs = require("fs");
const request = require("request");
const options = { method: 'POST',
url: 'http://localhost:4000/user',
headers:
{
'content-type': 'multipart/form-data},
formData:
{ '':
{ value: 'fs.createReadStream("F:\\image.jpg")',
options:
{ filename: 'F:\\image.jpg',
contentType: null }
} } };
I don't know, how can I access for this binary data ("image.jpg) on server-part (KOA), in ctx.request have any field with this data...
You can use busboy for this. I wrote a gist for doing this, but I'm going to embed it here with some comments.
Let's create a helper for parsing out the file in a promise-friendly way.
// parse.js
import Busboy from 'busboy'
/**
* Parses a single file from a Node request.
*
* #param {http.IncommingRequest} req
* #return {Promise<{ file: Stream, filename: string>}
*/
export default function parse (req) {
return new Promise((resolve, reject) => {
const busboy = new Busboy({
headers: req.headers,
limits: {
files: 1 // allow only a single upload at a time.
}
})
busboy.once('file', _onFile)
busboy.once('error', _onError)
req.pipe(busboy)
function _cleanup () {
busboy.removeListener('file', _onFile)
busboy.removeListener('error', _onError)
}
function _onFile (fieldname, file, filename) {
_cleanup()
resolve({ file, filename })
}
function _onError (err) {
_cleanup()
reject(err)
}
})
}
Now we need to use it. Let's assume you want to upload to AWS S3.
import Koa from 'koa'
import parse from './busboy'
import AWS from 'aws-sdk'
const app = new Koa()
const s3 = new AWS.S3({
params: { Bucket: 'myBucket' }
})
// Assuming this is a route handler.
app.use(async (ctx) => {
const { file, filename } = await parse(ctx.req)
// `file` is a Stream. Pass this to S3, Azure Blob Storage or whatever you want.
// `filename` is the file name specified by the client.
const result = await s3.upload({
Key: filename,
Body: file
}).promise()
ctx.body = result
})
For brevity's sake, this is how you upload the file using axios on the client.
// `file` is a DOM File object.
function upload (file) {
const data = new window.FormData()
data.append('file', file, file.name)
return axios.post('/upload', data)
}

node heroku error: connect ECONNREFUSED

I have this CRUD app on heroku connected to Mlab. I have 2 different routes (that submits from a form) data to MLab and an image to an S3 bucket. One route works completely fine. The other route, which is very similar to the first, minus some text inputs, throws an error on Heroku whenever you submit. It actually works fine locally though. If I remove all of the logic associated with the image upload it works too. I'm not sure where I'm going wrong here. I can't figure out why it would work on one almost identical route, but not the other.
This is my app.js code that connects mongoose and router
var router = express.Router();
var eventRoutes = require('./routes/event');
var adminRoutes = require('./routes/admin');
var venueRoutes = require('./routes/venue');
app.use('/', router);
app.use(eventRoutes);
app.use(adminRoutes);
app.use(venueRoutes);
mongoose.connect(process.env.HOTNIGHTDATABASEURL);
app.listen(process.env.PORT || 3000, function(){
var date = new Date()
console.log('Server Spinning Up at ' + date.toLocaleTimeString('en-US',
{hour12: true,}));
});
This is the config for S3 and the post route for the route that works.
//Multers3 and AWS s3 setup
var s3 = new aws.S3({
accessKeyId: process.env.HOTNIGHTEVENTSBUCKETID,
secretAccessKey: process.env.HOTNIGHTEVENTSBUCKETSECRET,
region: 'us-east-2'
});
var upload = multer({
storage: multerS3({
s3: s3,
acl: 'public-read',
bucket: 'hot-night-events',
metadata: function (req, file, cb) {
cb(null, {fieldName: file.fieldname});
},
key: function (req, file, cb) {
cb(null, Date.now().toString())
}
})
});
router.post('/events', upload.single('image'), function(req, res){
var event = new Event({
... for brevity left out data collected from form
});
event.save(function(err, data){
if(err){
console.log(err);
} else {
res.redirect('/');
}
});
});
This is the other route (in a different file) that doesn't work:
//Multers3 and AWS s3 setup
var s3 = new aws.S3({
accessKeyId: process.env.HOTNIGHTEVENTSBUCKETID,
secretAccessKey: process.env.HOTNIGHTEVENTSBUCKETSECRET,
region: 'us-east-2'
});
var upload = multer({
storage: multerS3({
s3: s3,
acl: 'public-read',
bucket: 'hot-night-events',
metadata: function (req, file, cb) {
cb(null, {fieldName: file.fieldname});
},
key: function (req, file, cb) {
cb(null, Date.now().toString())
}
})
});
router.post('/venues', upload.single('venueImage'), function(req, res){
var venue = new Venue({
... for brevity left out data collected from form
});
venue.save(function(err, data){
if(err){
console.log(err);
} else {
res.redirect('/');
}
});
});
Any help would be greatly appreciated.

Resources