ffmpeg using NestJS on ubuntu - ffmpeg

I'm trying to use ffmpeg to generate a thumbnail for a video in input.
I'm using NestJS and hosting it on an AWS EC2 instance (ubuntu).
This is the generation and upload method I'm using and it's working perfectly well when I run it on localhost, but on ubuntu server it doesn't even generate the thumbnail and I don't know why ..
getVideoThumbnail(sourcePath: string): string {
const sourceName = path.parse(sourcePath).name + '_thumb.png';
ffmpeg({ source: sourcePath })
.on('end', () => {
this.logger.log("Thumbnail generated and uploaded successfully");
const uploadFile = () => {
const filePath = './thumbnails/' + sourceName; // file to upload
const params = {
Bucket: process.env.BUCKET_NAME,
Key: sourceName,
Body: fs.createReadStream(filePath),
ContentType: 'image/png',
ACL: 'public-read',
};
this.s3.upload(params, function (s3Err, data) {
if (s3Err) throw s3Err
if (data) {
// Unlink file - remove it from local storage
}
});
};
uploadFile();
})
.on('error', function (err, stdout, stderr) {
this.logger.error("Thumbnail generation failed:\n", err);
})
.thumbnail({
filename: sourceName,
count: 1,
}, './thumbnails/');
return `https://${process.env.BUCKET_NAME}.s3.amazonaws.com/${sourceName}`;;
}
}
I installed ffmpeg package on ubuntu but still the generation keeps on failing, and even the error log is empty so I can't figure out what is missing.
Do you guys have an idea ?

Related

Wait for promise to finish before continuing

I know many topics has been about promises and callback. I tried many ways but still, I don't succeed to solve it.
What I want is to edit a file locally, save it then upload it to S3. then another function is called to read from the file and display as a list
Unfortunately I am having error because the file is ending call another function to display to read then it is writing and saving in S3 as you can see in my [terminal ][1]
the file is properly edited and uploaded to s3
1- I tried as promises using then to excecute one after another
static async edit_product(req: any, res: any) {
console.log('edit_product param request',req.body)
try {
ExcelFile.EditFile(prod.product_code,prod.product_name).then(rs=> res.status(200).json({'success'}) ) ).catch((err) => {
console.error(err);
})
console.log('test')
}
2- using await and then
static async edit_product(req: any, res: any) {
console.log('edit_product param request',req.body)
try {
await ExcelFile.EditFile(prod.product_code,prod.product_name).then(rs=> rs)
console.log('test')
res.status(200).json({'success product edit':prod.product_code})
}
3-to upload file to S3
static async UploadFileS3() {
const file = config._path+config._foldername +config._filename
var s3 = new aws.S3({ accessKeyId: config._ACCESS_KEY_ID,secretAccessKey: config._SECRET_ACCESS_KEY });
var newversionId: string = ''
const params = {
Bucket: config._BUCKET_NAME,
Key: config._filename // File name you want to save as in S3
};
return s3.putObject(params, function(err, data) {
if (err) {console.log(err) }
newversionId = data.VersionId!
console.log("Successfully uploaded data ",newversionId);
});
};
4-edit file
const stream = new Stream.PassThrough();
var dataFile = wb.xlsx.readFile(file).then(rs=>{
var sh = rs.getWorksheet(config._sheetname);
for (let i = 2; i <= sh.rowCount; i++) {
let currRow = sh.getRow(i);
if (currRow.getCell(1).text==product_code){
currRow.getCell(2).value = product_name
currRow.commit();
break } }
console.log('edit ')
//save locally
wb.xlsx.writeFile(file).then(rs=>{console.log('edit filed successfully')});
const param = {Key: config._filename,
Bucket: config._BUCKET_NAME,
Body: stream,
ContentType: 'CONTENT_TYPE_EXCEL'
}
//save to s3
wb.xlsx.write(stream).then(() => {s3.upload(param, function (err,data) {
if (err) { console.log("Error", err); }
console.log("Upload Success", data.ETag);
ExcelFile.getAwsVersion().then(rs=>ExcelFile.saveFileBucketVersion(rs))
})
})
})
return dataFile //return promise
How can I make it to respect the step, edit first then return res.status(200).json({'success'}
[1]: https://i.stack.imgur.com/SsWhu.png
Your EditFile function seems to be not waiting for the end of the writeFile. The for loop starts the writeFile function but it is not awaited there. The possible solutions are
Move the write function out of the for loop. It looks weird anyway that you are potentially saving the changes multiple time.
If the write should be in the loop then use a promise compatible loop there (e.g. Bluebird.each)

How to take a photo and upload it to the server with Nativecript-camera

I am new to Nativescript Vue development, I am trying to take a photo and send it to the server. My code works fine on Android, but when I run on iOS, errors occur, the image doesn’t even paste onto the page and doesn’t upload to the server.
import * as camera from "nativescript-camera";
import * as bghttp from "nativescript-background-http";
const firebase = require("nativescript-plugin-firebase");
var session = bghttp.session("image-upload");
takePicture() {
camera.requestPermissions()
.then(() => {
camera.takePicture({ width: 300, height: 300, keepAspectRatio: true, saveToGallery:true })
.then(imageAsset => {
this.img = imageAsset.android;
})
.catch(e => {
console.log('error:', e);
});
})
.catch(e => {
console.log('Error requesting permission');
});
}
upload() {
var file = this.img;
var url = "https://bocorp.ru/assets/mobileNewOrder.php";
var name = file.substr(file.lastIndexOf("/") + 1);
// upload configuration
var bghttp = require("nativescript-background-http");
var session = bghttp.session("image-upload");
var request = {
url: url,
method: "POST",
headers: {
"Content-Type": "application/octet-stream",
"File-Name": name,
},
content: JSON.stringify({
Title: title
}),
description: "Uploading " + name
};
var task = session.uploadFile(file, request);
I understand that another code should be used in "this.img = imageAsset.android;" but I don’t understand how can I get a photo from the Iphone camera. I will be glad to any prompt
We save our images to the device, and then upload later as a multipart upload. You might be able to skip the file saving part, but it does allow us to keep from reading in the entire image for uploading later in our app flow (I guess if you already have the image source for display you could reuse it for upload on the same page).
Hope you find this helpful.
const imageSource = require('tns-core-modules/image-source')
// ...
camera.takePicture(cameraOpts)
.then(imageAsset => {
return imageSource.fromAsset(imageAsset)
})
.then(imageSource => {
let pathDest = '/path/on/device' // you define
console.log(`Created image source with width=${imageSource.width} height=${imageSource.height} at ${pathDest}`)
imageSource.saveToFile(pathDest, 'jpg', 50)
return pathDest // save this to look up later
})
Then when we need to upload
const mime = require('mime-types')
import * as bghttp from 'nativescript-background-http'
...
let session = bghttp.session('image-upload')
let request = {
url: 'https://yourendpoint.com/here',
method: 'POST',
androidAutoDeleteAfterUpload: true,
headers: {
'Content-Type': 'application/octet-stream',
}
}
// photoPath is known somehow. We use Vuex, but somehow it makes it to this page
let params = [
{ name: 'photo1', filename: photoPath, mimeType: mime.lookup(photoPath) }
]
return new Promise((resolve, reject) => {
let task = session.multipartUpload(params, request)
task.on('error', (e) => {
reject(e)
})
task.on('complete', res => {
resolve()
})
})

How to upload image file to Amazon S3 using aws-sdk and Nativescript

I'm trying to upload an image local PNG to Amazon S3 using aws-sdk, but the file that is arriving in Bucket is double the bytes and corrupted.
I'm able to list files, create Buckets among others.
Nativescript: 5.2.4
npm: 6.4.1
node: 10.15.3
I'm tried read using method file.readText()
await file.readText().then(
(content) => {
this.binarySource = content;
});
Init Amazon S3
AWS.config.update({
region: "us-east-1",
credentials: {
accessKeyId: this.chaveAmazonS3,
secretAccessKey: this.tokenAmazonS3
}
});
const s3 = new AWS.S3({
apiVersion: "2006-03-01"
});
this.amazonS3 = s3;
Get file and upload using putObject
let artefato = '/data/user/0/org.nativescript.aixlab/files/midias/41_000014_1558633914086.png';
let diretorioS3 = 'dirbucket/';
let filename = artefato.substring(artefato.lastIndexOf(separator) + 1);
filename = diretorioS3 + filename;
const file: File = File.fromPath(artefato);
this.binarySource = await file.readTextSync();
// await file.readText().then(
// (content) => {
// this.binarySource = content;
// });
console.log("filename", filename, file.size, this.binarySource.length, artefato);
let params = {
Bucket: this.bucketAmazonS3,
Key: filename,
Body: this.binarySource,
ContentLength: this.binarySource.length,
ContentType: 'image/png',
};
try {
//let options = {partSize: 10 * 1024 * 1024, queueSize: 1};
let response = await this.amazonS3.putObject(params, function (err, data) {
console.log("PUT", err, data);
});
} catch (error) {
console.error("Erro putObjectBucket", error);
return false;
}
Result in console
JS:
JS: PUT null {
JS: "ETag": "\"913b7bda195f7bebfdaff5e5b10138a0\""
JS: }
This is values of File (Module) and return file.readTextSync
file.size = 193139
this.binarySource.length = 182599
Object in Amazon Bucket
342.6 KB
I expect the PNG file in Amazon S3 Bucket in binary format.
I can not save in base64 format because of other legacy applications.

node heroku error: connect ECONNREFUSED

I have this CRUD app on heroku connected to Mlab. I have 2 different routes (that submits from a form) data to MLab and an image to an S3 bucket. One route works completely fine. The other route, which is very similar to the first, minus some text inputs, throws an error on Heroku whenever you submit. It actually works fine locally though. If I remove all of the logic associated with the image upload it works too. I'm not sure where I'm going wrong here. I can't figure out why it would work on one almost identical route, but not the other.
This is my app.js code that connects mongoose and router
var router = express.Router();
var eventRoutes = require('./routes/event');
var adminRoutes = require('./routes/admin');
var venueRoutes = require('./routes/venue');
app.use('/', router);
app.use(eventRoutes);
app.use(adminRoutes);
app.use(venueRoutes);
mongoose.connect(process.env.HOTNIGHTDATABASEURL);
app.listen(process.env.PORT || 3000, function(){
var date = new Date()
console.log('Server Spinning Up at ' + date.toLocaleTimeString('en-US',
{hour12: true,}));
});
This is the config for S3 and the post route for the route that works.
//Multers3 and AWS s3 setup
var s3 = new aws.S3({
accessKeyId: process.env.HOTNIGHTEVENTSBUCKETID,
secretAccessKey: process.env.HOTNIGHTEVENTSBUCKETSECRET,
region: 'us-east-2'
});
var upload = multer({
storage: multerS3({
s3: s3,
acl: 'public-read',
bucket: 'hot-night-events',
metadata: function (req, file, cb) {
cb(null, {fieldName: file.fieldname});
},
key: function (req, file, cb) {
cb(null, Date.now().toString())
}
})
});
router.post('/events', upload.single('image'), function(req, res){
var event = new Event({
... for brevity left out data collected from form
});
event.save(function(err, data){
if(err){
console.log(err);
} else {
res.redirect('/');
}
});
});
This is the other route (in a different file) that doesn't work:
//Multers3 and AWS s3 setup
var s3 = new aws.S3({
accessKeyId: process.env.HOTNIGHTEVENTSBUCKETID,
secretAccessKey: process.env.HOTNIGHTEVENTSBUCKETSECRET,
region: 'us-east-2'
});
var upload = multer({
storage: multerS3({
s3: s3,
acl: 'public-read',
bucket: 'hot-night-events',
metadata: function (req, file, cb) {
cb(null, {fieldName: file.fieldname});
},
key: function (req, file, cb) {
cb(null, Date.now().toString())
}
})
});
router.post('/venues', upload.single('venueImage'), function(req, res){
var venue = new Venue({
... for brevity left out data collected from form
});
venue.save(function(err, data){
if(err){
console.log(err);
} else {
res.redirect('/');
}
});
});
Any help would be greatly appreciated.

What is the correct field to pass --staging-location parameter for a Dataflow job in Node.js?

I wonder if I've hit a bug. A wrote a Node.js piece of code to trigger a "GCS Text to PubSub" Dataflow. The function is triggered upon file upload into a GCS bucket.
But it never executes successfully: "textPayload: "problem running dataflow template, error was: { Error: Invalid JSON payload received. Unknown name "staging_location": Cannot find field." It is an issue with the syntax of I specify the staging location for the job. I have tried "staginglocation", "stagingLocation", etc...none of them have worked.
Here's my code. Thanks for your help.
var {google} = require('googleapis');
exports.moveDataFromGCStoPubSub = (event, callback) => {
const file = event.data;
const context = event.context;
console.log(`Event ${context.eventId}`);
console.log(` Event Type: ${context.eventType}`);
console.log(` Bucket: ${file.bucket}`);
console.log(` File: ${file.name}`);
console.log(` Metageneration: ${file.metageneration}`);
console.log(` Created: ${file.timeCreated}`);
console.log(` Updated: ${file.updated}`);
google.auth.getApplicationDefault(function (err, authClient, projectId) {
if (err) {
throw err;
}
console.log(projectId);
const dataflow = google.dataflow({ version: 'v1b3', auth: authClient });
console.log(`gs://${file.bucket}/${file.name}`);
dataflow.projects.templates.create({
projectId: projectId,
resource: {
parameters: {
inputFile: `gs://${file.bucket}/${file.name}`,
outputTopic: `projects/iot-fitness-198120/topics/MemberFitnessData`,
},
jobName: 'CStoPubSub',
gcsPath: 'gs://dataflow-templates/latest/GCS_Text_to_Cloud_PubSub',
stagingLocation: 'gs://fitnessanalytics-tmp/tmp'
}
}, function(err, response) {
if (err) {
console.error("problem running dataflow template, error was: ", err);
}
console.log("Dataflow template response: ", response);
callback();
});
});
callback();
};
I don't think this is actually possible.
Looking at the documentation for the Dataflow API itself, there's nothing like a staging location in the parameter section, and the library you're using is basically a wrapper for this API.
I'm a bit surprised it changes the name of the parameter though.
So i finally got this to work. It was indeed a syntax issue in the parameters section. The code below works like a charm:
var {google} = require('googleapis');
exports.moveDataFromGCStoPubSub = (event, callback) => {
const file = event.data;
const context = event.context;
console.log(`Event ${context.eventId}`);
console.log(` Event Type: ${context.eventType}`);
console.log(` Bucket: ${file.bucket}`);
console.log(` File: ${file.name}`);
console.log(` Metageneration: ${file.metageneration}`);
console.log(` Created: ${file.timeCreated}`);
console.log(` Updated: ${file.updated}`);
google.auth.getApplicationDefault(function (err, authClient, projectId) {
if (err) {
throw err;
}
console.log(projectId);
const dataflow = google.dataflow({ version: 'v1b3', auth: authClient });
console.log(`gs://${file.bucket}/${file.name}`);
dataflow.projects.templates.create({
gcsPath: 'gs://dataflow-templates/latest/GCS_Text_to_Cloud_PubSub',
projectId: projectId,
resource: {
parameters: {
inputFilePattern: `gs://${file.bucket}/${file.name}`,
outputTopic: 'projects/iot-fitness-198120/topics/MemberFitnessData2'
},
environment: {
tempLocation: 'gs://fitnessanalytics-tmp/tmp'
},
jobName: 'CStoPubSub',
//gcsPath: 'gs://dataflow-templates/latest/GCS_Text_to_Cloud_PubSub',
}
}, function(err, response) {
if (err) {
console.error("problem running dataflow template, error was: ", err);
}
console.log("Dataflow template response: ", response);
callback();
});
});
callback();
};

Resources