Store and access data files in lambda function - aws-lambda

How do I store custom file (.json in my case) in a lambda layer, so that I can access it like npm modules? I use node.js as a runtime. My current layer folder structure looks like this:
My modules stored in node_modules are visible and can be accessed as:
const { Client } = require('pg');
const knex = require('knex');
But when I try to list available files, I don't see my service-account-file.json file:
fs.readdir('./', (err, files) => {
files.forEach(file => {
console.log('#file')
console.log(file); // Returns only index.js
});
});

Here's what you need to use stored data files in Lambda function. Create a folder with name data_files, put your service-account-file.json file in there.
const path = require('path');
const fs = require("fs");
const loadDataFile = (file) => {
//create the filename including path
const fileName = `./data_files/${file}`;
//set up the variable
let resolved;
//if we have a lambda environment then add that to the path to resolve
if (process.env.LAMBDA_TASK_ROOT) {
//this creates an absolute path
resolved = path.resolve(process.env.LAMBDA_TASK_ROOT, fileName);
} else {
//otherwise resolve to the local path
resolved = path.resolve(__dirname, fileName);
}
try {
//get the text data as a string
let data = fs.readFileSync(resolved, 'utf8');
//convert to JS object
let parsedData = JSON.parse(data);
//TO DO - work data if required
//then return the data
return parsedData;
} catch (error) {
//if there'a an error in the data fetch then we need a report
console.log(error.message);
}
};
const data = loadDataFile('service-account-file.json');

Related

Wait for promise to finish before continuing

I know many topics has been about promises and callback. I tried many ways but still, I don't succeed to solve it.
What I want is to edit a file locally, save it then upload it to S3. then another function is called to read from the file and display as a list
Unfortunately I am having error because the file is ending call another function to display to read then it is writing and saving in S3 as you can see in my [terminal ][1]
the file is properly edited and uploaded to s3
1- I tried as promises using then to excecute one after another
static async edit_product(req: any, res: any) {
console.log('edit_product param request',req.body)
try {
ExcelFile.EditFile(prod.product_code,prod.product_name).then(rs=> res.status(200).json({'success'}) ) ).catch((err) => {
console.error(err);
})
console.log('test')
}
2- using await and then
static async edit_product(req: any, res: any) {
console.log('edit_product param request',req.body)
try {
await ExcelFile.EditFile(prod.product_code,prod.product_name).then(rs=> rs)
console.log('test')
res.status(200).json({'success product edit':prod.product_code})
}
3-to upload file to S3
static async UploadFileS3() {
const file = config._path+config._foldername +config._filename
var s3 = new aws.S3({ accessKeyId: config._ACCESS_KEY_ID,secretAccessKey: config._SECRET_ACCESS_KEY });
var newversionId: string = ''
const params = {
Bucket: config._BUCKET_NAME,
Key: config._filename // File name you want to save as in S3
};
return s3.putObject(params, function(err, data) {
if (err) {console.log(err) }
newversionId = data.VersionId!
console.log("Successfully uploaded data ",newversionId);
});
};
4-edit file
const stream = new Stream.PassThrough();
var dataFile = wb.xlsx.readFile(file).then(rs=>{
var sh = rs.getWorksheet(config._sheetname);
for (let i = 2; i <= sh.rowCount; i++) {
let currRow = sh.getRow(i);
if (currRow.getCell(1).text==product_code){
currRow.getCell(2).value = product_name
currRow.commit();
break } }
console.log('edit ')
//save locally
wb.xlsx.writeFile(file).then(rs=>{console.log('edit filed successfully')});
const param = {Key: config._filename,
Bucket: config._BUCKET_NAME,
Body: stream,
ContentType: 'CONTENT_TYPE_EXCEL'
}
//save to s3
wb.xlsx.write(stream).then(() => {s3.upload(param, function (err,data) {
if (err) { console.log("Error", err); }
console.log("Upload Success", data.ETag);
ExcelFile.getAwsVersion().then(rs=>ExcelFile.saveFileBucketVersion(rs))
})
})
})
return dataFile //return promise
How can I make it to respect the step, edit first then return res.status(200).json({'success'}
[1]: https://i.stack.imgur.com/SsWhu.png
Your EditFile function seems to be not waiting for the end of the writeFile. The for loop starts the writeFile function but it is not awaited there. The possible solutions are
Move the write function out of the for loop. It looks weird anyway that you are potentially saving the changes multiple time.
If the write should be in the loop then use a promise compatible loop there (e.g. Bluebird.each)

Attemp to invoke interface method java.lang.String com.facebook.react.bridge.ReadableMap etc on a null object reference

I'm getting this error after uploading image to firebase storage. I am using
"react-native": "0.55.4",
"react-native-fetch-blob": "^0.10.8",
"react-native-image-picker": "^0.26.10",
"firebase": "^5.0.4",
this is my code for uploading the image.
// Prepare Blob support
const Blob = RNFetchBlob.polyfill.Blob;
const fs = RNFetchBlob.fs;
window.XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;
window.Blob = Blob;
uploadImage = (uri, imageName, mime = "image/jpg") => {
return new Promise((resolve, reject) => {
const uploadUri =
Platform.OS === "ios" ? uri.replace("file://", "") : uri;
let uploadBlob = null;
const imageRef = db
.storage()
.ref("images/")
.child(imageName);
fs.readFile(uploadUri, "base64")
.then(data => {
return Blob.build(data, { type: `${mime};BASE64` });
})
.then(blob => {
uploadBlob = blob;
alert("blob is " + JSON.stringify(blob));
return imageRef.put(blob, { contentType: mime });
})
.then(() => {
uploadBlob.close();
return imageRef.getDownloadURL();
})
.then(url => {
resolve(url);
})
.catch(error => {
reject(error);
});
});};
Attempt to invoke interface method 'java.lang.String com.facebook.react.bridge.ReadableMap.getString(java.lang.String)' on a null object reference readAsText FileReaderModule.java:43 invoke Method.java invoke JavaMethodWrapper.java:372 invoke JavaModuleWrapper.java:160 run NativeRunnable.java handleCallback Handler.java:790 dispatchMessage Handler.java:99 dispatchMessage MessageQueueThreadHandler.java:29 loop Looper.java:164 run MessageQueueThreadImpl.java:192 run Thread.java:764
I faced the same error. The solution is to do a 'Fetch replacement' as the official documentation explains:
Since we are not implementing FileReader polyfill, you might run into
this error in some cases.
If you're facing this problem with Blob polyfill in Debug Mode, try
replace window.fetch with fetch replacement should fix it.
And:
If you have existing code that uses whatwg-fetch, now you don't have
to change existing code after 0.9.0, just use fetch replacement. The
difference between Official fetch and fetch replacement is that,
official fetch uses WHATWG-fetch js library which wraps XMLHttpRequest
polyfill under the hood, and our implementation simply wraps
RNFetchBlob.fetch.
Basically, you just have to add this to your code, just below your window.Blob = Blob; line:
const Fetch = RNFetchBlob.polyfill.Fetch
// replace built-in fetch
window.fetch = new Fetch({
// enable this option so that the response data conversion handled automatically
auto : true,
// when receiving response data, the module will match its Content-Type header
// with strings in this array. If it contains any one of string in this array,
// the response body will be considered as binary data and the data will be stored
// in file system instead of in memory.
// By default, it only store response data to file system when Content-Type
// contains string `application/octet`.
binaryContentTypes : [
'image/',
'video/',
'audio/',
'foo/',
]
}).build()
Documentation:
https://github.com/wkh237/react-native-fetch-blob/wiki/Trouble-Shooting#failed-to-execute-readastext-on-filereader
I am running into the same problem. It has something to do with the prep statements:
const Blob = RNFetchBlob.polyfill.Blob;
const fs = RNFetchBlob.fs;
window.XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;
window.Blob = Blob;
The error does not occur if I comment them out.
I have solved this by removing all this package because error still appearing even with fetch replacement I think Is triggered by
window.XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;
so I have used the old fashion
const uriToBlob = (uri) => {
return new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = function () {
resolve(xhr.response);
};
xhr.onerror = function () {
reject(new Error('uriToBlob failed'));
};
xhr.responseType = 'blob';
xhr.open('GET', uri, true);
xhr.send(null);
});
}

Parse Cloud Code Save Issue

I wrote some backend code for a Parse.com mobile app a couple of years ago, and have just been asked to add a feature. However, I found that after a small tweak the code wouldn't succeed. So, I rolled back to the working copy, downloaded, then deployed that back and it wouldn't work either! I wonder if this is a change in the Parse software?
The code is failing at the save method as all the logs are fine until then. The log for the error case shows 'No message provided'. If I don't use the message attribute it just shows '{}', so I presume it's empty. I have put the promise resolution in the error case to stop the job timing out while I debug. One thing I have never understood is why I have to make two Seed objects and piggy-back off one to save correctly. If I did a.save(null,...) it wouldn't work.
Any help would be fantastic. Thanks!
PS: Apologies for the indenting below - it is correct in my file.
function flush() {
//Clear the previous records from the class.
var Seed = Parse.Object.extend("Seeds");
var _ = require("underscore");
var arr = [];
var query = new Parse.Query(Seed);
return query.find().then(function(oldSeeds) {
_.each(oldSeeds, function(oldSeed) {
arr.push(oldSeed.destroy());
});
return Parse.Promise.when(arr);
});
}
Parse.Cloud.job("fetchjson", function(request, status) {
var url = 'someurl';
flush().then(function() { Parse.Cloud.httpRequest({url: url}).then(function(httpResponse){
var Seed = Parse.Object.extend("Seeds");
var jsonobj = JSON.parse(httpResponse.text);
var _ = require("underscore");
var results = [];
// do NOT iterate arrays with `for... in loops`
_.each(jsonobj.seeds, function(s) {
var p = new Parse.Promise();
results.push(p); // Needs to be done here or when() will execute immediately with no promises.
var seed = new Seed();
var a = new Seed(s);
var image_url = a.get("image")
//Get the JSON.
Parse.Cloud.httpRequest({url: image_url}).then(function(response) {
console.log("Fetching image at URL: " + image_url);
//Create a new image object and save, passing ref through promise.
var file = new Parse.File('thumb.jpg', { base64: response.buffer.toString('base64', 0, response.buffer.length) });
return file.save();
}).then(function(thumb) {
console.log("Attaching thumb to object");
//Set image ref as object attribute.
a.set("imageFile", thumb);
console.log("Parsing views into viewsint");
//Save decimal string as int into another attribute.
a.set("viewsInt", parseInt(a.get("views")));
console.log("Parsing description into descriptionarray");
//Save string as array into another attribute.
var dar = new Array(1);
//dar[0] = a.get("description")
a.set("descriptionarray", [a.get("description")]);
}, function(error) {
console.log("Error occurred :(");
}).then(function(){
console.log("Saving object");
//Save the object and resolve the promise so we can stop.
seed.save(a,{
success: function(successData){
console.log(successData);
p.resolve(successData);
},
error: function(error){
console.log(error.message);
p.resolve(error);
}
});
});
});
// .when waits for all promises to be resolved. This is async baby!
Parse.Promise.when(results).then(function(data){
console.log("All objects saved");
status.success("Updated Succesfully");
});
}, function(error) {
//Oh noes :'(
console.error('Request failed with response code ' + httpResponse.status);
status.error("Update Failed");
});
});
});
I changed your code a bit and put some comments to explain:
// DEFINE THESE ON THE TOP. NO NEED TO REPEAT.
var _ = require("underscore");
var Seed = Parse.Object.extend("Seeds");
function flush() {
//Clear the previous records from the class.
var arr = [];
var query = new Parse.Query(Seed);
return query.find().then(function(oldSeeds) {
_.each(oldSeeds, function(oldSeed) {
arr.push(oldSeed.destroy());
});
return Parse.Promise.when(arr);
});
}
Parse.Cloud.job("fetchjson", function(request, status) {
var url = 'someurl';
flush().then(function() {
Parse.Cloud.httpRequest({url: url}).then(function(httpResponse){
var jsonobj = JSON.parse(httpResponse.text);
var results = [];
_.each(jsonobj.seeds, function(s) {
// ONE SEED OBJECT WITH INITIAL SET OF DATA FROM JSON
var seed = new Seed(s);
var image_url = seed.get("image")
// A SERIAL PROMISE FOR EACH SEED
var promise = Parse.Cloud.httpRequest({url: image_url}).then(function(response) {
console.log("Fetching image at URL: " + image_url);
//Create a new image object and save, passing ref through promise.
var file = new Parse.File('thumb.jpg', { base64: response.buffer.toString('base64', 0, response.buffer.length) });
return file.save();
}).then(function(thumb) {
// SETTING MORE PROPERTIES
//Set image ref as object attribute.
console.log("Attaching thumb to object");
seed.set("imageFile", thumb);
//Save decimal string as int into another attribute.
console.log("Parsing views into viewsint");
seed.set("viewsInt", parseInt(seed.get("views")));
//Save string as array into another attribute.
console.log("Parsing description into descriptionarray");
seed.set("descriptionarray", [seed.get("description")]);
// SAVING THE OBJECT
console.log("Saving object");
return seed.save();
});
// PUSH THIS PROMISE TO THE ARRAY TO PERFORM IN PARALLEL
results.push(promise);
});
Parse.Promise.when(results).then(function(data){
console.log("All objects saved");
status.success("Updated Succesfully");
});
}, function(error) {
console.error('Request failed with response code ' + httpResponse.status);
status.error("Update Failed");
});
});
});
Thanks knshn. I had refactored the code a lot since that version (including several of the changes you made), but I had posted the version that was identical to that which was working fine before. Your changes let me see the right error. For some reason doing the simple single object implementation didn't work for me originally, hence the nasty workaround. It works now though.
I have now found the culprit - the Seed class had an attribute called 'id'. With the old version this worked fine, but when I deployed that code now it gave an error 101: 'object not found for update'. This must be because the new Parse code is mixing that up with the internal objectId and getting confused that the id is different to what it expects. I wonder how that could still work with the rollback though. Perhaps the at version was tagged to use the older Parse code.
My fix was to use a different name for the id - 'seed_id'.

Dropzone.js and full path for each file

I'm trying to recreate the folder structure of dropped files/folder with Dropzone.js.
Is there a way to have access to the full path of each file, so that the directory structure can be re-created on the php side?
This is a simple way you can send additionally full paths of all files which are in some folder(s):
dropzone.on("sending", function(file, xhr, data) {
// if file is actually a folder
if(file.fullPath){
data.append("fullPath", file.fullPath);
}
});
you can use a file reader for it, I did it in angular 5:
onFilesAdded(files: File[]) {
console.log(files);
this.dropzone.reset();
files.forEach(file => {
const reader = new FileReader();
let content;
reader.onload = (e: ProgressEvent) => {
content = (e.target as FileReader).result;
console.log(content);
};
this.previewImage(file).then(response => {
const imageItem = new FileItem(
response,
content,
);
let imagesComponentItems = this.store.value.imagesComponentItems;
imagesComponentItems = [...imagesComponentItems, imageItem];
this.store.set("imagesComponentItems", imagesComponentItems);
this.hasImages();
});
});
}

parse.com: Cache results as a json file

For my app I have a long running task that generates stats. I would like the stats to be available to my users as a json file. Is there a way using parse to store the results of the task as a json that my users can retrieve? It would be great if there was some server side caching that I could take advantage of as well.
UPDATE
The following code takes my array of dictionaries and saves it as a file. I am now trying to store the file as part of a table called Cache. However the result is
Input: {}
Result: undefined
The code snippet looks like
var Buffer = require('buffer').Buffer;
...
var json = <array of dictionaries>;
var jsonStr = JSON.stringify(json);
var buf = new Buffer(jsonStr, 'utf8');
var json64 = buf.toString('base64');
var parseFile = new Parse.File('result', { "base64" : json64 }, 'application/json');
parseFile.save().then(function(parseFile) {
// The file has been saved to Parse.
var cache = new Cache();
attr = {};
attr['file'] = parseFile;
console.log(parseFile.url());
cache.save(attr, {
success : function(result) {
status.success("Cached");
},
error : function(result, error) {
status.error("Error: + " + error.message);
}
});
}, function(error) {
console.log(error);
status.error(error.message);
});

Resources