Unable to PGP Encrypt/Decrypt Excel file using OpenPGP.js - openpgp

Tried encrypting and decrypting excel files the same way I am doing for csv/txt files but I am not able to.
Encryption
var rawMessage= fs.readFileSync(path)
await openpgp.initWorker({ path: 'openpgp.worker.js' }); // set the relative web worker path
const publicKeyArmored = readFile(publicKeyPath);
const { data: encrypted } = await openpgp.encrypt({
message: openpgp.message.fromBinary(new Uint8Array(rawMessage)), // input as Message object
publicKeys: (await openpgp.key.readArmored(publicKeyArmored)).keys, // for encryption
armor: true,
compression: openpgp.enums.compression.zip
});
Decryption
var rawMessage= fs.readFileSync(path)
const privateKeyArmored = readFile(privateKeyPath);
const { keys: [privateKey] } = await openpgp.key.readArmored(privateKeyArmored);
await privateKey.decrypt(passphrase);
const { data: decrypted } = await openpgp.decrypt({
message: await openpgp.message.readArmored(encryptedData), // // parse armored message
privateKeys: [privateKey] // for decryption
});
Looking for quick resolution

Since the file is excel and the data is compressed, the text output will not help. Openpgp by default returns output as text. Solution is to get the decrypted output as binary and then save it in the file.
const { data: decrypted } = await openpgp.decrypt({
message: await openpgp.message.readArmored(encryptedData), // parse armored message
privateKeys: [privateKey], // for decryption
format: 'binary'
});

Related

Wait for promise to finish before continuing

I know many topics has been about promises and callback. I tried many ways but still, I don't succeed to solve it.
What I want is to edit a file locally, save it then upload it to S3. then another function is called to read from the file and display as a list
Unfortunately I am having error because the file is ending call another function to display to read then it is writing and saving in S3 as you can see in my [terminal ][1]
the file is properly edited and uploaded to s3
1- I tried as promises using then to excecute one after another
static async edit_product(req: any, res: any) {
console.log('edit_product param request',req.body)
try {
ExcelFile.EditFile(prod.product_code,prod.product_name).then(rs=> res.status(200).json({'success'}) ) ).catch((err) => {
console.error(err);
})
console.log('test')
}
2- using await and then
static async edit_product(req: any, res: any) {
console.log('edit_product param request',req.body)
try {
await ExcelFile.EditFile(prod.product_code,prod.product_name).then(rs=> rs)
console.log('test')
res.status(200).json({'success product edit':prod.product_code})
}
3-to upload file to S3
static async UploadFileS3() {
const file = config._path+config._foldername +config._filename
var s3 = new aws.S3({ accessKeyId: config._ACCESS_KEY_ID,secretAccessKey: config._SECRET_ACCESS_KEY });
var newversionId: string = ''
const params = {
Bucket: config._BUCKET_NAME,
Key: config._filename // File name you want to save as in S3
};
return s3.putObject(params, function(err, data) {
if (err) {console.log(err) }
newversionId = data.VersionId!
console.log("Successfully uploaded data ",newversionId);
});
};
4-edit file
const stream = new Stream.PassThrough();
var dataFile = wb.xlsx.readFile(file).then(rs=>{
var sh = rs.getWorksheet(config._sheetname);
for (let i = 2; i <= sh.rowCount; i++) {
let currRow = sh.getRow(i);
if (currRow.getCell(1).text==product_code){
currRow.getCell(2).value = product_name
currRow.commit();
break } }
console.log('edit ')
//save locally
wb.xlsx.writeFile(file).then(rs=>{console.log('edit filed successfully')});
const param = {Key: config._filename,
Bucket: config._BUCKET_NAME,
Body: stream,
ContentType: 'CONTENT_TYPE_EXCEL'
}
//save to s3
wb.xlsx.write(stream).then(() => {s3.upload(param, function (err,data) {
if (err) { console.log("Error", err); }
console.log("Upload Success", data.ETag);
ExcelFile.getAwsVersion().then(rs=>ExcelFile.saveFileBucketVersion(rs))
})
})
})
return dataFile //return promise
How can I make it to respect the step, edit first then return res.status(200).json({'success'}
[1]: https://i.stack.imgur.com/SsWhu.png
Your EditFile function seems to be not waiting for the end of the writeFile. The for loop starts the writeFile function but it is not awaited there. The possible solutions are
Move the write function out of the for loop. It looks weird anyway that you are potentially saving the changes multiple time.
If the write should be in the loop then use a promise compatible loop there (e.g. Bluebird.each)

How to upload image file to Amazon S3 using aws-sdk and Nativescript

I'm trying to upload an image local PNG to Amazon S3 using aws-sdk, but the file that is arriving in Bucket is double the bytes and corrupted.
I'm able to list files, create Buckets among others.
Nativescript: 5.2.4
npm: 6.4.1
node: 10.15.3
I'm tried read using method file.readText()
await file.readText().then(
(content) => {
this.binarySource = content;
});
Init Amazon S3
AWS.config.update({
region: "us-east-1",
credentials: {
accessKeyId: this.chaveAmazonS3,
secretAccessKey: this.tokenAmazonS3
}
});
const s3 = new AWS.S3({
apiVersion: "2006-03-01"
});
this.amazonS3 = s3;
Get file and upload using putObject
let artefato = '/data/user/0/org.nativescript.aixlab/files/midias/41_000014_1558633914086.png';
let diretorioS3 = 'dirbucket/';
let filename = artefato.substring(artefato.lastIndexOf(separator) + 1);
filename = diretorioS3 + filename;
const file: File = File.fromPath(artefato);
this.binarySource = await file.readTextSync();
// await file.readText().then(
// (content) => {
// this.binarySource = content;
// });
console.log("filename", filename, file.size, this.binarySource.length, artefato);
let params = {
Bucket: this.bucketAmazonS3,
Key: filename,
Body: this.binarySource,
ContentLength: this.binarySource.length,
ContentType: 'image/png',
};
try {
//let options = {partSize: 10 * 1024 * 1024, queueSize: 1};
let response = await this.amazonS3.putObject(params, function (err, data) {
console.log("PUT", err, data);
});
} catch (error) {
console.error("Erro putObjectBucket", error);
return false;
}
Result in console
JS:
JS: PUT null {
JS: "ETag": "\"913b7bda195f7bebfdaff5e5b10138a0\""
JS: }
This is values of File (Module) and return file.readTextSync
file.size = 193139
this.binarySource.length = 182599
Object in Amazon Bucket
342.6 KB
I expect the PNG file in Amazon S3 Bucket in binary format.
I can not save in base64 format because of other legacy applications.

How to parse binary data ("multipart/form-data") in KOA?

If I send POST-query with text options, all OK:
query from front-end:
const request = require("request")
const options = {
method: 'POST',
url: 'http://localhost:4000/user',
headers: form: { data: '12345' }
}
On server-side (KOA) I can get parsed data of a.m.query:
ctx.request.method: "POST"
ctx.request.originalUrl: "user"
ctx.request.body.data: "12345"
But if I send a POST query with binary data (file):
const fs = require("fs");
const request = require("request");
const options = { method: 'POST',
url: 'http://localhost:4000/user',
headers:
{
'content-type': 'multipart/form-data},
formData:
{ '':
{ value: 'fs.createReadStream("F:\\image.jpg")',
options:
{ filename: 'F:\\image.jpg',
contentType: null }
} } };
I don't know, how can I access for this binary data ("image.jpg) on server-part (KOA), in ctx.request have any field with this data...
You can use busboy for this. I wrote a gist for doing this, but I'm going to embed it here with some comments.
Let's create a helper for parsing out the file in a promise-friendly way.
// parse.js
import Busboy from 'busboy'
/**
* Parses a single file from a Node request.
*
* #param {http.IncommingRequest} req
* #return {Promise<{ file: Stream, filename: string>}
*/
export default function parse (req) {
return new Promise((resolve, reject) => {
const busboy = new Busboy({
headers: req.headers,
limits: {
files: 1 // allow only a single upload at a time.
}
})
busboy.once('file', _onFile)
busboy.once('error', _onError)
req.pipe(busboy)
function _cleanup () {
busboy.removeListener('file', _onFile)
busboy.removeListener('error', _onError)
}
function _onFile (fieldname, file, filename) {
_cleanup()
resolve({ file, filename })
}
function _onError (err) {
_cleanup()
reject(err)
}
})
}
Now we need to use it. Let's assume you want to upload to AWS S3.
import Koa from 'koa'
import parse from './busboy'
import AWS from 'aws-sdk'
const app = new Koa()
const s3 = new AWS.S3({
params: { Bucket: 'myBucket' }
})
// Assuming this is a route handler.
app.use(async (ctx) => {
const { file, filename } = await parse(ctx.req)
// `file` is a Stream. Pass this to S3, Azure Blob Storage or whatever you want.
// `filename` is the file name specified by the client.
const result = await s3.upload({
Key: filename,
Body: file
}).promise()
ctx.body = result
})
For brevity's sake, this is how you upload the file using axios on the client.
// `file` is a DOM File object.
function upload (file) {
const data = new window.FormData()
data.append('file', file, file.name)
return axios.post('/upload', data)
}

Attemp to invoke interface method java.lang.String com.facebook.react.bridge.ReadableMap etc on a null object reference

I'm getting this error after uploading image to firebase storage. I am using
"react-native": "0.55.4",
"react-native-fetch-blob": "^0.10.8",
"react-native-image-picker": "^0.26.10",
"firebase": "^5.0.4",
this is my code for uploading the image.
// Prepare Blob support
const Blob = RNFetchBlob.polyfill.Blob;
const fs = RNFetchBlob.fs;
window.XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;
window.Blob = Blob;
uploadImage = (uri, imageName, mime = "image/jpg") => {
return new Promise((resolve, reject) => {
const uploadUri =
Platform.OS === "ios" ? uri.replace("file://", "") : uri;
let uploadBlob = null;
const imageRef = db
.storage()
.ref("images/")
.child(imageName);
fs.readFile(uploadUri, "base64")
.then(data => {
return Blob.build(data, { type: `${mime};BASE64` });
})
.then(blob => {
uploadBlob = blob;
alert("blob is " + JSON.stringify(blob));
return imageRef.put(blob, { contentType: mime });
})
.then(() => {
uploadBlob.close();
return imageRef.getDownloadURL();
})
.then(url => {
resolve(url);
})
.catch(error => {
reject(error);
});
});};
Attempt to invoke interface method 'java.lang.String com.facebook.react.bridge.ReadableMap.getString(java.lang.String)' on a null object reference readAsText FileReaderModule.java:43 invoke Method.java invoke JavaMethodWrapper.java:372 invoke JavaModuleWrapper.java:160 run NativeRunnable.java handleCallback Handler.java:790 dispatchMessage Handler.java:99 dispatchMessage MessageQueueThreadHandler.java:29 loop Looper.java:164 run MessageQueueThreadImpl.java:192 run Thread.java:764
I faced the same error. The solution is to do a 'Fetch replacement' as the official documentation explains:
Since we are not implementing FileReader polyfill, you might run into
this error in some cases.
If you're facing this problem with Blob polyfill in Debug Mode, try
replace window.fetch with fetch replacement should fix it.
And:
If you have existing code that uses whatwg-fetch, now you don't have
to change existing code after 0.9.0, just use fetch replacement. The
difference between Official fetch and fetch replacement is that,
official fetch uses WHATWG-fetch js library which wraps XMLHttpRequest
polyfill under the hood, and our implementation simply wraps
RNFetchBlob.fetch.
Basically, you just have to add this to your code, just below your window.Blob = Blob; line:
const Fetch = RNFetchBlob.polyfill.Fetch
// replace built-in fetch
window.fetch = new Fetch({
// enable this option so that the response data conversion handled automatically
auto : true,
// when receiving response data, the module will match its Content-Type header
// with strings in this array. If it contains any one of string in this array,
// the response body will be considered as binary data and the data will be stored
// in file system instead of in memory.
// By default, it only store response data to file system when Content-Type
// contains string `application/octet`.
binaryContentTypes : [
'image/',
'video/',
'audio/',
'foo/',
]
}).build()
Documentation:
https://github.com/wkh237/react-native-fetch-blob/wiki/Trouble-Shooting#failed-to-execute-readastext-on-filereader
I am running into the same problem. It has something to do with the prep statements:
const Blob = RNFetchBlob.polyfill.Blob;
const fs = RNFetchBlob.fs;
window.XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;
window.Blob = Blob;
The error does not occur if I comment them out.
I have solved this by removing all this package because error still appearing even with fetch replacement I think Is triggered by
window.XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;
so I have used the old fashion
const uriToBlob = (uri) => {
return new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = function () {
resolve(xhr.response);
};
xhr.onerror = function () {
reject(new Error('uriToBlob failed'));
};
xhr.responseType = 'blob';
xhr.open('GET', uri, true);
xhr.send(null);
});
}

parse.com: Cache results as a json file

For my app I have a long running task that generates stats. I would like the stats to be available to my users as a json file. Is there a way using parse to store the results of the task as a json that my users can retrieve? It would be great if there was some server side caching that I could take advantage of as well.
UPDATE
The following code takes my array of dictionaries and saves it as a file. I am now trying to store the file as part of a table called Cache. However the result is
Input: {}
Result: undefined
The code snippet looks like
var Buffer = require('buffer').Buffer;
...
var json = <array of dictionaries>;
var jsonStr = JSON.stringify(json);
var buf = new Buffer(jsonStr, 'utf8');
var json64 = buf.toString('base64');
var parseFile = new Parse.File('result', { "base64" : json64 }, 'application/json');
parseFile.save().then(function(parseFile) {
// The file has been saved to Parse.
var cache = new Cache();
attr = {};
attr['file'] = parseFile;
console.log(parseFile.url());
cache.save(attr, {
success : function(result) {
status.success("Cached");
},
error : function(result, error) {
status.error("Error: + " + error.message);
}
});
}, function(error) {
console.log(error);
status.error(error.message);
});

Resources