I know many topics has been about promises and callback. I tried many ways but still, I don't succeed to solve it.
What I want is to edit a file locally, save it then upload it to S3. then another function is called to read from the file and display as a list
Unfortunately I am having error because the file is ending call another function to display to read then it is writing and saving in S3 as you can see in my [terminal ][1]
the file is properly edited and uploaded to s3
1- I tried as promises using then to excecute one after another
static async edit_product(req: any, res: any) {
console.log('edit_product param request',req.body)
try {
ExcelFile.EditFile(prod.product_code,prod.product_name).then(rs=> res.status(200).json({'success'}) ) ).catch((err) => {
console.error(err);
})
console.log('test')
}
2- using await and then
static async edit_product(req: any, res: any) {
console.log('edit_product param request',req.body)
try {
await ExcelFile.EditFile(prod.product_code,prod.product_name).then(rs=> rs)
console.log('test')
res.status(200).json({'success product edit':prod.product_code})
}
3-to upload file to S3
static async UploadFileS3() {
const file = config._path+config._foldername +config._filename
var s3 = new aws.S3({ accessKeyId: config._ACCESS_KEY_ID,secretAccessKey: config._SECRET_ACCESS_KEY });
var newversionId: string = ''
const params = {
Bucket: config._BUCKET_NAME,
Key: config._filename // File name you want to save as in S3
};
return s3.putObject(params, function(err, data) {
if (err) {console.log(err) }
newversionId = data.VersionId!
console.log("Successfully uploaded data ",newversionId);
});
};
4-edit file
const stream = new Stream.PassThrough();
var dataFile = wb.xlsx.readFile(file).then(rs=>{
var sh = rs.getWorksheet(config._sheetname);
for (let i = 2; i <= sh.rowCount; i++) {
let currRow = sh.getRow(i);
if (currRow.getCell(1).text==product_code){
currRow.getCell(2).value = product_name
currRow.commit();
break } }
console.log('edit ')
//save locally
wb.xlsx.writeFile(file).then(rs=>{console.log('edit filed successfully')});
const param = {Key: config._filename,
Bucket: config._BUCKET_NAME,
Body: stream,
ContentType: 'CONTENT_TYPE_EXCEL'
}
//save to s3
wb.xlsx.write(stream).then(() => {s3.upload(param, function (err,data) {
if (err) { console.log("Error", err); }
console.log("Upload Success", data.ETag);
ExcelFile.getAwsVersion().then(rs=>ExcelFile.saveFileBucketVersion(rs))
})
})
})
return dataFile //return promise
How can I make it to respect the step, edit first then return res.status(200).json({'success'}
[1]: https://i.stack.imgur.com/SsWhu.png
Your EditFile function seems to be not waiting for the end of the writeFile. The for loop starts the writeFile function but it is not awaited there. The possible solutions are
Move the write function out of the for loop. It looks weird anyway that you are potentially saving the changes multiple time.
If the write should be in the loop then use a promise compatible loop there (e.g. Bluebird.each)
Related
I have this Interceptor which is responsible for the pagination of my application, it has the first part which used to check if the response of http was ok.
try {
//next.handle().pipe(map((data) => ({ data })));
const data = await next.handle().toPromise();
return data.status < 300
? this.mountPagination(data)
: this.getResponseStatus(data, context);
} catch (error) {
return throwError(this.createHttpException(error));
}
}
If it was ok, then it move on and build the pagination.
After the toPromise was deprecated, I wasn't able to change it by the lastvaluefrom
UPDATE
try {
const data = next.handle().pipe(map((data) => ({ data })));
return this.mountPagination(data);
} catch (error) {
return throwError(this.createHttpException(error));
}
}
This is what i'm trying to correct
But the variable data is an observable after that, which I can't get the status from it
How to convert from ToPromise() to First/LastValueFrom:
const data = await next.handle().toPromise();
to
const data = await lastValueFrom(next.handle());
Hope this helps
Hello dear developers. So I have a question regarding returning functions that use try-catch on typescript. I want to make an http request and then return the data. In case an error occurs I can either return an empty array or throw the error. The function below is then being called by another function that will process the data. What is the best approach for this. Having it like this is giving me an error regarding the returning type of getData()
static getData = async (): Promise<number[]> => {
let data: number[] = [];
try {
axios
.get(`endpoint/getData`)
.then((res) => {
data= res.data;
return data;
});
} catch (error) {
return data;
}
};
//Example
static getUnixTime = async(): number => {
const data = this.getData();
return data[0]/1000 ;
}
I'm using the ondata event to add data to the formData, but I need the relative file path in there (in case someone uploads a folder, so I can recreate the same structure on the server after upload, e.g /myFolder/fileploaded.jpg)
How does one get _relativePath in the ondata event ?
FilePond.setOptions({
server: {
url: 'http://192.168.0.100',
timeout: 7000,
process: {
url: './process',
ondata: (formData) => {
let fullPath = ''; // Need _relativePath here
formData.append('Hello', 'World');
return formData;
}
},
}
});
Never mind, I didn't see in the docs that there is a more advanced process function:
FilePond.setOptions({
server: {
process:(fieldName, file, metadata, load, error, progress, abort, transfer, options) => {
// fieldName is the name of the input field
// file is the actual file object to send
const formData = new FormData();
formData.append(fieldName, file, file.name);
const request = new XMLHttpRequest();
request.open('POST', 'url-to-api');
// Should call the progress method to update the progress to 100% before calling load
// Setting computable to false switches the loading indicator to infinite mode
request.upload.onprogress = (e) => {
progress(e.lengthComputable, e.loaded, e.total);
};
// Should call the load method when done and pass the returned server file id
// this server file id is then used later on when reverting or restoring a file
// so your server knows which file to return without exposing that info to the client
request.onload = function() {
if (request.status >= 200 && request.status < 300) {
// the load method accepts either a string (id) or an object
load(request.responseText);
}
else {
// Can call the error method if something is wrong, should exit after
error('oh no');
}
};
request.send(formData);
// Should expose an abort method so the request can be cancelled
return {
abort: () => {
// This function is entered if the user has tapped the cancel button
request.abort();
// Let FilePond know the request has been cancelled
abort();
}
};
}
}
});
Well, my lambda function work's well according to the log's, but it never get completed in the codepipeline stage, I have already set permission to role for allow notificate pipeline ("codepipeline:PutJobSuccessResult",
"codepipeline:PutJobFailureResult") and even set maximun time to 20sec but still not working (it actually ends at 800ms).
const axios = require('axios')
const AWS = require('aws-sdk');
const url = 'www.exampleurl.com'
exports.handler = async (event, context) => {
const codepipeline = new AWS.CodePipeline();
const jobId = event["CodePipeline.job"].id;
const stage = event["CodePipeline.job"].data.actionConfiguration.configuration.UserParameters;
const putJobSuccess = function(message) {
var params = {
jobId: jobId
};
codepipeline.putJobSuccessResult(params, function(err, data) {
if (err) {context.fail(err); }
else {context.succeed(message);}
});
};
const putJobFailure = function(message) {
var params = {
jobId: jobId,
failureDetails: {
message: JSON.stringify(message),
type: 'JobFailed',
externalExecutionId: context.invokeid
}
};
codepipeline.putJobFailureResult(params, function(err, data) {
if (err) console.log(err)
context.fail(message);
});
};
try {
await axios.post(url, { content: stage})
putJobSuccess('all fine')
} catch (e) {
putJobFailure(e)
}
};
The root issue
Because nodeJS runs everything async by default, codepipeline.putJobSuccessResult is being run async. The issue seems to be that the Lambda function is finishing it's execution before codepipeline.putJobSuccessResult has a chance to complete.
The solution
Run codepipeline.putJobSuccessResult synchronously so that it is forced to complete before the response is returned to Lambda for the lambdaHandler.
const putJobSuccess = function(id) {
//await sleep(60);
console.log("Telling Codepipeline test passed for job: " + id)
var params = {
jobId: id
};
return codepipeline.putJobSuccessResult(params, function(err, data) {
if(err) {
console.error(err)
} else {
console.log(data)
}
}).promise()
};
exports.lambdaHandler = async (event, context) => {
...
await putJobSuccess( jobId )
return response
};
Whenever I see this issue, most of the time it is due to 'PutJobSuccessResult' never being invoked. The best way to check this is to go to CloudTrail > 'Event History' and look for 'Event name' = 'PutJobSuccessResult' during the time range you expect the Lambda function calling this API. Probably you will not find the 'PutJobSuccessResult', then please have a look at the code again and the Lambda execution logs in CloudWatch.
I'm utilising the following API for a World Cup Laravel app - http://api.football-data.org/docs/v1/index.html#_fixture
This information brings me back today's fixture's as I'm using this code (config just holds my API key):
const todaysMatches = new Vue({
el: '#todaysMatches',
data: {
todaysMatches: [],
flags: []
},
methods: {
loadData: function () {
axios.get("http://api.football-data.org/v1/competitions/467/fixtures/?timeFrame=p1", config)
.then(response => {this.todaysMatches = response.data});
}
},
mounted: function () {
this.loadData();
}
});
This brings back the following data sctructure:
Inside each fixture you get an array of _links which you can see in the below screenshot:
Now, what I would like to do is query both the awayTeam api and the homeTeam api because they each have an endpoint of crestUrl which returns the country's flag.
You can see that inside my data I've set an array prop called flags so I was thinking of running additional calls inside my loadData method and populate that array for each fixture, but I don't think that's a clean way of doing it.
Can anyone suggest the best way to approach this?
I have used async/await pattern to achieve your requirement as below:
loadData: async function() {
const response = await axios.get(
"http://api.football-data.org/v1/competitions/467/fixtures/?timeFrame=p1",
config
);
this.todaysMatches = response.data;
let arr = this.todaysMatches.fixtures.map(fixture => {
const _links = fixture._links;
return [
axios.get(_links.awayTeam.href, config),
axios.get(_links.homeTeam.href, config)
];
});
arr.forEach(async item => {
const away = await item[0];
const home = await item[1];
this.flags.push({
awayFlag: away.data.crestUrl,
homeFlag: home.data.crestUrl
});
});
}
Explaination:
After fetching todaysMatches a new array arr is created which consists of promises returned by get request to the team's url [[getAwayTeamInfo, getHomeTeamInfo], [getAwayTeamInfo, getHomeTeamInfo], [getAwayTeamInfo, getHomeTeamInfo],...]
We loop through this and await on the promise to get the crestUrl
This crestUrl is pushed into flags array as an object
{
awayFlag: away.data.crestUrl,
homeFlag: home.data.crestUrl
}
Update
Adding the flag urls directly to the this.todaysMatches.fixtures array
loadData: async function() {
const response = await axios.get(
"http://api.football-data.org/v1/competitions/467/fixtures/?timeFrame=p1",
config
);
this.todaysMatches = response.data;
const fixtures = this.todaysMatches.fixtures;
let arr = fixtures.map(fixture => {
const _links = fixture._links;
return [
axios.get(_links.awayTeam.href, config),
axios.get(_links.homeTeam.href, config)
];
});
arr.forEach(async (item, index) => {
const away = await item[0];
const home = await item[1];
this.$set(fixtures, index, {
...fixtures[index],
awayFlag: away.data.crestUrl,
homeFlag: home.data.crestUrl
});
});
}