Passing HTML data from AWS Lambda and API Gateway using AWS-Express - aws-lambda

I have a use case where when a user hits my API gateway I spawn a Lambda. The Lambda is using aws serverless express.
The Lambda has the responsibility to fetch the data from a url. Lets say this url is tiny url which is amazons url.
I am fetching the data. My code to fetch the data is
function (req, res) {
var request1 = require('request')
res.setHeader('Content-Type', 'text/html; charset=utf-8');
request1.get('http://a.co/d/dBpaoQo')
.on('response', response => {
response.pipe(res);
});
}
However, when I see the data at the client end, I see the data as
��P_k�0}ϧH}� U�e輹{X�`����
���1�\��bbmQ��]���p����CZ/��^w�q�F�����C�Ν�gy���I�����!�h�<T��s0�W����r��ko7���j�ȓ�ΊyF<��<��z�hp+�L�&�s%:|
F�l/f٫.��*���z�2����0���JGD��8A��W��1�=�S��3����˜m.�Do~޼�|s{����O�����G���-2;O�h�;^�����4�$�D?&���F֊�w����Џ~�\�P��0,Z�7´�Z
��s��g�\��c��ڿ�N�M���>�^�,Ka?��,�'nd�H�Z]�����\���#s��2������#�
ra#N<��V�
(Not Pasting the whole content).
If I run the server locally it shows the data in correct format. It is basically API gateway which is messing up with the data. I even applied
res.setHeader('Content-Type', 'text/html; charset=utf-8')
in my code as suggested on stackoverflowlink
Can anyone point me what am I doing wrong.

Ok this was an issue with Lambda and not with using API-Gateway. When I was fetching the data it was a compressed file.
I had to put in request headers(request where I am requesting the URL to give me data)
Accept-Encoding: 'none'
However, I did not understand why it is working locally and not on Lambda. When I run the same code locally I get the correct file.

Related

adding security http headers to aws lambda function

We have a simple application structure that our ReactJs front-end make request to api gateway which does a proxy-integration with a lambda function. Since our api gateway is passing requests as they are without any modification and do the same when returning responses to customer so the place we are going to add http security headers would be in the lambda function itself. I have done some research on how it can be achived but all the answers I got searching in Google mention lambda#Edge+Cloudfront similar to this post which we do not use at all, does it mean we have to change our structure by adding these two things? Thanks.
The article you reference assumes the backend is static (e.g. S3) and cannot set headers. That's why Lambda#Edge is used.
It sounds like your current setup should work without any changes... Did you try adding headers in the code?
I have this code working perfectly for the APIGW + Lambda (proxy integration) combo.
exports.handler = async function (event) {
var response = {
statusCode: 200,
headers: {
'Content-Type': 'application/json; charset=utf-8',
'X-My-Header': 'whatever'
},
body: JSON.stringify({status: 'OK'}),
}
return response
}
Add HSTS header in AWS Lambda.

Youtube Data Api Search Endpoint 401 Error

I'm trying to use the Youtube search endpoint, which does not seem to require an OAuth token. I've read a few tutorials and they only pass an API key. I'm getting a "Failed to load resource: the server responded with a status of 401 ()" error in my console in Chrome Dev Tools. Specifically, looking at the Network tab I get this:
I notice it says that there is an error: "invalid_token" but I pass the api key so they must be talking about the OAuth token? I'm confused because it shouldn't need one, especially because I'm just doing a query for public data. Even the Try This API portion of the endpoint documentation does not need one. Most importantly, my call in Postman works and just pasting the endpoint in my browser directly works. Why doesn't it work? This is using an axios call from a ReactJS frontend.
const apiKey = 'MY_API_KEY';
const url = 'https://www.googleapis.com/youtube/v3/search';
const response = await axios.get(url, {
params: {
part: 'snippet',
maxResults: 5,
q: songName,
key: apiKey
}
});
What was happening was that I was using axios.defaults.headers.common['Authorization'] =Bearer ${params.access_token}; in other calls for another API. This causes default everything to have this access token! So what I did for now is delete axios.defaults.headers.common["Authorization"]; -- the solution is pretty obvious, just make sure you have no extra headers because Search is not a OAuth endpoint!

AWS API gateway really works well but I can't run this on javascript AJAX

I made python function using AWS lambda and connected lambda with API Gateway
After then, I tested API. It worked well.
Testing in API Gateway was Successful
Now I tried to using this API with AJAX.
Javascript AJAX Code was like this
How ever result was
"jquery-3.4.1.js:9837 GET https://9i1jhuewmj.execute-api.ap-northeast-2.amazonaws.com/test/transaction?jpgname=image.jpg net::ERR_FAILED"
How can i solve this problem??
Hope for your wisdom!
Thank you
I think there are a few things. The content-type header being returned is application/json but the response is not JSON.
But I think the main problem is that the HTTP status being returned is 301. This tells the browser that this resource has been moved and the browser typically expects the response to contain information on where things are moved to so it can redirect.
I suspect if you change your configuration so that a more normal response code (i.e. 200) is returned, this will work better.

Create function in Parse Cloud Code that does not require authorisation

I have my own instance of Parse Server running on AWS and until now Cloud Functions have been working great, but with one caveat: they cannot be successfully called publicly, i.e. they require an authorisation key be sent in the REST request header.
I want to set up a Slack Slash Command to my server, and it has to be able to POST a payload without any headers or extra parameters. As a result, my requests are currently unauthorised (returning 403 statuses).
Is there a way to create granular control over a Parse Cloud Function's authorisation (i.e. if it requires master-key header or not), and if not — is there a way of forwarding the request but still through the Parse server?—Or even a way of manipulating the headers of a Slack request? I would rather not have to use another service just for request forwarding.
Thanks!
Two options
Pass in the master key on the client request which should bypass authorization. It's a blunt approach but might be okay in your case (without knowing more details).
Or run a new express endpoint alongside parse and from there call the parse cloud function using the masker key.
var api = new ParseServer(...)
var app = express();
app.use('/parse', api);
app.get('/api/slack', function(req, res) {
//call cloud function passing in master key
// add X-Parse-Master-Key as http header
unirest.post("http://myhost.com:1337/parse/functions/mycloudfunction")
.headers({'X-Parse-Master-Key', MASTER_KEY)
.end(function(response) {
}

How do I receive an http request with a Node.js server and then pass that request along to another server?

There's a lot going on here so I'll simplify this into a pseudo example. Forget about security and whatnot for a minute here. The point is to understand the functionality.
Let's say I'm running a local web server with node.js to dev a website. In the website, the user should be able to create a new account. The account information will be submitted via ajax to the node server. I then need the node server to take the incoming request and pass it along to another server that gives me access to a database. CouchDB, for example.
So here's a pseudo example of what I'd like to happen.
In the client's browser:
$.ajax({
url: './database_stuff/whatever', // points to the node web server
method: 'POST',
data: {name: 'Billy', age: 24}
});
In the Node web server:
var http = require('http'),
dbServer = 'http://127.0.0.1:5984/database_url';
http.createServer(function (req, res) {
/* figure out that we need to access the database then... */
// magically pass the request on to the db server
http.magicPassAlongMethod(req, dbServer, function (dbResponse) {
// pass the db server's response back to the client
dbResponse.on('data', function (chunk) {
res.end(chunk);
});
})
}).listen(8888);
Make sense? Basically what's the best way to pass the original request along to another server and then pass the response back to the client?
If the server at dbServer url supports streaming you could do something like
var request = require('request');
req.pipe(request.post(dbServer)).pipe(res)
where request is a module, for more info look here https://github.com/mikeal/request
This is quite readable and easy to implement, if for whatever reason you cannot do this then you could take what you need from the request and manually POST it, then take the response and res.send it to the client.
Sorry if there's an error in my code, I haven't tested it but my point should be clear, if it's not then ask away.

Resources