I'm hosting an expressjs app on Vercel, at some point it uses the spreadsheet api to store a generated product key. The function looks like this:
async function addActivationCode(activationCode)
{
await googleSheets.spreadsheets.values.append({
auth: googleAuth,
spreadsheetId: sheetID,
range: "ActivationCodes!A:C",
valueInputOption: "USER_ENTERED",
resource: {
values: [
[activationCode, "1", "0"]
]
}
})
}
This function works perfectly fine when hosted locally. But in Vercel's Realtime logs I get a FetchError saying: Client network socket disconnected before secure TLS connection was established.
the try-catch block that calls this function is the following:
try
{
await addActivationCode(generateActivationCode())
}
catch(err)
{
console.log(err)
console.log("Failed to add new activation code.")
return
}
Note that it is not the only function that sends an append request to the spreadsheet api. There's another function used to add users and it's working fine. Here it is:
async function addUser(username, password)
{
// Generate salt and hash the password in one command
const hashedPassword = await bcrypt.hash(password, 10)
// Add user to the Users sheet
await googleSheets.spreadsheets.values.append({
auth: googleAuth,
spreadsheetId: sheetID,
range: "Users!A2:B",
valueInputOption: "USER_ENTERED",
resource: {
values: [
[username, hashedPassword]
]
}
})
}
I found another post talking about this issue but not in Vercel. It was related to proxies. My knowledge about proxies atm is limited so I didn't quite understand how to fix this in Vercel.
Any help is appreciated.
I found where the problem is coming from. It's Vercel's limits. For users using the free plan, their app can't take more than 10 seconds to respond to a request. For instance, the function I have in my app that's responsible for registering users does 5 things:
It checks if the username already exists by retrieving the list of users from google spreadsheet.
It checks if the given product key is valid
It adds the user
It deletes the product key used during registration
It generates a new product key and appends it to the table of keys.
That's 5 god damn requests sent to the spreadsheet api! That definitely takes more time than the limit to complete.
Related
I am building an application using the Twitter API and Netlify (aws lambda functions)
This API requires these steps:
When the user goes to my /auth function, a link to the Twitter authentication is created
Once the user clicks that link, he is redirected to Twitter where a pop-up asks to allow my app to connect.
Once the user approves, he is redirected to my /auth function again but this time the authCode is set to a number rather than being undefined. This authCode is used to instantiate the twitter client class and authorize it.
A new instance of the Twitter client is created and authorized. This instance allows to query the tweets
1, 2 and 3 works. However, the authorized instance only lives inside the /auth function. How can I pass it to different functions without losing its instantiation?
How can I pass this instance to different server-less functions?
client = new Client(OAuthClient) this is what I want to pass around.
I tried with a Middleware with little success. It seems the twitter client class gets re-instantiated (so without authorization) for every server-less function
https://playful-salmiakki-67d50e.netlify.app/.netlify/functions/auth
import Client from 'twitter-api-sdk';
let client: Client;
const auth = async (event, context, callback) => {
const authCode = event.queryStringParameters ? event.queryStringParameters.code : undefined;
const authUrl = OAuthClient.generateAuthURL({
state: 'STATE',
code_challenge: 'challenge',
});
console.log('HERE LINK:');
console.log(authUrl);
if (authCode) {
await OAuthClient.requestAccessToken(authCode as string);
client = new Client(OAuthClient); <-- THIS IS WHAT I WANT TO PASS TO DIFFERENT FUNCTIONS
}
return {
statusCode: 200,
body: JSON.stringify({ message: 'Auth, go to the url displayed terminal'}),
myClient: client
};
};
exports.handler = middy().use(myMiddleware()).handler(auth);
Problem
I was trying to use 'aws-amplify' GET API request with query parameters on the client side, but it turned out to be Request failed with status code 403, and the response showed:
"message":"The request signature we calculated does not match the signature you provided. Check your AWS Secret Access Key and signing method. Consult the service documentation for details.
Note: React.js as front-end, Javascript as back-end.
My code
Front-end
function getData() {
const apiName = 'MyApiName';
const path = '/path';
const content = {
body:{
data:'myData',
},
};
return API.get(apiName, path, content);
}
Back-end
try {
const result = await dynamoDbLib.call("query", params);
} catch (e) {
return failure({ status: false });
}
What I did to debug
The GET lambda function works fine in Amazon Console (Tested)
If I change the backend lambda function so that the frontend request can be made without parameters, i.e. return API.get(apiName, path), then no error shows up.
My question
How can I make this GET request with query parameters works?
I changed GET to POST (return API.post()), everything works fine now.
If anyone can provide a more detailed explanation, it would be very helpful.
Apollo is not storing the header from the query dynamically.
pages/index.vue
methods: {
fetchCars() {
const token = Cookies.get('XSRF-TOKEN')
console.log(token) // 🟢 Token is shown in console
this.$apollo.query({
query: gql`
query {
cars {
uuid
name
}
}
`,
headers: {
'X-XSRF-TOKEN': token, // â• Fetch without header
},
})
},
},
Is there a way to set the header value new for every Apollo request?
I have a separate Frontend and Backend. For the Frontend I am using Nuxt.js with Apollo. I want to have a session based communication with my server. For this reason I need to send the CSRF-Token with every Request.
Now the problem: On the first load of the page there is no Cookie set on the browser. I do a GET-Request on every initialization of my Nuxt application.
plugins/csrf.js
fetch('http://127.0.0.1:8000/api/csrf-cookie', {
credentials: 'include',
})
Now I have a valid Cookie set on my side and want to communicate with the GraphQL Server but my header is not set dynamically in the query. Does anyone know how I can solve this?
My Laravel Backend is throwing now a 419 Token Mismatch Exception because I did not send a CSRF-Token with my request.
Link to the repository: https://github.com/SuddenlyRust/session-based-auth
[SOLVED] Working solution: https://github.com/SuddenlyRust/session-based-auth/commit/de8fb9c18b00e58655f154f8d0c95a677d9b685b Thanks to the help of kofh in the Nuxt Apollo discord channel 🎉
In order to accomplish this, we need to access the code that gets run every time a fetch happens. This code lives inside your Apollo client's HttpLink. While the #nuxtjs/apollo module gives us many options, we can't quite configure this at such a high level.
Step 1: Creating a client plugin
As noted in the setup section of the Apollo module's docs, we can supply a path to a plugin that will define a clientConfig:
// nuxt.config.js
{
apollo: {
clientConfigs: {
default: '~/plugins/apollo-client.js'
}
}
}
This plugin should export a function which receives the nuxt context. It should return the configuration to be passed to the vue-cli-plugin-apollo's createApolloClient utility. You don't need to worry about that file, but it is how #nuxtjs/apollo creates the client internally.
Step 2: Creating the custom httpLink
In createApolloClient's options, we see we can disable defaultHttpLink and instead supply our own link. link needs to be the output of Apollo's official createHttpLink utility, docs for which can be found here. The option we're most interested in is the fetch option which as the docs state, is
a fetch compatible API for making a request
This boils down to meaning a function that takes uri and options parameters and returns a Promise that represents the network interaction.
Step 3: Creating the custom fetch method
As stated above, we need a function that takes uri and options and returns a promise. This function will be a simple passthrough to the standard fetch method (you may need to add isomorphic-fetch to your dependencies and import it here depending on your setup).
We'll extract your cookie the same as you did in your question, and then set it as a header. The fetch function should look like this:
(uri, options) => {
const token = Cookies.get('XSRF-TOKEN')
options.headers['X-XSRF-TOKEN'] = token
return fetch(uri, options)
}
Putting it all together
Ultimately, your ~/plugins/apollo-client.js file should look something like this:
import { createHttpLink } from 'apollo-link-http'
import fetch from 'isomorphic-fetch'
export default function(context) {
return {
defaultHttpLink: false,
link: createHttpLink({
uri: '/graphql',
credentials: 'include',
fetch: (uri, options) => {
const token = Cookies.get('XSRF-TOKEN')
options.headers['X-XSRF-TOKEN'] = token
return fetch(uri, options)
}
})
}
}
I am using aws lambda function for google smart home action. I used aws api gateway for fulfillment url to reach lambda. I can successfully handle google assistant's intents with below code:-
const {smarthome} = require('actions-on-google');
const app = smarthome();
app.onExecute((body, headers) => {
return {
requestId: 'ff36...',
payload: {
// ...
},
};
});
app.onQuery((body, headers) => {
return {
requestId: 'ff36...',
payload: {
// ...
},
};
});
app.onSync((body, headers) => {
console.log("body: "+JSON.stringify(body));
console.log("headers: "+JSON.stringify(headers));
return {
requestId: 'ff36...',
payload: {
// ...
},
};
});
exports.handler = app;
On hard coding device details in this function, It can successfully reflect in google home app. But to get actual devices of user I need to get oauth token from "SYNC" intent. But all I got from this code is this output:-
body: {"inputs":[{"intent":"action.devices.SYNC"}],"requestId":"5604033533610827657"}
headers: {}
Unlike "Discover Directive" of Alexa's skill, which contains token in request.directive.endpoint.scope.token, google's intent doesn't seems to carry it. For O Auth, I am using AWS Cognito which works fine with Alexa Account linking and for google home too it can successfully link the account and show devices which I hardcode in lambda function.
As per this answer, the token is in
headers.authorization.substr(7)
I've tried that and got nothing. It shows
"Cannot read property 'substr' of undefined".
The lambda handler in the Actions on Google client library assumes that the request headers are present at event.headers within the input event parameter of a Lambda Proxy Integration. If you have a custom Lambda integration or have otherwise modified the input mapping, you may need to edit your mapping template to ensure the headers are placed where the client library expects.
I can't seem to find any documentation on how to restrict the login to my web application (which uses OAuth2.0 and Google APIs) to only accept authentication requests from users with an email on a specific domain name or set of domain names. I would like to whitelist as opposed to blacklist.
Does anyone have suggestions on how to do this, documentation on the officially accepted method of doing so, or an easy, secure work around?
For the record, I do not know any info about the user until they attempt to log in through Google's OAuth authentication. All I receive back is the basic user info and email.
So I've got an answer for you. In the OAuth request you can add hd=example.com and it will restrict authentication to users from that domain (I don't know if you can do multiple domains). You can find hd parameter documented here
I'm using the Google API libraries from here: http://code.google.com/p/google-api-php-client/wiki/OAuth2 so I had to manually edit the /auth/apiOAuth2.php file to this:
public function createAuthUrl($scope) {
$params = array(
'response_type=code',
'redirect_uri=' . urlencode($this->redirectUri),
'client_id=' . urlencode($this->clientId),
'scope=' . urlencode($scope),
'access_type=' . urlencode($this->accessType),
'approval_prompt=' . urlencode($this->approvalPrompt),
'hd=example.com'
);
if (isset($this->state)) {
$params[] = 'state=' . urlencode($this->state);
}
$params = implode('&', $params);
return self::OAUTH2_AUTH_URL . "?$params";
}
I'm still working on this app and found this, which may be the more correct answer to this question. https://developers.google.com/google-apps/profiles/
Client Side:
Using the auth2 init function, you can pass the hosted_domain parameter to restrict the accounts listed on the signin popup to those matching your hosted_domain. You can see this in the documentation here: https://developers.google.com/identity/sign-in/web/reference
Server Side:
Even with a restricted client-side list you will need to verify that the id_token matches the hosted domain you specified. For some implementations this means checking the hd attribute you receive from Google after verifying the token.
Full Stack Example:
Web Code:
gapi.load('auth2', function () {
// init auth2 with your hosted_domain
// only matching accounts will show up in the list or be accepted
var auth2 = gapi.auth2.init({
client_id: "your-client-id.apps.googleusercontent.com",
hosted_domain: 'your-special-domain.example'
});
// setup your signin button
auth2.attachClickHandler(yourButtonElement, {});
// when the current user changes
auth2.currentUser.listen(function (user) {
// if the user is signed in
if (user && user.isSignedIn()) {
// validate the token on your server,
// your server will need to double check that the
// `hd` matches your specified `hosted_domain`;
validateTokenOnYourServer(user.getAuthResponse().id_token)
.then(function () {
console.log('yay');
})
.catch(function (err) {
auth2.then(function() { auth2.signOut(); });
});
}
});
});
Server Code (using googles Node.js library):
If you're not using Node.js you can view other examples here: https://developers.google.com/identity/sign-in/web/backend-auth
const GoogleAuth = require('google-auth-library');
const Auth = new GoogleAuth();
const authData = JSON.parse(fs.readFileSync(your_auth_creds_json_file));
const oauth = new Auth.OAuth2(authData.web.client_id, authData.web.client_secret);
const acceptableISSs = new Set(
['accounts.google.com', 'https://accounts.google.com']
);
const validateToken = (token) => {
return new Promise((resolve, reject) => {
if (!token) {
reject();
}
oauth.verifyIdToken(token, null, (err, ticket) => {
if (err) {
return reject(err);
}
const payload = ticket.getPayload();
const tokenIsOK = payload &&
payload.aud === authData.web.client_id &&
new Date(payload.exp * 1000) > new Date() &&
acceptableISSs.has(payload.iss) &&
payload.hd === 'your-special-domain.example';
return tokenIsOK ? resolve() : reject();
});
});
};
When defining your provider, pass in a hash at the end with the 'hd' parameter. You can read up on that here. https://developers.google.com/accounts/docs/OpenIDConnect#hd-param
E.g., for config/initializers/devise.rb
config.omniauth :google_oauth2, 'identifier', 'key', {hd: 'yourdomain.com'}
Here's what I did using passport in node.js. profile is the user attempting to log in.
//passed, stringified email login
var emailString = String(profile.emails[0].value);
//the domain you want to whitelist
var yourDomain = '#google.com';
//check the x amount of characters including and after # symbol of passed user login.
//This means '#google.com' must be the final set of characters in the attempted login
var domain = emailString.substr(emailString.length - yourDomain.length);
//I send the user back to the login screen if domain does not match
if (domain != yourDomain)
return done(err);
Then just create logic to look for multiple domains instead of just one. I believe this method is secure because 1. the '#' symbol is not a valid character in the first or second part of an email address. I could not trick the function by creating an email address like mike#fake#google.com 2. In a traditional login system I could, but this email address could never exist in Google. If it's not a valid Google account, you can't login.
Since 2015 there has been a function in the library to set this without needing to edit the source of the library as in the workaround by aaron-bruce
Before generating the url just call setHostedDomain against your Google Client
$client->setHostedDomain("HOSTED DOMAIN")
For login with Google using Laravel Socialite
https://laravel.com/docs/8.x/socialite#optional-parameters
use Laravel\Socialite\Facades\Socialite;
return Socialite::driver('google')
->with(['hd' => 'pontomais.com.br'])
->redirect();