Get OAuth 2.0 token for google service accounts - ajax

Short explanation
I want to get a Auth2.0 token for access to some APIs in my Google Cloud Platform proyect.
Context
At the current time i have a Wordpress page that has to make the connection. Temporarily i will make a javascript connection with the client via Ajax (when all work successfully i will make this in another way, for example with a PHP server in the middle).
The process that has to execute in our GCP don't need the user to log in with his google account, for that reason we will make a google service account for server to server connections. All the threads executed by the API will be log like be executed by this service account that isn't owned by any real person.
When i generate the Ajax connection for get the token, this will be send to the following URL:
https://oauth2.googleapis.com/token
I send it on JWT coding.
The coded message is generated in this Javascript code:
`
var unixHour = Math.round((new Date()).getTime() / 1000);
var header = {
"alg":"RS256",
"typ":"JWT"
}
var data = {
"iss":"nombreoculto#swift-firmament-348509.iam.gserviceaccount.com",
"scope":"https://www.googleapis.com/auth/devstorage.read_only",
"aud":"https://oauth2.googleapis.com/token",
"exp":(unixHour+3600),
"iat":unixHour
}
var secret = "MIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQCkhZH7TuaNO4XBVVVcE2P/hvHSsGXNu1D/FcCaMrW56BF/nbOlxAtbp07TCIOyrR1FEcJb+to66olSFnUVUWhWUB9zLbzKpULQoFmYECSWppUbCZd+bp271AFYZpxXFduziWuaG9BNxV2cmWTjLLlZI7FoIYFwLgPZHPWndY0E99lGEjmnH";
function base64url(source) {
// Encode in classical base64
encodedSource = CryptoJS.enc.Base64.stringify(source);
// Remove padding equal characters
encodedSource = encodedSource.replace(/=+$/, '');
// Replace characters according to base64url specifications
encodedSource = encodedSource.replace(/\+/g, '-');
encodedSource = encodedSource.replace(/\//g, '_');
return encodedSource;
}
var stringifiedHeader = CryptoJS.enc.Utf8.parse(JSON.stringify(header));
var encodedHeader = base64url(stringifiedHeader);
//document.getElementById("header").innerText = encodedHeader;
console.log(encodedHeader);
var stringifiedData = CryptoJS.enc.Utf8.parse(JSON.stringify(data));
var encodedData = base64url(stringifiedData);
//document.getElementById("payload").innerText = encodedData;
console.log(encodedData);
var signature = encodedHeader + "." + encodedData;
signature = CryptoJS.HmacSHA256(signature, secret);
signature = base64url(signature);
console.log(signature);
//document.getElementById("signature").innerText = signature;
var jwt = encodedHeader + "." + encodedData + "." + signature;
console.log(jwt);
$.ajax({
url: 'https://oauth2.googleapis.com/token',
type: 'POST',
data: { "grant_type": "urn:ietf:params:oauth:grant-type:jwt-bearer", "assertion" : jwt} ,
contentType: 'application/x-www-form-urlencoded; charset=utf-8',
success: function (response) {
alert(response.status);
},
error: function () {
alert("error");
}
});
`
Console:
Console output
The problem
The Ajax message generated in the script return "Invalid JWT signature".
send message API
ajax response API
Following the google documentation, this problem is for a bad coding of the message or a incorrect secret key.
You can see the code for generate the coding message in the previous script.
About the secret key, maybe i am not selecting the correct key for this task, here you have the steps i follow:
cred GCP
Inside the service account, i create a key in the "keys" section:
Keys GCP
As result this download this file:
File keys
I tried to use like secret key the "private_key" content of this file and additionally i tried to delete the line breaks (\n) of this and try again.
¿Is that correct?¿Or i dont use the corret key?
¿Maybe i make an incorrect coding?
*There aren't problems with share the key and account id because the key was disabled at the moment of share this thread and the project is only for testing purposes.

Related

Does Google Script have an equivalent to python's Session object?

I have this python script and I want to get Google Script equivalent but I do not know how to "pass" whatever needs to be passed between next get or post request once I log in.
import requests
import json
# login
session = requests.session()
data = {
'LoginName': 'name',
'Password': 'password'
}
session.post('https://www.web.com/en-CA/Login/Login', data=data)
session.get('https://www.web.com//en-CA/Redirect/?page=Dashboard')
# get customer table
data = {
'page': '1',
'pageSize': '100'
}
response = session.post('https://www.web.com/en-CA/Reporting', data=data)
print(response.json())
I wonder if there is an equivalent to .session() object from python's requests module. I did search google but could not find any working example. I am not a coder so I dot exactly know that that .session() object does. Would it be enough to pass headers from response when making new request?
UPDATE
I read in some other question that Google might be using for every single UrlFetchApp.fetch different IP so login and cookies might not work, I guess.
I believe your goal as follows.
You want to achieve your python script with Google Apps Script.
Issue and workaround:
If my understanding is correct, when session() of python is used, the multiple requests can be achieved by keeping the cookie. In order to achieve this situation using Google Apps Script, for example, I thought that the cookie is retrieved at 1st request and the retrieved cookie is included in the request header for 2nd request. Because, in the current stage, UrlFetchApp has no method for directly keeping cookie and using it to the next request.
From above situation, when your script is converted to Google Apps Script, it becomes as follows.
Sample script:
function myFunction() {
const url1 = "https://www.web.com/en-CA/Login/Login";
const url2 = "https://www.web.com//en-CA/Redirect/?page=Dashboard";
const url3 = "https://www.web.com/en-CA/Reporting";
// 1st request
const params1 = {
method: "post",
payload: {LoginName: "name", Password: "password"},
followRedirects: false
}
const res1 = UrlFetchApp.fetch(url1, params1);
const headers1 = res1.getAllHeaders();
if (!headers1["Set-Cookie"]) throw new Error("No cookie");
// 2nd request
const params2 = {
headers: {Cookie: JSON.stringify(headers1["Set-Cookie"])},
followRedirects: false
};
const res2 = UrlFetchApp.fetch(url2, params2);
const headers2 = res2.getAllHeaders();
// 3rd request
const params3 = {
method: "post",
payload: {page: "1", pageSize: "100"},
headers: {Cookie: JSON.stringify(headers2["Set-Cookie"] ? headers2["Set-Cookie"] : headers1["Set-Cookie"])},
followRedirects: false
}
const res3 = UrlFetchApp.fetch(url3, params3);
console.log(res3.getContentText())
}
By this sample script, the cookie can be retrieved from 1st request and the retrieved cookie can be used for next request.
Unfortunately, I have no information of your actual server and I cannot test for your actual URLs. So I'm not sure whether this sample script directly works for your server.
And, I'm not sure whether followRedirects: false in each request is required to be included. So when an error occurs, please remove it and test it again.
About the method for including the cookie to the request header, JSON.stringify might not be required to be used. But, I'm not sure about this for your server.
Reference:
Class UrlFetchApp
You might want to try this:
var nl = getNewLine()
function getNewLine() {
var agent = navigator.userAgent
if (agent.indexOf("Win") >= 0)
return "\r\n"
else
if (agent.indexOf("Mac") >= 0)
return "\r"
return "\r"
}
pagecode = 'import requests
import json
# login
session = requests.session()
data = {
\'LoginName\': \'name\',
\'Password\': \'password\'
}
session.post(\'https://www.web.com/en-CA/Login/Login\', data=data)
session.get(\'https://www.web.com//en-CA/Redirect/?page=Dashboard\')
# get customer table
data = {
\'page\': \'1\',
\'pageSize\': \'100\'
}
response = session.post(\'https://www.web.com/en-CA/Reporting\', data=data)
print(response.json())'
document.write(pagecode);
I used this program

Gmail API for managing multiple signatures

Google recently released an update to Gmail to bring support for multiple signatures. Ref: https://support.google.com/mail/answer/8395.
I do not see anything in the API documentation at https://developers.google.com/gmail/api/v1/reference that talks about how to manage those multiple signatures. How can I:
create a new signature
update a specific existing signature
associate a signature to an email address - both the "for new emails use" and "on reply/forward use"
Is there any documentation on this?
It is a bit hidden, the signature(s) need to be created with Users.settings.sendAs: create or update
As specified for the resource, signature is one of the parameters that can be modified and you do not need to create a new alias, but can also apply this method to your primaary email:
Settings associated with a send-as alias, which can be either the
primary login address associated with the account or a custom "from"
address.
Important: Access restricted to service accounts that have been delegated
Sample with Apps Script:
function createAlias() {
var alias = 'your primary email';
var signature = 'Your signature';
var service = getOAuthService();
service.reset();
if (service.hasAccess()) {
var url = 'https://www.googleapis.com/gmail/v1/users/me/settings/sendAs'
var headers ={
"Authorization": 'Bearer ' + service.getAccessToken(),
"Accept":"application/json",
"Content-Type":"application/json",
};
var resource ={
sendAsEmail: alias,
signature: signature,
};
var options = {
'headers': headers,
'method': 'POST',
'payload':JSON.stringify(resource),
'muteHttpExceptions': true
};
var response = UrlFetchApp.fetch(url, options);
Logger.log(response.getContentText());
}
}
Necessary scope:
https://www.googleapis.com/auth/gmail.settings.sharing

Google Cloud Platform: Unable to upload a new file version in Storage via API

I wrote a script that uploads a file to a bucket in Google Cloud Storage:
Ref: https://cloud.google.com/storage/docs/json_api/v1/objects/insert
function submitForm(bucket, accessToken) {
console.log("Fetching the file...");
var input = document.getElementsByTagName('input')[0];
var name = input.files[0].name;
var uploadUrl = 'https://www.googleapis.com/upload/storage/v1/b/'+
bucket + '/o?uploadType=media&access_token=' + accessToken + '&name=' + name;
event.preventDefault();
fetch(uploadUrl, {
method: 'POST',
body: input.files[0]
}).then(function(res) {
console.log(res);
location.reload();
})
.catch(function(err) {
console.error('Got error:', err);
});
}
It works perfectly fine when uploading a new file.
However, I get a 403 status code in the API response body while trying to replace an existing file with a new version.
Please note that:
The OAuth 2.0 scope for Google Cloud Storage is: https://www.googleapis.com/auth/devstorage.read_write
I did enable the versioning for the destination bucket
Could someone help me in pointing out what I did wrong?
Update I:
As suggested, I am trying to invoke the rewrite function as follows:
const input = document.getElementsByName('uploadFile')[0];
const name = input.files[0].name;
const overwriteObjectUrl = 'https://www.googleapis.com/storage/v1/' +
'b/' + bucket +
'/o/' + name +
'/rewriteTo/b/' + bucket +
'/o/' + name;
fetch(overwriteObjectUrl, {
method: 'POST',
body: input.files[0]
})
However, I am getting a 400 (bad request error).
{"error":{"errors":[{"domain":"global","reason":"parseError","message":"Parse Error"}],"code":400,"message":"Parse Error"}}
Could you explain me what I am doing wrong?
Update II:
By changing body: input.files[0] with body: input.files[0].data I made it working... Theoretically!
I get a positive response body:
{
"kind":"storage#rewriteResponse",
"totalBytesRewritten":"43",
"objectSize":"43",
"done":true,
"resource":{
"kind":"storage#object",
"id":"mybuck/README.txt/1520085847067373",
"selfLink":"https://www.googleapis.com/storage/v1/b/mybuck/o/README.txt",
"name":"README.txt",
"bucket":"mybuck",
"generation":"1520085847067373",
"metageneration":"1",
"contentType":"text/plain",
"timeCreated":"2018-03-03T14:04:07.066Z",
"updated":"2018-03-03T14:04:07.066Z",
"storageClass":"MULTI_REGIONAL",
"timeStorageClassUpdated":"2018-03-03T14:04:07.066Z",
"size":"43",
"md5Hash":"UCQnjcpiPBEzdl/iWO2e1w==",
"mediaLink":"https://www.googleapis.com/download/storage/v1/b/mybuck/o/README.txt?generation=1520085847067373&alt=media",
"crc32c":"y4PZOw==",
"etag":"CO2VxYep0NkCEAE="
}
}
Whit as well a new generation number (versioning enabled).
However, the file content has been not updated: I did append new strings but they did not show off within the file. Do you have any idea?
Thanks a lot in advance.
Based on the information available it's difficult to diagnose this issue with certainty- however I would check the roles assigned to the user or service account you are using for this operation.
As you have been able to upload a file, but not overwrite a file, this sounds like you may have assigned the user or service account that is attempting to perform this task the 'Storage Object Creator' role.
Users/service accounts with the Storage Object Creator role can create new objects in buckets but not overwrite existing ones (you can see this mentioned here).
If this is the case, you could try assigning the user/service account the role of 'Storage Object Admin' which allows users full control over bucket objects.
"insert" is only to be used to create new objects per the Methods section of the API's documentation, so you'll need to use "rewrite" to rewrite an existing object.

IBM Watson WebSocket connection failure: "HTTP Authentication failed; no valid credentials available"

I'm doing the tutorial for IBM Watson Speech-to-text. In the section "Using the WebSocket interface", subsection "Opening a connection and passing credentials", I copied the following code:
var token = watsonToken;
console.log(token); // token looks good
var wsURI = 'wss://stream.watsonplatform.net/speech-to-text/api/v1/recognize?watson-token=' +
token + '&model=es-ES_BroadbandModel';
var websocket = new WebSocket(wsURI);
websocket.onopen = function(evt) { onOpen(evt) };
websocket.onclose = function(evt) { onClose(evt) };
websocket.onmessage = function(evt) { onMessage(evt) };
websocket.onerror = function(evt) { onError(evt) };
I'm using Angular so I made a value for the token:
app.value('watsonToken', 'Ln%2FV...');
I get back an error message:
WebSocket connection to 'wss://stream.watsonplatform.net/speech-to-text/api/v1/recognize?watson-toke...&model=es-ES_BroadbandModel' failed: HTTP Authentication failed; no valid credentials available
I tried hardcoding the token:
var wsURI = 'wss://stream.watsonplatform.net/speech-to-text/api/v1/recognize?watson-token=Ln%2FV2...&model=es-ES_BroadbandModel';
Same error message.
IBM's documentation on tokens says that an expired or invalid token will return a 401 error, which I didn't get, so I presume that my token is neither expired nor invalid. Any suggestions?
I think you can see the Official Example from IBM Developers here.
The error is because the authentication does not work fine before you send the request to recognize, try to follow the same step inside this repository, like:
const QUERY_PARAMS_ALLOWED = ['model', 'X-Watson-Learning-Opt-Out', 'watson-token', 'customization_id'];
/**
* pipe()-able Node.js Readable/Writeable stream - accepts binary audio and emits text in it's `data` events.
* Also emits `results` events with interim results and other data.
* Uses WebSockets under the hood. For audio with no recognizable speech, no `data` events are emitted.
* #param {Object} options
* #constructor
*/
function RecognizeStream(options) {
Duplex.call(this, options);
this.options = options;
this.listening = false;
this.initialized = false;
}
util.inherits(RecognizeStream, Duplex);
RecognizeStream.prototype.initialize = function() {
const options = this.options;
if (options.token && !options['watson-token']) {
options['watson-token'] = options.token;
}
if (options.content_type && !options['content-type']) {
options['content-type'] = options.content_type;
}
if (options['X-WDC-PL-OPT-OUT'] && !options['X-Watson-Learning-Opt-Out']) {
options['X-Watson-Learning-Opt-Out'] = options['X-WDC-PL-OPT-OUT'];
}
const queryParams = extend({ model: 'en-US_BroadbandModel' }, pick(options, QUERY_PARAMS_ALLOWED));
const queryString = Object.keys(queryParams)
.map(function(key) {
return key + '=' + (key === 'watson-token' ? queryParams[key] : encodeURIComponent(queryParams[key])); // our server chokes if the token is correctly url-encoded
})
.join('&');
const url = (options.url || 'wss://stream.watsonplatform.net/speech-to-text/api').replace(/^http/, 'ws') + '/v1/recognize?' + queryString;
const openingMessage = extend(
{
action: 'start',
'content-type': 'audio/wav',
continuous: true,
interim_results: true,
word_confidence: true,
timestamps: true,
max_alternatives: 3,
inactivity_timeout: 600
},
pick(options, OPENING_MESSAGE_PARAMS_ALLOWED)
);
This code is from IBM Developers and for my project I'm using and works perfectly.
You can see in the code line #53, set the listening to true, otherwise it will eventually timeout and close automatically with inactivity_timeout applies when you're sending audio with no speech in it, not when you aren't sending any data at all.
Have another example, see this example from IBM Watson - Watson Developer Cloud using Javascript for Speech to Text.
Elementary, my dear Watson! There are three or four things to pay attention to with IBM Watson tokens.
First, you won't get a token if you use your IBMid and password. You have to use the username and password that were provided for a project. That username is a string of letters and numbers with hyphens.
Second, the documentation for tokens gives you code for getting a token:
curl -X GET --user {username}:{password}
--output token
"https://stream.watsonplatform.net/authorization/api/v1/token?url=https://stream.watsonplatform.net/text-to-speech/api"
Part of that code is hidden on the webpage, specifically the part that says /text-to-speech/. You need to change that to the Watson product or service you want to use, e.g., /speech-to-text/. Tokens are for specific projects and specific services.
Third, tokens expire in one hour.
Lastly, I had to put in backslashes to get the code to run in my terminal:
curl -X GET --user s0921i-s002d-dh9328d9-hd923:wy928ye98e \
--output token \
"https://stream.watsonplatform.net/authorization/api/v1/token?url=https://stream.watsonplatform.net/speech-to-text/api"

Restrict Login Email with Google OAuth2.0 to Specific Domain Name

I can't seem to find any documentation on how to restrict the login to my web application (which uses OAuth2.0 and Google APIs) to only accept authentication requests from users with an email on a specific domain name or set of domain names. I would like to whitelist as opposed to blacklist.
Does anyone have suggestions on how to do this, documentation on the officially accepted method of doing so, or an easy, secure work around?
For the record, I do not know any info about the user until they attempt to log in through Google's OAuth authentication. All I receive back is the basic user info and email.
So I've got an answer for you. In the OAuth request you can add hd=example.com and it will restrict authentication to users from that domain (I don't know if you can do multiple domains). You can find hd parameter documented here
I'm using the Google API libraries from here: http://code.google.com/p/google-api-php-client/wiki/OAuth2 so I had to manually edit the /auth/apiOAuth2.php file to this:
public function createAuthUrl($scope) {
$params = array(
'response_type=code',
'redirect_uri=' . urlencode($this->redirectUri),
'client_id=' . urlencode($this->clientId),
'scope=' . urlencode($scope),
'access_type=' . urlencode($this->accessType),
'approval_prompt=' . urlencode($this->approvalPrompt),
'hd=example.com'
);
if (isset($this->state)) {
$params[] = 'state=' . urlencode($this->state);
}
$params = implode('&', $params);
return self::OAUTH2_AUTH_URL . "?$params";
}
I'm still working on this app and found this, which may be the more correct answer to this question. https://developers.google.com/google-apps/profiles/
Client Side:
Using the auth2 init function, you can pass the hosted_domain parameter to restrict the accounts listed on the signin popup to those matching your hosted_domain. You can see this in the documentation here: https://developers.google.com/identity/sign-in/web/reference
Server Side:
Even with a restricted client-side list you will need to verify that the id_token matches the hosted domain you specified. For some implementations this means checking the hd attribute you receive from Google after verifying the token.
Full Stack Example:
Web Code:
gapi.load('auth2', function () {
// init auth2 with your hosted_domain
// only matching accounts will show up in the list or be accepted
var auth2 = gapi.auth2.init({
client_id: "your-client-id.apps.googleusercontent.com",
hosted_domain: 'your-special-domain.example'
});
// setup your signin button
auth2.attachClickHandler(yourButtonElement, {});
// when the current user changes
auth2.currentUser.listen(function (user) {
// if the user is signed in
if (user && user.isSignedIn()) {
// validate the token on your server,
// your server will need to double check that the
// `hd` matches your specified `hosted_domain`;
validateTokenOnYourServer(user.getAuthResponse().id_token)
.then(function () {
console.log('yay');
})
.catch(function (err) {
auth2.then(function() { auth2.signOut(); });
});
}
});
});
Server Code (using googles Node.js library):
If you're not using Node.js you can view other examples here: https://developers.google.com/identity/sign-in/web/backend-auth
const GoogleAuth = require('google-auth-library');
const Auth = new GoogleAuth();
const authData = JSON.parse(fs.readFileSync(your_auth_creds_json_file));
const oauth = new Auth.OAuth2(authData.web.client_id, authData.web.client_secret);
const acceptableISSs = new Set(
['accounts.google.com', 'https://accounts.google.com']
);
const validateToken = (token) => {
return new Promise((resolve, reject) => {
if (!token) {
reject();
}
oauth.verifyIdToken(token, null, (err, ticket) => {
if (err) {
return reject(err);
}
const payload = ticket.getPayload();
const tokenIsOK = payload &&
payload.aud === authData.web.client_id &&
new Date(payload.exp * 1000) > new Date() &&
acceptableISSs.has(payload.iss) &&
payload.hd === 'your-special-domain.example';
return tokenIsOK ? resolve() : reject();
});
});
};
When defining your provider, pass in a hash at the end with the 'hd' parameter. You can read up on that here. https://developers.google.com/accounts/docs/OpenIDConnect#hd-param
E.g., for config/initializers/devise.rb
config.omniauth :google_oauth2, 'identifier', 'key', {hd: 'yourdomain.com'}
Here's what I did using passport in node.js. profile is the user attempting to log in.
//passed, stringified email login
var emailString = String(profile.emails[0].value);
//the domain you want to whitelist
var yourDomain = '#google.com';
//check the x amount of characters including and after # symbol of passed user login.
//This means '#google.com' must be the final set of characters in the attempted login
var domain = emailString.substr(emailString.length - yourDomain.length);
//I send the user back to the login screen if domain does not match
if (domain != yourDomain)
return done(err);
Then just create logic to look for multiple domains instead of just one. I believe this method is secure because 1. the '#' symbol is not a valid character in the first or second part of an email address. I could not trick the function by creating an email address like mike#fake#google.com 2. In a traditional login system I could, but this email address could never exist in Google. If it's not a valid Google account, you can't login.
Since 2015 there has been a function in the library to set this without needing to edit the source of the library as in the workaround by aaron-bruce
Before generating the url just call setHostedDomain against your Google Client
$client->setHostedDomain("HOSTED DOMAIN")
For login with Google using Laravel Socialite
https://laravel.com/docs/8.x/socialite#optional-parameters
use Laravel\Socialite\Facades\Socialite;
return Socialite::driver('google')
->with(['hd' => 'pontomais.com.br'])
->redirect();

Resources