spotify application requests authorization - ajax

I am trying to get 'access token' from spotify with the following code.
var encoded = btoa(client_id+':'+client_secret);
function myOnClick() {
console.log('clikced!');
$.ajax({
url: 'https://accounts.spotify.com/api/token',
type: 'POST',
data: {
grant_type : "client_credentials",
'Content-Type' : 'application/x-www-form-urlencoded'
},
headers: {
Authorization: 'Basic ' + encoded
},
dataType: 'json'
}).always((data)=> console.log(data));
}
however I keep getting errors:
Cross-Origin Request Blocked: The Same Origin Policy disallows reading
the remote resource at https://accounts.spotify.com/api/token.
(Reason: CORS header ‘Access-Control-Allow-Origin’ missing).
and
readyState: 0, status: 0

Arielle from Spotify here.
Looks like you're using the Client Credentials Flow, which is one of 3 Authentication flows you can use with the Spotify API. (You can check out all 3 here)
Client Credentials is meant for server-side use only, and should not be used on the front-end, as it requires a client secret which you shouldn't be exposing!
You should use the Implicit Grant flow, which is made for use in the browser, instead. It's easy to get up and running, too!
// Get the hash of the url
const hash = window.location.hash
.substring(1)
.split('&')
.reduce(function (initial, item) {
if (item) {
var parts = item.split('=');
initial[parts[0]] = decodeURIComponent(parts[1]);
}
return initial;
}, {});
window.location.hash = '';
// Set token
let _token = hash.access_token;
const authEndpoint = 'https://accounts.spotify.com/authorize';
// Replace with your app's client ID, redirect URI and desired scopes
const clientId = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx';
const redirectUri = 'http://localhost:8888';
const scopes = [
'user-read-birthdate',
'user-read-email',
'user-read-private'
];
// If there is no token, redirect to Spotify authorization
if (!_token) {
window.location = `${authEndpoint}?client_id=${clientId}&redirect_uri=${redirectUri}&scope=${scopes.join('%20')}&response_type=token`;
}
Gist: https://gist.github.com/arirawr/f08a1e17db3a1f65ada2c17592757049
And here's an example on Glitch, that you can "Remix" to make a copy and start making your app: https://glitch.com/edit/#!/spotify-implicit-grant
Hope that helps - happy hacking! 👩🏼‍💻

const result = await axios({
url: this.apiLoginUrl,
method: 'post',
data: "grant_type=client_credentials",
headers: {
'Authorization': `Basic ${Buffer.from(this.clientId + ":" + this.clientSecret).toString('base64')}`,
},
});

Related

Extra information gets lost when updating user context

I've been fighting with this issue for days now and I just can't solve it. My app is built on React and Django Rest Framework. I'm authenticating users with JWT - when the user logs into the app, the React Auth context gets updated with some info about the tokens and I include some extra information in the context (namely the user email and some profile information) so that I have it easily accessible.
How I am doing this is by overwriting TokenObtainPairSerializer from simplejwt:
class MyTokenObtainPairSerializer(TokenObtainPairSerializer):
#classmethod
def get_token(cls, user):
token = super().get_token(user)
# Add custom claims
token["email"] = user.email
token["information"] = Profile.objects.get(user=user).information
return token
On the frontend in my AuthContext.js:
const loginUser = async (email, password, firstLogin = false) => {
const response = await fetch(`${baseUrl}users/token/`, {
method: "POST",
headers: {
"Content-Type": "application/json",
"X-CSRFToken": csrfToken,
},
body: JSON.stringify({
email,
password,
}),
});
const data = await response.json();
if (response.status === 200) {
setAuthTokens(data);
setUser(jwt_decode(data.access));
localStorage.setItem("authTokens", JSON.stringify(data));
if (firstLogin) {
history.push("/profile");
} else {
history.push("/");
}
} else {
return response;
}
};
Up to this point it works perfectly fine and my ReactDevTools show me that the AuthContext has all the data:
Now to the issue - once the access token has expired, the next API call the user makes gets intercepted to update the token. I do this in my axiosInstance:
const useAxios = () => {
const { authTokens, setUser, setAuthTokens } = useContext(AuthContext);
const csrfToken = getCookie("csrftoken");
const axiosInstance = axios.create({
baseURL,
headers: {
Authorization: `Bearer ${authTokens?.access}`,
"X-CSRFToken": csrfToken,
"Content-Type": "application/json",
},
});
axiosInstance.interceptors.request.use(async (req) => {
const user = jwt_decode(authTokens.access);
const isExpired = dayjs.unix(user.exp).diff(dayjs()) < 1;
if (!isExpired) return req;
const response = await axios.post(`${baseURL}users/token/refresh/`, {
refresh: authTokens.refresh,
});
// need to add user info to context here
localStorage.setItem("authTokens", JSON.stringify(response.data));
setAuthTokens(response.data);
setUser(jwt_decode(response.data.access));
req.headers.Authorization = `Bearer ${response.data.access}`;
return req;
});
return axiosInstance;
};
export default useAxios;
But the extra information is not there. I tried to overwrite the TokenRefreshSerializer from jwt the same way as I did it with the TokenObtainPairSerializer but it just doesn't add the information
class MyTokenRefreshSerializer(TokenRefreshSerializer):
#classmethod
def get_token(cls, user):
token = super().get_token(user)
token["email"] = user.email
token["information"] = Profile.objects.get(user=user).information
print(token)
return token
It doesn't even print the token in my console but I have no clue what else I should try here.
Before anyone asks, yes I specified that the TokenRefreshView should use the custom serializer.
class MyTokenRefreshView(TokenRefreshView):
serializer_class = MyTokenRefreshSerializer
However, after a while of being logged into the application, the email and information key value pairs disappear from the context.
Any idea about how this can be solved will be much appreciated!

GET request from NetSuite to Oracle EPM, but faced with "Authorization Required - You are not authorized to access the requested resource

Error: "Authorization Required - You are not authorized to access the requested resource. Check the supplied credentials (e.g., username and password)."
Using the same exact headers and URL, I am successfully able to make the request get through via Postman and Powershell. But when doing the call via SuiteScript, I get the auth error. I am thinking it may have something to do with me constructing the headers.
Here is the code I used via NetSuite Debugger:
require(['N/https', 'N/encode'], function(https, encode) {
function fetchCSVdata() {
var authObj = encode.convert({
string : "username:password",
inputEncoding : encode.Encoding.UTF_8,
outputEncoding : encode.Encoding.BASE_64
});
var psswd = 'Basic ' + authObj;
var headerObj = {'Authorization' : psswd};
var response = https.get({
url: 'https://<bleep>.pbcs.us6.oraclecloud.com/interop/rest/11.1.2.3.600/applicationsnapshots/DemandPlan_ExportItemPlan.csv/contents',
headers: headerObj
});
return response.body;
};
var x = fetchCSVdata();
log.debug("error", x);
});
Looking at some working code of mine it is different than yours but I don't see the error.
var authstring = encode.convert({string: 'username:password',
inputEncoding: encode.Encoding.UTF_8,
outputEncoding: encode.Encoding.BASE_64});
var headerObj = {Authorization: 'Basic '+ authstring };
var response = https.get({url: 'https://webservices.XXX.com', headers: headerObj});

Authorize GMail API with JWT

I'm trying to send an email through the gmail API from a Node.js application. I had this working, following the documentation and using the node-mailer package. However, I noticed that when we change our organizations password, the connection is no longer good (which makes sense). I'm therefore trying to authorize with a JWT instead.
The JWT is correctly generated and posted to https://oauth2.googleapis.com/token. This request then returns an access_token.
When it comes time to write and send the email, I tried to simply adapt the code that was previously working (at the time with a client_secret, client_id and redirect_uris):
const gmail = google.gmail({ version: 'v1', auth: access_token });
gmail.users.messages.send(
{
userId: 'email',
resource: {
raw: encodedMessage
}
},
(err, result) => {
if (err) {
return console.log('NODEMAILER - The API returned: ' + err);
}
console.log(
'NODEMAILER Sending email reply from server: ' + result.data
);
}
);
The API keeps returning Error: Login Required.
Does anyone know how to solve this?
EDIT
I've modified my code and autehntication to add the client_id and client_secret:
const oAuth2Client = new google.auth.OAuth2(
credentials.gmail.client_id,
credentials.gmail.client_secret,
credentials.gmail.redirect_uris[0]
);
oAuth2Client.credentials = {
access_token: access_token
};
const gmail = google.gmail({ version: 'v1', auth: oAuth2Client });
gmail.users.messages.send(
{
userId: 'email',
resource: {
raw: encodedMessage
}
},
(err, result) => {
if (err) {
return console.log('NODEMAILER - The API returned: ' + err);
}
console.log(
'NODEMAILER Sending email reply from server: ' + result.data
);
}
);
But now the error is even less precise: Error: Bad Request
Here's the final authorization code that worked for me:
var credentials = require('../../credentials');
const privKey = credentials.gmail.priv_key.private_key;
var jwtParams = {
iss: credentials.gmail.priv_key.client_email,
scope: 'https://www.googleapis.com/auth/gmail.send',
aud: 'https://oauth2.googleapis.com/token',
exp: Math.floor(new Date().getTime() / 1000 + 120),
iat: Math.floor(new Date().getTime() / 1000),
sub: [INSERT EMAIL THAT WILL BE SENDING (not the service email, the one that has granted delegated access to the service account)]
};
var gmail_token = jwt.sign(jwtParams, privKey, {
algorithm: 'RS256'
});
var params = {
grant_type: 'urn:ietf:params:oauth:grant-type:jwt-bearer',
assertion: gmail_token
};
var params_string = querystring.stringify(params);
axios({
method: 'post',
url: 'https://oauth2.googleapis.com/token',
data: params_string,
headers: {
'Content-Type': 'application/x-www-form-urlencoded'
}
}).then(response => {
let mail = new mailComposer({
to: [ARRAY OF RECIPIENTS],
text: [MESSAGE CONTENT],
subject: subject,
textEncoding: 'base64'
});
mail.compile().build((err, msg) => {
if (err) {
return console.log('Error compiling mail: ' + err);
}
const encodedMessage = Buffer.from(msg)
.toString('base64')
.replace(/\+/g, '-')
.replace(/\//g, '_')
.replace(/=+$/, '');
sendMail(encodedMessage, response.data.access_token, credentials);
});
});
So that code segment above uses a private key to create a JSON Web Token (JWT), where: iss is the service account to be used, scope is the endpoint of the gmail API being accessed (this must be preauthorized), aud is the google API oAuth2 endpoint, exp is the expiration time, iat is the time created and sub is the email the service account is acting for.
The token is then signed and a POST request is made to the Google oAuth2 endpoint. On success, I use the mailComposer component of NodeMailer to build the email, with an array of recipients, a message, a subject and an encoding. That message is then encoded.
And here's my sendMail() function:
const oAuth2Client = new google.auth.OAuth2(
credentials.gmail.client_id,
credentials.gmail.client_secret,
credentials.gmail.redirect_uris[0]
);
oAuth2Client.credentials = {
access_token: access_token
};
const gmail = google.gmail({ version: 'v1', auth: oAuth2Client });
gmail.users.messages.send(
{
userId: 'me',
resource: {
raw: encodedMessage
}
},
(err, result) => {
if (err) {
return console.log('NODEMAILER - The API returned: ' + err);
}
console.log(
'NODEMAILER Sending email reply from server: ' + result.data
);
}
);
In this function, I am creating a new googleapis OAuth2 object using the credentials of the service account (here stored in an external file for added security). I then pass in the access_token (generated in the auth script with the JWT). The message is then sent.
Pay attention to the userId: 'me' in the sendMail() function, this was critical for me.
This is the way I was able to only use googleapis package instead of axios + googleapis with your service account. You will need domain wide authority for this account with the scope used below associated with it. Follow this to do that https://support.google.com/a/answer/162106?hl=en
You can also use the mailComposer example up above to create the email. keys is the service_credentials.json file you get when making this service account
const { google } = require('googleapis');
const scope = ["https://www.googleapis.com/auth/gmail.send"];
const client = new google.auth.JWT({
email: keys.client_email,
key: keys.private_key,
scopes: scope,
subject: "emailToSendFrom#something.com",
});
await client.authorize();
const gmail = google.gmail({ version: 'v1', auth: client});
const subject = '🤘 Hello 🤘';
const utf8Subject = `=?utf-8?B?${Buffer.from(subject).toString('base64')}?=`;
const messageParts = [
'From: Someone <emailToSendFrom#something.com>',//same email as above
'To: Someone <whoever#whoever.com>',
'Content-Type: text/html; charset=utf-8',
'MIME-Version: 1.0',
`Subject: ${utf8Subject}`,
'',
'This is a message just to say hello.',
'So... <b>Hello!</b> 🤘❤️😎',
];
const message = messageParts.join('\n');
// The body needs to be base64url encoded.
const encodedMessage = Buffer.from(message)
.toString('base64')
.replace(/\+/g, '-')
.replace(/\//g, '_')
.replace(/=+$/, '');
const res = await gmail.users.messages.send({
userId: 'me',
requestBody: {
raw: encodedMessage,
},
});
console.log(res.data);

CORS Issue with AWS Lambda and Ajax

So I have been working on this serverless configuration that calls a Lambda function through ajax. The I enable CORs through the API Gateway, and I have made sure of the domain I specified. This domain works when calling other lambda functions within the same API.
Now for the weird stuff.
I send a post request (I am trying to upload a file through ajax, lambda, and S3), to my API. If I configure the Access-Control-Allow-Origin so that it points to the domain WITHOUT the http in front of it. Ex: example.com. When I try to call this i get:
Failed to load https://m562ogkc1l.execute-api.us-east-1.amazonaws.com/test/upload: Response to preflight request doesn't pass access control check: The 'Access-Control-Allow-Origin' header has a value 'example.com' that is not equal to the supplied origin. Origin 'http://example.com' is therefore not allowed access.
Ok fine, this is assumed, since that's not the proper domain. So when I add in the http (http://example.com) for the CORs of the API I get:
Failed to load https://m562ogkc1l.execute-api.us-east-1.amazonaws.com/test/upload: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://example.com' is therefore not allowed access. The response had HTTP status code 400.
What this seems like to me is that there is an issue elsewhere, except I don't know where the issue lies.
I have made sure the data I pass for parameters of the ajax call are strignified (JSON.stringify()), and I am NOT running an AWS Lambda Proxy which means I shouldn't be configuring responses on the lambda side of things.
This all really confuses me and I wish AWS had better documentation and examples since they really want to push these serverless services.
Further code is here:
Ajax:
$('#submitButton').on('click', function(){
//console.log(document.getElementById('fileUpload').value.substring(12));//C:\fakepath\ in front of filename (size = 12)
$.ajax({
type: 'POST',
url: 'https://m562ogkc1l.execute-api.us-east-1.amazonaws.com/test/upload',
data: JSON.stringify({"id": id,"name": document.getElementById('fileUpload').value.substring(12),"body": document.getElementById('fileUpload').files[0]}),
contentType: "application/json",
success: function(data){
console.log(data);
//location.reload();
}
});
return false;
});
Lambda:
const AWS = require('aws-sdk');
var s3 = new AWS.S3();
exports.handler = async (event) => {
let encodedImage = JSON.parse(event.body);
let decodedImage = Buffer.from(encodedImage, 'base64');
var filePath = event.id + '/' + event.name
var params = {
"Body": decodedImage,
"Bucket": "repository.example.com",
"Key": filePath
};
return await new Promise((resolve, reject) => {
s3.upload(params, function(err, data){
if(err) {
let response = {
"statusCode": 200,
"headers": {
"Access-Control-Allow-Origin": "http://example.com"
},
"body": JSON.stringify(err),
"isBase64Encoded": false
};
resolve(response);
} else {
let response = {
"statusCode": 200,
"headers": {
"Access-Control-Allow-Origin": "http://example.com"
},
"body": JSON.stringify(data),
"isBase64Encoded": false
};
resolve(response);
}
});
});
};
(Yes I threw in some response configuration for the function, I just wanted to see if it would work)

POST binary data from browser to JFrog / Artifactory server without using form-data

So we get a file (an image file) in the front-end like so:
//html
<input type="file" ng-change="onFileChange">
//javascript
$scope.onFileChange = function (e) {
e.preventDefault();
let file = e.target.files[0];
// I presume this is just a binary file
// I want to HTTP Post this file to a server
// without using form-data
};
What I want to know is - is there a way to POST this file to a server, without including the file as form-data? The problem is that the server I am send a HTTP POST request to, doesn't really know how to store form-data when it receives a request.
I believe this is the right way to do it, but I am not sure.
fetch('www.example.net', { // Your POST endpoint
method: 'POST',
headers: {
"Content-Type": "image/jpeg"
},
body: e.target.files[0] // the file
})
.then(
response => response.json() // if the response is a JSON object
)
You can directly attach the file to the request body. Artifactory doesn't support form uploads (and it doesn't look like they plan to)
You'll still need to proxy the request somehow to avoid CORS issues, and if you're using user credentials, you should be cautious in how you treat them. Also, you could use a library like http-proxy-middleware to avoid having to write/test/maintain the proxy logic.
<input id="file-upload" type="file" />
<script>
function upload(data) {
var file = document.getElementById('file-upload').files[0];
var xhr = new XMLHttpRequest();
xhr.open('PUT', 'https://example.com/artifactory-proxy-avoiding-cors');
xhr.send(file);
}
</script>
Our front-end could not HTTP POST directly to the JFrog/Artifactory server. So we ended up using a Node.js server as a proxy, which is not very ideal.
Front-end:
// in an AngularJS controller:
$scope.onAcqImageFileChange = function (e) {
e.preventDefault();
let file = e.target.files[0];
$scope.acqImageFile = file;
};
// in an AngularJS service
createNewAcqImage: function(options) {
let file = options.file;
return $http({
method: 'POST',
url: '/proxy/image',
data: file,
headers: {
'Content-Type': 'image/jpeg'
}
})
},
Back-end:
const express = require('express');
const router = express.Router();
router.post('/image', function (req, res, next) {
const filename = uuid.v4();
const proxy = http.request({
method: 'PUT',
hostname: 'engci-maven.nabisco.com',
path: `/artifactory/cdt-repo/folder/${filename}`,
headers: {
'Authorization': 'Basic ' + Buffer.from('cdt-deployer:foobar').toString('base64'),
}
}, function(resp){
resp.pipe(res).once('error', next);
});
req.pipe(proxy).once('error', next);
});
module.exports = router;
not that we had to use a PUT request to send an image to Artifactory, not POST, something to do with Artifactory (the engci-maven.nabisco.com server is an Artifactory server). As I recall, I got CORS issues when trying to post directly from our front-end to the other server, so we had to use our server as a proxy, which is something I'd rather avoid, but oh well for now.

Resources