I'm trying to use the Youtube search endpoint, which does not seem to require an OAuth token. I've read a few tutorials and they only pass an API key. I'm getting a "Failed to load resource: the server responded with a status of 401 ()" error in my console in Chrome Dev Tools. Specifically, looking at the Network tab I get this:
I notice it says that there is an error: "invalid_token" but I pass the api key so they must be talking about the OAuth token? I'm confused because it shouldn't need one, especially because I'm just doing a query for public data. Even the Try This API portion of the endpoint documentation does not need one. Most importantly, my call in Postman works and just pasting the endpoint in my browser directly works. Why doesn't it work? This is using an axios call from a ReactJS frontend.
const apiKey = 'MY_API_KEY';
const url = 'https://www.googleapis.com/youtube/v3/search';
const response = await axios.get(url, {
params: {
part: 'snippet',
maxResults: 5,
q: songName,
key: apiKey
}
});
What was happening was that I was using axios.defaults.headers.common['Authorization'] =Bearer ${params.access_token}; in other calls for another API. This causes default everything to have this access token! So what I did for now is delete axios.defaults.headers.common["Authorization"]; -- the solution is pretty obvious, just make sure you have no extra headers because Search is not a OAuth endpoint!
Related
I've recently started using Postman and I am testing an API where I get a CSRF token and then login but I always get a CSRF token mismatch. I am including X-XSRF-TOKEN header but I think the issue is around the cookies not being stored correctly.
I am calling a Laravel Sanctum endpoint to get a CSRF token and when I look in the in the console I can see Set-Cookie response headers
However, when I look in the cookies tab it says no cookies were received from the server
However, when I look at the cookies store they are listed for my test domain (home.local)
Due to this issue, when I send a request to the login, the session cookies are not sent in the request as shown in the console on the request to the login endpoint
I can do this fine using Insomnia.Rest client so I know the API is working as expected - I am however trying to replace Insomnia with Postman.
I've tried Google, but I've only found some bugs that were introduced that seemed to cause something similar back in 2016
Update
I managed to Postman working with production fine using a pre-request script to get the CSRF token and set the environment variable using the below:
const url = pm.environment.get('base_url');
const referer = pm.environment.get('Referer');
pm.sendRequest({
url: `${url}/sanctum/csrf-cookie`,
method: 'GET',
}, function (err, response, {cookies}) {
if (!err) {
console.log("cookies", cookies);
pm.environment.set('xsrf_token', cookies.get('XSRF-TOKEN'))
}
});
Although this worked on production and successfully did the POST request, on my local dev PC, I was still getting the CSRF mismatch.
Although the request/response looked the same between dev/and prod I for some reason had the idea to change my dev URL from my-app.home.local to my-app.home.com and now the cookies are received and send in the next request to login without getting a CSRF token mismatch.
There's clearly an issue with postman here but not sure if it's something I'm doing or a bug in Postman. Does .local mean something different?
I'm not getting a token back from my epic app.
I'm calling my app (PFI_app, non-prod. client id: [my_client_id]) from a browser script:
FHIR.oauth2.authorize({
'client_id':[my_client_id],
'scope':'openid, fhirUser,PATIENT.READ, PATIENT.SEARCH, OBSERVATION.READ, OBSERVATION.SEARCH',
'redirect_uri':[my_redirect_uri],
'state':'abc123',
'aud':'https://fhir.epic.com/interconnect-fhir-oauth/api/fhir/r4'
});
I get prompted to login at signin.epic.com and i use the credentials FHIR (username) and EpicFhir11!(password), which i got from this page: https://fhir.epic.com/Documentation?docId=testpatients.
at my redirect url page i use the following to get the access token:
FHIR.oauth2.ready()
.then(function(client){
myapp.smart = client
console.log(client);
})
BUT, i keep getting the following error message:
Failed to load resource: the server responded with a status of 400 (Bad Request)
app.html:39 https://fhir.epic.com/interconnect-fhir-oauth/oauth2/token
i get another message saying: URL: https://fhir.epic.com/interconnect-fhir-oauth/oauth2/token
unauthorized_client
this leads me to believe that i logged in with an improper user who isn't authorized.
ultimately, i don't get a token. any idea why? is it because I'm using improper login credentials and therfore that user doesn't have access to get a token.
also, I'm using fhir-client.js not, fhir-client-v2.js, is that a problem?
UPDATE:
so I just waited and token issue resolved itself. perhaps there was a time period I had to wait after changing my epic fhir app information at fhir.epic.com. I changed the "Application Audience" from patients to "clinicians and administrative users." I had been logging in to epic when prompted as an admin for many hours before I wrote this post, but I can't think of anything that I changed to my code. I just waited.
now my last remaining problem is that when I try and search for patients from the sandbox with this code:
var obs = await fetch(myapp.smart["state"]["serverUrl"]+"/Patient?address=123%20Main%20St.&address-city=Madison&address-postalcode=53703&address-state=Wisconsin&family=Mychart&gender=Female&given=Allison&telecom=608-123-4567",{
headers:{
"Accept":"application/json+fhir",
"Authorization":"Bearer"+myapp.smart["state"]["tokenResponse"]["access_token"]
}
}).then(function(data){
return data;
});
var response = await obs.json();
console.log( response );
I get another "unauthorized message":
Failed to load resource: the server responded with a status of 401 (Unauthorized) https://fhir.epic.com/interconnect-fhir-oauth/api/FHIR/R4/Patient?address=123%20Main%20St.&address-city=Madison&address-postalcode=53703&address-state=Wisconsin&family=Mychart&gender=Female&given=Allison&telecom=608-123-4567
this is where I got the syntax for structuring this call to the Patient.search resource:
https://fhir.epic.com/Sandbox?api=932
any ideas why I'm unauthorized to make this call? again, I'm logged in using the provider-facing app user credentials listed here: https://fhir.epic.com/Documentation?docId=testpatients (username: FHIR)
UPDATE:
so I changed the FHIR.oauth2.ready call to include the request and it worked. I'm not sure why I couldn't include the provided token as a Bearer token in fetch but the following worked:
var req = "/Patient?address=123%20Main%20St.&address-city=Madison&address-postalcode=53703&address-state=Wisconsin&family=Mychart&gender=Female&given=Allison&telecom=608-123-4567"
FHIR.oauth2.ready( client => client.request(req) ).then(function(output){
console.log(output); /* should include search results for the patient */
});
thanks for any help
To summarize, I changed the FHIR.oauth2.ready call to include the request and it worked.:
var req = "/Patient?address=123%20Main%20St.&address-city=Madison&address-postalcode=53703&address-state=Wisconsin&family=Mychart&gender=Female&given=Allison&telecom=608-123-4567"
FHIR.oauth2.ready( client => client.request(req) ).then(function(output){
console.log(output); /* should include search results for the patient */
});
In addition, I had to wait a period of time, possibly due to the fact that I made some changes in my epic fhir app.
I am using google business API and trying to get location list.
I am going trough their documentation and using project from the link below as basis
https://developers.google.com/my-business/content/implement-oauth
Using that project I am able to successfully retrieve accounts list.
The problem occures when I try to retrieve locations list using accountID. Here is a link for their documentation
https://developers.google.com/my-business/content/manage-locations
Acording to documentation, to get location list for specific account, I should use following request
GET
https://mybusinessbusinessinformation.googleapis.com/v1/{accountId}/locations
Authorization: Bearer <access_token>
This is the code snippet, that I added to their sample project
function retrieveGoogleMyBusinessLocations(accessToken) {
$.ajax({
type: 'GET',
url: 'https://mybusinessbusinessinformation.googleapis.com/v1/{accID}/locations',
headers: {
'Authorization' : 'Bearer ' + accessToken
},
success: function(returnedData) {
var e = document.createElement("pre")
e.innerHTML = JSON.stringify(returnedData, undefined, 2);
document.body.appendChild(e);
}
});
}
When I do this request, it gives "CORS error".
The error in the console is following
Access to XMLHttpRequest at 'https://mybusinessbusinessinformation.googleapis.com/v1/xxx/locations' from origin 'http://localhost:8001' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource.
GET https://mybusinessbusinessinformation.googleapis.com/v1/xxx/locations net::ERR_FAILED
From the error message it seems that server does no accept requests from localhost:8001, but if I do request to different endpoint it will return result.
For example if I change utl from
https://mybusinessbusinessinformation.googleapis.com/v1/{accID}/locations
to
https://mybusinessbusinessinformation.googleapis.com/v1/accounts/{accID}/locations?readMask=categories
With the second url it will return successfull result.
I am confused why it is allowing requests to one endpoint and blocks requests to other endpoint.
Can anyone help with this problem?
Have you tried making direct API requests via the Google Developers OAuth 2.0 Playground?
I tried to reproduce your error but got the expected 404 Not Found response for the unsupported endpoint pattern.
I'm making requests to Google Time Zone API using Ajax and by following this tutorial:
var apicall = 'https://maps.googleapis.com/maps/api/timezone/json?location=39.6034810,-119.6822510×tamp=1331161200&key=YOUR_API_KEY'
var xhr = new XMLHttpRequest() // create new XMLHttpRequest2 object
xhr.open('GET', apicall) // open GET request
xhr.onload = function(){
if (xhr.status === 200){ // if Ajax request successful
var output = JSON.parse(xhr.responseText) // convert returned JSON string to JSON object
console.log(output.status) // log API return status for debugging purposes
if (output.status == 'OK'){ // if API reports everything was returned successfully
// do something
}
}
else{
alert('Request failed. Returned status of ' + xhr.status)
}
}
xhr.send() // send request
I've replaced the key parameter with my generated API Key in Google Console, and the request works. The problem is I can't seem to restrict the API key access by either Referrer or Server IP in Google Console- specifying the domain of my server or IP doesn't work. My guess is Ajax doesn't send referrer or server IP info along with the request for Google to determine if it's a valid request? At the moment I'm stuck with using no API key restrictions, though this is not a good idea of course.
Anyone have any experience restricting Google API key access when making calls to Google APIs via AJax?
just remind, just do not embed API keys directly in code
google api
I have my own instance of Parse Server running on AWS and until now Cloud Functions have been working great, but with one caveat: they cannot be successfully called publicly, i.e. they require an authorisation key be sent in the REST request header.
I want to set up a Slack Slash Command to my server, and it has to be able to POST a payload without any headers or extra parameters. As a result, my requests are currently unauthorised (returning 403 statuses).
Is there a way to create granular control over a Parse Cloud Function's authorisation (i.e. if it requires master-key header or not), and if not — is there a way of forwarding the request but still through the Parse server?—Or even a way of manipulating the headers of a Slack request? I would rather not have to use another service just for request forwarding.
Thanks!
Two options
Pass in the master key on the client request which should bypass authorization. It's a blunt approach but might be okay in your case (without knowing more details).
Or run a new express endpoint alongside parse and from there call the parse cloud function using the masker key.
var api = new ParseServer(...)
var app = express();
app.use('/parse', api);
app.get('/api/slack', function(req, res) {
//call cloud function passing in master key
// add X-Parse-Master-Key as http header
unirest.post("http://myhost.com:1337/parse/functions/mycloudfunction")
.headers({'X-Parse-Master-Key', MASTER_KEY)
.end(function(response) {
}