I am using the Node JS library for BigQuery Data Transfer API to create a scheduled query.
I get this error message
BigQuery Data Transfer API has not been used in project xxxxxxxxx before or it is disabled. Enable it by visiting....
I visited the link and the project is not even my project, I do not have access to it and the project number provided is different to the project I have in the request.
I have used user access token in Header Authorization Bearer for OAuth access.
Anyone experience this issue before?
Related
I'm building an application and I need to retrieve all management groups in a hierarchical style like the subscription IDs and that resource informations .
I looked up in the SDK for java :(https://github.com/Azure/azure-sdk-for-java)
but can't find anything related with my topic
I went through tones of documentations but still didn't find any significant class that can help
Refer to MSDOC to access the management group list from azure by using azure REST Api call for the list management api.
To access the above api we should have the following details:
Client ID
Client Secret
Tenant ID
With the help of above credentials, we can generate access token to access the api’s in azure.
Below is the API to generate the access token.
https://login.microsoftonline.com/<your-tenant-id>/oauth2/token
Hit the api using postman: *
Use the generated token as header in the below api to access management groups:
https://management.azure.com/providers/Microsoft.Management/managementGroups?api-version=2020-05-01
I have implemented the above scenario using java and spring boot by following the code.
Output:
In my case, I have only one management groups. So, I'm getting one management group as shown below.
I'm working on a Laravel site where users could connect their Google account and manage their Google calendars directly from within the site. I found https://github.com/spatie/laravel-google-calendar but I'm not sure that it really meets my needs.
That package doesn't seem to follow the authentication flow (OAuth 2) I'm used to with other APIs. It uses service accounts and stores credentials in JSON files where I usually save access and refresh tokens in my users table.
So am I missing something or is that package not made for that kind of site ?
I want to build a web application in Go. I'm using Google App Engine for deployment combined with Identity Aware Proxy (IAP) to authenticate user access.
I want to know how to access the authentication to get the user email, which I can link to app data stored in a back end database. Essentially I want to avoid my users logging in and then having to authenticate again to get their profiles from the back end.
I have looked into the IAP documentation and I can see it uses JWT Headers and that is where my knowledge lacks. My guess would be a link to the incoming request which accesses those headers to get the email.
How can I get Google Analytics report on my SPA without authorization by Google?
Without authorization as a participant of project made I get this error
403 PERMISSION_DENIED
403 PERMISSION_DENIED
Means that you do not have permission to do what it is you are trying to do.
How can I get Google Analytics report on my SPA without authorization by Google?
You cant you must always be authenticated in order to access google analytics data.
In order to access private user data you must have permission to access it. Your google analytics data is private. There for you must be authorized though google in order to access that data.
If you are looking for a way of doing it without requesting access of the user. For example if you would like to display data from your personal Google analytics account to others without requiring that they have access to the data then you could use a service account. Service accounts are preauthorized so that your code will run and have access to the data without having to request access of a user.
Don't do Analytics API requests for your own data at the client side.
What you should do is have a server side job that requests data from the API every day and caches the result. Then you can provide the result for clients.
The API have limits in place that avoid more than 10k requests per day, having this done at client side means that you will reach this limit. This limit is in place exactly to discourage use cases like yours.
Having this done on the client side means exposing your credentials at the client side which would also likely allow users to query data you don't intend to share or maybe even change settings in your account depending on which scopes you are authorized.
You have to rethink your design to not do this job on client side.
Since Parse is mainly about cross-platform management of data in the cloud and the security of data is managed through Class-Level and Object-Level (ACL) controls, one is told to pass SDK/REST keys to client-side confidently as long as the data security levels are properly set. For example, file uploads through Parse REST Api requires specific headers like
X-Parse-Application-Id
X-Parse-REST-API-Key
to https://api.parse.com/1/files/ endpoint. For we have exposed those credentials to the client-side, isn't it possible for anyone to abuse this endpoint to upload countless irrelevant files to the file storage on Parse platform on behalf of the application? Yes, one can secure the data by setting the security levels properly but the file storage? The file storage quota of an application can be exploited, can it not?
The main question is: do that file uploads count as an API request and uploaded files count before they're linked to any object in an application? If they do, isn't it insecure for exploitation?