I'm trying to connect my server to Google's API but I keep getting the following error.
google.auth.exceptions.RefreshError: ('invalid_scope: h is not a valid audience string.', u'{\n "error" : "invalid_scope",\n "error_description" : "h is not a valid audience string."\n}')
I've looked around but I just can't seem to get why google's supplied code is giving me that error. I think it's a problem with my service.json, but I can't pinpoint what it is.
This is the code, which is pretty much swiped from Google with very limited changes.
from google.oauth2 import service_account
import googleapiclient.discovery
SCOPES = 'https://www.googleapis.com/auth/drive.metadata.readonly'
SERVICE_ACCOUNT_FILE = 'service.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
drive = googleapiclient.discovery.build('drive', 'v3', credentials=credentials)
response = drive.files().list(
pageSize=10,fields="nextPageToken, files(id, name)").execute()
print(response)
What I'm looking to do is automatically download a spreadsheet to local using Google's API maybe once an hour without user verification.
I'm having this error.
It looks like scopes is expected to be iterable, so when a single string is given, the library processes each letter separately (the first being 'h').
Try changing line 4 to add brackets:
SCOPES = ['https://www.googleapis.com/auth/drive.metadata.readonly']
Related
like title states I wanna find an old friend of mine, but the only thing I know it's his username, no tag number..
No mutual servers or anything unfortunally, and the discord search feature didn't helped.
So my idea was to send a friend request to all 9999 possible tags with that username.
But doing it manually, well it would take so long, I could try setup a macro, but I would like to know if there's a way I could make a program or discord bot using discord apis to accomplish this faster.
I would just need check the pending friends list in the end and should be done.
Weird nobody did yet such a thing, lots of people would benefit with such a software or whatever it could be.
What this code does is create range from 0 to 9999 and sends friend request to all usernames within range.
import requests
headers = {
'Authorization': 'YOUR_TOKEN_HERE',
}
data = {
'username': 'YOUR_FRIEND_USERNAME',
'discriminator': 0,
}
var link = 'https://discord.com/api/v9/users/#me/relationships'
for i in range(10000):
data[discriminator] = i
r = requests.post(link, headers=headers, json=data).text
print(str(i) + ":" + str(r))
But remember there's legal advice:
Self bots are illegal in Discord's terms of service.
You are not allowed to send such no. of friend requests. Off course Discord will restrict/expire/ban your token.
I have trained a CNN using Tensorflow/Keras and successfully deployed it to Sagemaker using the saved_model format. It answers pings and the dashboard shows it is running.
I now need to be able to send it images and get back inferences. I have already successfully deployed an ANN to Sagemaker and gotten predictions back, so most of the "plumbing" is already working.
The Ruby performing the request is as follows:
def predict
sagemaker = Aws::SageMakerRuntime::Client.new(
access_key_id: Settings.sagemaker_key_id,
secret_access_key: Settings.sagemaker_secret,
region: Settings.sagemaker_aws_region
)
response = sagemaker.invoke_endpoint(endpoint_name: Settings.sagemaker_endpoint_name,
content_type: 'application/x-image',
body: File.open('developer/ai/caox_test_128.jpg', 'rb'))
return response[:body].string
end
(For now, I simply hardcoded a known file for testing.)
When I fire this, I get back this error: Aws::SageMakerRuntime::Errors::ModelError: Received client error (400) from model with message "{ "error": "JSON Parse error: Invalid value. at offset: 0" }"
It's almost as if the model is expecting more in the body than just the image, but I can't tell what. AWS's documentation has an example for Python using boto:
import boto3
import json
endpoint = '<insert name of your endpoint here>'
runtime = boto3.Session().client('sagemaker-runtime')
# Read image into memory
with open(image, 'rb') as f:
payload = f.read()
# Send image via InvokeEndpoint API
response = runtime.invoke_endpoint(EndpointName=endpoint, ContentType='application/x-image', Body=payload)
# Unpack response
result = json.loads(response['Body'].read().decode())
As far as I can tell, they are simply opening a file and sending it directly to sagemaker with no additional pre-processing. And, insofar as I can tell, I'm doing exactly what they are doing in Ruby, just using 'aws-sdk'.
I've looked through Amazon's documentation, and for examples on Google, but there is scant mention of doing anything special before sending the file, so I'm scratching my head.
What else do I need to consider when sending a file to a Sagemaker endpoint running a TensorFlow/Keras CNN to get it to respond with a prediction?
I am attempting to write some code in webDNA to connect to the google drive api. Using the service account seems to be the best solution for the given problem. From what I have read the process is... create the JWT, send the JWT to google to get a token response, then send the taken response to call api methods.I believe my issue is with the private key.
I build and encrypt the header:
[text]header={"alg":"RS256","typ":"JWT"}[/text]
[text]header=[encrypt method=Base64][header][/encrypt][/text]
[text]header=[db_base64URL varName=header][/text][!]custom function to deal with special characters[/!]
Next build and encrypt the claim(added white space for readability):
[text]claim={
"iss":"xxx",
"scope":"https://www.googleapis.com/auth/drive",
"aud":"https://www.googleapis.com/oauth2/v4/token",
"exp":[Math][cTime]+3600[/Math],
"iat":[cTime]
}[/text]
[text]claim=[encrypt method=Base64][claim][/encrypt][/text]
[text]claim=[db_base64URL varName=claim][/text]
Those sections seem to be correct, now to build the signature:
[text]p_key=-----BEGIN PRIVATE KEY-----xxxx-----END PRIVATE KEY-----\n[/text]
[text]sig=[encrypt method=SHA256][header].[claim].[p_key][/encrypt][/text]
[text]sig=[encrypt method=Base64][sig][/encrypt][/text]
[text]sig=[db_base64URL varName=sig][/text]
I have tried moving the [p_key] around, outside the sha256 encryption and inside, with and without the '.', I don't get an error till I try to send it to google using [TCPConnect] and [TCPSend] here:
[text show=T]response=[!]
[/!][TCPconnect host=accounts.google.com&SSL=T&port=443][!]
[/!][TCPsend skipheader=T]POST /o/oauth2/token HTTP/1.1[crlf][!]
[/!]Host: accounts.google.com[crlf][!]
[/!]Content-Type: application/x-www-form-urlencoded[crlf][!]
[/!]Content-Length: [countchars][sendData][/countChars][crlf][!]
[/!]Connection: close[crlf][!]
[/!][crlf][!]
[/!][sendData][crlf][!]
[/!][/TCPsend][!]
[/!][/TCPconnect][/text]
When the response is shown it is displayed as:
{
"error": "invalid_grant",
"error_description": "Invalid JWT Signature."
}
This error message is less that helpful, from what I have read it could mean one(or more) of any number of things and google's documentation on this is not exactly helpful. If anyone has any experience using the google apis through webDNA I would appreciate any help you could give!
Whenever I call Bing Translation API [HTTP] to translate some text, first time it works fine, and second time onwards it gives me 'bad request' [status code 400] error. If I wait for 10 or so minutes and then try again, then first request is successful, but second one onwards same story. I have a free account [2million chars translation] with Bing Translation APIs, are there any other limitations calling this API?
Thanks, Madhu
Answer:
hi, i missed to subscribing to Microsoft Translator DATA set subscription. Once i get the same, then things have solved. i.e; once i have signed up for https://datamarket.azure.com/dataset/bing/microsofttranslator then things are working.
i was generating the access_token correctly, so that is not an issue.
thanks, madhu
i missed to subscribing to Microsoft Translator DATA set subscription. Once i get the same, then things have solved. i.e; once i have signed up for https://datamarket.azure.com/dataset/bing/microsofttranslator then things are working.
i was
thanks, madhu
As a note to anyone else having problems, I figured out that the service only allows the token to be used once when using the free subscription. You have to have a paid subscription to call the Translate service more than once with each token. This limitation is, of course, undocumented.
I don't know if you can simply keep getting new tokens -- I suspect not.
And regardless of subscription, the tokens do expire every 10 minutes, so ensure you track when you receive a token and get a new one if needed, e.g. (not thread-safe):
private string _headerValue;
private DateTime _headerValueCreated = DateTime.MinValue;
public string headerValue {
get {
if(_headerValueCreated < DateTime.Now.AddMinutes(-9)) {
var admAuth = new AdmAuthentication("myclientid", "mysecret");
_headerValue = "Bearer " + admAuth.GetAccessToken();
_headerValueCreated = DateTime.Now;
}
return _headerValue;
}
}
Has anyone battled 500 errors with the Google spreadsheet API for google domains?
I have copied the code in this post (2-legged OAuth): http://code.google.com/p/google-gdata/source/browse/trunk/clients/cs/samples/OAuth/Program.cs, substituted in my domain;s API id and secret and my own credentials, and it works.
So it appears my domain setup is fine (at least for the contacts/calendar apis).
However swapping the code out for a new Spreadsheet service / query instead, it reverts to type: remote server returned an internal server error (500).
var ssq = new SpreadsheetQuery();
ssq.Uri = new OAuthUri("https://spreadsheets.google.com/feeds/spreadsheets/private/full", "me", "mydomain.com");
ssq.OAuthRequestorId = "me#mydomain.com"; // can do this instead of using OAuthUri for queries
var feed = ssservice.Query(ssq); //boom 500
Console.WriteLine("ss:" + feed.Entries.Count);
I are befuddled
I had to make sure to use the "correct" class:
not
//using SpreadsheetQuery = Google.GData.Spreadsheets.SpreadsheetQuery;
but
using SpreadsheetQuery = Google.GData.Documents.SpreadsheetQuery;
stinky-malinky
Seems you need the gdocs api to query for spreadsheets, but the spreadsheet api to query inside of a spreadsheet but nowhere on the internet until now will you find this undeniably important tit-bit. Google sucks hard on that one.