When calling a google oauth library method, it fails without error - no amount of try/catch-ing traps any error messages.
I am trying to get an identity token much as I would if I executed gcloud auth print-identity-token from the command line using the gcloud cli.
The reason for wanting the identity token is that another Cloud Function service requires it as Authorization : Bearer [token], and indeed works correctly when I stuff a manually generated identity token in my code. That is not a suitable solution for development or production
The code snippet I wrote, cobbled from numerous sources, to procure an identity token is this:
using (var stream = new FileStream(credentialsFilePath, FileMode.Open, FileAccess.Read))
{
var credentials = GoogleCredential.FromStream(stream);
if (credentials.IsCreateScopedRequired)
{
credentials = credentials.CreateScoped(scopes);
}
OidcToken oidcToken = await credentials.GetOidcTokenAsync(
Options
.FromTargetAudience(scopes[0])
.WithTokenFormat(OidcTokenFormat.Standard));
// this line bombs immediately, jumping out of this method and the calling method.
string token = await oidcToken.GetAccessTokenAsync();
return token;
}
In the above code, scopes[0] is left over code from a previous attempt which contains the endpoint to Cloud Function service. https://subdomain.cloudfunctions.net/cloud-function/v1/ is the general form of the cloud function endpoint I am calling as a part of a web api.
Is this a valid and reasonable way to get the equivalent of gcloud auth print-identity-token? If so, why the epic failure?
I need to use a google service account for service to service authentication. Development environment is visual studio 2019, .net core 3.1, docker/linux
PS - the service account has the cloud function's Cloud Functions Invoker role.
PPS - the issue seems to be related to docker and a set of error messages I get when starting my project in docker. I had ignored them as they were not until now impairing functionality.
at System.Net.Http.CurlHandler.ThrowIfCURLEError(CURLcode error)
at System.Net.Http.CurlHandler.MultiAgent.FinishRequest(StrongToWeakReference`1 easyWrapper, CURLcode messageResult)
running the code on windows works.
The penultimate problem is that I needed to make an upstream method asynchronous and add an await. Now the code above works every time. This change led me to the ultimate problem whose solution is some code refactoring in ConfigureServices() related to AddHttpClient() setup.
The curl exception was due to trying to add logger.loggerFactory.AddGoogle(…) with a bad configuration. this has been a bad hair day.
This question is also an example of what not to do - ie I used too much minimalism to describe the problem.
Related
I 'm trying to post/get score from google play leaderboards
I met all the meta-tag from the documentation including my client id
<meta name="google-signin-client_id" content="XXXXXX-YYYYYYYYYYY.apps.googleusercontent.com" />
I also have setup the google sign-in system and all is fine, however when I try to call the leaderboards API I get the message error: The requested application with ID xxxxxx was not found
I am calling the API like the mentioned in the doc
gapi.client.request({
path: '/games/v1/leaderboards/LEADERBOARD-ID',
params: { maxResults: 3 },
callback: function(response) {
console.log(response);
}
});
I am not sure if the problem is the missing argument to execute a request.
Try to use this API requests method.
gapi.client.Request
An object encapsulating an HTTP request. This object is not
instantiated directly, rather it is returned by gapi.client.request.
There are two ways to execute a request. We recommend that you treat
the object as a promise and use the then method, but you can also use
the execute method and pass in a callback.
You can refer to this Github post for additional reference.
This message: W/AchievementAgent( 3558):
{"code":404,"errors":[{"message":"The requested application with ID
571707973781 was not found.","domain":"global","reason":"notFound "}]}
is a little cryptic but points to a mismatch with the auth
configuration on the console and the application.
You'll want to double check the keystore SHA1 fingerprint of the
keystore you signed the app with and the one configured in the dev
console.
It could also be the bundle ID, but that is hard to mess up since it
is part of the resource data used when running Setup for the plugin.
Also, it could be that the player is not a tester for this game.
For anyone having the same issue, you need to publish the beta version of the game to be able to interact with the game scoreboard.
Note: In the beta version, only tester accounts added to the game can access the scoreboard.
I have secured my API app and I have successfully tested my ADB2C flow with the sample app I found here: https://github.com/Azure-Samples/active-directory-b2c-xamarin-native. Using that structure, I can trigger the sign-in process, and then access my protected API calls.
However I wanted to also use the WindowsAzure.Mobile sdk as a convenience. It is hinted at here: https://cgillum.tech/2016/08/10/app-service-auth-and-azure-ad-b2c-part-2/ that you can trigger the B2C flow from LoginAsync in that class but it does nothing when I call it in that way.
I also found https://azure.microsoft.com/en-us/documentation/articles/app-service-mobile-dotnet-how-to-use-client-library/ (scroll to "Authenticate users with the Active Directory Authentication Library") where I substituted the MSAL calls for getting the token. This triggers the sign-on flow, I get a good token and claims back, then I put it in some JSON and pass it like so:
AuthenticationResult ar = await App.PCApplication.AcquireTokenSilentAsync(App.Scopes, "", App.Authority, App.SignUpSignInpolicy, false);
JObject payload = new JObject();
payload["access_token"] = ar.AccessToken;
user = await App.MobileService.LoginAsync(
MobileServiceAuthenticationProvider.WindowsAzureActiveDirectory, payload);
This call to LoginAsync throws
{Microsoft.WindowsAzure.MobileServices.MobileServiceInvalidOperationException: You do not have permission to view this directory or page.
at Microsoft.WindowsAzure.MobileServices.MobileServiceHttpClient+<ThrowInvalidResponse>d__18.MoveNext () [0x0022f] in <filename unknown>:0
--- End of stack trace from previous location where exception was thrown ---
(snip)
Are they not designed to work together? Are those different kinds of tokens? The reason I'm using B2C is because I really don't WANT to know all that OAUTH stuff :)
In the case of B2C, you are actually getting back an ID token instead of an access token, and I believe the ar.AccessToken property would be null. This property also seems to go away in the latest versions of MSAL.
I suspect you just need to update the payload to "authenticationToken" and instead use ar.IdToken.
I am not sure if you can continue to use the "access_token" key in the payload, but it may be that you can. If not, try "authenticationToken" instead.
One of the features of our Marketplace app makes use of accessing the user's Gmail account via IMAP. We are using the google-api-java-client and google-oauth-java-client libraries and code similar to this example in the java-gmail-imap project as follows:
GoogleCredential credential = new GoogleCredential.Builder().setTransport(HTTP_TRANSPORT)
.setJsonFactory(JSON_FACTORY)
.setServiceAccountId(SERVICE_ACCOUNT_ID)
.setServiceAccountScopes(Arrays.asList(GMAIL_SCOPE))
.setServiceAccountPrivateKey(PRIVATE_KEY)
.setServiceAccountUser(emailAddress)
.build();
credential.refreshToken();
We are then using code based on the examples at https://code.google.com/p/google-mail-oauth2-tools to make the IMAP connection e.g.
IMAPStore imapStore = OAuth2Authenticator.connectToImap("imap.googlemail.com",
993, emailAddress, credential.getAccessToken(), false);
The majority of the time this appears to work correctly, however we are seeing that for a small but significant number of requests the call to Google made by refreshToken() fails with an HTTP 500 error and an HTML response where the JSON would normally be returned e.g.
<p class="large"><b>500.</b> <ins>That's an error.</ins></p>
<p class="large">The server could not process your request.
<ins>That's all we know.</ins></p>
We were advised by a developer advocate at Google that we refresh tokens are not supported for service accounts and we should be using an approach like in this example.
However, it seems like without the call to refreshToken then accessToken is not populated on the credential object and then this results in a NullPointerException when we call OAuth2Authenticator.connectToImap
From the source for GoogleCredential it did seem like executeRefreshToken() is overridden to handle service accounts i.e. instead of performing a refresh it simply requests a new token, and then this bit of code in Credential then handles populating the access token:
TokenResponse tokenResponse = executeRefreshToken();
if (tokenResponse != null) {
setFromTokenResponse(tokenResponse); ....
We were unsure whether we need to enclose our call to refreshToken() in a retry loop to work around the intermittent 500 errors or whether we need to make other changes to our code to follow the recommended approach for this scenario.
Can anyone advise?
I use the java-gmail-imap example code in production (but it is only used to display an inbox in our University portal, there isn't much interaction that would require me to reuse the same refresh token for instance).
Depending on your usage, I wonder if in your case some kind of throttling is coming into play (I've read in places that Gmail can occasionally throttle access).
Elsewhere I've seen Google APIs talk about making retries using an exponential backoff algorithm.
You have to be a little careful when comparing the usage of OAuth 2.0 with the other Google Service APIs and Gmail. Gmail is special in that it uses XOAUTH2. That said I've seen other Google API's that appear to need the refreshToken call. The documentation is a bit unclear and says things like "Refresh the access token, if necessary" (as you say it doesn't seem to work without this step but I haven't done any experimentation with re-using refresh tokens via credential.setRefreshToken(String refreshToken)).
I'd be interested to hear how you get on.
I had originally posted a question about what API to use in regards to making a SharePoint 2010 timer job able to access the twitter API and chose the Spring Social .NET api and have run into another roadblock.
I cannot get the OAuth handshake or 'dance' to work.
I have the consumer key and secret linked to my account, as well as an access token and secret, but any time I try to initialize a TwitterServiceProvider object, any time I attempt to query I get a 401 error.
The console/mvc and wp7.1 examples provided dont give much insight how I can get this code (which should run with no human involvement) to work.
Does anyone have any good resources regarding this?
Thanks in advance
If you already have access token value and secret, you can do something like that:
ITwitter twitter = new TwitterTemplate("consumerKey", "consumerSecret", "accessTokenValue", "accessTokenSecret");
// twitterApi.UserOperations.GetUserProfile();
that is equivalent to :
TwitterServiceProvider serviceProvider = new TwitterServiceProvider("consumerKey", "consumerSecret");
ITwitter twitterApi = serviceProvider.GetApi("accessTokenValue", "accessTokenSecret");
// twitterApi.UserOperations.GetUserProfile();
How do you get the access token secret and value?
Haven't seen many Geneva related questions yet, I have posted this question in the Geneva Forum as well...
I'm working on a scenario where we have a win forms app with a wide installbase, which will be issuing frequent calls to various services hosted by us centrally throughout it's operation.
The services are all using the Geneva Framework and all clients are expected to call our STS first to be issued with a token to allow access to the services.
Out of the box, using the ws2007FederationHttpBinding, the app can be configured to retrieve a token from the STS before each service call, but obviously this is not the most efficient way as we're almost duplicating the effort of calling the services.
Alternatively, I have implemented the code required to retrieve the token "manually" from the app, and then pass the same pre-retrieved token when calling operations on the services (based on the WSTrustClient sample and helpon the forum); that works well and so we do have a solution,but I believeit's not very elegant as it requires building the WCF channel in code, moving away from the wonderful WCF configuration.
I much prefer the ws2007FederationHttpBinding approach where by the client simply calls the service like any other WCF service, without knowing anything about Geneva, and the bindings takes care of the token exchange.
Then someone (Jon Simpson) gave me [what I think is] a great idea - add a service, hosted in the app itself to cache locally retrieved tokens.
The local cache service would implement the same contract as the STS; when receiveing a request it would check to see if a cahced token exists, and if so would return it, otherwise it would call the 'real' STS, retrive a new token, cache it and return it.
The client app could then still use ws2007FederationHttpBinding, but instead of having the STS as the issuer it would have the local cache;
This way I think we can achieve the best of both worlds - caching of tokens without the service-sepcific custom code; our cache should be able to handle tokens for all RPs.
I have created a very simple prototype to see if it works, and - somewhat not surprising unfortunately - I am slightly stuck -
My local service (currently a console app) gets the request, and - first time around - calls the STS to retrieve the token, caches it and succesfully returns it to the client which, subsequently, uses it to call the RP. all works well.
Second time around, however, my local cahce service tries to use the same token again, but the client side fails with a MessageSecurityException -
"Security processor was unable to find a security header in the message. This might be because the message is an unsecured fault or because there is a binding mismatch between the communicating parties. This can occur if the service is configured for security and the client is not using security."
Is there something preventing the same token to be used more than once? I doubt it because when I reused the token as per the WSTrustClient sample it worked well; what am I missing? is my idea possible? a good one?
Here's the (very basic, at this stage) main code bits of the local cache -
static LocalTokenCache.STS.Trust13IssueResponse cachedResponse = null;
public LocalTokenCache.STS.Trust13IssueResponse Trust13Issue(LocalTokenCache.STS.Trust13IssueRequest request)
{
if (TokenCache.cachedResponse == null)
{
Console.WriteLine("cached token not found, calling STS");
//create proxy for real STS
STS.WSTrust13SyncClient sts = new LocalTokenCache.STS.WSTrust13SyncClient();
//set credentials for sts
sts.ClientCredentials.UserName.UserName = "Yossi";
sts.ClientCredentials.UserName.Password = "p#ssw0rd";
//call issue on real sts
STS.RequestSecurityTokenResponseCollectionType stsResponse = sts.Trust13Issue(request.RequestSecurityToken);
//create result object - this is a container type for the response returned and is what we need to return;
TokenCache.cachedResponse = new LocalTokenCache.STS.Trust13IssueResponse();
//assign sts response to return value...
TokenCache.cachedResponse.RequestSecurityTokenResponseCollection = stsResponse;
}
else
{
}
//...and reutn
return TokenCache.cachedResponse;
This is almost embarrassing, but thanks to Dominick Baier on the forum I no now realise I've missed a huge point (I knew it didn't make sense! honestly! :-) ) -
A token gets retrieved once per service proxy, assuming it hadn't expired, and so all I needed to do is to reuse the same proxy, which I planned to do anyway, but, rather stupidly, didn't on my prototype.
In addition - I found a very interesting sample on the MSDN WCF samples - Durable Issued Token Provider, which, if I understand it correctly, uses a custom endpoint behaviour on the client side to implement token caching, which is very elegant.
I will still look at this approach as we have several services and so we could achieve even more efficiency by re-using the same token between their proxies.
So - two solutions, pretty much infornt of my eyes; hope my stupidity helps someone at some point!
I've provided a complete sample for caching the token here: http://blogs.technet.com/b/meamcs/archive/2011/11/20/caching-sts-security-token-with-an-active-web-client.aspx