Sudden Failure Requests on Braintree Sandbox API: 'Billing state format is invalid' - braintree

We're running a UK Magento store hooked up to Braintree. All has been running smoothly for months, then suddenly, we are no longer able to complete an order on any of our staging or local test environments which are hooked up to Braintree Sandbox.
At checkout, a request is made to the 3d secure endpoint, and if we have entered a UK based county, we get the following response:
Endpoint:
https://api.sandbox.braintreegateway.com/merchants/xxx/client_api/v1/payment_methods/xxx/three_d_secure/lookup
Request billing part:
"additionalInfo": {
"billingCity": "Leeds",
"billingCountryCode": "GB",
"billingGivenName": "John",
"billingLine1": "50 Upton Road",
"billingPhoneNumber": "07733222111",
"billingPostalCode": "LE6 7TH",
"billingState": "Yorkshire",
"billingSurname": "Smith"
},
Response:
{
"error": {
"message": "Billing state format is invalid."
},
"threeDSecureInfo": {
"liabilityShiftPossible": false,
"liabilityShifted": false
}
}
If we remove the county field from the checkout (and ultimately the 'billingSate from the request), the response is valid and we are able to checkout fine.
This has only started happening recently
The same codebase works fine on production Braintree
I simulated the a request with exact same params on production and it worked OK
Raised ticket with Braintree but no response
I am able to checkout if I use a two digit US state code in the county field
Anyone have any ideas?

I did finally get an answer from Braintree regarding this. Apparently 3ds2 is now enforced on the Sandbox, and this requires the state or county to be sent as a two digit code.
On production, if the full name is sent, it will (currently) gracefully degrade to 3ds1 and complete.
In an attempt to push people to using 3ds2, the Sandbox does not switch to 3ds1 and returns the error.

Today I encountered same problem with 3DSecure in Braintree.
First of all, I made sure that I use the latest version of drop-in, client and data-collector scripts which (at the time of writing this response) are:
<script src="https://js.braintreegateway.com/web/3.71.0/js/client.min.js"></script>
<script src="https://js.braintreegateway.com/web/3.71.0/js/data-collector.min.js"></script>
<script src="https://js.braintreegateway.com/web/dropin/1.25.0/js/dropin.min.js"></script>
Then I modified/renamed two of "threeDSecure" properties "locality"->"city" and "region"->"state"
dropin.requestPaymentMethod({
threeDSecure: {
amount: '10.01',
email: 'me#mydomain.com',
billingAddress: {
givenName: 'John',
surname: 'Smith',
streetAddress: '51 East Street,
extendedAddress: 'na',
city: 'Colchester',
state: 'Essex',
postalCode: 'CO1 2QY',
countryCodeAlpha2: 'GB'
}
}
}, function (err, payload) {
if (err) {
console.log('tokenization error:');
dropin.clearSelectedPaymentMethod();
return;
}
if (!payload.liabilityShifted) {
console.log('Liability did not shift');
return;
}
console.log('verification success');
console.log(payload.nonce);
});
I hope this will help you as it works fine for me in the Sandbox environment.

Related

Mastercard Hosted Checkout Integration always "Payment Unsuccessful" at the end with no error info

I am implementing Mastercard's Hosted Checkout Integration. Detail see below.
https://ap.gateway.mastercard.com/api/documentation/integrationGuidelines/hostedCheckout/integrationModelHostedCheckout.html
For checkout.js, I am using interaction.operation: "PURCHASE"
Checkout.configure({
merchant: <xxxx>,
order: {
amount: xxxx,
currency: "AUD",
description:<xxxxx>,
id: <my own order id, unique>,
},
session: {
id: <i can successfully get the session id>,
},
interaction: {
merchant: {
name: <xxxx>,
logo:logo_url_secure
},
operation: "PURCHASE",
displayControl: {
billingAddress: "HIDE",
},
},
});
Checkout.showLightbox();
The checkout light box can load etc. with no problem. Filled in the test card info.
The "ACS Emulator" shows up as normal. see screenshot below.
After click on "Submit" on the "ACS Emulator" page, it always gives a "Payment Unsuccessful" result with no error msg etc. See below
checked the API call to the bank,
https://xxxxxxxxxx.ap.gateway.mastercard.com/checkout/api/performPayment/SESSIONxxxxxxxxxxxxxxxxxxxxx?charset=UTF-8
the response is 200, but not successful.
{
"merchantReturnUrl":false,
"success":false,
"threeDsRequired":false,
"alreadySuccessfullyProcessed":false,
"sessionId":"SESSIONxxxxxxxxxxxxx",
"receiptUrl":"/checkout/receipt/SESSIONxxxxxxxx",
"receiptData":{
"paymentMethod":null,
"paymentDetail1":null,
"orderDate":"4/09/21 7:34 PM"
},
"transactionId":"2"
}
Anything I did wrong here?
I suspect my test account might not been setup properly?
Any hints are very much appreciated?

GoogleJsonResponseException: API call to classroom.courses.courseWork.create failed with error: Internal error encountered

I can't understand why this piece of code suddenly returns internal error.
It was working like a charm, after a while it started to throw the error in header and there is nothing to do.
var courseWork = {
'title': 'Reglamento',
'description': 'Por favor, leer los documentos adjuntos.',
'materials': [
{'link': { 'url': linkDocumento }},
{'link': { 'url': "https://drive.google.com/file/d/1eESDk5HA3vR2IEbV0UglaR0F6UysuEmd/view?usp=sharing" }}
],
'workType': 'ASSIGNMENT',
// 'state': 'DRAFT',
'state': 'PUBLISHED',
'topicId': idTema
};
Classroom.Courses.CourseWork.create(courseWork, passa.idClasse);
Answer from the comments on the question:
Actually the problem was the document I was sharing. As I mentioned before, each month I generate 120 classrooms and in each one I put a document "Course Rules". This document is shared from my Drive account (Sharing rules is "Everyone that has the link can see"). Doing this the system charges to the document the link of each classroom that shares it and, I suppose, there is a limit in the number of links that a document can receive.
With a new copy of the document no more error. The workaround will be to copy the document in the classroom drive folder and share it from there.

How to use a nested request with insights in the Facebook API

I am trying to include insights in a single call to Facebook ads API using Ruby. I researched this and got the following call:
params = {
'time_range': {
'since': '2019-08-01',
'until': '2019-08-31',
}
}
ad_account.campaigns(
fields: [
'adsets{
id,
insights{spend, impressions, clicks, ctr, cpc},
adcreatives{id, object_story_spec, image_url, object_type}
}'
],
params: params
).to_json
My response:
[
{
"adsets"=>{
"data"=>[
{
"id"=>"xxxxxxxxx",
"adcreatives"=>{
"data"=>
[
{
"id"=>"xxxxxxxxxxxxxx",
"object_story_spec"=>{
...
},
"image_url"=> "https://scontent.xx.fbcdn.net/v/xxxxxx",
"object_type"=>"SHARE"
}
],
"paging"=>{
"cursors"=>{
"before"=>"xxxxxx",
"after"=>"xxxxxxx"
}
}
}
}
],
...
So by levels, I am able to get all the campaigns, then the adsets inside them, and the adcreatives inside the adsets, but not the insights.
Am I doing something wrong? Does anyone have any experience with this?
So, surprisingly, Facebook DOES return insights with my call. Sometimes. No explanation why, but I tried with two accounts and got on one account insights and the other not.
Maybe it has something to do with the insights returned with this call have a shorter lifespan, so you can't get insights for older ads. Not sure. Not going to read through the million of pages.
My solution, for whoever is interested.
I made a call to adaccount insights directly, and specified the level as 'ad' so ir will return insights for all the account ads. I also requested the 'ad_id'. Later I was able to match the results returned with this call to the previous call I mentioned by comparing the ad ID, and merge the results.
ad_account.insights(
fields: ['ad_id', 'spend', 'impressions', 'clicks', 'ctr', 'cpc'],
level: 'ad',
time_range: {
since: date_since,
until: date_until
}
).to_json

Stripe Connect: Charging an existing customer against a "connected" (Standalone) account

If attempting to charge a customer record (which has an associated credit-card) via a connected account, I get an error claiming, "No such customer: cus_xxxx" -- even though making a charge to the same-exact customer will work fine when not using a "connected" account (when charging via the platform account).
For example, consider the following Ruby code, assuming we have a "connected" (Standalone) account with ID acct_ABC123:
# Use the (secret) API key for the "platform" or base account.
Stripe.api_key = 'sk_[...]'
customer = Stripe::Customer.create(email: 'customer#example.com')
# Associate a credit-card with the customer.
token = # Generate a token (e.g., using Stripe Checkout).
customer.sources.create(source: token)
# Attempt to charge the card via the connected account...
Stripe::Charge.create({ amount: 150, currency: 'usd', customer: customer.id,
application_fee: 25 }, stripe_account: 'acct_ABC123')
The last line there leads to a Stripe::InvalidRequestError exception, with the "No such customer" error mentioned above. However, the same charge will go through fine if we just try to run it on the "platform" account (without the stripe_account parameter and no application_fee)...
Stripe::Charge.create({ amount: 150, currency: 'usd', customer: customer.id }
For some (confusing and slightly bizarre) reason, you must add the intermediate step of creating a new token when making charges against "Shared Customers" (customers that will be charged through one or more connected accounts). So, assuming we've already created the customer with an associated credit-card (as per the question), the working code ends up looking something like this...
token = Stripe::Token.create({ customer: customer.id },
{ stripe_account: 'acct_ABC123' })
Stripe::Charge.create({ amount: 150, currency: 'usd', source: token.id,
application_fee: 25 }, stripe_account: 'acct_ABC123')
As an aside, I would consider Stripe's error message ("No such customer") to be a bug and the fact that this extra step (generating a token) is required only for "Stripe Connect" charges a confusing quirk.

Can't write acl rules to primary calendar in google service account

So I have set up a google service account for one of my apps. My intention is to keep a google calendar associated with the admin portal that all of the admins can post events to. I have got the JWT auth working I can post events to the calendar and perform other API actions. However, for some reason I cannot change the access control rules on the primary calendar. It is initialized with a single acl rule (role: owner, scope: {type: user, value: service_account_id}), and when I try to add public read access (role: reader, scope: {type: default}) like so:
POST https://www.googleapis.com/calendar/v3/calendars/primary/acl
Authorization: Bearer my_jwt_here
{
"role":"reader",
"scope":{
"type":"default"
}
}
I get the following error:
{
"error": {
"errors": [
{
"domain": "calendar",
"reason": "cannotRemoveLastCalendarOwnerFromAcl",
"message": "Cannot remove the last owner of a calendar from the access control list."
}
],
"code": 403,
"message": "Cannot remove the last owner of a calendar from the access control list."
}
}
This doesn't make any sense to me because this request shouldn't be trying to remove any access control rules. When I create a secondary calendar and do this I have no issues. When I do this with the primary calendar of my personal google account I have no issues. Is this some behavior specific to service accounts that I am not familiar with or what? I could settle for using a non-primary calendar but it bothers me that this isn't working. Any advice is appreciated.
so I found a weird work around for this issue and im posting here because I could not find SQUAT to help resolve this so hopefully this saves others some hassle.
I will also post some common problems I found when creating a organization-wide calendar (whether this is your use case or not I believe these tips will be helpful) - Jump to the bottom of the solution to this particular error.
First I needed to set up authentication with google calendar:
const { google } = require("googleapis");
const calendar = google.calendar("v3");
const scopes = [
"https://www.googleapis.com/auth/admin.directory.resource.calendar",
"https://www.googleapis.com/auth/calendar",
"https://www.googleapis.com/auth/admin.directory.user",
];
const path = require("path");
const key = require(path.join(__dirname, "google-cal-api.json"));
I created a service account and then allowed it domain wide delegation with the above listed scopes; then downloaded the key. Now if you want to do actions like create calendar events FOR users within this domain what you have to do is generate a JWT token that 'impersonates' the user whos calendar you wish to interact with; like so
const generateInpersonationKey = (email) => {
var jwtClient = new google.auth.JWT(
key.client_email,
null,
key.private_key,
scopes,
email
);
return jwtClient;
};
To set up a JWT client for the service account itself (and so you can create a calendar people can subscribe to; in our case it was a google calendar to show whos on leave within the workplace; so a calendar that has ALL that people can subscribe and toggle on/off was ideal) you just replace the email with 'null' and it defaults to itself, instead of 'impersonating' someone within the domain wide org.
Creating events are simple, follow the google cal api docs, depending on the auth token will depend on where the calendar is generated
JUMP HERE FOR THE IMMEDIATE SOLUTION TO THE ABOVE
For resolving the issue you pointed out; What I did was set my personal accounts email as an owner of this service accounts calendar with the following NodeJS code:
var request = await calendar.acl.insert({
auth,
calendarId: "primary",
resource: {
role: "owner",
scope: {
type: "user",
value: "callum#orgdomain.com",
},
},
});
I set myself as an owner, then I went to Google Calendar API > Patch (Try Me) filled in the calendarId as the service account with the calendar im trying to restrict; and then rule ID would be the gsuite domain domain:orgdomain.com The body should be
{
"role": "reader",
"scope": {
"type": "domain",
"value": "orgdomain.com"
}
}
And thats how I was able to restrict people within our gsuite domain from deleting or editing custom calendar events. This solution is coming from the perspective of someone who originally inserted the domain ACL as
var request = await calendar.acl.insert({
auth,
calendarId: "primary",
resource: {
role: "owner",
scope: { type: "domain", value: "orgdomain.com" },
},
});
Because adding it as a 'reader' like this messes with the service account ownership and wont allow anything but owner
Hope this has been helpful
Callum

Resources