Unable to get Keycloak client initiated client account linking to work - laravel

The request to start the client iniated account linking fails.
The console is showing a WARN of type: CLIENT_INITIATED_ACCOUNT_LINKING_ERROR with error: invalid_token.
The url was generated as described here: https://www.keycloak.org/docs/latest/server_development/#client-initiated-account-linking, by php backend system.
Also making sure to use UTF8 encoding when generating the hash
All prerequisites as describe it the section have been fulfilled.
Im' using Keycloak 15.0.2 and Laravel with Socialite to authenticate users.
This is how the hash is generated.
$keycloack_user = Socialite::driver('keycloak')->user();
$bearerToken = $keycloack_user->token;
$tokenParts = explode(".", $bearerToken);
$tokenHeader = base64_decode($tokenParts[0]);
$tokenPayload = base64_decode($tokenParts[1]);
$jwtHeader = json_decode($tokenHeader);
$jwtPayload = json_decode($tokenPayload);
$client_id = $jwtPayload->azp;
$host = $jwtPayload->iss;
$session_state = $jwtPayload->session_state;
$nonce = Str::random(20);
$provider = "google";
$input = $nonce . $session_state . $client_id . $provider;
$utf8encoded = utf8_encode($input);
$hashed = hash('sha256', $utf8encoded);
$encoded = rtrim(strtr(base64_encode($hashed), '+/', '-_'), '=');
Then the linking url is constructed as shown below:
$redirect_uri = urlencode(...);
$full_url = $host . "/broker/". $provider ."/link?client_id=". $client_id ."&redirect_uri=". $redirect_uri ."&nonce=". $nonce ."&hash=" . $encoded;
I'm currently testing a my local machine, without using https for any of the applications. Loging in works fine and when inspecting the JWT token, the needed role mappings are present:
"account": {
"roles": [
"manage-account",
"manage-account-links",
"view-profile"
]
}
But when accessing the url it says "Invalid request" and the Keycloak console indicates the token is invalid.
Update: Solution was to return the result of the hash method as raw binary data
$hashed = hash('sha256', $utf8encoded, true);

I had to work on the same task lately but with the client implemented in JavaScript. I was also stuck for quite a while till I realized how uncommonly keycloak is expecting the encoded hash value. You need to consider following two points:
Encode the hash string into hexadecimal before base64 conversion
Replace + by - and / by _. Besides that remove trailing = symbols
Below you find a working snippet written in JS:
import sjcl from "sjcl";
hexToBase64(hexstring) {
return btoa(hexstring.match(/\w{2}/g).map(function(a) {
return String.fromCharCode(parseInt(a, 16));
}).join(""));
},
// Assume nonce, session_state, clientId, provider to be given
var data = nonce + session_state + clientId + provider;
var myBitArray = sjcl.hash.sha256.hash(data)
var hashedData = sjcl.codec.hex.fromBits(myBitArray)
var base64HashedData = this.hexToBase64(HashedData)
base64HashedData = base64HashedData.replaceAll('+','-').replaceAll('/','_').replaceAll('=','')
base64HashedData is then what you need to pass as hash query parameter to the link endpoint of keycloak.

Related

DocuSign / Ruby - unsupported_grant_type when trying to obtain token

We are moving away from using the DocuSign::Esign gem and we are trying to make the API calls following the How to get an access token with JWT Grant authentication instructions. The consent has already been granted for this application when we set up this originally with the DocuSign::Esign gem.
I am getting the following error:
{"error"=>"invalid_grant", "error_description"=>"unsupported_grant_type"}
I am using Ruby and am running this code in the console
config = Padrino.config.docusign
current_time = Time.now.utc
header = {
typ: 'JWT',
alg: 'RS256'
}
body = {
iss: config.integrator_key,
sub: config.user_id,
iat: current_time.to_i,
exp: (current_time + 1.hour).to_i,
aud: config.host,
scope: 'signature impersonation'
}
key_file = "-----BEGIN RSA PRIVATE KEY-----
MIIEogIBAAKCAQEAiOMDM5jdGYTEOC/nFVUTQ3+5U2TCUpEKyUD+mByldDbgvT9q
. . .
jDjfX6L15x8JcY9eiXvCvZNF6Za2dg8cagK+ff5d6KLodmVFD5o=
-----END RSA PRIVATE KEY-----"
private_key = OpenSSL::PKey::RSA.new(key_file)
token = JWT.encode(body, private_key, 'RS256')
uri = 'https://account-d.docusign.com/oauth/token'
data = {
grant_type: 'urn:ietf:params:oauth:grant-type:jwt-bearer',
assertion: token
}
auth_headers = {content_type: 'application/x-www-form-urlencoded'}
However, when I call the api, I get a RestClient::Bad response error
irb(main):352:0> begin
irb(main):353:1> RestClient.post(uri, data.to_json, auth_headers)
irb(main):354:1> rescue RestClient::BadRequest => e
irb(main):355:1> JSON.parse(e.http_body)
irb(main):356:1> end
=> {"error"=>"invalid_grant", "error_description"=>"unsupported_grant_type"}
I am not sure what I am doing wrong. The JWT decodes correctly when I check it in https://jwt.io/. I am using the grant_type exactly as provided in the documentation.
Hmmm,
scope claim only needs to be signature (impersonation is implied since you're using the JWT grant flow.)
For the aud claim, what is config.host? It should be account-d.docusign.com for the developer system (Do not include https://)
Your main error is that you are sending the data hash in JSON format. That's wrong, it must be sent in url form format. Try
RestClient.post(uri, data, auth_headers)
instead. (Don't convert the data to json.)

Error in using Solr to add data - Solr HTTP error: OK (409)(HttpException )

I am trying this out for quite a time now, I have even googled a lot.
I am getting this error while trying to add data into Solr using Solarium in Laravel,
(1/1) HttpException
Solr HTTP error: OK (409)
{
"responseHeader":{
"status":409,
"QTime":3},
"error":{
"metadata":[
"error-class","org.apache.solr.common.SolrException",
"root-error-class","org.apache.solr.common.SolrException"],
"msg":"version conflict for 12 expected=12435421423451 actual=-1",
"code":409}}
in Result.php line 106
at Result->__construct(object(Client), object(Query), object(Response))in Client.php line 753
This is my function in EmployeeController.php
public function enterDataSolr()
{
$update = $this->client->createUpdate();
$doc1 = $update->createDocument();
$doc1->Gender = "M";
$doc1->Salary = 199999;
$doc1->SSN = "0050-03-10T21:00:00Z";
$doc1->City = "Mumbai";
$doc1->State = "Maharastra";
$doc1->Zip = 119973;
$doc1->Region = "Navi Mumbai";
$doc1->Password = "21435t34tgsd";
$doc1->id = 12;
$doc1->_Emp_ID = 1234546;
$doc1->Name_Prefix = "Mr.";
$doc1->First_Name = "Kant";
$doc1->Middle_Initial = "S";
$doc1->Last_Name = "Bhat";
$doc1->E_Mail = "nav#gmail.com";
$doc1->Father_s_Name = "Mant";
$doc1->Mother_s_Name = "Vandana";
$doc1->Mother_s_Maiden_Name = "vandana";
$doc1->Date_of_Birth = 12/2/1998;
$doc1->Time_of_Birth = "12:24";
$doc1->Age_in_Yrs = 21;
$doc1->Weight_in_Kgs = 56;
$doc1->Date_of_Joining = "2/2/2020";
$doc1->Quarter_of_Joining = "Q1";
$doc1->Half_of_Joining = "1st";
$doc1->Year_of_Joining = 2020;
$doc1->Month_of_Joining = 2;
$doc1->Month_Name_of_Joining = "February";
$doc1->Short_Month = "Feb";
$doc1->Day_of_Joining = 2;
$doc1->DOW_of_Joining = "Tuesday";
$doc1->Short_DOW = "Tues";
$doc1->Age_in_Company__Years_ = 2.4;
$doc1->Last___Hike = 2;
$doc1->Phone_No = 8906986022;
$doc1->Place_Name = "Delhi";
$doc1->User_Name = "kant";
$doc1->_version_ = 12435421423451;
$doc1->score = 1;
$doc2 = $update->createDocument();
$doc2->Gender = "F";
$doc2->Salary = '200000';
$doc2->SSN = "0050-03-10T00:00:00Z";
$doc2->City = "Purcellville";
$doc2->State = "VA";
$doc2->Zip = 20134;
$doc2->Region = "South";
$doc2->Password = "1";
$doc2->id = "2a69b460-2299-46a6-84b6-cf16938a1997";
$doc2->_Emp_ID = 520092;
$doc2->Name_Prefix = "Mrs.";
$doc2->First_Name = "Mary";
$doc2->Middle_Initial = "Watson";
$doc2->Last_Name = "Jane";
$doc2->E_Mail = "janemarie#hotmail.com";
$doc2->Father_s_Name = "Spder";
$doc2->Mother_s_Name = "May";
$doc2->Mother_s_Maiden_Name = "may";
$doc2->Date_of_Birth = "10/1/1921";
$doc2->Time_of_Birth = "12:02";
$doc2->Age_in_Yrs = 99;
$doc2->Weight_in_Kgs = 61;
$doc2->Date_of_Joining = "2/27/2020";
$doc2->Quarter_of_Joining = "Q2";
$doc2->Half_of_Joining = "Q1";
$doc2->Year_of_Joining = "Q4";
$doc2->Month_of_Joining = "2";
$doc2->Month_Name_of_Joining = "February";
$doc2->Short_Month = "Feb";
$doc2->Day_of_Joining = 27;
$doc2->DOW_of_Joining = "Tuesday";
$doc2->Short_DOW = "Tues";
$doc2->Age_in_Company__Years_ = 1.7;
$doc2->Last___Hike = "11%";
$doc2->Phone_No = 852489628962;
$doc2->Place_Name = "Purcellville";
$doc2->User_Name = "llwoods";
$doc2->_version_ = 1658322049611851997;
$doc2->score = 1;
$update->addDocuments(array($doc1, $doc2));
$update->addCommit();
$result = $this->client->update($update);
echo '<b>Update query executed</b><br/>';
echo 'Query status: ' . $result->getStatus(). '<br/>';
echo 'Query time: ' . $result->getQueryTime();
}
The connection is made properly as as ping() function is returning status OK.
The search function is working properly as well.
This is the constructor
public function __construct(EmployeeRepository $emp_repository, Client $client)
{
$this->emp_repository = $emp_repository;
$this->client = $client;
//dd('Solarium library version: ' . Client::VERSION . ' - ');
}
and I have used class as well
use Solarium\Client;
Optimistic Concurrency is a feature of Solr that can be used by client applications which update/replace documents to ensure that the document
they are replacing/updating has not been concurrently modified by another client application.
If there is a version conflict (HTTP error code 409), the client starts the process over.
This feature works by requiring a _version_ field on all documents in the index, and comparing that to a version specified as part of the update command.
By default, Solr’s Schema includes a _version_ field, and this field is automatically added to each new document.
$ curl -X POST -H 'Content-Type: application/json' 'http://localhost:8983/solr/techproducts/update?_version_=1632740120218042368&versions=true&commit=true&omitHeader=true' --data-binary '
[{ "id" : "aaa",
"foo_s" : "update attempt with correct existing version" }]'
an update with a value for _version_ that matches the value in the index, and it succeeds. Because we included versions=true to the update request,
the response includes a different value for the _version_ field.
If an update with a value for _version_ embedded in the document itself. The request fails because you have specified the wrong version.
Below would be the error for it.
{
"error":{
"metadata":[
"error-class","org.apache.solr.common.SolrException",
"root-error-class","org.apache.solr.common.SolrException"],
"msg":"version conflict for aaa expected=100 actual=1632740462042284032",
"code":409
}
}
Please refer the solr documentation for more details.
The -1 here is meant that Solr is not able to find a document with that version.
I would suggest you to try sending one of the document to solr yourself by hand on the Solr admin UI.
Select your core/collection name, then click the Documents link(on the solr admin page) and you'll be at the page where you could send the document for update to solr.
Solr Document Update

Laravel issue Credentials are required to create a Client

I need to test send SMS to mobile I get Credentials are required to create a Client error for My Code Here
.env
TWILIO_ACCOUNT_SID=AC15...................
TWILIO_AUTH_TOKEN=c3...................
TWILIO_NUMBER=+1111...
Config\App
'twilio' => [
'TWILIO_AUTH_TOKEN' => env('TWILIO_AUTH_TOKEN'),
'TWILIO_ACCOUNT_SID' => env('TWILIO_ACCOUNT_SID'),
'TWILIO_NUMBER' => env('TWILIO_NUMBER')
],
Controller
$accountSid = env('TWILIO_ACCOUNT_SID');
$authToken = env('TWILIO_AUTH_TOKEN');
$twilioNumber = env('TWILIO_NUMBER');
$client = new Client($accountSid, $authToken);
try {
$client->messages->create(
'0020109.....',
[
"body" => 'test',
"from" => $twilioNumber
// On US phone numbers, you could send an image as well!
// 'mediaUrl' => $imageUrl
]
);
Log::info('Message sent to ' . $twilioNumber);
} catch (TwilioException $e) {
Log::error(
'Could not send SMS notification.' .
' Twilio replied with: ' . $e
);
}
Twilio developer evangelist here.
A quick read over the environment config for Laravel suggests to me that you can use the env method within your config files, as you are doing, but it's not necessarily available in application code. Since you are committing your environment variables to the config object, I think you need to use the config method instead.
$accountSid = config('TWILIO_ACCOUNT_SID');
$authToken = config('TWILIO_AUTH_TOKEN');
$twilioNumber = config('TWILIO_NUMBER');
Let me know if that helps at all.

Google Vault API HttpError 500 "Internal error encountered."

I'm getting the following error when trying to create a hold using the Google Vault API:
HttpError 500 when requesting
https://vault.googleapis.com/v1/matters/{matterId}/holds?alt=json
returned "Internal error encountered."
from google.oauth2 import service_account
import googleapiclient.discovery
SCOPES = ['https://www.googleapis.com/auth/ediscovery']
SERVICE_ACCOUNT_FILE = './serviceaccount.json'
credentials = service_account.Credentials.from_service_account_file(SERVICE_ACCOUNT_FILE, scopes=SCOPES)
delegated_credentials = credentials.with_subject('delegateuser#example.com')
client = googleapiclient.discovery.build('vault', 'v1', credentials=delegated_credentials)
data = { 'name': 'test', 'accounts': [{'email': 'testuser#example.com' }], 'corpus': 'MAIL', 'query': { 'mailQuery': {'terms': 'to:ceo#company.com'} }}
results = client.matters().holds().create(matterId='{matterId}', body=data).execute()
I've replaced the actual matterId string with {matterId}.
Creating matters, listing matters and listing holds work just fine.
I've tried different combinations of fields to include in the request body but the docs are not clear as to which are required...
It turns out you can't use 'email' in holds().create() - you must use accountId, or the 'id' number for the gmail user.
You can use emails to create holds
https://developers.google.com/vault/guides/holds#create_a_hold_for_mail_on_specific_user_accounts_with_a_search_query

Setting ACL for a Pre Signed Object URL with Fog

I'm generating a fog pre-signed URL for AWS using the following snippet:
bucket = "..."
object = "demo.jpg"
expires = Integer(Time.now + 4.hours)
headers = {}
options = { path_style: true }
fog.put_object_url(bucket, object, expires, headers, options)
This works great - except that the uploaded objects aren't accessible to the public. How can a public-read access control list (ACL) be applied to the upload path?
You have to list these extra parameters (eg. x-amz-acl, Content-Type) under the "query" key of the options hash.
So your example would be.
bucket = "..."
object = "demo.jpg"
expires = Integer(Time.now + 4.hours)
headers = {}
query = {"x-amz-acl" => "public-read"}
options = { path_style: true, query: query }
fog.put_object_url(bucket, object, expires, headers, options)
You have probably solved this by now but incase anyone else is stuck on this, as the lack of surrounding documentation does not make it very straight forward to implement.

Resources