Unable to execute odata calls using S4Hana SDK in cloud foundry environment with oAuth2SAMLBearerAssertion authentication - s4sdk

I'm trying to connect to s4 hana system using s4 sdk. While executing calls via .execute() method in cloud foundry environment, i see below error logs:
Caused by: com.sap.cloud.sdk.cloudplatform.connectivity.exception.DestinationAccessException: Failed to get authentication headers. Destination service returned error: Missing private and public key for subaccount ******-****-****-***-*******.
Note: I've already configured trust between subaccount and S4Hana system and created respective communication and business user. The associated authentication method used in the destination is oAuth2SamlBearerAssertion. Note: The call executes fine in both local and cloud foundry environment with basic authentication.
Can someone please suggest what is wrong here.

As correctly pointed out by #Dennis H there was a problem in trust configuration between my subaccount and S4 Hana system, the configuration wrong in my case :
-> The certificate I downloaded for trust was using this URL:
https://.authentication.eu10.hana.ondemand.com/saml/metadata
This is incorrect we need to get the certificate from download trust button in destination tab at subaccount level
->Provider name was incorrect in the communication system.

We are developing a side-by-side extension app and deploying it to CF. Our app is trying to connect to S4HANA cloud system using oAUTH2SAMLBEARERASSERTION. But facing issues while doing it. We are getting below error in logs. Please be noted, we are able to connect to S4HANA Cloud using basic auth.
com.sap.cloud.sdk.cloudplatform.connectivity.exception.DestinationAccessException: Failed to access the configuration of destination
Our destination parameters look as attached screenshotenter image description here
Thank you.

Related

ADF Oracle Service Cloud connector - correct endpoint

In Azure Data Factory, I'm trying to create a linked service by using the Oracle Service Cloud (Preview) connector to connect to my organisation's Oracle HCM instance. I'm generally following this guidance, using the copy data tool, which should be straightforward: https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle-service-cloud?tabs=data-factory
I have tried the following host names...
https://xxxx.xx.xxx.oraclecloud.com/
https://xxxx.xx.xxx.oraclecloud.com/hcmRestApi
https://xxxx.xx.xxx.oraclecloud.com/hcmRestApi/resources/11.13.18.05/grades
https://xxxx.xx.xxx.oraclecloud.com:443/hcmRestApi/resources/11.13.18.05/grades
... but all of the generate the following error...
Error code 9603
ERROR [HY000] [Microsoft][OSvC] (20) Error while attempting to use REST API: Couldn't resolve host name
ERROR [HY000] [Microsoft][OSvC] (20) Error while attempting to use REST API: Couldn't resolve host name
Activity ID: 590c5007-ec6f-4729-9eb2-d05ef779dc0e.
I'm using a username and password that has been tested on Oracle, and have tried various combinations of using encrypted endpoints, host verification and peer verification as true or false.
I believe I'm using the correct endpoints, based on Oracle's guidance:
Oracle REST endpoints
https://docs.oracle.com/en/cloud/saas/human-resources/22c/farws/rest-endpoints.html
I'm not sure what else to try to get this connector to work? Has anybody else got it to work, or perhaps noticed something I'm doing wrong with the host name?

Connect to Daverse with MFA Error AADTS50076

I'm trying to connect through to the Dataverse API for my PowerPlatform environment in c#. When following the samples provided by Microsoft I've been unable to produce any output as the application terminates with the error: "AADTS50076: Due to a configuration change made by our administrator, or because you moved to a new location, you must use multi-factor authentication to access".
Is there a way to get around this without disabling MFA?

Go storage client not able to access GCP bucket

I have a golang service which has an API exposed where we try to upload a CSV to a GCP bucket. On my local host, I set the environment variable GOOGLE_APPLICATION_CREDENTIAL
and point this variable to the filepath of service account json. But when deploying to an actual GCP instance, I'm getting the below error while trying to access this API. Ideally,the service should talk to GCP metadata server and fetch the credentials and then store them in a json file. So there are 2 problems here:
Service is not querying the metadata service to get the credentials.
If file is present(I created it manually), it's not able to access due to permission issues.
Any help would be appreciated.
Error while initializing storage Client:dialing: google: error getting credentials using well-known file (/root/.config/gcloud/application_default_credentials.json): open /root/.config/gcloud/application_default_credentials.json: permission denied
Finally, after long debugging and searching over the web, found out that there's already an open PR for the go-storage client which is open: https://github.com/golang/oauth2/issues/337. I had to make a few changes in the code using this method: https://pkg.go.dev/golang.org/x/oauth2/google#ComputeTokenSource where in basically we are trying to fetch the token explicitly from metadata server and then calling subsequent cloud API's.

Unable to fetch the metadata : Failed to execute OData Metadata request. error after deploying s4sdk to cloud foundry

I am getting the below response while calling /businesspartners api after deploying s4sdk app to cf.
I was able to test this app by deploying locally (after ALLOW_MOCKED_AUTH_HEADER: true), so i deployed the appl'n to cloud foundry, and tried using destinations service(insted of env), below are the steps that i followed.
step 1: Set up the cloud connector
step 2: Create service instance of xsuaa and destinations
step 3: Refer this in app yaml file
step 4: Push the package to cloud. (mvn clean package; cf push)
step 5: Now i went ahead and configured destinations; (app>> service instance>>destinations)
Here i couldn't test the connectivity, when i pressed "check connection" i was getting the below error:
How do i test this connection?
step 6: With the belief that the connection is working i went ahead and restarted the app, and started testing api's. the app was up but when i was getting above(fig.1) error.
step 7: Tried looking at the logs, and notices the below to errors
could anyone help over here to resolve this issue..?
Tried with setting "ALLOW_MOCKED_AUTH_HEADER" -> same two issue
Tried removing properties in destinations, but same errors.
Could not try with destinations variable in CF, as our S4 system is not publicly opened, so tried it locally it works.
Tried with Neo, it works both locally and on cloud.
But after pushing to CF, couldn't manage to run
Thanks,
Girish
You additionally need to bind your application to an instance of the connnectivity service on Cloud Foundry to communicate via the Cloud Connector. This is mentioned, albeit a bit hidden, in the error message "Failed to get connectivity service credentials: no service binding found".
Create a service instance with cf create-service connectivity lite my-connectivity.
Add the name of this instance to the section services in your manifest.yml file.
If you still face issues afterwards, please also try to remove the proxyPort and proxyHost properties from your destination. Those should not be required.
For more details about on-premise connectivity on SAP Cloud Platform Cloud Foundry, consult the following blog post.

Issue after informatica installation

After Successful Installation of Informatica 9.6.1 server and client on win-8.1
i am facing the below error while configuring Domain in IPC client Repository Manager tool:
Unable to save information for domain Domain_Hostname.
[PCSF_46026] Unable to find valid TrustStore certificate in PEM format
[ERROR: Cannot connect to Integration Service [xx].][1]
Thank you #user5468563,the solution in mentioned link didn't work,but the idea and key words in the link has made me to find solution
At last i found solution for this question.
Actually i faced this issue because of enabing secure communication for the domain while installing INFA9.6.1 server.
solution is after successful installation log in Administrator console
After successful Creation of Repository service and Integration service,
go to
Integration service --> Edit Advanced Properties --> Trust Store -->
Add value in Trust Store,Enter the value for Trust Store using the following
syntax:
/
for Example:
C:\Informatica\9.6.1\server\samples\WebServices\ssl/wsh.keystore
click OK and restart the service
Now,we can Configure Domain in Repository Manager and can Connect to Repository under respective domain

Resources