IBM App Connect Enterprise Could not locate JDBC Provider policy - jdbc

I have created a JDBCProvider policy in an IBM App Connect Enterprise (ACE v11) in Windows called CLINIC
which is also the name of the database I have a mapping Node where I'm trying to select from or insert into the Oracle Database
then i deployed the policy to the integration Node after i set credentials to the node
JDBCProviders
CLINIC
connectionUrlFormat='jdbc:oracle:thin:[user]/[password]#[serverName]:[portNumber]:[connectionUrlFormatAttr1]'
connectionUrlFormatAttr1='XE'
connectionUrlFormatAttr2=''
connectionUrlFormatAttr3=''
connectionUrlFormatAttr4=''
connectionUrlFormatAttr5=''
databaseName='CLINIC'
databaseSchemaNames='useProvidedSchemaNames'
databaseType='Oracle'
databaseVersion='11.2'
description=''
environmentParms=''
jarsURL='C:\\oraclexe\\app\\oracle\\product\11.2.0\server\\jdbc\\lib'
jdbcProviderXASupport='TRUE'
maxConnectionPoolSize='0'
portNumber='1521'
securityIdentity='mySecIdentity'
serverName='localhost'
type4DatasourceClassName='oracle.jdbc.xa.client.OracleXADataSource'
type4DriverClassName='oracle.jdbc.OracleDriver'
useDeployedJars='FALSE'
then when i test the message flow i always getting this error:
Exception. BIP2230E: Error detected whilst processing a message in node 'MappSelect.Mapping'. : C:\ci\product-build\WMB\src\DataFlowEngine\PluginInterface\jlinklib\ImbJniNode.cpp: 433: ImbJniNode::evaluate: ComIbmMSLMappingNode: MappSelect#FCMComposite_1_3
BIP6253E: Error in node: 'Mapping'. Could not locate JDBC Provider policy ''XE'', which was given for the data source name property for this node. : JDBCCommon.java: 575: JDBCDatabaseManager::constructor: :
so What am I missing? any help please?

I don't know what you did in your mapping node, but it should be one of the following :
You mentioned a wrong resource name. You have to mention CLINIC in your mapping node as a source
You did not restart the Integration Server after applying this configuration
https://www.ibm.com/support/pages/ibm-app-connect-enteprise-bip6253e-error-node-java-compute-could-not-locate-jdbc-provider-policy-mypoliciesmypolicy

Related

Unable to connect to SQL Server using the quarkus-hibernate-reactive due to incorrect reactive.url

I had worked on Quarkus connecting to Postgres. But this is the first time I am trying to connect to SQL Server, which is the default server in my current project. I am following this guide to create a database component.
The properties file contains the following:
quarkus.datasource.db-kind=mssql
quarkus.datasource.username=<user-id>
quarkus.datasource.password=<pwd>
quarkus.datasource.reactive.url=sqlserver://localhost:1433/<db-name>?currentSchema=<schema-name>
quarkus.datasource.reactive.max-size=20
hibernate.default_schema=<schema-name>
The application starts fine, but when I make a request to the Resource that internally uses the repository, I get the following error:
Internal Server Error
Error id f0a959d2-3201-4015-bfd7-6628ae9914d1-1, io.vertx.mssqlclient.MSSQLException: {number=208, state=1, severity=16, message='Invalid object name ''.', serverName='<sql-instance>', lineNumber=1, additional=[io.vertx.mssqlclient.MSSQLException: {number=8180, state=1, severity=16, message='Statement(s) could not be prepared.', serverName='<sql-instance>', lineNumber=1}]}
This means, my application is able to connect to the database, but it is not able to find the table. The table exists in a schema, and I am unable to pass the schema which may be the cause of the issue. If you check the properties file, I have tried with two options:
Adding 'currentSchema' as a query param
Adding the property 'hibernate.default_schema'
But none of the two options are working. There is no documentation on SQL Server that can help me provide the right configuration to the Quarkus application. Please help.
The correct property is quarkus.hibernate-orm.database.default-schema It is possible to check all the available configuration properties in this url https://quarkus.io/guides/hibernate-orm#hibernate-configuration-properties

Desktop Neo4j Error: Cannot create property 'transport-class' on number '0'

I have spined a Neo4j instance in Heroku and created a DB with few Nodes and relationships but when I tried to connect the same graph from the Neo4j desktop browser with below steps
Step 1: created a project
Step 2: Add the Remote Graph and enter the bolt and user name password
Step 3 : I have successfully connected to the Graph and showing all the count of Nodes and relationships I have created.
But here is the problem when I execute any query it pops below ERROR and I didn't find any solution from the net I think now one faced this issue :(
ERROR: Cannot create property 'transport-class' on number '78'
I was also having the same issue. You need to update server connection using :server connect command and provide the connect url as bolt://YOUR_SERVER_IP:7687 Your user name and password will be same as you have set it. This is working for me..!!

Ms Integration Runtime data factory

Kind of new with the integration runtime.
I had a pipeline running with no issues but recently we had an AD upgrade and the local on premesis SQL db changed my user from 'bluecompany\joe' to 'redcompany\joe'
This has caused my datafactory to stop working properly . as it can't connect to the SQL onpremesis .
I can't seem to find the place of where I can update this change?
Error:
Copy activity encountered a user error at Source side: Integration Runtime (Self-hosted) Node Name=ORG200016,ErrorCode=UserErrorFailedToConnectToSqlServer,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Cannot connect to SQL Server: 'org200016.bluecompany.com.au', Database: 'GroupRisk', User: 'bluecompany\joe'.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.ComponentModel.Win32Exception,Message=This user can't sign in because this account is currently disabled,Source=Microsoft.DataTransfer.ClientLibrary,'.
any ideas would be very welcomed. Thank you
As your login account has changed, I think you will need to update the account in the corresponding linked service, where you entered your credentials for this database previously.
Be sure the test connection succeeds after you edit the linked serivce. Then the pipeline should be able to connect to your database again.
Depending on which version of ADF you're using, there are different ways to update your linked service:
login to https://portal.azure.com/ and find you data factory (if you don't have an account to login to portal, you need to find the admin who create this linked service and ask him to update for you)
if you're using v1 data factory, find the "Author and Deploy" where you should be able to find your linked service corresponding to your on premise SQL server.
if you're using v2 data factory, find the "Author and Monitor", click on the pen logo where you should be able to find your linked service from the "connections" tab, it will allow you to edit the linked service.
Thanks,
Eva

pwdChangedTime which is not allowed by any of the objectclasses defined in that entry in OpenDj LDAP

I installed a new LDAP server (OpenDJ 2.4.6). I am trying to enable the replication with reference of existing server. But I am getting the below issue.
I ran the replication command, existing server (1st server). Can you please help/suggest on the below issue?
Establishing connections ..... Done.
Checking registration information .....
Error updating registration information. Details: Registration information
error. Error type: 'ERROR_UNEXPECTED'. Details:
javax.naming.directory.SchemaViolationException: [LDAP: error code 65 - Entry
cn=admin,cn=Administrators,cn=admin data violates the Directory Server schema
configuration because it includes attribute pwdChangedTime which is not
allowed by any of the objectclasses defined in that entry]; remaining name
'cn=admin,cn=Administrators,cn=admin data'
See /tmp/opends-replication-6304872164983350730.log for a detailed log of this
operation.
Schema violation indicates that the entry is not compliant with the schema defined on the directory server.
Since the pwdChangedTime is by default defined in the Schema as an operational attribute and the error occurs with the dsreplication command (which is known to produce valid data), this probably indicates that you have messed up with the default schema and altered it in a non standard and incompatible way.

Configuration Issue for IBM Filenet 5.2

I installed IBM Filenet Content Engine 5.2,on my machine.I am getting problem while configuring GCD datasources for new profile.
Let me first explain the setps I did,then I would mention the problem that I am getting.
First,I created GCD database in DB2,then I created datasources required for configuration of profile in WAS Admin Console.I created J2C Authentication Alias,for user which has access to GCD database and configured it with datasources.I am getting test database connection as successful but when I run task of configuring GCD datasources,it fails with the following error:-
Starting to run Configure GCD JDBC Data Sources
Configure GCD JDBC Data Sources ******
Finished running Configure GCD JDBC Data Sources
An error occurred while running Configure GCD JDBC Data Sources
Running the task failed with the following message: The data source configuration failed:
WASX7209I: Connected to process "server1" on node Poonam-PcNode01 using SOAP connector; The type of process is: UnManagedProcess
testing Database connection
DSRA8040I: Failed to connect to the DataSource. Encountered java.sql.SQLException: [jcc][t4][2013][11249][3.62.56] Connection authorization failure occurred. Reason: User ID or Password invalid. ERRORCODE=-4214, SQLSTATE=28000 DSRA0010E: SQL State = 28000, Error Code = -4,214.
It looks like simple error of user id and password not valid.I am using same alias for other datasources as well and they are working fine.so not sure,why I am getting error.I have also tried changing scope of datasources,but no success.Can somebody please help?
running "FileNet Configuration Manager" task of configuring GCD datasources will create all the needs things in WAS (including Alias), do not created it before manually.
I suspect it had an issue with exciting JDBC data sources/different names Alias
Seems from your message that you are running it from Filene configuration manager. Could you please double check from your database whether user id is authorised to execute query in GCD database. It is definitely do it with permission issue.

Resources