I just want to ask for some help regarding the JDBC driver and configuring a ColdFusion datasource. After I save, by clicking the submit button, it generates this error:
Connection verification failed for data source: mydtsrcName java.sql.SQLException: No suitable driver found for jdbc:jtds:sybase://127.0.0.1:1313/test.db The root cause was that: java.sql.SQLException: No suitable driver found for jdbc:jtds:sybase://127.0.0.1:1313/test.db
Can anyone explain the problem? How can I install the JDBC driver in ColdFusion?
Here are the settings for the DSN that I configured in the ColdFusion Administrator:
CF Data Source : my_dtsrc
JDBC URL : jdbc:sybase:Tds:127.0.0.1:3939
Driver Class : com.sybase.jdbc3.jdbc.SybDriver
Driver Name : SybDriver
User name : myusername
Password : mypwd
Did I miss something ?
(Extended from comments ...)
Did you add the driver jar to the CF class path and restart the CF server first? When the CF server starts, it only checks specific locations for jars/classes. Collectively, those locations are referred to as the "CF class path". Your driver jar must be placed somewhere within the CF class path, or it will not be detected. Hence the error message "No suitable driver found".
There are several locations CF checks automatically when it starts, such as:
{cf_root}\lib
{cf_root}\WEB-INF\lib
The simplest option is to just drop your jar in one of those directories. Then restart the service so CF detects the jar. Afterward, CF will be able to locate the driver class and you can create your "Other" datasource. (Note, the driver class name is case-sensitive)
NB: Technically you can place a jar anywhere, as long as it is accessible to the CF server and you add it to the class path in jvm.config. (See this blog entry for details. It is old, but still relevant). But again, it is simpler to just drop it in one of the directories CF checks automatically. Then there is no need to muck around with the jvm.config file.
Related
I had worked on Quarkus connecting to Postgres. But this is the first time I am trying to connect to SQL Server, which is the default server in my current project. I am following this guide to create a database component.
The properties file contains the following:
quarkus.datasource.db-kind=mssql
quarkus.datasource.username=<user-id>
quarkus.datasource.password=<pwd>
quarkus.datasource.reactive.url=sqlserver://localhost:1433/<db-name>?currentSchema=<schema-name>
quarkus.datasource.reactive.max-size=20
hibernate.default_schema=<schema-name>
The application starts fine, but when I make a request to the Resource that internally uses the repository, I get the following error:
Internal Server Error
Error id f0a959d2-3201-4015-bfd7-6628ae9914d1-1, io.vertx.mssqlclient.MSSQLException: {number=208, state=1, severity=16, message='Invalid object name ''.', serverName='<sql-instance>', lineNumber=1, additional=[io.vertx.mssqlclient.MSSQLException: {number=8180, state=1, severity=16, message='Statement(s) could not be prepared.', serverName='<sql-instance>', lineNumber=1}]}
This means, my application is able to connect to the database, but it is not able to find the table. The table exists in a schema, and I am unable to pass the schema which may be the cause of the issue. If you check the properties file, I have tried with two options:
Adding 'currentSchema' as a query param
Adding the property 'hibernate.default_schema'
But none of the two options are working. There is no documentation on SQL Server that can help me provide the right configuration to the Quarkus application. Please help.
The correct property is quarkus.hibernate-orm.database.default-schema It is possible to check all the available configuration properties in this url https://quarkus.io/guides/hibernate-orm#hibernate-configuration-properties
I'm just new to Nifi. I was able to install Nifi and see it in webbrowser. Now as next step i want to connect to sql server, nevertheless it seems i have to install jdbc as well and here is my issue when i look at tutorials all referencing to something called "docker" and advising to install jdbc from there. When i go into cmd and type docker cmd not recognize it. Can anyone tell me how to install it and what it is?
There is no need of docker for this use case.
All you have to do is download and install SQL server from official downloads page, if you don't have server setup.
Installation guide - https://learn.microsoft.com/en-us/sql/database-engine/install-windows/install-sql-server?view=sql-server-2017
You also need to download jar file which has JDBC driver stuff - https://learn.microsoft.com/en-us/sql/connect/jdbc/microsoft-jdbc-driver-for-sql-server?view=sql-server-2017
In NiFi, you can use PutDatabaseRecord processor to insert/update/delete rows from table. This processor internally uses DBCPConnectionPool controller services to get database connections.
DBCPConnectionPool controller service requires below properties to be set.
Database connection url - jdbc:sqlserver://localhost:1433;databaseName=dbname
Driver class name -
com.microsoft.sqlserver.jdbc.SQLServerDriver
Driver (jar) location -
/tmp/sqlserver.jar (Example only)
PutDatabaseRecord Processor
DBCPConnectionPool controller service
I think you may want to google how to install Docker and what it is, it's already explained in many places.
I like SQuirreL, but I am having a hell of a time getting the driver set up! I am trying to connect to a Intersystems Cache Database. I have downloaded the CacheBD.jar file but SQuirreL continually gives me the error:
Could not find class CacheBD in neither the Java class path nor the Extra class path of the Intersystems Cache driver definition:
java.lang.ClassNotFoundException: CacheBD
I have put the jar file in several places, added a CLASSPATH variable, and I can not get it to work...irky.
Can someone please give a step by steb for dummies guide to installing and adding a driver to SQuirreL?
Thanks for any guidance.
Leslie
I needed to select the 5.2_CacheDB jar instead..once I did that the correct Class Name was populated in the driver set up and it turned green.
I've just integrated pentaho's design studio into the BI server. Does anyone know how to add mysql jdbc drivers. I need to connect in order to define the relational action process.
In my research I found:
http://wiki.bizcubed.com.au/xwiki/bin/view/Pentaho%20Tutorial/Install%20Pentaho%20Design%20Studio#Comments
which specifies selecting
JDBC Driver, Edit, Extra Class Path from Preferences but no such preference exists,
http://forums.pentaho.com/showthread.php?85148-Design-Studio-xaction-database-connection-dropdown-list-empty&highlight=add+jdbc+driver+to+design+studio
which resulted in me creating a jdbc folder in which I placed the drivers in plugins\org.pentaho.designstudio.editors.actionsequence_4.0.0.stable\lib\
but just as the author of the thread I'm stuck
http://forums.pentaho.com/showthread.php?53303-Create-a-new-datasource&highlight=add+jdbc+driver+to+design+studio
suggests that:
3. If you are using the Pentaho DesignStudio you have to copy your jdbc (JAR files) to the plugins directory (in pentaho plugin) so you can develop, deploy and run your applications. This apply also to eclipse plugin (If you have now an Eclipse).
Which resulted in me placing the jar files in the plugin directory to no avail.
http://forums.pentaho.com/showthread.php?53715-Can-t-add-new-datasource-GA-version&highlight=add+jdbc+driver+to+design+studio
talks of a directory, rdw which does not exist
Any form of assistance will be greatly appreciated.
You have to configure the datasource by adding a Relational Process Action to your .xaction in the Pentaho Design Studio wherein you can specify the JDBC Driver, Username, Password and the Database URL. But first you have to put your MySQL JAR file in your lib folder /path/to/biserver-ce/tomcat/lib
You will also have to save your *.xaction file/s in the pentaho-solutions folder /path/to/biserver-ce/pentaho-solutions in order for your *.xaction files to connect to the database which you have assigned in your Relational Process Action.
I encountered the same problem and solved as follow
place mysql-connector-java-5.1.17.jar under (bi server path)\tomcat\lib\ folder
start Pentaho Admin Console (PAC) http://127.0.0.1:8099 with
user: admin
password: password
and add a connection there
use the name of the connection just created for action sequence as JNDI
The problem solved for me.
I am having some problems with an Hibernate Criteria query causing a outOfIndex error at driver level, I am pretty sure the problem is at the driver as I have being debugging and everything seems ok, but to be sure and be able to report the error I need to enable traces and I am not being able to do so.
I have added ojsbc5_g.jar as a new JDBC provider and created a new data source with it, renamed the jndi of the old one so the new debug data source uses the same and tried using -Doracle.jdbc.Trace=true -Djava.util.logging.config.file=ConfigFile.properties
at the server JVM configuration (using a absolute path for the file and a relative one).
The data source and the app works but no log appears, then I found the data source custom properties so I modified the level and the filename and now the file is there but empty.
The JVM Configuration should include:
-Doracle.jdbc.Trace=true -Djava.util.logging.config.file=/jdbc/OracleLog.properties
and the mentioned properties file should include something like:
.level=SEVERE
oracle.jdbc.level=ALL
oracle.jdbc.handlers=java.util.logging.ConsoleHandler
java.util.logging.ConsoleHandler.level=INFO
java.util.logging.ConsoleHandler.formatter=java.util.logging.SimpleFormatter
You can set the logging for following targets depending on the nature of your problem:
oracle.jdbc
oracle.jdbc.driver
oracle.jdbc.pool
oracle.jdbc.rowset
oracle.jdbc.xa
oracle.sql