jdbc connection mysql8.0 data format problem - jdbc

100 becomes 100.0 when I connect to mysql8 via JDBC for float data, but 100 when I use navicat or mysql command
enter image description here
enter image description here

Change the type of f variable from float to int
and
use
f = rs.getInt("f");
It will work

Related

Very slow connection to Snowflake from Databricks

I am trying to connect to Snowflake using R in databricks, my connection works and I can make queries and retrieve data successfully, however my problem is that it can take more than 25 minutes to simply connect, but once connected all my queries are quick thereafter.
I am using the sparklyr function 'spark_read_source', which looks like this:
query<- spark_read_source(
sc = sc,
name = "query_tbl",
memory = FALSE,
overwrite = TRUE,
source = "snowflake",
options = append(sf_options, client_Q)
)
where 'sf_options' are a list of connection parameters which look similar to this;
sf_options <- list(
sfUrl = "https://<my_account>.snowflakecomputing.com",
sfUser = "<my_user>",
sfPassword = "<my_pass>",
sfDatabase = "<my_database>",
sfSchema = "<my_schema>",
sfWarehouse = "<my_warehouse>",
sfRole = "<my_role>"
)
and my query is a string appended to the 'options' arguement e.g.
client_Q <- 'SELECT * FROM <my_database>.<my_schema>.<my_table>'
I can't understand why it is taking so long, if I run the same query from RStudio using a local spark instance and 'dbGetQuery', it is instant.
Is spark_read_source the problem? Is it an issue between Snowflake and Databricks? Or something else? Any help would be great. Thanks.

Sphnix with Oracle does not work with special (Turkish) characters

Everything is find with Sphnix. I've ony have problem with Turkish characters
Using sphinx-2.2.11 and (2.3 also does not works)
Oracle 11G connection in Sphnix.conf
source db{
Driver={Oracle in OraClient11g_home2};Dbq=ABC-DATABASE-XX:1521/ibbcbs;Uid=ibbcbs;
type = odbc
odbc_dsn = DSN=dsn_ABC; Pwd=ABC;Dbq:ABC
sql_host = XX
sql_user = XX
sql_pass = XX
sql_db = XX
sql_port = 1521
}
Query like:
select
1000000+objectid as GID,
TO_CHAR(NAME) as NAME,
SDO_UTIL.TO_WKTGEOMETRY(SHAPE) as SHAPE_WKT
from MAHALLE
I tried very different charset tables for Turkish in Sphnix.conf
charset_table = A->a, B->b, C->c, U+C7->U+E7, D..G->d..g, U+011E->U+011F, H->h, U+49->U+131, U+130->i, J..O->j..o, U+D6->U+F6, P->p, R..U->r..u, U+15E->U+15F, U+DC->U+FC, X->x, W->w, V->v, Y->y, Z->z, a, b, c, U+E7, d..g, U+11F, h, U+131, i..o, U+F6, p, r..u, U+15F, U+FC, x, w, v, y, z
Original Data: ALANİÇİ
But indexed in Sphinx: ALANIÇI
İ is converted to I somehow. Even If I search same text (ALANIÇI) sphinx does not return any result.
The issue is because oracle client is not connecting to database as UTF8 so Sphinx Search is not getting data correctly.
To fix this issue set the language for oracle client to TURKISH_TURKEY.UTF8.
On Windows, this can be done by editing registry value for NLS_LANG on registry path Computer\HKEY_LOCAL_MACHINE\SOFTWARE\ORACLE\KEY_OraClient11g_home1. The path may be different based on oracle client you are using.
Same can be achieved by setting environment variable as mentioned in https://docs.oracle.com/cd/E12102_01/books/AnyInstAdm784/AnyInstAdmPreInstall18.html see heading "For Windows:".
For Unix it can be fixed by
setenv NLS_LANG TURKISH_TURKEY.UTF8
After above changes reindex the data and all should be good.

db2 Invalid parameter: Unknown column name SERVER_POOL_NAME . ERRORCODE=-4460, SQLSTATE=null

I am using SQL 'select' to access a db2 table with schemaname.tablename as follows:
select 'colname' from schemaname.tablename
The tablename has 'colname' = SERVER_POOL_NAME for sure . yet I get the following error :
"Invalid parameter: Unknown column name SERVER_POOL_NAME . ERRORCODE=-4460, SQLSTATE=null"
I am using db2 v10.1 FP0 jdbc driver version 3.63.123. JDBC 3.0 spec
The application is run as db2 administrator and also Windows 2008 admin
I saw a discussion about this issue at : db2jcc4.jar Invalid parameter: Unknown column name
But i do not know where the connection parameter 'useJDBC4ColumnNameAndLabelSemantics should be set ( to value =2)
I saw the parameter should appear in com.ibm.db2.jcc.DB2BaseDataSource ( see: http://publib.boulder.ibm.com/infocenter/db2luw/v9r5/index.jsp?topic=%2Fcom.ibm.db2.luw.apdv.java.doc%2Fsrc%2Ftpc%2Fimjcc_r0052607.html)
But i do not find this file on my DB2 installation . maybe it is packed in a .jar file
Any advice ?
There is a link on the page you're referring to, showing you the ways to set properties. Specifically, you can populate a Properties object with desired values and supply it to the getConnection() call:
String url = "jdbc:db2://host:50000/yourdb";
Properties props = new Properties();
props.setProperty("useJDBC4ColumnNameAndLabelSemantics", "2");
// set other required properties
Connection c = DriverManager.getConnection(url, props);
Alternatively, you can embed property name/value pairs in the JDBC URL itself:
String url = "jdbc:db2://host:50000/yourdb:useJDBC4ColumnNameAndLabelSemantics=2;";
// set other required properties
Connection c = DriverManager.getConnection(url);
Note that each name/value pair must be terminated by a semicolon, even the last one.

Error while uploading image to database

When I am trying to upload an Image using below code, I am getting following error : java.sql.SQLException: ORA-01460: unimplemented or unreasonable conversion requested
File image = new File("D:/"+fileName);
preparedStatement = connection.prepareStatement(query);
preparedStatement.setString(1,"Ayush");
fis = new FileInputStream(image);
preparedStatement.setBinaryStream(2, (InputStream)fis, (int)(image.length()));
int s = preparedStatement.executeUpdate();
if(s>0) {
System.out.println("Uploaded successfully !");
flag = true;
}
else {
System.out.println("unsucessfull to upload image.");
flag = false;
}
Please help me out.
DB Script :
CREATE TABLE ESTMT_SAVE_IMAGE
(
NAME VARCHAR2(50),
IMAGE BLOB
)
Its first cause is incompatible conversion but after seeing your DB script, I assume that you are not doing any conversion in your script.
There are other reported causes of the ORA-01460 as well:
Incompatible character sets can cause an ORA-01460
Using SQL Developer, attempting to pass a string to a bind variable value in excess of 4000 bytes can result in an ORA-01460
With ODP, users moving from the 10.2 client and 10.2 ODP to the 11.1 client and 11.1.0.6.10 ODP reported an ORA-01460 error. This was a bug that should be fixed by patching ODP to the most recent version.
Please see this

unable to use RODM to connect to Oracle database from R

I am trying to connect to an Oracle database from R.
I used RODM_open_dbms_connection(dsn, uid = "", pwd = ""), but it doesn't work. I am not sure what kind of the error it is.
Here is the error screen from R.
> library(RODM) Loading required package: RODBC DB<-
> RODM_open_dbms_connection(dsn="****",uid="****", pwd="****") Error in
> typesR2DBMS[[driver]] <<- value[c("double", "integer", "character", :
> cannot change value of locked binding for 'typesR2DBMS'
Have you tried ROracle? After you get the instant client installed on your machine, connecting and fetching records from R looks like this:
library(ROracle)
con <- dbConnect(dbDriver("Oracle"), username="username", password="password", dbname = "dbname")
res <- dbSendQuery(con, "select * from schema.table")
dt <- data.table(fetch(res, n=-1))
I explored the RODM_open_dbms_connection. I commented out the part setSqlTYpeInfo(). After that I didn't receive that error.
Install RODM package from source then only you can edit the package.

Resources