I am looking to frame a SQL in Clickhouse like below
SELECT *
FROM Student
WHERE (:fname IS NULL OR FirstName = :fname)
AND (:laname IS NULL OR LastName = :lname)
but experiencing an exception DB::Exception: Cannot compare std::string with DB::Null
Related
I have Kafka integration objects:
CREATE TABLE topic_kafka
(
topic_data String
) ENGINE = Kafka()
SETTINGS
kafka_broker_list = 'kafka:9092',
kafka_topic_list = 'topic',
kafka_group_name = 'clickhouse_group',
kafka_format = 'JSONAsString',
kafka_num_consumers = 1;
CREATE TABLE topic
(
time DateTime64(3),
user_id Int32 NOT NULL,
version String
) ENGINE = MergeTree()
ORDER BY (user_id, time);
CREATE MATERIALIZED VIEW topic_consumer
TO topic AS
SELECT
JSONExtract(topic_data, 'time', 'Int64') as time,
toInt32(JSON_VALUE(topic_data, '$.data.user_id')) as user_id,
JSON_VALUE(topic_data, '$.data.version') as version
FROM topic_kafka;
And Kafka topic of json data with nested objects, like this:
{"time":1639387657456,"data":{"user_id":42,"version":"1.2.3"}}
The problem is that time has values 2282-12-31 00:00:00.000 in the topic table.
It also can be checked with the following query:
select cast (1639387657456 as DateTime64(3)) as dt
But for DML query below implicit date convertation works fine, as the documentation states:
insert into topic (time, user_id) values ( 1640811600000, 42)
I've found that such cast works fine too:
select cast (1639387657.456 as DateTime64(3)) as dt
Looks like I've missed something from the documentation.
What is the problem with view topic_consumer above? Is it ok to divide milliseconds by 1000 to convert it to DateTime explicitly?
fromUnixTimestamp64Milli
https://clickhouse.com/docs/en/sql-reference/functions/type-conversion-functions/#tounixtimestamp64nano
select fromUnixTimestamp64Milli(toInt64(1640811600000));
┌─fromUnixTimestamp64Milli(toInt64(1640811600000))─┐
│ 2021-12-29 21:00:00.000 │
└──────────────────────────────────────────────────┘
this is my native SELECT Query in Repository
#Modifying
#Query(value = "SELECT * FROM tasks WHERE title LIKE '%Java%' ORDER BY id DESC ", nativeQuery = true)
List<Task> listAllTasks();
this works ok, but when I use custom column name instead of *, like this
#Modifying
#Query(value = "SELECT title FROM tasks WHERE title LIKE '%Java%' ORDER BY id DESC ", nativeQuery = true)
List<Task> listAllTasks();
I have this error :
org.postgresql.util.PSQLException: The column name id was not found in this ResultSet.
any Help?
The resultset doesn't have the "id" in it, you have to provide it.
You should change the way you are declaring your SQL:
SELECT t.title, t.id FROM tasks t WHERE t.title LIKE '%Java%' ORDER BY t.id DESC
Check out this sort example:Native Queries
Select * from Entity -> returns a List of Entity
Example:
#Query(select * from tasks)
List<Task> findAllTasks();
Select column from Entity -> returns a List of Types of the entity.
Example:
#Query(select t.title from tasks t)
List<String> findTitle_AllTasks();
title is of the type String
Select multiple columns from Entity -> returns an Object[] holding the data
Example:
#Query(select t.id, t.title from tasks t)
List<Object[]> findIdTitle_AllTasks();
So, you are retrieving String type data - title and asking to return a List of Task type. This is causing the problem. You can actually check the hibernate docs under HQL and JPQL to understand this.
Plus, you are doing a SELECT (DQL operation). #Modifying is rudimentary here as it is used for DML operations using Data JPA - UPDATE/DELETE.
I have my model as:
type Report struct {
ID int `json:"id,omitempty" gorm:"primary_key"`
Title *string `json:"title" gorm:"not null"`
}
I have initialized variable report as var report Report I have successfully auto migrated this model as database table and have populated database as sql INSERT using GORM's db.Create(&report).
The problem I am facing is while trying query commands. Every query commands supported by GORM such as db.Find(&report) , db.First(&report, 1) is resulting to queries such as folows:
SELECT * FROM "reports" WHERE "reports"."deleted_at" IS NULL AND ((id = $1))
SELECT * FROM "reports" WHERE "reports"."deleted_at" IS NULL AND ((id = $1))
SELECT * FROM reports WHERE (reports.deleted_at IS NULL) AND ((id = $1))
SELECT * FROM reports WHERE (reports.deleted_at IS NULL) AND ((id = $1))
SELECT 0 done
I am unable to query database. I am using GORM with cockroach db. This works fine when using GO pq driver and raw sql commands.
The deleted_at column is a part of GORM's base gorm.Model struct and its soft delete feature. Are you using gorm.Model somewhere that we can't see in this example? This isn't supposed to happen unless you either define a field named DeletedAt or embed a gorm.Model in your model struct.
Since the model has the field deleted_at, gorm is using the soft delete ability automatically. You can use Unscoped
db.Unscoped().Find(&reports)
Which is the same as running the raw query
db.Raw("SELECT * FROM reports").Scan(&reports)
i am using spring jdbc template and i want to update some of the columns in a table.
Ex:Consider Student table
id-->1007.
name-->Krish.
age-->25.
tel-->0112538956536.
this is a existing record.i want to update some fields only(updating fields change time to time).others should have their existing values.how can i acheive this in Spring JDBC template.Any suggestions will be very much helpful.
Thanks.
you can use jdbc template for update table from application
here is the simple example
String name = "asdads";
int age = 12;
String tel = "+905655465465";
int id = 1;
String SQL = "update Student set name = ?, age= ? ,tel=?, last_updated = sysdate() where id = ?";
jdbcTemplate.update(SQL,name, age, tel, id);
may be just like
jdbcTemplate.update(
"update Student set last_updated=now(),some_field=?",
"some value");
I'm trying to do a native query on an Oracle database involve dates but I'm just not getting something.
I have a table with a number of rows in it and I know that the record with the oid=1234 has the latest updatetimeutc field, where updatetimeutc is a Date field. So I do the following:
Query query1 = entityManager.createNativeQuery("select updatetimeutc from TABLE where oid=1234");
List<Object[]> resultList = query1.getResultList();
Object[] os = resultList.get(0);
Date date = os[0];
Query query2 = entityManager.createNativeQuery("select oid, updatetimeutc from TABLE where updatetimeutc > :d");
query.setParameter("d", date);
resultList = query.getResultList();
At this point, I'm expecting the resultList.size() == 0, but that's not what's happening. I'm getting the same row returned to me as in query1.
Can someone explain why this is happening or what I'm doing wrong?
Thanks
The solution is to make sure you are using a version of Oracle's JDBC drivers that produce a java.sql.Timestamp for native queries that return DATE field.