Using Where condition with Upsert Query in Gorm GoLang - go

I am trying to use where condition using upsert query in Gorm GoLang but the upsert query is not getting formed correctly. It is just updating the values without where condition.
Here is my code -
onConflict := clause.OnConflict{
Where: clause.Where{Exprs: []clause.Expression{clause.Lte{Column: "timestamp", Value: time.Now()}}},
DoUpdates: clause.AssignmentColumns([]string{"first_name", "last_name", "timestamp", "updated_at"}),
}
insert := gormSQLDB.Clauses(onConflict).Create(&Users)
The mysql query formed from above code is -
INSERT INTO `users` (`first_name`,`last_name`,`timestamp`,`updated_at`,`id`) VALUES ('Ram','Kumar','2022-05-03 03:59:16','2022-06-07 14:45:22.631','5befa85e-e642-11ec-89a5-acde48001122') ON DUPLICATE KEY UPDATE `first_name`=VALUES(`first_name`),`last_name`=VALUES(`last_name`),`timestamp`=VALUES(`timestamp`),`updated_at`=VALUES(`updated_at`)
My requirement is to update the row on the basis of same id field with where condition as timestamp should be lower than current timestamp. But the where clause is not working. Can anybody tell me what is wrong in my code?

Related

Delete element from jsonb array in cockaroachdb

I got field with jsonb tags: [{"value": "tag1"}]
I need to do something like this update table1 set tags = tags - '{"value": "tag1"}' - but this don't work
What query should I execute to delete element from array?
Assuming your table looks like
CREATE TABLE public.hasjsonb (
id INT8 NOT NULL,
hash JSONB NULL,
CONSTRAINT hasjsonb_pkey PRIMARY KEY (id ASC)
)
you can do this with the following statement:
INSERT INTO hasjsonb(id, hash)
(SELECT id,array_to_json(array_remove(array_agg(json_array_elements(hash->'tags')),'{"value": "tag1"}'))
FROM hasjsonb
GROUP BY id
)
ON CONFLICT(id) DO UPDATE SET hash = jsonb_set(hasjsonb.hash, array['tags'], excluded.hash);
The actual json operation here is straightforward, if longwinded. We're nesting the following functions:
hash->'tags' -- extract the json value for the "tags" key
json_array_elements -- treat the elements of this json array like rows in a table
array_agg -- just kidding, treat them like a regular SQL array
array_remove -- remove the problematic tag
array_to_json -- convert it back to a json array
What's tricky is that json_array_elements isn't allowed in the SET part of an UPDATE statement, so we can't just do SET hash = jsonb_set(hash, array['tags'], <that function chain>. Instead, my solution uses it in a SELECT statement, where it is allowed, then inserts the result of the select back into the table. Every attempted insert will hit the ON CONFLICT clause, so we get to do that UPDATE set using the already-computed json array.
Another approach here could be to use string manipulation, but that's fragile as you need to worry about commas appearing inside objects nested in your json.
You can use json_remove_path to remove the element if you know its index statically by passing an integer.
Otherwise, we can do a simpler subquery to filter array elements and then json_agg to build a new array.
create table t (tags jsonb);
insert into t values ('[{"value": "tag2"}, {"value": "tag1"}]');
Then we can remove the tag which has {"value": "tag1"} like:
UPDATE t
SET tags = (
SELECT json_agg(tag)
FROM (
SELECT *
FROM ROWS FROM (json_array_elements(tags)) AS d (tag)
)
WHERE tag != '{"value": "tag1"}'
);

How to create an update query with typeorm and Oracle json column?

I have an Oracle table with a JSON column that I want to update, and I am using TypeORM with javascript. I need to access the JSON column in the where clause, following is the raw sql query and what I am attempting with typeORM:
Query updates the date to current date where the key's (inside the json column) value is 123.
entityManager.query(`UPDATE TABLE_NAME T
SET DATE = CURRENT_DATE
WHERE T.JSON_COLUMN.key = '123'`)
The query with createQueryBuilder:
tableRepository.createQueryBuilder()
.update('TABLE_NAME')
.set({DATE: '2021-07-23 10:07:10'})
.where('JSON_COLUMN.key = :key', {key: '123'})
.execute();
I am not sure how to access the JSON column's key in the where clause. Ideally, I would use the dot operator in SQL to access the JSON columns key value pairs, like so:
JSON_Column_Name.key = value but I cannot find a way to implement it with Oracle.
Any help would be appreciated.

how to get Null records from oracle database if column is Number type and nullable

I have Oracle database table with three columns i.e Id,RTOName,VehicleCode. my table looks like below
RTOName is the varchar2 type and VehicleCode is NUMBER(2,0) and is nullable field.
So I have the data like below and I want to fetch the records with Some VehicleCode and with null value. The table design like this already done so changing that will impact a lot in my application. I have a JPA Native Query that I used like this and I want to fetch the records with null values.
Query query = createNativeQuery("select RTOName,VehicleCode from tbl_vehiclecodes WHERE VehicleCode=#vCode");
query.setParameter("vCode", vehicleCode);
From above Query I will get only Non null valued record. Eg. for vCode parameter 61 I will get
Marathalli,61. If my vCode is null I have a problem and I wont get any record.
How to achieve this in Native Query?
I know that we can use IS NULL in the Query in where clause. Since I have some numbers here In my case how to solve this? Any help
Thanks
We can use OR here,
Following query will give you records with matched records for parameter vCode along with rows having null and in case of vCode is null you get the records only with null values.
Query query = createNativeQuery("select RTOName,VehicleCode from tbl_vehiclecodes WHERE (VehicleCode is null or VehicleCode=#vCode)");
Edit: considering the doubts from #Ranagal
If you want like in case of null value passed to vCode you want all the records having value in vehiclecode and also with null then we need to change the query like,
Query query = createNativeQuery("select RTOName,VehicleCode from tbl_vehiclecodes WHERE (VehicleCode is null or VehicleCode=coalesce(#vCode,VehicleCode))");

Nifi throwing None of the fields in the record map to the columns defined by [table name]

Am trying execute a sql query on oracle database and inserting the result into another table, for my trial am just performing a simple query as
SELECT 1 AS count
FROM dual
and trying to insert that into a single column table which has the name COUNT.
The content of the record on Nifi seems to be as follows
[
{
"COUNT" : "1"
}
]
but the logs keeps throwing the error
due to java.sql.SQLDataException:
None of the fields in the record map to the columns defined by
the schema_name.table_name table:
any ideas ?
I believe you get that same error message if your table name doesn't match. The Translate Field Names property only translates the fields (columns), not the table name. Try specifying the schema/table in uppercase to match what Oracle is expecting.

Oracle Update and Return a Value

I am having a Update Statement on a large volume table.
It updates only one row at a time.
Update MyTable
Set Col1 = Value
where primary key filters
With this update statement gets executed I also want a value in return to avoid a Select Query on a same table to save resources.
What will be my syntax to achieve this?
You can use the RETURNING keyword.
Update MyTable
Set Col1 = Value
where primary key filters
returning column1,column2...
into variable1,variable2...

Resources