how to fix an error in your SQL syntax; - mysql-error-1064

I have checked alot of posts but still cannot find the answer to an error in one of my tables on my database.
When i run a query in SQL i get the following:
Error
SQL query:
0 SQL = INSERT INTO yurzk_user_usergroup_map( user_id, group_id )
VALUES ( 140 ) , ( 8 )
MySQL said: Documentation
1064 - You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '0 SQL=INSERT INTO yurzk_user_usergroup_map (user_id,group_id) VALUES (140),(8)' at line 1

if user_id is an autoincrement field and 140 and 8 are two different group_id:
insert into urzk_user_usergroup_map( `user_id`, `group_id` ) VALUES (NULL, '140'), (NULL, '8')
or if 140is the user_id and 8 the group_id
insert into urzk_user_usergroup_map( `user_id`, `group_id` ) VALUES ('140', '8')

the Values part should'nt look like
VALUES ( 140 ) , ( 8 )
instead it should look like
VALUES ( 140 , 8 )

INSERT INTO yurzk_user_usergroup_map (user_id,group_id) VALUES (140,8);
This is the correct syntax, check out.

Related

Oracle query to concat a column data having two rows into one

I have an Oracle column with "Long" datatype, as the column has the 4000 chars limit, we tend to insert data into two rows or more than that, now we want to display 2 rows of data into one, I tried but there are some of the other error like char to long and so on, could you please help.
Id, Series, Serialized
1 1 abc
1 1 def
3 2 gdf
Id, Series
1 1
1 1
3 2
Output
Id, Series, Serialized
1 1 abcdef
3 2 gdf
Here's the table scripts
CREATE TABLE "LOGMESSAGE"
( "ID" VARCHAR2(36 BYTE) NOT NULL ENABLE,
"SERIALIZEDMESSAGE" LONG,
"SERIES" NUMBER(10,0)
)
CREATE TABLE "LOGENTRY"
( "ID" VARCHAR2(36 BYTE) NOT NULL ENABLE
)
insert into "LOGMESSAGE" values(
'd8dcd593-af52-425a-8bf2-d93f78a601c6','{"name":"miogetadapter001.xml","type":"miogetadapterresponse","id":"56e125af-4202-44cb-bf90-58df03776793","time":"2021-10-05T06:27:04.8595987Z","source":"miochannelsource","adapter":"miotestadapter","channel":"miotestchannel","original":"<RESTTestResponse TestTimestamp=\"2021/10/05 11:57:05\">\r\n <Name>miotestadapter</Name>\r\n <IsActive>true</IsActive>\r\n <Description>Added during MIO install for confirmation test. Version=3.0.0.0</Description>\r\n <type>ChannelAdapterConfiguration</type>\r\n <AdapterConfigurationTypeName>FileAdapterConfiguration</AdapterConfigurationTypeName>\r\n <AdapterType>FileAdapter</AdapterType>\r\n <AdapterVersion>1</AdapterVersion>\r\n <AdapterKind>default</AdapterKind>\r\n <ContentFormat>Xml</ContentFormat>\r\n <ChannelSourceName>miochannelsource</ChannelSourceName>\r\n <MessageChannelName>miotestchannel</MessageChannelName>\r\n <MomConnectionName></MomConnectionName>\r\n <OutboundNameFormat>{messagename}</OutboundNameFormat>\r\n <InboundNameFormat>{messagename}</InboundNameFormat>\r\n <DoNotSend>false</DoNotSend>\r\n <DoNotSendOutbound>false</DoNotSendOutbound>\r\n <ShutdownMaxTime>15</ShutdownMaxTime>\r\n <IsXml>true</IsXml>\r\n <WakeUpInterval>-1</WakeUpInterval>\r\n <UserName></UserName>\r\n <Password></Password>\r\n <DomainName></DomainName>\r\n <VerifyUnique>true</VerifyUnique>\r\n <UniqueIncludesTimestamp>true</UniqueIncludesTimestamp>\r\n <UniqueCacheExpiration>0</UniqueCacheExpiration>\r\n <ClearOriginalContents>true</ClearOriginalContents>\r\n <BufferSettings>\r\n <Description></Description>\r\n <Kind>Persistent</Kind>\r\n <ConnectionName></ConnectionName>\r\n <ConnectionString>folder name=%AppData%</ConnectionString>\r\n <Interval>2000</Interval>\r\n <Expiration>-1</Expiration>\r\n <MaxCount>-1</MaxCount>\r\n </BufferSettings>\r\n <InProcessExpiration>120</InProcessExpiration>\r\n <InboundFilters />\r\n <OutboundFilters />\r\n <OutboundFailPlugin>LogAndDiscard</OutboundFailPlugin>\r\n <OutboundFailConfiguration>LogAndDiscard</OutboundFailConfiguration>\r\n <OutboundRetryInterval>15000</OutboundRetryInterval>\r\n <OutboundMaxRetries>5</OutboundMaxRetries>\r\n <SendMessageMaxRetries>10</SendMessageMaxRetries>\r\n <SendMessageRetryInterval>10000</SendMessageRetryInterval>\r\n <InboundUri>C:\\Program Files\\Opcenter Connect MOM\\Channel Adapter Host\\inbound</InboundUri>\r\n <OutboundUri>C:\\Program Files\\Opcenter Connect MOM\\Channel Adapter Host\\outbound</OutboundUri>\r\n <ErrorUri></ErrorUri>\r\n <InboundDriveToMap></InboundDriveToMap>\r\n <InboundFilenameFilter>*.*</InboundFilenameFilter>\r\n <OutboundDriveToMap></OutboundDriveToMap>\r\n <MaxReadRetries>10</MaxReadRetries>\r\n <RetryDelay>15000</RetryDelay>\r\n <DeleteInterval>15</DeleteInterval>\r\n <EncodingName>UTF-8</EncodingName>\r\n</RESTTestResponse>","contents":"<RESTTestResponse TestTimestamp=\"2021/10/05 11:57:05\">\r\n <Name>miotestadapter</Name>\r\n <IsActive>true</IsActive>\r\n <Description>Added during MIO install for confirmation test. Version=3.0.0.0</Description>\r\n <type>ChannelAdapterConfiguration</type>\r\n <AdapterConfigurationTypeName>FileAdapterConfiguration</AdapterConfigurationTypeName>\r\n <AdapterType>FileAdapter</AdapterType>\r\n <AdapterVersion>1</AdapterVersion>\r\n <AdapterKind>default</AdapterKind>\r\n <ContentFormat>Xml</ContentFormat>\r\n <ChannelSourceName>miochannelsource</ChannelSourceName>\r\n <MessageChannelName>miotestchannel</MessageChannelName>\r\n <MomConnectionName></MomConnectionName>\r\n <OutboundNameFormat>{messagename}</OutboundNameFormat>\r\n <InboundNameFormat>{messagename}</InboundNameFormat>\r\n <DoNotSend>false</DoNotSend>\r\n <DoNotSendOutbound>false</DoNotSendOutbound>\r\n <ShutdownMaxTime>15</ShutdownMaxTime>\r\n <IsXml>true</IsXml>\r\n <WakeUpInterval>-1</WakeUpInterval>\r\n <UserName></UserName>\r\n <Password></Password>\r\n <DomainName></DomainName>\r\n <VerifyUnique>true</VerifyUniqu',
0)
insert into "LOGMESSAGE" values(
'd8dcd593-af52-425a-8bf2-d93f78a601c6','e>\r\n <UniqueIncludesTimestamp>true</UniqueIncludesTimestamp>\r\n <UniqueCacheExpiration>0</UniqueCacheExpiration>\r\n <ClearOriginalContents>true</ClearOriginalContents>\r\n <BufferSettings>\r\n <Description></Description>\r\n <Kind>Persistent</Kind>\r\n <ConnectionName></ConnectionName>\r\n <ConnectionString>folder name=%AppData%</ConnectionString>\r\n <Interval>2000</Interval>\r\n <Expiration>-1</Expiration>\r\n <MaxCount>-1</MaxCount>\r\n </BufferSettings>\r\n <InProcessExpiration>120</InProcessExpiration>\r\n <InboundFilters />\r\n <OutboundFilters />\r\n <OutboundFailPlugin>LogAndDiscard</OutboundFailPlugin>\r\n <OutboundFailConfiguration>LogAndDiscard</OutboundFailConfiguration>\r\n <OutboundRetryInterval>15000</OutboundRetryInterval>\r\n <OutboundMaxRetries>5</OutboundMaxRetries>\r\n <SendMessageMaxRetries>10</SendMessageMaxRetries>\r\n <SendMessageRetryInterval>10000</SendMessageRetryInterval>\r\n <InboundUri>C:\\Program Files\\Opcenter Connect MOM\\Channel Adapter Host\\inbound</InboundUri>\r\n <OutboundUri>C:\\Program Files\\Opcenter Connect MOM\\Channel Adapter Host\\outbound</OutboundUri>\r\n <ErrorUri></ErrorUri>\r\n <InboundDriveToMap></InboundDriveToMap>\r\n <InboundFilenameFilter>*.*</InboundFilenameFilter>\r\n <OutboundDriveToMap></OutboundDriveToMap>\r\n <MaxReadRetries>10</MaxReadRetries>\r\n <RetryDelay>15000</RetryDelay>\r\n <DeleteInterval>15</DeleteInterval>\r\n <EncodingName>UTF-8</EncodingName>\r\n</RESTTestResponse>","empty":false,"contentsformat":"Xml","contentshash":"64882DB2ED9382F3075C94A0657C3034","hash":"135D1EEE963EC392868AF2AFF7DE5711","priority":1,"outbound":true,"request":false,"response":true,"requestid":"06da6532-bfd0-4a77-9637-b4da6f48b23a","events":[],"attributes":{"filename":"miogetadapter001.xml","restcommand":"/api/channeladapters","httpverb":"GET","querystring":"name=miotestadapter"},"tagdata":"","adaptertagdata":{"properties":"{\r\n \"encoding\": \"UTF-8\",\r\n \"source\": \"miochannelsource\",\r\n \"adapter\": \"miotestadapter\",\r\n \"directory\": \"C:\\\\Program Files\\\\Opcenter Connect MOM\\\\Channel Adapter Host\\\\inbound\"\r\n}"},"status":"","correlationid":"","maxretrycount":10,"retrycount":0,"automapped":false,"express":false,"inhibitEvent":false,"eventonly":false,"eventttl":0,"check":{"timestamp":"2021-10-05T06:27:04.8626024Z","stages":[{"name":"Created","machineName":"VM-VDIP49-03","timestamp":"2021-10-05T06:27:04.8626024Z","time":0,"duration":0},{"name":"Adapter Host Added","machineName":"VM-VDIP49-03","timestamp":"2021-10-05T06:27:04.8645993Z","time":1,"duration":1},{"name":"Adapter Host Dispatch","machineName":"VM-VDIP49-03","timestamp":"2021-10-05T06:27:04.869603Z","time":7,"duration":6,"dispatcher":"BrokerBalanced"},{"name":"Broker Received","machineName":"VM-VDIP49-03","timestamp":"2021-10-05T06:27:04.8905926Z","time":27,"duration":20},{"name":"Broker Added","machineName":"VM-VDIP49-03","timestamp":"2021-10-05T06:27:04.8925959Z","time":29,"duration":2},{"name":"Broker Dispatch","machineName":"VM-VDIP49-03","timestamp":"2021-10-05T06:27:04.9285978Z","time":65,"duration":36,"availThreads":24,"dispatcher":"fifowithpredecessors"},{"name":"Broker Added","machineName":"VM-VDIP49-03","timestamp":"2021-10-05T06:27:05.0966056Z","time":234,"duration":169},{"name":"Broker Dispatch","machineName":"VM-VDIP49-03","timestamp":"2021-10-05T06:27:05.1066034Z","time":244,"duration":10,"availThreads":24,"dispatcher":"fifowithpredecessors"},{"name":"Adapter Host Received","machineName":"VM-VDIP49-03","timestamp":"2021-10-05T06:27:05.1216049Z","time":259,"duration":15},{"name":"Adapter Host Send","machineName":"VM-VDIP49-03","timestamp":"2021-10-05T06:27:05.1486105Z","time":286,"duration":27}]}}',
1)
insert into "LOGENTRY"
values ('d8dcd593-af52-425a-8bf2-d93f78a601c6')
**Tried Query**
SELECT M.Id,RTRIM(XMLAGG(XMLELEMENT(E,M.SerializedMessage,',').EXTRACT('//text()') ORDER BY M.SerializedMessage).GetClobVal(),',') AS LIST1,M.Series
from LOGENTRY E INNER JOIN LOGMESSAGE M ON E.Id = M.Id group by M.Id,M.series;
**Error**
ORA-00932: inconsistent datatypes: expected CHAR got LONG
00932. 00000 - "inconsistent datatypes: expected %s got %s"
The long datatype by itself has lots of restrictions and is very difficult to work with in general. There's a reason why it's deprecated and you're just giving yourself extra work, finding ways around it, instead of using clobs.
But to answer your question, this is, I think, probably what you wanted:
SELECT listagg(a.linetext, '') within group (order by rownum)
FROM
(
SELECT 'a' linetext from dual
UNION ALL
SELECT 'a' linetext from dual
UNION ALL
SELECT '2' linetext from dual
UNION ALL
SELECT 'b' linetext from dual
) a
Replace the subquery with a query of your table and it will concat your results, which you get in separate lines.

QUERY Data for array of set of Conditions in SELECT Oracle

I have a query something Like this
Set of columns : { user_id , user_details,date }
I have set of conditional values like : { 1 , AAA , 09-03-2021 } , { 2 , BBB , 08-02-2021 }
I am trying to add the conditions in same select query as I need to get the data at a same time ,
Have tried below query
SELECT * FROM USERS WHERE ( user_id , user_details , date ) in ( select 1 , AAA , 09-03-2021 from dual ) ;
The above was working properly , when I use the below , I couldnt fetch Data , Can someone please if there is any provision to fetch data for all the set of conditons .
SELECT * FROM USERS WHERE ( user_id , user_details , date ) in (( select 1 , AAA , 09-03-2021 from dual ) , ( select 2 , BBB , 08-02-2021 from dual ) );
You are using the wrong syntax, even for a single tuple to compare to. The two cases should look like this:
where (user_id, user_details, date) in ( (1 , 'AAA' , to_date('09-03-2021')) )
and
where (user_id, user_details, date) in ( (1 , 'AAA' , to_date('09-03-2021')),
(2 , 'BBB' , to_date('08-02-2021')) )
There is no need to select anything from DUAL in the IN lists. The IN condition can also be used when the values on the right-hand side come from a table (in which case you would select three columns from that table, not from DUAL), but that's not what you are using here - here you are using hard-coded values, and the syntax doesn't require SELECT in the IN list.
Note that in your question you use values like 1, AAA, 09-03-2021. I hope you know that's wrong. 1 is fine, it's a number. AAA not enclosed in single quotes is not a string - it will be interpreted as a column name, and an error will be thrown because your table doesn't have a column named AAA. And 09-03-2021 is seen as a simple arithmetic expression involving three numbers, with minus signs (subtractions) between them. If this confuses you, that's a big problem.

Insert issue while trying with not exist operator in oracle

Trying to insert values if particular column value not exist in table
I have tried with sub query in where statement
INSERT
INTO ANIMALDATA VALUES
(
( SELECT MAX(first)+1 FROM ANIMALDATA
)
,
'Animals',
'Lion',
10,
'',
'13-06-2019',
'STOP'
)
where not exists
(select NAMES from ANIMALDATA where NAMES='Lion');
If the lion not exist then do insert statement should run
Give me an idea what i am missing as i am a beginner to oracle queries. help me to proceed further. thanks in advance
Since you have a condition, I think you need to do an INSERT INTO...SELECT:
(UPDATE: the CREATE TABLE statement is there to provide simple test data. It is not part of the solution).
create table animaldata(first, kingdom, names, num, nl, dte, s) as
select 1, 'Animals', 'Tiger', 11, 'a', '13-06-2019', 'STOP' from dual;
INSERT
INTO ANIMALDATA select
( SELECT MAX(first)+1 FROM ANIMALDATA
)
,
'Animals',
'Lion',
10,
'',
'13-06-2019',
'STOP'
from dual
where not exists
(select NAMES from ANIMALDATA where NAMES='Lion');
Best regards,
Stew Ashton
please try below. Thanks,
INSERT
INTO ANIMALDATA a
select
( SELECT MAX(first)+1 FROM ANIMALDATA
)
,
'Animals',
'Lion',
10,
'',
'13-06-2019',
'STOP'
from dual
where not exists
(select 1 from ANIMALDATA b where b.NAMES='Lion' and a.NAMES = b.NAMES );
First off, don't use max(<value>) + 1 to come up with new values for a column - that does not play well with concurrent sessions.
Instead, you should create a sequence and use that in your inserts.
Next, if you are trying to do an upsert (update the row if it exists or insert if it doesn't), you could use a MERGE statement. In this case, you're trying to insert a row if it doesn't already exist, so you don't need the update part.
Therefore you should be doing something like:
CREATE SEQUENCE animaldata_seq
START WITH <find MAX VALUE OF animaldata.first>
INCREMENT BY 1
MAXVALUE 9999999999999999
CACHE 20
NOCYCLE;
MERGE INTO animaldata tgt
USING (SELECT 'Animals' category,
'Lion' animal,
10 num_animals,
NULL unknown_col,
TRUNC(SYSDATE) date_added,
'STOP' action
FROM dual) src
ON (tgt.animal = src.animal)
WHEN NOT MATCHED THEN
INSERT (<list of animaldata columns>)
VALUES (animaldata_seq.nextval,
src.animal,
src.unknown_col,
src.date_added,
src.action);
Note that I have tried to specify the columns being inserted into - that's good practice! Code that has insert statements that don't list the columns being inserted into are prone to errors should someone add a column to the table.
I have also assumed that the column you're adding the date into is of DATE datatypee; I have used sysdate (truncated to remove the time part) as the value to insert, but you may which to use a specific date, in which case you should use to_date(<string date>, '<string date format')

H2-Database CommandCentre: CSVREAD skips loading the first(!) csv-Line of Data

folks,
H2 skips/drops the FIRST line of the following csv-Dataset ...
and I couldn't find a solution or workaround.
I have already looked through the various H2-tutorials and of course skimmed
the internet ...
Am I the only one (newbie - my "home" is the IBM-Mainframe)
who has such a problem inserting into a H2-database by using CSVREAD?
I expected here in this example the CSVREAD-Utility to insert 5(five!) lines
into the created table "VL01T098".
!!! there is no "Column-Header-Line" in the csv-dataset - I get the data this way only !!!
AJ52B1;999;2013-01-04;2014-03-01;03Z;A
AJ52C1;777;2012-09-03;2012-08-19;03Z;
AJ52B1;;2013-01-04;2014-03-01;;X
AJ52B1;321;2014-05-12;;03Z;Y
AJ52B1;999;;2014-03-01;03Z;Z
And here is my SQL (from the H2-joboutput):
DROP TABLE IF EXISTS VL01T098;
Update count: 0
(0 ms)
CREATE TABLE VL01T098 (
MODELL CHAR(6)
, FZG_STAT CHAR(3)
, ABGABE_DATUM DATE
, VERSAND_DATUM DATE
, FZG_GRUPPE CHAR(3)
, AV_KZ CHAR(1))
AS SELECT * FROM
CSVREAD
('D:\VL01D_Test\LOAD-csv\T098.csv',
null,
'charset=UTF-8 fieldSeparator=; lineComment=#');
COMMIT;
select count(*) from VL01T098;
select * from VL01T098;
MODELL FZG_STAT ABGABE_DATUM VERSAND_DATUM FZG_GRUPPE AV_KZ
AJ52C1 777 2012-09-03 2012-08-19 03Z null
AJ52B1 null 2013-01-04 2014-03-01 null X
AJ52B1 321 2014-05-12 null 03Z Y
AJ52B1 999 null 2014-03-01 03Z Z
(4 rows, 0 ms)
? Where is just the first csv-line gone ... and why is it lost?
Could you please help a H2-newbie ... with some IBM-DB2-experience
Many thanks in advance
Achim
You didn't specify a column list in the CSVREAD function. That means the column list is read from the file, as documented:
If the column names are specified (a list of column names separated
with the fieldSeparator), those are used, otherwise (or if they are
set to NULL) the first line of the file is interpreted as the column
names.

Update Query issue for multiple rows

I have below update query to set some values and controle the data flow.But i am getting error "Too many values" from the condtion(subquery)when i execute the bellow query.
UPDATE MTB ----- TABLE NAME
SET MTB_EXTR_FLAG='N',
MTB_ALOC_PROCESS='DC1'
WHERE MTB_I IN --- PRIMARY KEY
(
SELECT * FROM
(
SELECT MTB_I ,ROW_NUMBER() OVER (ORDER BY ROWID) AS RN
FROM MTB
)
WHERE RN BETWEEN 100 AND 500
)
Here my intension is selecting the different set up data per processing of one job.
I want to set MTB_EXTR_FLAG='N',MTB_ALOC_PROCESS='DC1' each time before running of the job with different set of data.
Can someone please help me to resolve the error issue or propose different query.
Thank you.
I think this is just a matter of number of columns not matching (2 - MTB_I and RN - instead of 1 - MTB_I):
UPDATE MTB
SET MTB_EXTR_FLAG='N',
MTB_ALOC_PROCESS='DC1'
WHERE MTB_I IN --- PRIMARY KEY
(
SELECT MTB_I FROM -- Else RN will be taken !!
(
SELECT MTB_I ,ROW_NUMBER() OVER (ORDER BY ROWID) AS RN
FROM MTB
)
WHERE RN BETWEEN 100 AND 500
)
You can't do where x in (...) with a subquery returning more fields than expected.

Resources