PLSQL turn clob containing json into table - oracle

Is there a way to turn a clob containing JSON Object into table
for example I have a clob containing [{"a":"1","b":"1"; "a":"2", "b":"2"; "a":"2","b":"2"}]
I want to turn this into a table to join it with other tables in my database.
is there a way to do it?
Thank you!

Your JSON is definitely not well formatted. However, once that is cleaned up, you can use JSON_TABLE to achieve your goals:
WITH test_data (json) AS
(
SELECT '{"rows":[{"a":"1","b":"1"},{"a":"2", "b":"2"},{"a":"2","b":"2"}]}' FROM DUAL
)
SELECT jt.*
FROM test_data td,
JSON_TABLE(td.json,
'$.rows[*]'
COLUMNS (row_number FOR ORDINALITY,
a INTEGER PATH '$.a',
b INTEGER PATH '$.b')) jt
Produces the following results:
row_number
a
b
1
1
1
2
2
2
3
2
2
Here is a DBFiddle showing how this works (Link)

See if this can help.
PLSQL looping through JSON object
It answers more or less what you are asking, although I'm not sure if it can handle not knowing the column names, figuring them out and creating a table from them.
Otherwise you could probably do some REGEXP parsing to figure out the distinct column names first, then either go through it with the json package, or just loop through it manually.

Related

Error: In ARRAY_LITERAL, an Array Literal was missing values for one or more rows

I am trying to combine 3 tables, each on a separate Google Sheet, and each table has the same 3 columns and has its header in row 1. Other than the actual data, the tables are all exactly the same. But I am getting the error:
'In ARRAY_LITERAL, an Array Literal was missing values for one or more rows'
I have browsed other posts with this same error and they are all about situations where the tables did not have the same number of columns. But my tables DO HAVE the same number of columns:
Here is my formula:
=query({IMPORTRANGE("1673t-7ZEzpZenbeK_HjCYyo824th4YE8Vt9YAn1J0nU","Sheet1!A1:C4");IMPORTRANGE("1J5VRivxfxToObBikLAKfNDNX4ibxvIWphcjohw0B8Ms","Sheet1!A2:C4");IMPORTRANGE("1DRSAyjDN7v6UNjFiBtMvrfzIewV8_iE-GjrGkNBPwIU","Sheet1!A2:C4")},"Select Col1, Col2, Col3",1)
Since I want headers to appear in my query result, I am including row 1 from the first sheet. I therefore reference range A1:C4 for that table but then only reference range A2:C4 for the next 2 tables.
Any input on why I am getting this error? Thanks
try first separately and give authorization for each spreadsheet, the error comes from a lack of coupling between spreadsheets
Try this
QUERY({IMPORTRANGE("1673t-7ZEzpZenbeK_HjCYyo824th4YE8Vt9YAn1J0nU","Sheet1!A1:C1");IMPORTRANGE("1673t-7ZEzpZenbeK_HjCYyo824th4YE8Vt9YAn1J0nU","Sheet1!A2:C4");IMPORTRANGE("1J5VRivxfxToObBikLAKfNDNX4ibxvIWphcjohw0B8Ms","Sheet1!A2:C4");IMPORTRANGE("1DRSAyjDN7v6UNjFiBtMvrfzIewV8_iE-GjrGkNBPwIU","Sheet1!A2:C4")},"Select * where Col1 is not null ")
Explanation
IMPORTRANGE the header then the other tables.
Since you want all column set the query to " Select * where Col1 is not null "

Iterative search of a Teradata clob

We have an accountnumber stored in a Clob field in a table...we'll call it tbl_accountdetail. I need to pull back all records from tbl_accountdetail if the account numbers are in the results from another query...we'll call it sourcequery.
I can do this individually for each account number with:
Select * from Tbl_accountdetail where REGEXP_INSTR(CLOB,'accountnumber')>0
Naturally, my first thought was to do a cursor and loop through each account number from the sourcequery.
Declare #accountnumber varchar(30)
Declare Err_Cursor Cursor for
Select accountnumber from ErrorTable;
Open Err_Cursor;
Fetch next from Err_Cursor into #accountnumber;
While ##Fetch_status = 0
Begin
Select * from Tbl_accountdetail where REGEXP_INSTR(CLOB,#accountnumber)>0
Fetch next from Err_Cursor into #accountnumber
End;
Close Err_Cursor;
Deallocate Err_Cursor;
The more I read the more I'm confused about the best/most efficient way to get my desired results.
References to cursors all seem to require them to be included in a stored procedure and based on the simplicity, you wouldn't think this needs to be added to a sp. References to macros all seem to be macros that need to update/insert,etc. which I don't need. All I need to do is return the rows from Tbl_accountdetail that have the accountnumber somewhere in the clob.
I'm new to Teradata and Clobs. Can someone help me with the best way to search the clob? And to do so for a list of values?
Any help/suggestions greatly appreciated.
How is your CLOB data structured? Is the accountnumber field stored in a way that you can extract it using a searchable pattern -- i.e. accountnumber=<10-digit-#>?
If you want to search for multiple accountnumber values, then I think the best way is to extract the accountnumber value(s) from each CLOB and then search on them. Something like:
SELECT *
FROM Tbl_accountdetail
WHERE <extracted_accountnumber> IN (
SELECT account_number
FROM account_table
)
You are right that cursors are only used in stored procedures. The main purpose for them is to process each row of a result set individually and perform any additional logic, which you don't seem to need in your case. You could either put the SQL into a macro or just run it as-is.
Update
Assuming you only have one accountnum value in your CLOB field with a format of "accountnum": "123456789", you should be able to do something like this:
SELECT *
FROM Tbl_accountdetail
WHERE REGEXP_SUBSTR(myclob, '"accountnum":\s+"([0-9]+)"', 1, 1, NULL) IN (
SELECT account_number
FROM account_table
)
This should extract the first accountnumber match in your CLOB field and see if that value also exists in the IN sub-query.
I don't have a TD system to test on, so you may need to fiddle with the arguments a bit. Just replace myclob with the name of your CLOB field and update the sub-query in the IN(). Give that a try and let me know.
SQL Fiddle (Oracle)
Regexp Tester
Teradata - REGEXP_SUBSTR

How MAX of a concatenated column in oracle works?

In Oracle, while trying to concatenate two columns of both Number type and then trying to take MAX of it, I am having a question.
i.e column A column B of Number data type,
Select MAX(A||B) from table
Table data
A B
20150501 95906
20150501 161938
when I’m running the query Select MAX(A||B) from table
O/P - 2015050195906
Ideally 20150501161938 should be the output????
I am trying to format column B like TO_CHAR(B,'FM000000') and execute i'm getting the expected output.
Select MAX(A || TO_CHAR(B,'FM000000')) FROM table
O/P - 2015011161938
Why is 2015050195906 is considered as MAX in first case.
Presumably, column A is a date and column B is a time.
If that's true, treat them as such:
select max(to_date(to_char(a)||to_char(b,'FM000000'),'YYYYMMDDHH24MISS')) from your_table;
That will add a leading space for the time component (if necessary) then concatenate the columns into a string, which is then passed to the to_date function, and then the max function will treat as a DATE datatype, which is presumably what you want.
PS: The real solution here, is to fix your data model. Don't store dates and times as numbers. In addition to sorting issues like this, the optimizer can get confused. (If you store a date as a number, how can the optimizer know that '20141231' will immediately be followed by '20150101'?)
You should convert to number;
select MAX(TO_NUMBER(A||B)) from table
Concatenation will result in a character/text output. As such, it sorts alphabetically, so 9 appears after 16.
In the second case, you are specifiying a format to pad the number to six digits. That works well, because 095906 will now appear before 161938.

Executing triggers in Oracle for copying the old values to a Mirror table

We are trying to copy the current row of a table to mirror table by using a trigger before delete / update. Below is the working query
BEFORE UPDATE OR DELETE
ON CurrentTable FOR EACH ROW
BEGIN
INSERT INTO MirrorTable
( EMPFIRSTNAME,
EMPLASTNAME,
CELLNO,
SALARY
)
VALUES
( :old.EMPFIRSTNAME,
:old.EMPLASTNAME,
:old.CELLNO,
:old.SALARY
);
END;
But the problem is we have more than 50 coulmns in the current table and dont want to mention all those column names. Is there a way to select all coulmns like
:old.*
SELECT * INTO MirrorTable FROM CurrentTable
Any suggestions would be helpful.
Thanks,
Realistically, no. You'll need to list all the columns.
You could, of course, dynamically generate the trigger code pulling the column names from DBA_TAB_COLUMNS. But that is going to be dramatically more work than simply typing in 50 column names.
If your table happens to be an object table, :new would be an instance of that object so you could insert that. But it would be rather rare to have an object table.
If your 'current' and 'mirror' tables have EXACTLY the same structure you may be able to use something like
INSERT INTO MirrorTable
SELECT *
FROM CurrentTable
WHERE CurrentTable.primary_key_column = :old.primary_key_column
Honestly, I think that this is a poor choice and wouldn't do it, but it's a more-or-less free world and you're free (more or less :-) to make your own choices.
Share and enjoy.
For what it's worth, I've been writing the same stuff and used this to generate the code:
SQL> set pagesize 0
SQL> select ':old.'||COLUMN_NAME||',' from all_tab_columns where table_name='BIGTABLE' and owner='BOB';
:old.COL1,
:old.COL2,
:old.COL3,
:old.COL4,
:old.COL5,
...
If you feed all columns, no need to mention them twice (and you may use NULL for empty columns):
INSERT INTO bigtable VALUES (
:old.COL1,
:old.COL2,
:old.COL3,
:old.COL4,
:old.COL5,
NULL,
NULL);
people writing tables with that many columns should have no desserts ;-)

Good way to deal with comma separated values in oracle

I am getting passed comma separated values to a stored procedure in oracle. I want to treat these values as a table so that I can use them in a query like:
select * from tabl_a where column_b in (<csv values passed in>)
What is the best way to do this in 11g?
Right now we are looping through these one by one and inserting them into a gtt which I think is inefficient.
Any pointers?
This solves exactly same problem
Ask Tom
Oracle does not come with a built-in tokenizer. But it is possible to roll our own, using SQL Types and PL/SQL. I have posted a sample solution in this other SO thread.
That would enable a solution like this:
select * from tabl_a
where column_b in ( select *
from table (str_to_number_tokens (<csv values passed in>)))
/
In 11g you can use the "occurrence" parameter of REGEXP_SUBSTR to select the values directly in SQL:
select regexp_substr(<csv values passed in>,'[^,]+',1,level) val
from dual
connect by level < regexp_count(<csv values passed in>,',')+2;
But since regexp_substr is somewhat expensive I am not sure if it is the most effective in terms of being the fastest.

Resources