Inserting values into a created table with while loop - oracle

I'm working on a while loop exercise on oracle. I have created a table with two columns.
What I want to do is; inserting values into first column with a sequence of from 1 to 1 million(1,2,3,4,5....1000000).
I've tried
DECLARE
a int := 0;
BEGIN
WHILE a < 1000000 LOOP
a := a + 1;
END LOOP;
END;
insert into Schema_name.table_name
(column_1)
values('a')
P.S: I'm working on Toad 12.9
Would you like to give a hand to me for this?

Just insert values(a), when you write 'a' you insert the character 'a' and not the variable a
DECLARE
a int := 0;
BEGIN
WHILE a < 1000000 LOOP
a := a + 1;
insert into Schema_name.table_name
(column_1)
values(a);
END LOOP;
END;

Related

Oracle: Update Every Row in a Table based off an Array

So i'm trying to create some seed data for a database that uses zip codes. I've created an array of 22 arbitrary zip code strings, and i'm trying to loop through the array and update one of the zips to every row in a table. Based on what I read and tried (I'm a 1st year, so I'm probably missing something), this should work, and does when I just output the array value based on the count of the table. this issue is in the row id subquery. When I run it in my console, it doesn't throw any errors, but it never completes and I think it's stuck in an infinite loop. How can I adjust this so that it will update the field and not get stuck?
declare
t_count NUMBER;
TYPE zips IS VARRAY(22) OF CHAR(5);
set_of_zips zips;
i NUMBER;
j NUMBER :=1;
BEGIN
SELECT count(*) INTO t_count FROM T_DATA;
set_of_zips:= zips('72550', '71601', '85920', '85135', '95451', '90021', '99611', '99928', '35213', '60475', '80451', '80023', '59330', '62226', '27127', '28006', '66515', '27620', '66527', '15438', '32601', '00000');
FOR i IN 1 .. t_count LOOP
UPDATE T_DATA
SET T_ZIP=set_of_zips(j)
---
WHERE rowid IN (
SELECT ri FROM (
SELECT rowid AS ri
FROM T_DATA
ORDER BY T_ZIP
)
) = i;
---
j := j + 1;
IF j > 22
THEN
j := 1;
END IF;
END LOOP;
COMMIT;
end;
You don't need PL/SQL for this.
UPDATE t_data
SET t_zip = DECODE(MOD(ROWNUM,22)+1,
1,'72550',
2,'71601',
3,'85920',
4,'85135',
5,'95451',
6,'90021',
7,'99611',
8,'99928',
9,'35213',
10,'60475',
11,'80451',
12,'80023',
13,'59330',
14,'62226',
15,'27127',
16,'28006',
17,'66515',
18,'27620',
19,'66527',
20,'15438',
21,'32601',
22,'00000')

inserting of data not happening in oracle

I am new to plsql.I have a table where i need to insert the data(some dummy data).So,i thought to use plsql block and using For loop it will insert the data automatically.The plsql block is runned successfully,but the data are stored as empty.The block I tried is:
declare
v_number1 number;
v_number2 number;
v_number3 number;
begin
For Lcntr IN 2..17
LOOP
v_number1 := v_number1+1;
v_number2 := v_number2+2;
v_number3 := v_number3+3;
Insert into stu.result(res_id,stu_id,eng,maths,science) values (stu.seq_no.NEXTVAL,Lcntr,v_number1,v_number2,v_number3);
END LOOP;
end;
But my table is loaded as:(please ignore first two row data,i inserted it manually):
The data for eng,maths,science is not being inserted.why it is happening so?
That's because your variables are NULL. NULL + 1 = NULL as well.
If you modify declaration to
v_number1 number := 0;
v_number2 number := 0;
v_number3 number := 0;
something might happen.

ROWNUM works only for one row. How to fetch rows after that?

I want to extract BLOB data and write a file (suggest it should be excel or txt or any other) out of it. A single cell has upto 60k characters. I wanted to write a script that reads whole table with BLOB data and write into a file. In below code ROWNUM works only for one row. what's the alternative? Or is there another script that can help me to achieve my ultimate objective to read BLOB and write file?
SET SERVEROUTPUT ON;
DECLARE
TotalRows NUMBER;
TotalChar NUMBER;
CharCounter NUMBER;
BEGIN
SELECT count(*) INTO TotalRows FROM <TableName>;
--dbms_output.Put_line(RC);
--END;
FOR RC IN 1..TotalRows
LOOP
-----------------Code for Rows starts--------------------------------------------------------------------------------
dbms_output.Put_line('Row '||RC||' Started.');
SELECT Length(<ColumnWithBLOBDataType>) INTO TotalChar FROM <TableName> where **Rownum = RC**;
dbms_output.Put_line('Crossed Char counting query. TotalChar='||TotalChar);
CharCounter:=TotalChar/2000+1;
dbms_output.Put_line('Loop will run these many times= '||CharCounter|| ' and Total Chars=' ||TotalChar);
For CC IN 1..CharCounter
LOOP
dbms_output.Put_line('Trip: '||CC);
END LOOP;
-----------------Code for Rows Ends----------------------------------------------------------------------------------------
TotalChar :=0;
dbms_output.Put_line('Row '|| RC||' Done. TotalChar='|| TotalChar);
END LOOP;
dbms_output.Put_line('Exited loop 1.');
END;
You normally don't use ROWNUM to select rows from a table. It's not safe and not necessary. Normally, you can do it with a single FOR SELECT loop:
DECLARE
CharCounter NUMBER;
part VARCHAR2(30000);
offset NUMBER;
BEGIN
FOR r IN (SELECT c,
rownum as rc,
dbms_lob.getlength(c) as totalchar
FROM mytable)
LOOP
-----------------Code for Rows starts--------------------------------------------------------------------------------
dbms_output.put_line('Row '||r.rc||' Started.');
dbms_output.put_line('Crossed Char counting query. TotalChar='||r.totalchar);
offset := 1;
WHILE (offset <= r.totalchar) LOOP
part := dbms_lob.substr(r.c, 20000, offset);
offset := offset + length(part);
dbms_output.put(part);
END LOOP;
dbms_output.put_line('');
END LOOP;
END;
/

ORA-01555 error when updating 200 million rows with BULK COLLECT

I have the following PL/SQL code, which updates one column of each row in a table with about 200 million rows. I use BULK COLLECT to repeatedly fetch 150,000 rows from the table and update the rows. I do a commit after 50,000 updates.
DECLARE
CURSOR jobs_cursor IS
SELECT e.ID, e.PTI, e.CAT, e.JOBNAME, e.JOBDATE, e.WORK_DESCRIPTION
FROM JOB e
WHERE length(e.WORK_DESCRIPTION) > 1000;
TYPE JOBS_TYPE IS TABLE OF jobs_cursor%ROWTYPE;
v_jobs JOBS_TYPE;
fetch_jobs_limit PLS_INTEGER := 150000;
trimmed_work_description VARCHAR2(2000 CHAR);
sub_string_work_description_left VARCHAR2(1000 CHAR);
sub_string_work_description_right VARCHAR2(1000 CHAR);
update_counter NUMBER := 0;
commit_counter NUMBER := 50000;
BEGIN
OPEN jobs_cursor;
LOOP
FETCH jobs_cursor BULK COLLECT INTO v_jobs LIMIT fetch_jobs_limit;
EXIT WHEN v_jobs.COUNT = 0;
FOR idx IN 1..v_jobs.COUNT
LOOP
trimmed_work_description := ' ';
IF v_jobs(idx).WORK_DESCRIPTION IS NOT NULL THEN
trimmed_work_description := TRIM(TRAILING ' ' FROM v_jobs(idx).WORK_DESCRIPTION);
END IF;
IF length(trimmed_work_description) <= 1000 THEN
UPDATE JOBS j SET j.WORK_DESCRIPTION = trimmed_work_description WHERE j.ID = v_jobs(idx).ID;
update_counter := update_counter + 1;
IF mod(update_counter, commit_counter) = 0 THEN
COMMIT;
update_counter := 0;
END IF;
CONTINUE;
ELSIF length(trimmed_work_description) > 1000 THEN
sub_string_work_description_left := SUBSTR(trimmed_work_description, 1, 1000);
sub_string_work_description_right := SUBSTR(trimmed_work_description, 1001, 2000);
END IF;
UPDATE JOBS j SET j.WORK_DESCRIPTION = sub_string_work_description_left WHERE j.ID = v_jobs(idx).ID;
INSERT INTO JOBS j VALUES ("SEQUENCE_JOBS".NEXTVAL, j.PTI, j.CAT, j.JOBNAME, j.JOBDATE, sub_string_work_description_right);
update_counter := update_counter + 1;
IF mod(update_counter, commit_counter) = 0 THEN
COMMIT;
update_counter := 0;
END IF;
END LOOP;
END LOOP;
COMMIT;
CLOSE jobs_cursor;
END;
The code runs for several hours, but then Oracle raises an ORA-01555 - Snapshot too old - Rollback segment number 14 with name xxxx too small.
Could you please tell me what is wrong with my PL/SQL? I already did the Google research and found some threads saying that this error could be avoided by expanding the UNDO table space, however this is not an option in my case. Thus, I need to modify the PL/SQL code.
On first view I don't see any reason why you make the update in a loop, it should be possible with single statements. Would be similar to this (not verified/tested)
update JOBS j SET
WORK_DESCRIPTION = SUBSTR(TRIM(TRAILING ' ' FROM WORK_DESCRIPTION), 1, 1000)
WHERE length(WORK_DESCRIPTION) > 1000;
INSERT INTO JOBS
SELECT SEQUENCE_JOBS.NEXTVAL, j.PTI, j.CAT, j.JOBNAME, j.JOBDATE,
SUBSTR(WORK_DESCRIPTION, 1001, 2000)
FROM JOBS j
WHERE length(TRIM(TRAILING ' ' FROM WORK_DESCRIPTION)) > 1000;

given two scripts for deleting records in table(6 million rows) want to know which script will be better and why?

I have table with 6 million records I am running archival script to delete around 5 million records.
My script1 will do the deletion but the DBA said my script1 getting into more buffer gets and
he recommended the approach which is script2. I am confused how come script2 is better than script1.
So please review the scripts and reply which is the best approach and why.
script1 :
PROCEDURE archival_charging_txn(p_no_hrs IN NUMBER,
p_error_code OUT NUMBER,
p_error_msg OUT VARCHAR2) IS
v_sysdate DATE := SYSDATE - p_no_hrs / 24;
TYPE t_txn_id IS TABLE OF scg_charging_txn.txn_id%TYPE INDEX BY PLS_INTEGER;
v_txn_id t_txn_id;
CURSOR c IS
SELECT txn_id FROM scg_charging_txn WHERE req_time < v_sysdate; /* non unique index */
BEGIN
OPEN c;
LOOP
FETCH c BULK COLLECT
INTO v_txn_id LIMIT 10000;
IF v_txn_id.COUNT > 0 THEN
FORALL i IN v_txn_id.FIRST .. v_txn_id.LAST
DELETE FROM scg_charging_txn WHERE txn_id = v_txn_id(i); /* Primary key based */
END IF;
COMMIT;
EXIT WHEN c%NOTFOUND;
END LOOP;
CLOSE c;
COMMIT;
p_error_code := 0;
EXCEPTION
WHEN OTHERS THEN
p_error_code := 1;
p_error_msg := substr(SQLERRM, 1, 200);
END archival_charging_txn;
script 2:
PROCEDURE archival_charging_txn_W(p_no_hrs IN NUMBER,
p_error_code OUT NUMBER,
p_error_msg OUT VARCHAR2) IS
BEGIN
DELETE FROM scg_charging_txn WHERE req_time < SYSDATE - p_no_hrs / 24;
COMMIT;
p_error_code := 0;
EXCEPTION
WHEN OTHERS THEN
p_error_code := 1;
p_error_msg := substr(SQLERRM, 1, 200);
END archival_charging_txn_W;
the 1st script reads the table entries using the cursor, therefore more buffer gets, while the 2nd script just deletes the table entries.
I would prefer the 2nd script. in case you get troubles with locking the table for too log time, make a LOOP UNTIL SQL%NOTFOUND with DELETE ... AND ROWNUM <= 10000; COMMIT; inside.
If you're deleting 5 million of 6 million rows, performance will suck.
You're better off doing CTAS.
Something like:
create table new_scg_charging_txn nologging as select * from scg_charging_txn WHERE req_time >= SYSDATE - p_no_hrs / 24;
If downtime is unacceptable, you may be able to do something similar, but wrap it with DBMS_REDEFINITION.

Resources