Google Apps Script sync with oracle - oracle

I am trying to fetch the data from Oracle by Google Apps Script, I am able to fetch the data by simple queries like
Select * from EMP_TABLE
and
Select * from EMP_TABLE where EMP_NAME = "Dhananjay"
But I am not able to fetch the data by below query
Select * from EMP_TABLE where EMP_ID = 1100
in above case all other relational operators are working fine but equal to is not working for any numerical field.
Need help on fetching data from oracle by Google Apps script.
statement that I am using is :
var rs = stmt.executeQuery('SELECT * FROM EMP_TABLE WHERE EMP_ID = 1100');

Related

Spring Auth Server JDBC implementation - error with Oracle DB BLOB [duplicate]

I have this query I am trying to run but I keep running into this error. I am trying to do a Where clause that compares the data (BLOB column) to :var2 which is a blob object.
Here is my code.
SELECT max(id)
INTO :var1
FROM table_name
where data = :var2;
Any suggestions to why I would be getting this ORA-00932 error?
I am comparing a blob to a blob column, shouldn't that be fine?
Thanks
They aren't simple types and you need to use a function to compare them.
SELECT max(id)
INTO :var1
FROM table_name
where dbms_lob.compare(data,:var2) = 0;

cx_Oracle query JSON CLOB with 'LIKE'

I'm exploring cx_Oracle's JSON features within a CLOB. I have an index on the table that allows me to query for direct equality
SELECT * FROM mytable m WHERE m.jsonclob.jsonattribute = 'foo';
I'd like to be able to do the same thing with a LIKE statement.
SELECT * FROM mytable m WHERE m.jsonclob.jsonattribute LIKE 'foo.%';
This works for me with Oracle DB 12.2:
SQL> CREATE TABLE j_purchaseorder_b (po_document CLOB CHECK (po_document IS JSON)) LOB (po_document) STORE AS (CACHE);
Table created.
SQL> INSERT INTO j_purchaseorder_b VALUES ('{"userId":2,"userName":"Bob","location":"USA"}');
1 row created.
SQL> SELECT pob.po_document.location FROM j_purchaseorder_b pob where pob.po_document.location LIKE 'US%';
LOCATION
--------------------------------------------------------------------------------
USA
For reference check the Oracle JSON manual chapter Query JSON Data.
A side note: the JSON team like recommending BLOB for storage for performance reasons. Check the doc etc etc etc.

Declaring and using variables in PL-SQL

I am new to PL-SQL. I do not understand why I am getting the error "PLS-00428: an INTO clause is expected in this SELECT statement"
What I'm trying to accomplish is to create a variable c_limit and load it's value. I then want to use that variable later to filter data.
Basically I am playing around in the demo db to see what I can/can't do with PL-SQL.
The code worked up to the point that I added "select * from demo_orders where CUSTOMER_ID = custID;"
declare
c_limit NUMBER(9,2);
custID INT;
BEGIN
custID := 6;
-- Save the credit limit
select credit_limit INTO c_limit
from demo_customers cust
where customer_id = custID;
select * from demo_orders where CUSTOMER_ID = custID;
dbms_output.Put_line(c_limit);
END;
If you are using a SQL SELECT statement within an anonymous block (in PL/SQL - between the BEGIN and the END keywords) you must select INTO something so that PL/SQL can utilize a variable to hold your result from the query. It is important to note here that if you are selecting multiple columns, (which you are by "SELECT *"), you must specify multiple variables or a record to insert the results of your query into.
for example:
SELECT 1
INTO v_dummy
FROM dual;
SELECT 1, 2
INTO v_dummy, v_dummy2
FROM dual;
It is also worth pointing out that if your SELECT * FROM.... will return multiple rows, PL/SQL will throw an error. You should only expect to retrieve 1 row of data from a SELECT INTO.
Looks like the error is from the second select query.
select * from demo_orders where CUSTOMER_ID = custID;
PL-SQL won't allow a standalone sql select query for info.
http://pls-00428.ora-code.com/
You need to do some operation with the second select query

Oracle query to get latest table or stored procedure schema change?

I'm trying to write a query to get the latest schema changes to a table or stored procedure on Oracle.
This is how to do this on Sybase:
select top 10 name from sysobjects where type = 'U' order by crdate desc
(I accept that this is built on created date and not modified date - I'd appreciate anyone who can show me how the modified date works in Sybase for tables but what I'm looking for is Oracle schema change date right now).
My question is: What is the Oracle query to get latest table or stored procedure schema change?
select * from
(SELECT * FROM user_objects ORDER BY last_ddl_time DESC)
where rownum <= 10;
user_objects contains all the objects owned by the current user (= current schema objects)
all_objects contains all the objects on which the user has any privileges
dba_objects contains all the DB objects (requires some special privileges to access).
all_ and dba_ have the additional column owner
3rd party edit
You may want to read does-rebuilding-an-index-update-the-last-ddl-time ...
From ROWNUM Pseudocolumn
For each row returned by a query, the ROWNUM pseudocolumn returns a
number indicating the order in which Oracle selects the row from a
table or set of joined rows. The first row selected has a ROWNUM of 1,
the second has 2, and so on.
You can use ROWNUM to limit the number of rows returned by a query,...
If you want to be specific about the table or procedure, you can limit like below
with 11g database
select * from
(SELECT * FROM user_objects where OBJECT_TYPE in ('TABLE','PROCEDURE') ORDER BY last_ddl_time DESC)
where rownum <= 10;
The above will give the latest changed objects either in table or procedure.
whereas in 12c database no need to use subquery
SELECT * FROM user_objects
where OBJECT_TYPE in ('TABLE','PROCEDURE')
ORDER BY last_ddl_time DESC
FETCH FIRST 10 ROWS ONLY;

Hive Timestamp aggregation

I have two hive tables, in which one table is updating an hourly basic by Java API team (they are calling and storing it into hive table1). And now I have to aggregate the latest data and store it into another table called table2 (data which are loaded newly,because old data have been aggregated and stored). For that I have used the query below:
set maxtime = select max(lastactivitytimestamp) from table2;
insert into table2 select * from table1 where lastactivitytimestamp > unix_timestamp('${hivevar:maxtime}');
I am not getting any result. But when I give the timestamp value manually I am getting data, like below:
insert into table2 select * from table1 where lastactivitytimestamp > unix_timestamp('2014-08-18 15:23:26.754');
Is it possible to pass dynamic values in unix_timestamp?
Try removing the upper commas from the unix_timestamp() function, like this:
insert into table2 select * from table1 where lastactivitytimestamp > unix_timestamp(${hivevar:maxtime});

Resources