We're using a stored procedure provided by a remote system. For testing purposes, I call this procedure from my development machine. Now the problem is if I call the procedure from Toad, everything is OK. But when I call it using SQL Developer an error happens.
I debugged and debugged and found out this: In the procedure, an expire date is generated and passed to a web service (don't ask me why).
Here are the lines responsible for generating the date:
vt_User.EXPDATE := TO_DATE('01.01.2025', 'dd.mm.yyyy');
vs_Value := to_char(vt_User.EXPDATE, 'YYYY-MM-DD"T"HH24:MI:SSTZR');
vs_Value, when called from Toad is generated like:
2025-01-01T00:00:00+02:00
But if I call from SQL Developer, it's like:
2025-01-01T00:00:00EUROPE/ATHENS
Everything except these lines are exactly the same. I tried many different approach, trying to set NLS_LANG, altering the session etc but to no result.
I need to solve this because the same thing happens when I call the procedure from Java code also and that's the main issue.
I connect to the remote database using TNS for Toad and SQL Developer, thin driver for the Java code.
In Oracle DB you basically have a TIME_ZONE definition for the DB, but you can change it for a session.
In this case the DB time_zone is set in the Absolute offset from UTC format which is what you want.
Probably the SQL Developer opened the session in Time zone region name format, as could be seen with:
select sessiontimezone, dbtimezone from dual;
So, altering the session to be as dbtimezone may help.
ALTER SESSION SET TIME_ZONE=dbtimezone;
have another solution using tzh and tzm:
select to_char(current_timestamp, 'yyyy-mm-dd"T"hh24:mi:sstzh:tzm') from dual
Related
Dear Techies,
Our application triggers queries like below very frequently.
select name,emp_id,prod_id,prod_name, .... from appuser.table where emp_id=:1 and prod_id=:2;
We usually spend ample amount of time finding the SQL_ID when we receive the problematic SQL_TEXT. We have an option in SQL*Plus for variables like emp_id, prodnum, etc.. as below
VARIABLE emp_id NUMBER; EXEC :emp_id := 101;
However, we have :1, :2 as bind variables name which can't be set before running the SQL as these are mere number (although treated as bind variables by Oracle). We can't ask Application vendor to rebuild all queries removing these numbered bind variables.
So, I was looking for any of the below option in SQL*Plus:
How to declare/define such bind variables (:1, :2, etc...) before running the SQL?
Can we bypass in any way the bind values and send this SQL to cursor cache in Oracle? Looks to be difficult but still wanted to give a try asking.
Can we pass the values of these bind variables (:1, :2, etc..) during runtime as we do in Toad & SQL developer? This way we can track the correct SQL_ID from the cursor (v$sql).
I have been trying and searching for various options but didn't get specific to mine. Any help in this regard would be greatly appreciated. Any version of Oracle database which addresses this concern would be fine.
I'm using Oracle's SQL Developer to export data into CSVs. I found that Oracle spits out the dates as dd-MMM-yy. When I bulk insert these files into SQL Server it's interpreting some of the dates incorrectly. How do I change that?
I'm an Oracle neophyte, so I might be approaching this whole thing incorrectly. I need to transfer a lot of tables/rows from Oracle to SQL Server. I have a linked server set up in SQL to Oracle, but that takes a really long time to transfer the data. About 18 hours, and both databases are on the same server, but it gets the dates correct.
I didn't find any good way to accomplish this other than a couple of PL/SQL scripts I couldn't get to work for me. Is it really that rare that data gets migrated from Oracle to MS-SQL?
Well, dates across database vendors are just hell.
The default date format can be set in SQL Developer Preferences > Database > NLS > Date Format. You can also set it in the session as #belayer has commented.
For writing CSV files or for migration projects, I would always try to control the format directly, like
SELECT id, TO_CHAR(my_date,'YYYY-MM-DD') AS my_date, my_column
FROM my_table;
Having said that, there should be a better way to move the data out of Oracle into SQL Server...
I have created a temporary table in oracle sql developer but I forgot to save it and now I want to reuse the query but I don't remember the code used then. Is there a process to get query used creation of temp table?
You can use dbms_metadata.get_ddl()
select dbms_metadata.get_ddl('TABLE', 'YOUR_TABLE_NAME_HERE')
from dual;
The result is a CLOB with the complete DDL. You might need to adjust the display in SQL Developer to make the content of that value fully visible (I don't use SQL Developer, so I don't know if that is necessary and if so, what you would need to do)
Edit:
It seems SQL Developer can't display the result of this query properly unless you use the "Run Script" option. And with that you need to use a SET LONG 60000 (or some other big number) before you run it, to see the complete source code:
I'm running a query across a database link to a Sybase server from Oracle.
In it's where clause is a restriction on date, and I want it tied to sysdate, so something like this:
select * from some_remote_view where some_numeric_key = 1 and
some_date > sysdate+2
The problem is, when I do explain plan, only the condition some_numeric_key = 1 shows up in the actual sql that is getting remoted to the sybase server. Oracle is expecting to perform the date filter on its side.
This is causing a performance nightmare - I need that date filter remoted across to have this query working quickly
Even if I try something like casting the sysdate to a charcater string like this:
to_char(sysdate-2,'YYYY-MM-DD')
It still does not remote it.
Is there anything I can do to get Oracle to remote this date filter across the db link to Sybase?
Doing integration between Oracle and other platforms I often run into this problem, not just with SYSDATE but with other non-standard functions as well.
There are two methods to work around the issue, the first being the most reliable in my experience.
First, you can create a view on the remote db with the filters you need, then on the Oracle side you just select from the new view without additional filters.
Second, if you are not allowed to create objects on the remote side, try using bind variables (of the correct data type!) in your Oracle SELECT statement, e.g.:
declare
v_some_date constant date := sysdate + 2;
begin
insert into oracle_table (...)
select ...
from remote_table#db_link t
where t.some_numeric_key = 1
and t.some_date > v_some_date;
commit;
end;
/
I just created a stored procedure in MS SQL DB using TOAD.
what it does is that it accepts an ID wherein some records are associated with, then it inserts those records to a table.
next part of the stored procedure is to use the ID input to search on the table where the items got inserted and then return it as the result set to the user just to confirm that the information got inserted.
IN TOAD, it does what is expected. It inserts date and returns information using just the stored procedure.
IN Oracle SQL developer however, it does the insert and it ends at that. It seems to not execute the 2nd part of the stored procedure which is a select stmt.
I just have a feeling that this is because of the jdbc adapter. Also why I'm asking is because I'm using a reporting tool Pentaho Report Designer and it would really make it easier if I can do 2 things at the same time. Pentaho Report Designer is also using jdbc adapters, not a coincidence maybe?
But if there are other things that I can tweak I'd really appreciate it.
This is a guess, but worth considering...
There are things called "Batches", where are sets of SQL Statements that are all sent to the server at once, and executed by the server as one set of statements, within a single server-side session. Sending a set of sql statements to the server as a batch will often result in different results than if you sent them one at a time, where each statement is executed in its own session.
I haven't used Toad (or Oracle) in a while, but as I recall, it dealt with batches differently than the other ide I used. If the second statement in your set is relying on being in the same session as the first, and in one ide it is in a separate session, then this might explain what is happening.