I am facing a strange problem , when I connecting Oracle from SSIS and running below query it is not applying filter criteria
SELECT * FROM Table_Schema.Table_Name where trunc(Date_Column) >='01-APR-14'
but this is giving me data older than 01 April also. But when I am running same query in Oracle it is working absolutely fine.
What is wrong here?
Have you tried applying a format mask to the date you are passing in - you should always do this by default. You could substitute your date for sysdate (or getdate() in sqlserver) and see if it performs correctly. Also try removing the trunc.
Related
Looking into a query in WebIntelligence, after running, the prompts are replaced by values provided by user (for instance dates).
When I run the same query on Oracle (because this database I use for my universe) I’m getting error in terms of dates. Dates in query (in BO) are just strings,
like StartDate = '30-06-2020 00:00:00′. When I run the query generated in WebIntelligence on Oracle I’m getting error:
ORA-01843: not a valid month
01843. 00000 – ” not a valid month”
And to fix this I need to use for instance to_date function and then it’s working fine. My question is: how dates are parsed in WebIntelligence while running a query?
so the mentioned error does not occur?
I am getting the same error as you when I try a query directly against Oracle using SQL Developer that works in Web Intelligence. According to this BusinessObjects makes a call to set the date format.
So you can do that either in the preferences of SQL Developer (or presumably whatever database query tool you are using) or explicitly setting it with the alter session command.
alter session set nls_date_format = 'DD-MM-YYYY HH24:MI:SS';
select...[the rest of your query]
Both options are shown in the answer to How can I set a custom date time format in Oracle SQL Developer?.
I'm try to pull the data from SQL Server and using the generate table fetch. When I use MYSQL database instead SQL Server for the same generate table fetch it's working as expected. Whenever I use to connect SQL Server I'm getting error as below.
GenerateTableFetch[id=07bed292-0162-1000-0000-00004bc12345] failed to process session due to java.lang.IllegalArgumentException: Order by clause cannot be null or empty when using row paging: Order by clause cannot be null or empty when using row paging
SQL Server Version: 2016
I gone through the below link and came to know that there is a bug for generate table fetch for SQL Server. However I'm not whether the bug is fixed or not.
https://github.com/apache/nifi/pull/1510
Nifi Version I'm using - 1.5
Could someone please let me know whether the bug is fixed or not, If not any work around solution for this bug.
Here is my flow.
Edit:
GenerateTableFetc:
This is a bug in some of the DatabaseAdapters in NiFi, using GenerateTableFetch with no Max-value Column set. In this case there's a workaround, you can use the 2008 driver, then a ReplaceText processor to replace "ORDER BY asc" with "ORDER BY newid() asc". I'm trying to find out everywhere this could be an issue, I'll write up a Jira to cover all the cases. The general symptom is OFFSET/LIMIT clauses without an ORDER BY clause.
After upgrading from SSRS 2012 to 2016, we've had to rewrite all of our reports because of an issue with SSRS giving ORA-01830: date format picture ends before converting entire input string.
The code that causes the issue is below:
WHERE (
trunc(date_processed) BETWEEN NVL(:start_date,:subscription_start_date) AND NVL(:end_date,:subscription_end_date)
)
Start/End_date are both null at the beginning of the report execution. subscription_start/end_date is NEVER null and is always set. To make things even more frustrating, the following works fine:
WHERE (
trunc(date_processed) BETWEEN NVL(:start_date ,'01-JAN-1848') AND NVL(:end_date,'31-DEC-2039')
and trunc(date_processed) BETWEEN :subscription_start_date and :subscription_end_date
)
The issue, however, is that now the :start_date parameter can not override the subscription date parameter when it is set by the user.
This did not occur on previous versions. This is happening to ALL reports we have which isn't a few.
Setting the variable parameters in SSRS to text and using TO_DATE resolved similar situations for me using SSRS with an Oracle source.
WHERE ( trunc(date_processed) BETWEEN NVL(TO_DATE(:start_date,'mm/dd/yyyy'),TO_DATE(:subscription_start_date,'mm/dd/yyyy')) AND NVL(TO_DATE(:end_date,'mm/dd/yyyy'),TO_DATE(:subscription_end_date, 'mm/dd/yyyy') )
We were able to get around this issue by using an expression to format the date parameters on the output Dataset Parameter Properties with the following formulas.
=Format(Parameters!BeginDate.Value, "dd/MMM/yyyy")
=Format(Parameters!EndDate.Value, "dd/MMM/yyyy")
We have a legacy application we cannot modify that connects to Oracle 11g and at some point runs a query and returns a result. The application however is using the "generated" column name from Oracle to read the result.
Consider the following query:
select nvl(1,0.0) from DUAL;
As this query does not specify an alias, the generated column name would be "nvl(1,0.0)"
However on another server the generated column name is "nvl(1,0)" (notice 0 and not 0.0) and the application fails.
Is there a configuration that can be changed for Oracle? I've searched for formatting and locale configurations and they are equal on both servers.
Any help would be appreciated
It turns out there's a parameter called cursor_sharing that was set to FORCE instead of EXACT
select nvl(1,0.0) from DUAL;
The query above returns the following depending on the value of the parameter:
FORCE=NVL(1,0)
EXACT=NVL(1,0.0)
I'm having one of those throw the computer out the window days.
I am working on a problem involving Crystal Reports (Version 10) and an Oracle Database (11g).
I am taking a view from the database that returns a string (varcahr2(50)) which is actually a number, when a basic SELECT * query is run on this view I get the number back in the format 000000000000100.00.
When this view is then used in Crystal Reports I can view the field data, but I can't sum the data as it is not a number.
I began, by attempting to using ToNumber on the field, to which Crystal's response was that the string was not numeric text. Ok fair enough, I went back to the view and ran TO_NUMBER, when this was then used in crystal it did not return any results. I also attempted to run TO_CHAR on the view so that I could hopefully import the field as text and then perform a ToNumber, yet the same as with the TO_NUMBER no records were displayed.
I've started new reports, I've started new views. No avail.
This seems to have something to do with how I am retrieving the data for the view.
In simplistic terms I'm pulling data from a table looking at two fields a Foreign Key and a Value field.
SELECT PRIMARY_KEY,
NVL(MAX(DECODE(FOREIGN_KEY, FOREIGN_KEY_OF_VALUE_I_NEED, VALUE_FIELD)), 0)
FROM MY_TABLE
GROUP BY PRIMARY_KEY
When I attempted to put modify the result using TO_NUMBER or TO_CHAR I have used it around the VALUE_FIELD itself and the entire expression, wither way works when the run in a SQL statement. However any TO_NUMBER or TO_CHAR modification to the statement returns no results in Crystal Reports when the view is used.
This whole problem smacks of something that is a tick box or equivalent that I have overlooked.
Any suggestions of how to solve this issue or where I could go to look for an answer would be greatly appreciated.
I ran this query in SQL Developer:
SELECT xxx, to_number(xxx) yyy
FROM (
SELECT '000000000000100.00' XXX FROM DUAL
)
Which resulted in:
XXX YYY
000000000000100.00 100
If your field is truly numeric, you could create a SQL Expression field to do the conversion:
-- {%NUMBER_FIELD}
TO_NUMBER(TABLE.VALUE_FIELD)
This turned out to be an issue with how Crystal Reports deals with queries from a database. All I needed to do was contain my SQL statement within another Select Statement and on this instance of the column apply the TO_NUMBER so that Crystal Reports would recognize the column values as numbers.
Hopefully this helps someone out, as this was a terrible waste of an afternoon.