In the pentaho business analyzer, I am using just the date in my dimension table.
The dates are in yyyy-MM-dd format. For example, 2016-10-01.
When I go to use SQL to pull dates out of my date_dimension table, it keeps adding a timestamp.
It doesn't do it in the analyzer reports, just when I use SQL for the prompts.
Why is this?
I talked to the support team from Pentaho.
Actually, there was nothing I could do to fix this as this was a bug on their end.
I ended up changing the data type to a string data type. Now everything works fine.
Related
We've had some success with removing lob fields and avoiding row-by-row processing but with hadoop we can't seem to get around this. In some cases, the fields in question are less than 10 characters yet ssis sees them as lobs. is this an issue with hadoop, the odbc driver or ssis? what steps can we take to make a determination? Help me Obi Wan Kenobi. You're our last hope.
Hi if you can identify the columns that you need to convert add a data conversion step to those columns and then point those to the destination columns
and then set Validate External Data to False.
if this step also not working go into the advanced properties of the ODBC source and change the types back to something like DT_WSTR.
The answer you will get!!!
I recently wrote a tool that extracts certain data from our DBs. It runs as PL/SQL script running in SQLDeveloper (either in a worksheet or as extension plugin) and produces its output to the SQLDeveloper log-window.
This worked all fine on my system, but now I encountered an issue when users are on systems with a different language or more specific with different default time/date/timestamp formats than on the machine on which I had been developing and testing this.
Now - I am not sure: is the format of dates, times and timestamps controlled by the DB or by SLQ-Developer? In my understanding these PL/SQL scripts are sent to the DB for execution and their output is sent back to SQL-Developer. That would mean for me, that the format of the output depends solely on the DB (or the system on which the DB executes). Or are the NLS setting of the client (SQL-Developer) somehow involved here?
To make my tool auto-adjust to these settings I will need to be able to query these formats - either from the DB in use (Oracle 12.2 or Oracle XE 18/19 in our case) or from SQLDeveloper.
Assuming, it's the DB: Is there a table that contains the default format strings that are being used for select results?
Note: The point is NOT how to format dates etc. as strings, but the other way round:
I get the the query results as strings in the log-window. These contain dates and timestamps. I now need a hint from the DB-system to figure out how to interpret these. E.g. when I get a date such as '10-11-12', is this meant to be Nov. 10th, 2012 or is meant to be Nov. 12th, 2010?
Hope I could make myself clear...
I am migrating data from sybase to oracle using talend. I am using tSybaseInput for input and tOracleOutput for output db. I am mapping them through t_Map in some whereas direct in others.
After running the job, the row order is not maintained i.e. the order in which the data comes from sybase is not same as reflected in oracle. I need the order to be same so that I can validate the data later by outputting the data of both db to csv's and then comparing them(right now I am sorting them(unix sort) ..but it seems wrong).
Please suggest a way to maintain row order of input db in output db.
Also , is my method of validation correct or should I try something else?
The character sets and sort orders between the two vendors may be slightly different, which probably why you are seeing a change in the order. You may want to add a numeric key value to your tables in the Sybase DB, which can then be used to force a particular order once the data is imported into Oracle.
As for validation, if you are already using Unix command line, once you have a key value, you should just be able to use diff to compare the two CSV files without having to involve Excel. Alternatively you can add both Sybase and Oracle as data sources for Excel, and query the data directly into your worksheets, instead of generating CSV.
I have to change the data source in quite a few reports. Its easy when the original data source uses table, but its more complicated when instead has a SQL command (well, practically its a view but made in the report, not in the original database).
Lets say that report has originally such command:
SELECT nbr FROM equipment WHERE equipment.owner='ABC'
I know that in the new database Equipment.nbr is called now Items.ID, so I can easily map this. But what about the rest of command, the "WHERE" part? In the new database there is obviously no Equipment.owner and possibly might not even be Items.owner. Does crystal simply drop this part? I know how to remake it, by simply adding selection formula to the report, but first I have to be know what happened to the "WHERE" condition, and after such mapping I can't anymore preview the SQL command in the data source.
Confrim me where you are using this query.
If it is in Data soruce then no issue this report will work.
If it is in formula then I have doubt crystal report won't accept this format even in basic syntax mode.
I have some reports written in Crystal 2008 using business views. These reports have a date parameter set up and I have a selection on the date defined in the select expert. However, when I run the report it appears to retrieve all the data from the database and only then filter out based on the date. As you can imagine this slows down the report quite a bit. I also clicked on Database-Show SQL Query and confirmed that the date parameter did not appear in the SQL Query. This behavior seems very strange to me. This did not use to happen to me when I used Crystal 8.5 with dictionaries. Is this a limitation using business views?
I did some searching and found that I can create a report using a database command. This helped improve performance on one of my reports but when I tried to do something similar on a different report, even though I was using the database command, it still did not appear in the appear to be doing the selection on the database before retrieving the data and the report took forever to run. I also didn't see the selection in the SQL Query.
Do I need to add the parameter to the database command? Will I be able to prompt the user to enter the value when they run the report?
I hope there is a way to do this properly using business views because otherwise I'll have to rewrite all my reports to use another method.
Any ideas or advice are welcome. Thank you very much!
I had a similar problem. I used the command, but my report was still taking longer than i had hoped to run. so i added a where statement into the command to start checking dates starting from 2009. that sped up my report a little.
you may want to consider creating a stored procedure if you think you are pushing CR to the limit. that may also help sped up the report.
I figured out what the problem is. My business view had fields in it that were formulas. If you try to use selection criteria using a formula, it does not add the criteria to the WHERE clause in the SQL Query. Luckily, I was able to find other fields besides the formula in the business view to do the selection.