SSIS 0x80040e37 oracle dataflow saying that a table doesnt exist in 2016 (but works fine in 2017) - oracle

I have a table comprised of columns varchar2's of ascending length. They range from varchar2(1911) to varchar2 (2865).
I use an Oledb Connection to the database and in the Dataflow source editor, under the 'table or view' I am able to see the table. However when I select that table I get the following error:
Error Code: 0x80040e37
Error at DataflowTask[Ole DB Source[1]]: Opening a rowset for "inserttablename" failed. Check that the object exists in the database.
I do not think it is a datatype length issue as it is able to find a table with a range of varchar2(3993) to varchar2(4000). Maybe it is the cumulative size of the table?
Whats more strange is that it works fine when targeting SQL Server 2017, it is only in 2016 and 2014 that it does not appear to work. Any insight would be greatly appreciated.
I am using Visual Studio 2017.
It also works with AdoNet connections as well as ODBC connections.

Related

How to load flat file into Oracle database

I am trying to import data from a csv file to an Oracle database, I am using Visual Studio 2017 I have downloaded all required components such as SSDT 15.8; and Attunity version 5.0
I was wondering if someone could please guide me on how I can load the flat file from csv to an Oracle table.
So far, I dragged and dropped the flat file and connected it to an Oracle Destination, however, even after mapping, as you can see, the red cross in Oracle Destination is still there
And when I tried to "Start" the process, this is the error message I got:
If somebody could please help me, that would be great. Below is a screenshot of the mapping i did:
thank you
Thank you for our help everyone, I resolved it by ensuring that the datatype in source matches the datatype in destination. Also, a major thing to consider is to ensure that the user you are logged in as has permission to write to the database, the user I was logged as did not have permission which is why I was getting that error

SQL Server 2014 : can't export Varchar(1000) in a query to Excel 2007. Works for Excel 97-2003

I am working with SQL Server 2014. I have applied the Microsoft Access Database Engine 2010 Redistributable to my server. I have tried both the 32-bit and 64-bit drivers. With the 32-bit driver, I can export to Excel 97-2003 without a problem. If I try to export to Excel 2007, my Varchar(1000) column fails with the following error:
Error 0xc0202009: Data Flow Task 1:
SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred.
Error code: 0x00040EDA.
An OLE DB record is available.
Source: "Microsoft Access Database Engine"
Hresult: 0x80040E21
Description: "The field is too small to accept the amount of data you attempted to add. Try inserting or pasting less data.".
(SQL Server Import and Export Wizard)
If I go with Varchar(255) instead of Varchar(1000), it works fine in both versions of Excel. The column maps from Varchar(1000) to Longtext and I set it to ignore truncation errors.
I have run into this with various projects, this project requires the full length of the column and truncation is not accepted. I have tried running the wizard from SMSS and from the Start menu (both 32-bit and 64-bit, although it seems SMSS runs the 32-bit version if the 32-bit driver is installed).
I could probably stick with Excel 97-2003 for a while, but eventually it will go over 65,536 rows and will have to go to Excel 2007. I cannot export to csv, as my client requirements are that it be placed into an Excel file. They would rather have .xlsx files, but will accept .xls files for the time being.
The export needs to be run on a regular basis, so I need to save the package as a .dtsx file.
Does anyone have any ideas on how to fix this?
One of the workarounds I found on a website sais:
You have to include the account you use to execute the package in the
performance counters group.
But I don't know how to apply this in SSIS, I will look for this and try to update my question.

SSIS value does not fall within the expected range with OLE DB Datasource

I'm using Visual studio 2013 with update 3 and a collegue of mine with update 4 installed each. We are using the data dools for sql server 2014.
I've created a few DTS packages which my collegue updated so far it worked without problems.
But all of a sudden I get "value does not fall within the expected range" warning from the datasource and can't edit columns there,.. . I needed to recreate the datasource for the message to disappear again.
My question here is can it be that the appearance of additional columns in the table which the datasource accesses was the cause of this problem? (I've seen out of sync warnings for datadestinations whenever a destination table got new columns or lost columns, but this is the first time something changed for a source table).
Or can that problem have a completely different cause?
It has been a long while since I've worked on an SSIS project, but I do recall seeing this error as well.
My experience was that it was caused to the metadata of the input being out of date in a certain way, and what you describe as your suspicion fits with this.
The solution I found to avoid this was to be very specific on all my input components, selecting the exact columns I wanted rather than selecting all. I think in the end I actually changed them all to use hand written SQL queries rather than the GUI column selector.
Also I don't remember if this was the same error but a similar one: sometimes after a schema changing when trying to open a component the GUI would throw an error and not open but when I tried again it would have resolved the error.
Sorry I couldn't be more definitive in my answer but hopefully this information helps point you in the right direction.
I used a simple method and it is working fine. In the OLE DB Source Editor while I retained the same connection manager, changed Data access mode (from Table/View) to SQL command and used SQL command to select the required columns. Error message no longer appeared and I could see the column values....

Crystal reports 2013 with oracle DB

i wanna build a crystal report using crystal reports 2013 with oracle database.
i've created an odbc connection to connect to this db named "abc".
then from the database expert in the crystal reports i've created a new connection ,then i've click on odbc ,and i choose the "abc" .
the problem is that only the stored procedure are shown , cannot find any table from the database, only the stored procedure,
i don't know why .. Can any one help me please .
Right click on the connection , go to properties and find the options. Most likely tables are not selected to be shown in the tree

Crystal Reports - Browse Data Shows Nothing

I just created a new table and filled it with data. When I run a simple select query, I can see all the data.
But when I try to build a report in Crystal with this table, I get no data. It doesn't matter if I have other tables included or not, so it isn't a linking issue.
If I right click a field and choose "browse data", I get nothing, which tells me that somehow Crystal can't read the data at all.
I created the table with the same user name/password that I used when connecting to the database from Crystal Reports.
Any ideas?
(If it matters, I'm using Oracle 10g Release 1 and Crystal Reports XI Release 2. We use a direct Oracle connection, not an ODBC.)
I found the problem. I hadn't yet committed the data on the database.
This taught me something about Oracle. I rarely use the "commit" command - but when I have created tables and filled them with data it's been via a "select" statement. Those tables have always worked. This time I did it via the insert all command. I needed to commit for that to work!

Resources