Data area access through SQL - db2-400

I am looking for advice on how to use the QSYS2.DATA_AREA_INFO in a particular situation. So I have created a few views which selects data from multiple tables. I am trying to fetch data from a data area as well using the DATA_AREA_INFO function.
The views need to be installed in a number of data libraries. The create view SQL statement does not have any libraries hardcoded. The tables to pull data from will be based on the default library we set in iSeries navigator while creating the views. So once the view is created, it would permanently point to tables from the default data library set. (Hope this is correct?)
The issue is with fetching the data from the data area:
SELECT DATA_AREA_VALUE
FROM TABLE(QSYS2.DATA_AREA_INFO(
DATA_AREA_NAME => 'TESTDA1',
DATA_AREA_LIBRARY => '*LIBL'))
Writing the statement as above would result in the view selecting the data from the data area present in the library list.
But the jobs from which the views will be executed might not have a library list setup. Hence I cant rely on DATA_AREA_LIBRARY => '*LIBL'
Is there a way I can make the view point to the same data library always (same as how the tables work)?

You could wrap up the data area access in a (service)program which accesses the *dtaara via ILE. The advantage is, that your able to reuse the program in several ways, in and outside an sql context. You can find information about this technique here:
Scott Klement Powerpoint

Related

Database diagram from Microsoft Access that is connected to Oracle via ODBC connection

I have Microsoft Access that is connected to an Oracle database via ODBC connection. In Access now I can access tables and data from Oracle.
How can I view/construct the data diagram from from within Access ? I don't have access to the Oracle database itself.
Many thanks.
You can certainly use the diagram tools in Access to layout and creating a relationships diagram in Access. However that diagram while “pretty” will of course NOT effect nor enforce nor change any of the attributes on the server side. In fact EVEN when using an Access front end linked to an access back end file, you are free to create the diagram in the front end, but you CAN NOT CHANGE the data structures nor set the relationships that exist.
Of course the above assumes you have FIRST linked all the tables you are going to work with from the Oracle database into Access. Once you have all the tables you plan to work with linked to the Access front end, the you can freely launch the relationships tool/window in Access and the proceeded to drop in the tables and draw relationships between the tables. Access will not automatic “pull in” the relationships in the front end and access will not draw the connecting relationships between the tables for you. However you can most certainly layout the tables, and draw relationship lines between those linked tables. As noted, any changes you make in this diagram will NOT be reflected in the back end database.
So while you are most free to draw and layout the tables, in access when using linked tables from an Access database, an SQL server database, or in this case an Oracle database, such relationships DESIGN changes in ALL CASES need to be made by the tools provided with the back end database system you have chosen to use Access with as the front end.
So to be clear, even with linked tables to an Access database, the use of the diagram tools in the front end will not make structure or relationship changes to the database. Of course with an Access database you would open up the back end database and then YES you can modify tables and modify relationships directly from the relationship window.
So when using linked tables, no data structure or relationships changes can or will occur to the back end database. However as noted you are most free to draw a “pretty” diagram with Access diagram tool and print it out when using an Oracle back end database. In effect the relationship tool in Access becomes a diagram tool without the ability to make changes to the linked database in question.

Windows Forms: Best way to store table lookups

Developing new C# .net4.5 Windows Forms application. I want to code it "right". I'm developing a couple User Controls. The controls are shared via several tabs. On the controls are some common drop down boxes that are populated with the same SQL Server table data. (one or two columns) I want to read the DB once and have the lookup data available during the entire user experience. The app will be used by many users. Whats the best way to store this data in my new code? example code appreciated. cache? static list ? Help! Thanks!
Simply a global DataTable (Dataset) would do. Or if you want control over the contents of the list using SortedDictionary containing your own custom class for each row would suffice.
The Custom Class is a tidy way of holding a cache (for the data you want from each row), as you can override the ToString function and populate the user controls easily.
To share this cache amongst many users is not easy, and could prove more trouble than its worth. Each user with a separate copy of the program would have their own copy of the cache (in the 2 methods above). (But the user controls will also contains subsets of this cache too). And each program would need to load the user controls, so perhaps this sharing across multiple instances direction is moot.

BIRT Scripted Data Source using existing JDBC DataSource

I know that my overall problem is generally approached using two of the more common solutions such as a join data set or a sub-table, sub-report. I have looked at those and I am not sure this will work effectively.
Background:
JDBC data source has local data which includes a series of id's that reference a record in a master data repository interfaced via a web service. This is where the need for a scripted data source arises. The data can be filtered on either attributes within the local JDBC data and/or the extended data from the web service. The complication is that my only interface is the id argument to the webservice.
Ideal Solution:
Aside from creating a reporting table or other truly desirable scenarios I am looking to creating a unified data source through a single scripting data source that will handle all the complexities. This leaves the report generation and parameter creation a bit cleaner, hopefully. The idea is to leverage the JDBC query as well as the web service queries in the scripted data source do the filtering and joins and create that singular unified view.
I tried using the following code as a reference to use the existing JDBC connection in the BIRT report definition to execute the query. However if I think my breakdown on what should be in open vs fetch given this came from beforeFactory for a completely different purpose may be giving me errors...truth is I see no errors it just returns 0 records.
a link
I have also found a code snippet to dynamically load a JDBC connection but that seems a bit obtuse and a ton of overhead for what I am needing to do. a link
In short: How in all-that-is-holy do you simply run a query against a database within a scripted data source if you wanted to do. The merit of doing that is another issue, but technically how?
Thanks in Advance!

How to access data in Dynamics CRM?

What is the best way in terms of speed of the platform and maintainability to access data (read only) on Dynamics CRM 4? I've done all three, but interested in the opinions of the crowd.
Via the API
Via the webservices directly
Via DB calls to the views
...and why?
My thoughts normally center around DB calls to the views but I know there are purists out there.
Given both requirements I'd say you want to call the views. Properly crafted SQL queries will fly.
Going through the API is required if you plan to modify data, but it isnt the fastest approach around because it doesnt allow deep loading of entities. For instance if you want to look at customers and their orders you'll have to load both up individually and then join them manually. Where as a SQL query will already have the data joined.
Nevermind that the TDS stream is a lot more effecient that the SOAP messages being used by the API & webservices.
UPDATE
I should point out in regard to the views and CRM database in general: CRM does not optimize the indexes on the tables or views for custom entities (how could it?). So if you have a truckload entity that you lookup by destination all the time you'll need to add an index for that property. Depending upon your application it could make a huge difference in performance.
I'll add to jake's comment by saying that querying against the tables directly instead of the views (*base & *extensionbase) will be even faster.
In order of speed it'd be:
direct table query
view query
filterd view query
api call
Direct table updates:
I disagree with Jake that all updates must go through the API. The correct statement is that going through the API is the only supported way to do updates. There are in fact several instances where directly modifying the tables is the most reasonable option:
One time imports of large volumes of data while the system is not in operation.
Modification of specific fields across large volumes of data.
I agree that this sort of direct modification should only be a last resort when the performance of the API is unacceptable. However, if you want to modify a boolean field on thousands of records, doing a direct SQL update to the table is a great option.
Relative Speed
I agree with XVargas as far as relative speed.
Unfiltered Views vs Tables: I have not found the performance advantage to be worth the hassle of manually joining the base and extension tables.
Unfiltered views vs Filtered views: I recently was working with a complicated query which took about 15 minutes to run using the filtered views. After switching to the unfiltered views this query ran in about 10 seconds. Looking at the respective query plans, the raw query had 8 operations while the query against the filtered views had over 80 operations.
Unfiltered Views vs API: I have never compared querying through the API against querying views, but I have compared the cost of writing data through the API vs inserting directly through SQL. Importing millions of records through the API can take several days, while the same operation using insert statements might take several minutes. I assume the difference isn't as great during reads but it is probably still large.

Can Core Data content be edited directly?

I've been using Core Data for about a week now, and really loving it, but one minor issue is that setting default values requires going through and setting up a temp interface to load the data, which I then do away with once I have the data seeded. Is there any way to edit values in a table, like how you can use phpMyAdmin to manipulate values in a MySQL database? Alternately, is there a way to write a function to import seed values from something like a Numbers spreadsheet if it doesn't detect the storedata XML file?
For your first question, you could edit the file directly but it's highly recommended you don't. How to edit it depends entirely on the store type you selected.
Regarding importing or setting up pre-generated data, of course: you can write code to manually insert entity instances into your Managed Object Context. There's a dedicated section on this very topic in the documentation. Further, if you have a lot of data to import, there's even a section on how to do this efficiently.
Is there any way to edit values in a
table, like how you can use phpMyAdmin
to manipulate values in a MySQL
database?
Xcode has a means of creating a quick and dirty interface for a data model. You just drag the data model file into a window in interface builder and it autogenerates an interface for you. This lets you view the data without having to have your entire app up and running.

Resources