Is there a way to externalize report queries for BIRT reports. We need to support multiple database engines and so our queries are different depending on the underlying database. I would like to use a config parameter to tell BIRT report to use a specific query file
Sure you can. If you code up some Javascript in the report itself, it can access files on the disk to retrieve the textual queries and modify the query before it's executed.
The event you need to code for is beforeOpen on the data source. We actually use this for wildcarding parameters by detecting if they're set to "*" and dynamically adjusting the SQL query, changing it from:
select A from B where C = ?
to:
select A from B where ((C = ?) or (1==1))
The ugly modification is just so we don't have to worry about changing the positional parameters.
You can read a line from a disk file and change the query with something like:
try {
var fip0 = new Packages.java.io.FileInputStream("/query.txt");
try {
var fip1 = new Packages.java.io.DataInputStream(fip0);
try {
queryText = fip1.readLine() + "";
} catch(e1) {}
fip1.close();
} catch(e2) {}
fip0.close();
} catch(e3) {}
although you should probably have better error checking than that :-) I removed it as it's (1) somewhat large; and (2) somewhat proprietary.
I do not know of a way to do this out of the box. You could probably dream up some fairly complex scripting to fire off on the onLoad event of the Data Set.
How about placing the same stored procedure in each DB? THen you can parameterize the data connection information (On the Edit page for the report Data Source) and dynamically direct your report to query a specific data source. As long as the stored proc is on all instances, you will get the correct data from the correct source.
Related
In the following microsoft documentation: -
https://learn.microsoft.com/en-us/sql/ado/guide/appendixes/microsoft-data-shaping-service-for-ole-db-ado-service-provider?view=sql-server-2017
this feature is being removed and the suggestion is to use XML. Has anyone done this? I'm wondering what they mean, in terms of loading the structure of what MSDataShape is by using XML, or just to use XML objects?
TIA
I believe this is referring to the FOR XML clause of T-SQL, which performs much the same job as MSDataShape in that it returns hierarchically nested data.
Port your MSDataShape queries to FOR XML queries and change the client to parse the results instead of using the MSDataShape OLEDB provider.
At the client side, SAX or pull parsing would be the best fit to port code that previously used MSDataShape (which also had a move-through-the-records cursor based model).
This is my bit of code that is helpful. My MSDataShape code still works, therefore I propose using that to generate your XML as a template, then use that going forward to load them: -
Dim objShapeMaker As clsShapeMaker
Dim rsoTemp As ADODB.Recordset
Dim strXMLTemplate As String
' Template file
strXMLTemplate = "C:\Temp\Template_GI.xml"
' Create the MSDataShape and save it to XML
Set rsoTemp = objShapeMaker.CreateGI()
rsoTemp.Save strXMLTemplate, adPersistXML
' Now we have the XML in a file going forward, load it in my recordset
Set rsoTemp = New ADODB.Recordset
rsoTemp.Open strXMLTemplate, , , , adCmdFile
' Cleanup
Set rsoTemp = Nothing
Set objShapeMaker = Nothing
If you don't like the idea of generating XML template files to maintain, you could do this via .NET and expose it to COM to use in your VB6/VBA application as mentioned here.
I have made a .NET application that can generate these XML files from simple code lines should anyone want going forward that is similar to the blog listed, however it handles child recordsets with relationships.
EDIT 1: This works great if you have schema set ups without returning data. As far as I can tell, to populate these effectively, it's better to write code to load the structure first, and populate it after from seperate recordsets (which is slower!)
EDIT 2: This is the approach we are taking with a replacement in a .NET Interop. Initially looking at bringing XML from SQL and parsing that back as required. This could be bought back into a DataSet and that's parsed into the target recordset as well, but then the relationship between the tables in the result dataset needs to be set in code rather than the one place in T-SQL with XML output.
I am trying to do string replace on entries of a column inside a db table. So far, I have reached till here:
$misa = DB::table('mis')->pluck('name');
for($i=0;;$i++)
{
$misa[$i] = substr_replace("$misa[$i]","",-3);
}
The error I am getting is "Undefined offset:443".
P.S. I am not a full-fledged programmer. Only trying to develop a few simple programs for my business. Thank You.
Since it's a collection, use the transform() collection method transform it and avoid this kind of errors. Also, you can just use str_before() method to transform each string:
$misa = DB::table('mis')->pluck('name');
$misa->transform(function($i) {
return str_before($i, ':ut');
});
There are a few ways to make this query prettier and FASTER! The beauty of Laravel is that we have the use of both Eloquent for pretty queries and then Collections to manage the data in a user friendly way. So, first lets clean up the query. You can instead use a DB::Raw select and do all of the string replacing in the query itself like so:
$misa = DB::table('mis')->select(DB::raw("REPLACE(name, ':ut' , '') as name"));
Now, we have a collection containing only the name column, and you've removed ':ut' in your specific case and simply replaced it with an empty string all within the MySQL query itself.
Surprise! That's it. No further php manipulation is required making this process much faster (will be noticeable in large data sets - trust me).
Cheers!
Everytime my page comes up it executes the table query. I tried doing this,
https://blogs.oracle.com/shay/entry/preventing_queries_when_page_f
but it still happens. How can I fix this? I've tried setting the refresh condition to never, but then I can't get any data back. I'm using JDev 11g.
Thanks.
I can share how I did this in JDeveloper 12.1. I am not sure whether it works in 11g. Nor am I sure it's the best way to do it. But it worked for me.
The idea is to not prevent the initial query execution, but, rather, to make sure it happens quickly and returns no data. The approach is to set the initial query criteria in the View Instance of your Application Module.
This approach assumes you have at least one bind variable in your view object. If you do not, you will need to add one just for this.
Double click on the Application Module
Click on the "Overview" tab at the bottom
Click on the "Data Model" tab on the left, so that you are viewing the Data Model Components of your Application Module
In the "Data Model" tree on the right, click once on the View Object Usage ("MyObjectVO1", e.g.).
Click the "Edit..." in the upper-right corner of the Data Model tree.
Here, you can specify view criteria to be used initially for the page. So,
* Set an initial parameter that will (A) execute quickly and (B) return no data
Then, later, programatically reset the bind variables to correct values that will return data, when you are ready to let your query run.
The most simple way i know is to set up correctly "Refresh" attribute on iterator in the page definition.
To do that set refresh=ifNeeded, then set RefreshCondition with expression language, something like this: #{viewScope.yourBean.refresh}
public class YourBeanClass {
private boolean refresh = false;
public void someAction(ActionEvent actionEvent) {
//some custom logic to set refresh flag
refresh = true;
}
public boolean isRefresh() {
return refresh();
}
}
With this code you can simply manage the moment when you need to refresh your model.
While these solutions will work to prevent the query from executing, I would ask what is the use case that a user sees a table appear on a web page without doing a search first?
ADF is built to query automatically because normally a web page will have an af:query or af:quickquery component on the page to allow the user to enter a query. Once the query is executed, the results are returned and populate the table or form (depending on the page design). Including either of these search components on the page will prevent the page from executing the query until the search is executed. You can also execute a query on entry to the page using a taskflow and adding an Execute operation from the data control to the navigation into/prior to getting to the page.
Note that Shay's blog post is from 2009 and covers the use case of not using a search component and instead uses Execute with Params. Is this what you are using? Saying more about your use case would be helpful.
I have several Birt Reports that I am trying to set up to run on a cron job that will email pdfs of the reports every morning. Everything is working fine as far as the generation and emailing goes; the only issue I am stuck with is this: if there is nothing to report, a pdf with just the report title is generated and emailed (a blank report, basically). I'd like to stop this report from being generated at all, so i can skip the emailing, if the pdf file does not exist.
I have been all over Google for two days now, and the closest I can find is this: http://www.eclipse.org/forums/index.php/t/458779/ in which someone was trying to solve a similar problem and received a push in the right direction, but not a complete solution.
It appears as if this can be done during the beforerender script... but how?
I know I need to:
set a persistent global variable in the oncreate if there is indeed data to report.
get the persistent global variable in the beforerender script.
send the magic don't generate report command.
I'm doing all of generating and emailing from a php script, not Java, so I can't send commands like IEngineTask.cancel() (or can I???)
Yes, I know I can make a row in the report that says "No data to report", but that's not what my users want.
And yes, I could query the database outside of the report to determine if there is valid data to report or not, but i'd prefer not to.
And maybe I could even open and read the pdf, programmatically to see if there is anything there, but that sounds like more of a hassle than it's worth...
So, how do I do this?
Thanks.
My answer is a little bit late, but I'm doing it like this in a framework that is working for hundreds of reports, probably it could be simplified for a single report:
Note that all the code is written from memory (not copied from our framework), so maybe it contains some errors.
Add an external Javascript file myframework.js to your report.
In this file, define an object myframework like this:
if (myframework == undefined) {
myframework = {
dataFound: false,
afterReport: function() {
// Write it to the appContext.
// Using Java, you could read it after the
// runAndRenderTask is done.
reportContext.getAppContext().put("dataFound", this.dataFound);
// But since you probably cannot the context
// (don't like coding Java?), the report has to
// tell it to he world some other way...
var txt = "dataFound=" + (dataFound? "true": "false");
var fw = new java.io.FileWriter("c:\\reportcontext.out");
fw.write(txt);
fw.close();
}
};
}
Add the JS file to your report's resources.
In your report, at a place where you decide that the report has found something (e.g. typically in a data set's onFetch event), tell the framework so by calling
myframework.dataFound = true;
In the reports's afterFactory or afterRender event, call
myframework.afterReport();
Then your report should create an output file c:\reportcontext.out which contains the information you need.
I see articles on using SPMetal to generate the .cs file that allows LINQ to work properly. The file I'm talking about inherits from the Microsoft.SharePoint.Linq.DataContext class. How can I use LINQ without recompiling on my production environment, since I would need to regenerate this file using SPMetal on my production environment? I suspect the answer is going to be "can't do it".
I guess I'll use a CAML query instead unless there is some easier way to use LINQ that I am missing.
If the objective is just to query lists using LINQ and you want to avoid such recompilations, do not use SPMetal.
LINQ can be directly used on SPListItemCollection
e.g.
var FindCustomer = from SPListItem Item in Customers.Items
where Item["Orders"] as int == 5
select Item;
//or select new{Title = Item["Title"]}
This does not have hard coded entities but is more flexible. And as long as your list column names remain same, code can be deployed on any environment even if other lists are changing.
Also you can choose to retrieve few chosen field's data instead of retrieving data of all the fields every time.
There is no problem I guess. Personally I have been using Linq for good amount of time. I never generated the cs specifically for production. Is your site different across environments?
Im not sure if I'm missing the point or not, but the DataContext object takes the URL as apart of the constructor, so you should retrieve the URL from config somewhere E.g. database
DataContext teamSite = new DataContext("http://MarketingServer/SalesTeam");
OR use the SPContext object, if your code has a SharePoint context. E.g. in a web part
DataContext teamSite = new DataContext(SPContext.Current.Web.Url);