Accessing SQL array element from Libreoffice Basic - libreoffice-base

I have a postgresql database that contains program data. In Libreoffice Calc, I have Basic macros that interact with the postgresql database and uses Calc as the user client. One of the postgresql tables has an array and I can't index into that array directly from Basic.
Here is the table setup, as shown in pgAdmin:
sq_num integer,
year_start integer,
id serial NOT NULL,
"roleArray" text[]
Say I want to SELECT roleArray[50]. My every attempt to do this out of Basic results in the entire array being passed. I can certainly split the array myself and get the element I'm after, but I was using SQL arrays to help automate this stuff.
My Basic code uses a Libreoffice Base file for the connection to the postgresql database. Going to the Base file, I cannot create a query that will select an individual element and not return the entire array UNLESS I select the button "Run SQL command directly" and run this query:
SELECT "roleArray"['50'] FROM myTableThatHasArrays
Then I get element 50 from every record as intended.
I believe there is a bug report that describes this, where the Base command parser can't handle indexing an array. My question is what is the best method to overcome this?
The best scenario is to be able to index an element in the SQL array directly from Basic.

It sounds like you used XRow.getString, which (sensibly enough) retrieves the array as a single large string. Instead, use XRow.getArray and then XArray.getArray. Here is a working example:
sSQL = "SELECT id, ""roleArray""[2] FROM mytablethathasarrays;"
oResult = oStatement.executeQuery(sSQL)
s = ""
Do While oResult.next()
sql_array = oResult.getArray(2)
basic_array = sql_array.getArray(Null)
s = s & oResult.getInt(1) & " " & basic_array(1) & CHR$(10)
Loop
MsgBox s

Related

Can I use expression builder to return a SQL query result to a variable?

I am using a software, pc/mrp, which appears to have a built-in Visual Fox Pro editor for FRX files. It also has an external usage of an ef file. Based on some usage of Google, the report designer seems standard, not custom. The ef file usage may be a custom thing. Now, I need to find a way to get access to a value from a SQL statement inside the report. The statement needs to run per-line in the report.
EF:
This file has sections:
~in~
~out~
In these sections, I can run code, but if there is a ~perline~ type section, I don't know how to access it. I can use the ~in~ to try to create a relationship between the databases, as shown in the following example:
~IN~
THISAREA = SELECT()
USE PARTMAST ORDER BYPARTNO IN 0
SELECT (THISAREA)
SET RELATION TO PARTNO INTO PARTMAST ADDITIVE
GO TOP
~OUT~
USE IN SELECT("SALES")
But, for this I don't know how to join the databases. I have two databases (A,B) I need to connect them based on two fields (pono,line). If (A.pono and a.line) = (B.pono and B.line) then they would be linked. Is this possible?
Report Designer:
The other way I see this working is to do the query inside the report designer. Inside report properties is a variable tab. I can use this to assign to variables using expressions. I need:
SELECT field from B where B.pono = pono and B.line = line; INTO ARRAY varArray;
But, it gives me an error, likely because this is trying to create a new variable as opposed to actually assigning to the variable in the report. I tried editing a field inside the designer to use the preceeding code as well, but that also failed.
Is there a way using the report designer or the ef file to grab the data I need per line?
The sample code you show is doing something like a join with the SET RELATION command. To use SET RELATION, there has to be an index on the relevant field (expression) in the child table. So, if your table B has an index on PONO + LINE (or, if those are numeric, STR(PONO, length) + STR(LINE, length)), you can SET RELATION TO PONO + LINE INTO B, again, using the more complicated expression if necessary.

SSRS reports dynamics crm have current record id used as parameter

Hi for my ssrs report In dynamics crm I require that when the report is ran against a certain record, the records passes the record id so that only the relevant results are displayed.
How is this possible ? Steps provided will be great also
Reports are created using SQL.
Run against campaign entity
SSRS reports for CRM have special parameters that enable this for you. To filter by selected records (or the current record you have open) you can utilise a hidden parameter called "CRM_Filtered[Entity]" where entity is the relevant entity you are linking the report to.
In your case, i.e. for the campaign entity, this hidden parameter will be called CRM_FilteredCampaign. For a SQL report this will be a text parameter and will be set to something like this (set by CRM when you run the report)
select campaign0.* from FilteredCampaign as "campaign0"
I do not have a report to hand to check exactly what the SQL will contain, so it might not be exact. But you get the idea. There are several ways to embed this in your report, but you could do so in a rudimentary fashion like this in a dataset:
declare #sql as nVarchar(max)
set #sql = 'SELECT c.campaignid FROM (' + #CRM_FilteredCampaign + ') as c'
exec(#sql)
Expanding on this, i.e. rather than executing text SQL in your main dataset, you can instead simplify the usage by creating a dataset/parameter combo based off the text. In effect, convert the SQL text to a list of values instead.
So add the above SQL to its own DataSet (for this example called DS_FilteredCampaign).
Once you have created DS_FilteredCampaign make sure you click on the Refresh Fields button. Type in the following instead of <null> for the parameter value:
select c.* from FilteredCampaign as c
Once that comes back click on Ok to save the DataSet.
Next, create another hidden text parameter (e.g. Int_FilteredCampaign) and tell it to get its default value from a DataSet (not its available values, its default value). Point the values at DS_FilteredCampaign, and you should be able to select campaignid as its value field. This in effect makes the parameter an array of Ids you can reference in your main DataSet
Now it's much more usable as you can reference it in your SQL something like this in your main DataSet:
select c.*
from FilteredCampaign c
inner join ActivityPointer ap on ...
inner join FilteredAccount a on ...
where c.campaignid in (#Int_FilteredCampaign)
The important piece being where c.campaignid in (#Int_FilteredCampaign)
Summary Steps:
You have a main DataSet called something like dsMain
Create a new parameter called CRM_FiltetedCampaign
Create a DataSet (DS_FilteredCampaign) that executes the SQL passed into CRM_FilteredCampaign
Refresh Fields on the data set to get the campaignid field
Create a text parameter (Int_FilteredCampaign) that retrieves its default value using the new dataset (DS_FilteredCampaign) using campaignid for the value
Reference this new parameter in you dsMain dataset

Parsing large txt files in ruby taking a lot of time?

below is the code to download a txt file from internet approx 9000 lines and populate the database, I have tried a lot but it takes a lot of time more than 7 minutes. I am using win 7 64 bit and ruby 1.9.3. Is there a way to do it faster ??
require 'open-uri'
require 'dbi'
dbh = DBI.connect("DBI:Mysql:mfmodel:localhost","root","")
#file = open('http://www.amfiindia.com/spages/NAV0.txt')
file = File.open('test.txt','r')
lines = file.lines
2.times { lines.next }
curSubType = ''
curType = ''
curCompName = ''
lines.each do |line|
line.strip!
if line[-1] == ')'
curType,curSubType = line.split('(')
curSubType.chop!
elsif line[-4..-1] == 'Fund'
curCompName = line.split(" Mutual Fund")[0]
elsif line == ''
next
else
sCode,isin_div,isin_re,sName,nav,rePrice,salePrice,date = line.split(';')
sCode = Integer(sCode)
sth = dbh.prepare "call mfmodel.populate(?,?,?,?,?,?,?)"
sth.execute curCompName,curSubType,curType,sCode,isin_div,isin_re,sName
end
end
dbh.do "commit"
dbh.disconnect
file.close
106799;-;-;HDFC ARBITRAGE FUND RETAIL PLAN DIVIDEND OPTION;10.352;10.3;10.352;29-Jun-2012
This is the format of data to be inserted in the table. Now there are 8000 such lines and how can I do an insert by combining all that and call the procedure just once. Also, does mysql support arrays and iteration to do such a thing inside the routine. Please give your suggestions.Thanks.
EDIT
I have to make insertion's into the tables depending on whether they are already exist or not, also I need to make use of conditional comparison's before inserting into the table. I definitely can't write SQL statements for these, so I wrote SQL stored procedures. Now I have a list #the_data, how do I pass that to the procedure and then iterate through it all on MySQL side. Any ideas ?
insert into mfmodel.company_masters (company_name) values
#{#the_data.map {|str| "('#{str[0]}')"}.join(',')}
this makes 100 insertions but 35 of them are redundant so I need to search the table for existing entries before doing a insertion.
Any Ideas ? thanks
From your comment, it looks like you are spending all your time executing DB queries. On a recent Ruby project, I also had to optimize some slow code which was importing data from CSV files into the database. I got about a 500x performance increase by importing all the data by using a single bulk INSERT query, rather than 1 query for each row of the CSV file. I accumulated all the data in an array, and then built a single SQL query using string interpolation and Array#join.
From your comments, it seems that you may not know how to build and execute dynamic SQL for a bulk INSERT. First get your data in a nested array, with the fields to be inserted in a known order. Just for an example, imagine we have data like this:
some_data = [['106799', 'HDFC FUND'], ['112933', 'SOME OTHER FUND']]
You seem to be using Rails and MySQL, so the dynamic SQL will have to use MySQL syntax. To build and execute the INSERT, you can do something like:
ActiveRecord::Base.connection.execute(<<SQL)
INSERT INTO some_table (a_column, another_column)
VALUES #{some_data.map { |num,str| "(#{num},'#{str}')" }.join(',')};
SQL
You said that you need to insert data into 2 different tables. That's not a problem; just accumulate the data for each table in a different array, and execute 2 dynamic queries, perhaps inside a transaction. 2 queries will be much faster than 9000.
Again, you said in the comments that you may need to update some records rather than inserting. That was also the case in the "CSV import" case which I mentioned above. The solution is only slightly more complicated:
# sometimes code speaks more eloquently than prose
require 'set'
already_imported = Set.new
MyModel.select("unique_column_which_also_appears_in_imported_files").each do |x|
already_imported << x.unique_column_which_also_appears_in_imported_files
end
to_insert,to_update = [],[]
imported_data.each do |row|
# for the following line, don't let different data types
# (like String vs. Numeric) get ya
# if you need to convert the imported data to match correctly against what's
# already in the DB, do it!
if already_imported.include? row[index_of_unique_column]
to_update << row
else
to_insert << row
end
end
Then you must build a dynamic INSERT and a dynamic UPDATE for each table involved. Google for UPDATE syntax if you need it, and go wild with all your favorite string processing functions!
Going back to the sample code above, note the difference between numeric and string fields. If it is possible that the strings may contain single quotes, you will have to make sure that all the single quotes are escaped. The behavior of String#gsub may be surprise you when you try to do this: it assigns a special meaning to \'. The best way I have found so far to escape single quotes is: string.gsub("'") { "\\'" }. Perhaps other posters know a better way.
If you are inserting dates, make sure they are converted to MySQL's date syntax.
Yes, I know that "roll-your-own" SQL sanitization is very iffy. There may even be security bugs with the above approach; if so, I hope my better-informed peers will set me straight. But the performance gains are just too great to ignore. Again, if this can be done using a prepared query with placeholders, and you know how, please post!
Looking at your code, it looks like you are inserting the data using a stored procedure (mfmodel.populate). Even if you do want to use a stored procedure for this, why do you have dbh.prepare in the loop? You should be able to move that line outside of lines.each.
You might want to try exporting the data as csv and loading it with 'load data infile... replace'. It seems cleaner/easier than trying to construct bulk insert queries.

SQLITE3 strings in where clauses seem confused

I'm wondering if anyone has any clarification on the difference between the following statements using sqlite3 gem with ruby 1.9.x:
#db.execute("INSERT INTO table(a,b,c) VALUES (?,?,?)",
some_int, other_int, some_string)
and
#db.execute("INSERT INTO table(a,b,c) VALUES (#{some_int},"+
+"#{some_int}, #{some_string})")
My problem is: When I use the first method for insertion, I can't query for the "c" column using the following statement:
SELECT * FROM table WHERE c='some magic value'
I can use this:
"SELECT * FROM table WHERE c=?", "some magic value"
but what I really want to use is
"SELECT * FROM table WHERE c IN ('#{options.join("','")}')"
And this doesn't work with the type of inserts.
Does anyone know what the difference is at the database level that is preventing the IN from working properly?
I figured this out quite a while ago, but forgot to come back and point it out, in case someone finds this question at another time.
The difference turns out to be blobs. Apparently when you use the first form above (the substitution method using (?,?)) SQLite3 uses blogs to enter the data. However, if you construct an ordinary SQL statement, it's inserted as a regular string and the two aren't equivalent.
Insert is not possible to row query but row query used in get data that time this one working.
SQLite in you used in mobile app that time not work bat this row query you write in SQLite Browse in that work

Generate reports in birt using user input. When input is null, everything should be fetched otherwise corresponding data should be shown

I have to create a birt report with user input parameters. It is something like when the para
meter is left blank it should fetch all values from a table otherwise when the user inputs the students roll no.,the corresponding data should be fetched. Can this be done through Birt report? If yes, then please suggest a way.
Thanks!
Yes, you can do that. If the parameter is optional you can't use the Dataset Parameter (with a ? in your query), because it will be null. Instead you have to modify your query using JavaScript.
Create a Report Parameter like usual, in this case 'stud_no'. Then add a comment in your SQL that you are reasonably sure is unique, I use something like --$stud_no$, wherever you want your clause inserted.
Then add a script like this to your Data Set, in beforeOpen:
if (params["stud_no"].value){
this.queryText = this.queryText.replace("--$stud_no$", "and stud_no = " + params["stud_no"]);
}
This replaces the comment with the clause when the parameter has a value. You can use regex in the search string, and then you can also insert it multiple places if you want.
Create your paremater using a like statement
where students_roll_no like ?
Create your report paramater using as a text box, with a defualt value of %
Because the percent '%' is the SQL wildcard, it returns all values. If the user enters a Student Roll number, it returns only that record. Additionally the user can enter 0500% and get all the records that begin 0500.

Resources