Excel 2003: Why does creating links to other spreadsheets take so long? - performance

My Excel application creates links to other Excel files. It takes approx 1 sec to create a link for 1 cell, but I have several hundred cells to link so it's painfully slow.
There is no difference in speed whether the source file is opened or closed. I have, however, noticed that creating links manually (e.g., by copying and pasting the formula containing the link into other cells by hand) is much faster while the source file is opened.
In my program I have configured
Application.Calculation to be Manual instead of Automatic.
I tried accomplish the linking in two ways (please note this is not a real code, but I hope it shows what I mean):
1.
Copy the formula link to the external source from Range("A1") using
a foreach loop on every Cell in a range:
Cell.FormulaR1C1 = Range("A1").FormulaR1C1
2.
Copy the formula from Range("A1") to another range without using a foreach loop:
Range("thisIsMyTargetRange").FormulaR1C1 = Range("A1").FormulaR1C1
Both methods are equally slow, and as I said I've checked both of the above methods with the source file opened and closed.
Is there any way to speed this up? Does anyone know enough about the linking mechanism in Excel 2003 to offer advice on how to improve the linking performance?

Is it pertinent to have the formulas? If not use .value instead of .formular1c1.
Also if the ranges are the same size then use a range copy instead of the loop. e.g.
range(a1:a500).value = range(a1:a500)
Added Alternative Method:
Also, an alternative method would be to create a table using the data import functionthat is refreshed when the file opens, if a sheet is what you after. Then if you wanted bits of that sheet, use the data in the import sheet/table. This will keep the updating or linking to only one large import when opening the file.

I had to do this for months of data taken around once a second.
Open the document you want to copy data from. Then copy any sheets that hold that data into the workbook you are using. Grab the data from them, then delete those sheets from your workbook. Do this in a macro so it can be done quickly and repeatedly.

Related

How to profile/debug VBA script that works fine on XLS files, but hangs on XLSX files?

I have a VB script that does row processing on a small Excel file (35 columns, 2000 rows, no hidden content). It takes about 3 seconds to process the file when applied to an .xls file. If I save the .xls as a .xlsx file, the same script takes about 30 to 60 seconds, completely freezing Excel several times in the process (although it finishes and the results are correct).
I suspect some loop parts of the code are responsible, but I need to find out which. I have found scripts like https://stackoverflow.com/a/3506143/5064264 which basically amount to timing all parts of the script by hand - similar to printf-debugging, which is not very elegant and also consumes much time for large scripts.
Is there a better alternative to semi-manual profiling, like performance analyzers in most IDEs, or exist any other approaches to this problem? I do not want to post the code as it is rather long and I also want to solve it for other scripts in the future.
yes - kind of. I use this to test how fast things are running and experiment with different approaches to get better timings. I stole this from somewhere here on SO.
sub thetimingstuff()
Dim StartTime As Double
Dim SecondsElapsed As Double
StartTime = Timer
'your code goes here
SecondsElapsed = Round(Timer - StartTime, 2)
MsgBox "This code ran successfully in " & SecondsElapsed & " seconds", vbInformation
end sub
Another EDIT
you could bury debug.print lines sporadically through your code to see how much each chunk takes to execute.
The error in my case was as Comintern's comment suggested - a method was copying whole columns into a dictionary to reorder them, instead of just the cells with values in them. To preserve this for future readers, I am copying his/her excellent comment here, thanks again!
An xls file has a maximum of 65,536 rows. An xlsx file has a maximum of 1,048,576 rows. I'd start by searching for .Rows. My guess is that you have some code iterates the entire worksheet instead of only rows with data in them.
As for generic profiling or how to get there, I used a large amount of breakpoints in the editor (set/remove with left-click in the line number column of the code editor):
I first set them every 20 lines apart in the main method, ran the program with the large dataset, then after each press of continue (F5) manually measured if it was slow or not.
Then I removed the breakpoints in the fast regions and added additional ones in the slow regions to further narrow it down into methods and then into lines inside the methods.
At the end, I could verify if by commenting out the responsible line of code and fix it.

QTP VB Script to update data excel placed in QC

I'm trying to automate some set of test cases which would pass inputs from one to another. For instance, if I have 5 test cases then 1st test case would pass input to 2nd - 2nd to 3rd - likewise it goes on.
And another point to be noted is that I won't perform batch execution and there will be a certain time gap between each test case.
So what I'm trying to do is like updating the outputs into some excel sheet and call them during succeeding execution. I have tried searching and tried some codes, but nothing has worked out.
So please share some idea to update excel sheet during run time which is placed in QC. Thanks!
What you're essentially saying is that you have test runs separated by some indeterminate amount of time, and you need to share data between runs. The answer is you need persisted storage of your data. You could use a database, flat file, Excel spreadsheet, or anything else that will let you programmatically write data in one run then read it in the next.
Excel spreadsheets are one such solution. You said you tried it and it did not work. That likely means that the method you used to write or read the data was incorrect, and not that there was a problem with the concept. If you provide some more specifics about exactly what you tried and where it failed, hopefully the community will be able to assist you.
I Believe you have Input Excel Sheet(s) in QC, What you can do is download the excel file from QC to local machine, store output from 1st test case to this excel sheet and upload back to QC. Which now you can use as input to next test case.

Excel: Can using external links improve performance?

I'm wondering if there is a performance gain when using external links vs. opening a spreadsheet and copying cells (using VBA).
I would imagine that Excel has to open the file just the same, but when I test using the following code, external linking comes out faster:
Dim t As Double
Dim rng As Range
For Each rng In shtId.Range("A1:A5")
t = Now
ThisWorkbook.Names("rngID").RefersToRange = rng
ThisWorkbook.UpdateLink "H:\manualData.xlsx", xlExcelLinks
Debug.Print "link: " & Format(Now - t, "######.0000000")
t = Now
Workbooks.Open "H:\manualData.xlsx", readonly:=True
ActiveWorkbook.Close False
Debug.Print "open: " & Format(Now - t, "######.0000000")
Next
Results:
link: .0000116
open: .0000231
link: .0000116
open: .0000347
link: .0000000
open: .0000347
link: .0000000
open: .0000347
link: .0000000
open: .0000347
The workbook has a range of cells with lookup formulas keyed on an ID field. These formulas have external links. To test it, I change the ID and force an update. To test the file open approach, I just open and close the source file.
I'm looking to speed up a process that's having an issue due to low bandwidth over the network. I've already explored various options and would just like to understand if this is a valid one. I searched on this topic and some people say external links could cause performance issues while some say otherwise. I'd like to get a better idea of the mechanism behind external links so that I can understand what to expect when implemented.
Any thoughts?
It's a lot more work, but reading / writing the data from / to an XML file (using MSXML) would solve the performance issue. It's a route I've been forced to adopt under certain circumstances where bandwidth has been low.
The speed at which VBA can retrieve the data and run calculations is a lot quicker than using multiple links. Depending on the circumstances; you could do this on a Workbook Open Event, or even a specific Change Event (ComboBox etc.) since you're only working with kB of data.

How to paginate a list a files in a directory in asp?

We're using folder = objFSO.GetFolder(<path>) to get a list of files with a directory.
We then For Each over the folder.Files array to output the list of filenames.
We've got to a point where there are thousands of files in the folder and its quite slow so we want to add some pagination to the screen. So to all show 500 files at a time. I have no idea if this is even possible and all Google searches haven't helped.
Could anyone point me in the right direction?
Firstly, regarding the listing of the files, can you get the listing and persist (cache) the result somewhere to speed up repeated access? This is a method we use in .NET when getting large listings of files (we're using ASP.NET Web Pages so we just use WebCache.Set, but I imagine you could write it to a text file if you needed to).
I see some anecdotal evidence here that FSO is quite slow when listing large numbers of files:
One consideration not addressed is speed. I had a small VB app which used FileSystemObject to loop through files in a folder. It took approx. 5-7 minutes just to walk through every file in a given folder (approx. 2200 files). When I switched to using the DIR() command, I could walk the files in about 6-10 seconds. There are limitations to DIR() as well, but the speed factor was a huge consideration which went against FileSystemObject.
Obviously once you have the listing you would loop through them 500 at a time with an offset / page size
EDIT: I include an example on how to use WScript.Shell to do this from a couple of aspfaq.com articles I found (1 | 2)
Set objWShell = CreateObject("WScript.Shell")
Set objCmd = objWShell.Exec("%COMSPEC% /C dir c:\")
strPResult = objCmd.StdOut.Readall()
I wrote an online file browser a few years ago (available here), and found that it was reading the file sizes that slowed down file listing with FSO. As soon as I skipped that, then listing went lightning fast.
For pagination, I would recommend creating a 'disconnected' ado recordset out of the file names, and then use that for the actual navigation of names... then you've got pagination built in... as well as sorting and searching. Let me know if you're interested in a code example.

JMeter - saving results + configuring "graph results" time span

I am using JMeter and have 2 questions (I have read the FAQ + Wiki etc):
I use the Graph Results listener. It seems to have a fixed span, e.g. 2 hours (just guessing - this is not indicated anywhere AFAIK), after which it wraps around and starts drawing on same canvas from the left again. Hence after a long weekend run it only shows the results of last 2 hours. Can I configure that span or other properties (beyond the check boxes I see on the Graph Results listener itself)?
Can I save the results of a run and later open them? I know I can save the test plan or parts of it. I am unclear if I can save separately just the test results data, and later open them and perform comparisons etc. And furthermore can I open them with different listeners even if they weren't part of original test (i.e. I think of the test as accumulating data, and later on I want to view and interpret the data using different "viewers").
Thanks,
-- Shaul
Don't know about 1. Regarding 2: listeners typically have a configuration field for "Write All Data to a File", which lets you specify the file name. You can use the Simple Data Writer to store results efficiently for later analysis.
You can load results from a previous test into a visualizer by choosing "Write All Data to a File" and browsing for the file you wish to load. Somewhat counterintuitively, selecting a file for writing also loads that file into the visualizer and displays the results. Just make sure you don't run the test again while that file is selected, otherwise you will lose your saved test data. :-)
Well, I later found a JMeter group that was discussing the issue raised in my first question, and B.Ramann gave me an excellent suggestion to use instead a better graph found here.
-- Shaul

Resources