CPU usage goes to 100% while reading from excel file? - windows

I created a VB.Net application that will read from excel file and put the data into a table.
I used an excel sheet which has 3 columns and 65000 rows.
Before starts reading the excel my machine's CPU Usage is around 15%, but during reading the CPU Usage jumps upto 95%.
I don't know why it is happening? Can someone help me in this issue?
The following is the code i'd written:
Private Sub readFromExcel(ByVal fileName As String, ByVal sheetName As String)
Dim connString As String = "data source=XE; user=test; password=test"
Dim con As New OracleConnection(connString)
Dim str1 As String
Dim str2 As String
Dim str3 As String
Dim xlApp As Excel.Application
Dim xlWorkBook As Excel.Workbook
Dim xlWorkSheet As Excel.Worksheet
xlApp = New Excel.ApplicationClass
xlWorkBook = xlApp.Workbooks.Open(fileName)
xlWorkSheet = xlWorkBook.Worksheets(sheetName)
Dim x As Integer
Dim y As Integer
Dim i As Integer
x = xlWorkSheet.Rows.Count()
y = xlWorkSheet.Columns.Count()
Try
For i = 1 To x - 1
'MsgBox(xlWorkSheet.Cells(i, 0).value)
str1 = xlWorkSheet.Cells(i, 1).value
str2 = xlWorkSheet.Cells(i, 2).value
str3 = xlWorkSheet.Cells(i, 3).value
insertData()
Next
Catch ex As Exception
MsgBox(ex.Message())
Finally
con.Close()
xlWorkBook.Close()
End Try
End Sub
Private Sub insertData()
Dim str As String
str = "insert into test_import values('" + str1 + "'," + str2 + "," + str3 + ")"
Dim cmd As New OracleCommand()
cmd.CommandText = str
cmd.Connection = con
cmd.ExecuteNonQuery()
End Sub
thx in advance.

This is entirely normal. A program only doesn't burn 100% core when it gets bogged down by I/O. Reading from a disk or network card, that blocks a program while the operating system supplies the data. Your code doesn't bog down like that, you are asking it to do a bunch of work. Getting 195,000 cell values one by one just takes a while. Excel is an out-of-process COM server so every cell read requires two CPU context switches. You can optimize it a bit by using a Range instead. Or by running it on a machine with a two-core CPU so it only shoots up to 50%.
Feature, not a bug.

Its much faster to read the 195000 cells into an Object array in one go and then loop the object array. (There is a very high overhead for each .Net call to the Excel object model)

Use get_range(cell1,cell2) method to get the cells value.
You can use it to take the cell value by row, by column, or take all cells value in one go.
Keep watch the CPU usage when you adjust the code to read the cell value either by row, by column, or take all cells value in one go.

Related

Use VBScript to VLOOKUP values, or evaluate formula?

For processing orders we're using VBScripts to import them into accounting software. There are several suppliers, each with their own file format, mostly CSV and XML. The first step is to extract all the order lines (custom function per supplier), do some additional processing and then write it to the database, which is the same for all suppliers.
One new supplier uses Excel files with all the order lines in one sheet, except for the corresponding VAT percentage value which are available in another sheet. The VAT percentage per item can be looked up using the itemcode from the order sheet.
The company only has LibreOffice Calc and I understand you could do something like this in macro. However, it is a fully automated process and every other file is already handled by VBScript so I'd rather not make an exception or handle just this one order type manually (opening Calc and running the macro). So it has to be VBS and LibreOffice in this case.
Here is the VBScript code I have so far:
Option Explicit
' variables
Dim oSM, oDesk
Dim sFilename
Dim oDoc
Dim oSheet
Dim iLine
Dim sCode, iCount, sDesc, fCost, Perc
Set oSM = WScript.CreateObject("com.sun.star.ServiceManager")
Set oDesk = oSM.createInstance("com.sun.star.frame.Desktop")
sFilename = "file:///C:/orders/import/supplier_orderlist_08-01-2019.xls"
set oDoc = oDesk.loadComponentFromURL( sFilename, "_blank", 0, Array() )
set oSheet = oDoc.getSheets().getByName("Orderlist")
For iLine = 11 to 12 ' testing first 2 lines
sCode = oSheet.getCellByPosition(1, iLine).getString()
iCount = oSheet.getCellByPosition(2, iLine).getString()
sDesc = oSheet.getCellByPosition(5, iLine).getString()
fCost = oSheet.getCellByPosition(8, iLine).getString()
'lookup doesn't work
Perc = Macro_VLOOKUP(sCode, oDoc)
WScript.Echo sCode & " - " & iCount & "x - " & sDesc & " => " & fCost & ", " & Perc & "%"
Next 'iLine
WScript.Quit 1
Function Macro_VLOOKUP(SearchValue, oDocGlob)
Dim oSheetLook, CellRange
Dim Column, Mode, svc, arg, Value
Set oSheetLook = oDocGlob.getSheets().getByName("Itemlisttotal")
Set CellRange = oSheetLook.getCellRangeByName("A1:B10000")
Column = 1
Mode = 0
svc = createUnoService("com.sun.star.sheet.FunctionAccess") '<- error: variable not defined
arg = Array(SearchValue, CellRange, Column, Mode)
Value = svc.callFunction("VLOOKUP", arg)
Macro_VLOOKUP = Value
End Function
It gives an error on the line with createUnoService:
Variable not defined 'createUnoService'
which is probably a LibreOffice Basic function and needs to be translated to the VBScript equivalent. There isn't much documentation or examples on this, so I can only guess, but Set svc = WScript.CreateObject("com.sun.star.sheet.FunctionAccess") also doesn't work and gives a "class name not found" error.
Is it possible to do a VLOOKUP (or something similar) from VBScript in LibreOffice Calc?
Or is there a way to evaluate a cell formula from a string at runtime?

Compile error: searching for one specific data point, looping through whole workbook, copy/paste data

Excel VBA beginner coming back for more. I am creating a macro that does the following two things:
1) Searches through multiple worksheets in a single workbook for a specific piece of data (a name), variable A below
2) If that name appears, to copy a specific range of cells from the worksheet (variable X below) to the master file (variable B below)
Sub Pull_X_Click()
Dim A As Variant 'defines name
Dim B As Workbook 'defines destination file
Dim X As Workbook 'defines existing report file as source
Dim Destination As Range 'defines destination for data pulled from report
Dim ws As Worksheet
Dim rng As Range
A = Workbooks("B.xlsm").Worksheets("Summary").Range("A1").Value
Set B = Workbooks("B.xlsm")
Set X = Workbooks.Open("X.xlsm")
Set Destination = Workbooks("B").Worksheets("Input").Range("B2:S2")
'check if name is entered properly
If A = "" Then
MsgBox ("Your name is not visible; please start from the Reference tab.")
Worksheets("Reference").Activate
Exit Sub
End If
X.Activate
For Each ws In X.Worksheets
Set rng = ws.Range("A" & ws.Rows.Count).End(xlUp)
If InStr(1, rng, A) = 0 Then
Else
X.ActiveSheet.Range("$A$2:$DQ$11").AutoFilter Field:=1, Criteria1:=A
Range("A7:CD7").Select
Selection.Copy
Destination.Activate
Destination.PasteSpecial
End If
Next ws
Application.ScreenUpdating = False
End Sub
UPDATE: I managed to resolve the previous compile error, and it seems that the code (should?) work. However, it gets to this step:
X.Activate
...and then nothing happens. There's no run-time errors or anything, but it doesn't seem to be searching through the file (variable X) or pulling any of the data based on the presence of variable A. Any thoughts?
What I would've done is loop through the rows and evaluate the column in which the necessary data appears and then avoiding copy/paste just make the target range equal to the source range:
Sub SearchNCopy()
Dim A As String 'The String you are searching for
Dim b As String ' the string where you shall be searching
Dim wbs, wbt As Workbook ' Declare your workbooks
Dim wss As Worksheet
Dim i, lrow As Integer
Set wbt = Workbooks("B.xlsm") 'Set your workbooks
Set wbs = Workbooks.Open("X.xlsm")
A = wbt.Worksheets("Summary").Range("A1").Value
If A = "" Then
MsgBox ("Your name is not visible; please start from the Reference tab.")
Worksheets("Reference").Activate
Exit Sub
End If
For Each wss In wbs.Worksheets 'Loop through sheets
lrow = wss.Cells(wss.Rows.Count, "A").End(xlUp).Row 'Find last used row in each sheet - MAKE SURE YOUR SHEETS DONT HAVE BLANKS BETWEEN ENTIRES
For i = 1 To lrow Step 1 'Loop through the rows
b = wss.Range("A" & i).Value 'Assign the value to the variable from column a of the row
If Not InStr(1, b, A) = 0 Then 'Evaluate the value in the column a and if it contains the input string, do the following
wbt.Worksheets("Input").Range("B2:CC2") = wss.Range("A" & i & ":CD" & i) 'copies the range from one worksheet to another avoiding copy/paste (much faster)
End If
Next i
Next wss
End Sub

Trying to automatically split data in excel with vba

I have absolutely no experience programming in excel vba other than I wrote a function to add a data stamp to a barcode that was scanned in on our production line a few weeks back, mainly through trial and error.
Anyways, what I need help with right now is inventory is coming up and every item we have has a barcode and is usually scanned into notepad and then manually pulled into excel and "text to columns" is used. I found the excel split function and would like a little bit of help getting it to work with my scanned barcodes.
The data comes in in the format: 11111*A153333*11/30/11 plus a carriage return , where the * would be the delimiter. All the examples I've found don't seem to do anything, at all.
For example here is one I found on splitting at the " ", but nothing happens if I change it to *.
Sub splitText()
'splits Text active cell using * char as separator
Dim splitVals As Variant
Dim totalVals As Long
splitVals = Split(ActiveCell.Value, "*")
totalVals = UBound(splitVals)
Range(Cells(ActiveCell.Row, ActiveCell.Column + 1), Cells(ActiveCell.Row, ActiveCell.Column + 1 + totalVals)).Value = splitVals
End Sub
And this is applied in the Sheet1 code section, if that helps.
It really can't be this complicated, can it?
Edit: Trying to add in Vlookup to the vba.
So as I said below in the comments, I'm now working on getting the vlookup integrated into this, however it just returns N/A.
Here is the sub I wrote based on the link below
Public Sub vlook(ByRef codeCell As Range)
Dim result As String
Dim source As Worksheet
Dim destination As Worksheet
Set destination = ActiveWorkbook.Sheets("Inventory")
Set source = ActiveWorkbook.Sheets("Descriptions")
result = [Vlookup(destination!(codeCell.Row, D), source!A2:B1397, 2, FALSE)]
End Sub
And I was trying to call it right after the For loop in the worksheet change, and just created another for loop, does this/should this be a nested for loop?
Just adding the code to the VBA behind the worksheet won't actually cause it to get called. You need to handle the worksheet_change event. The following should help:
Private Sub Worksheet_Change(ByVal Target As Range)
Application.EnableEvents = False
Dim cell As Range
For Each cell In Target.Cells
If cell.Column = 1 Then SplitText cell
Next
Application.EnableEvents = True
End Sub
Public Sub SplitText(ByRef codeCell As Range)
'splits Text active cell using * char as separator
Dim splitVals As Variant
Dim totalVals As Long
splitVals = Split(codeCell.Value, "*")
totalVals = UBound(splitVals)
Range(Cells(codeCell.Row, codeCell.Column), Cells(codeCell.Row, codeCell.Column + totalVals)).Value = splitVals
End Sub
If you want to process the barcodes automatically on entering them, you need something like this (goes in the worksheet module).
Private Sub Worksheet_Change(ByVal Target As Range)
Dim splitVals As Variant
Dim c As Range, val As String
For Each c In Target.Cells
If c.Column = 1 Then 'optional: only process barcodes if in ColA
val = Trim(c.Value)
If InStr(val, "*") > 0 Then
splitVals = Split(val, "*")
c.Offset(0, 1).Resize( _
1, (UBound(splitVals) - LBound(splitVals)) + 1 _
).Value = splitVals
End If
End If 'in ColA
Next c
End Sub

Speed up this Find/Filter Operation - (VB6, TextFile, ADO, VFP 6.0 Database)

I'm trying to figure out how to speed up this operation. Before I import a record from the text file I first need to see if one exists in the database. If it does exist I'm going to perform an update operation on it. If it does not exist I'm going to create a new record.
Running the code you see below this operation takes somewhere in the neighborhood of 3 hours.
I've tried using ADO's find method and it actually appears to be slower than the filter method.
The database is a Visual Foxpro 6 database. The table does have an index on the item_cd field but the table does not have any primary key established. This is out of my control since I didn't write the software and I'm trying to stay away from making any structural changes to the database.
There are 46652 rows in the text file and about 650,000 records/rows in the ADO recordset. I think slimming down the recordset would be the biggest step in fixing this but I haven't come up with any way of doing that. I'm trying to prevent creating duplicate records since there is no primary key and so I really need to have the entire table in my recordset.
Because I'm running this on my local machine it appears that the operation is limited by the power of the CPU. In actuality this might be used across the network, especially if I can get it to go faster.
Dim sFileToImport As String
sFileToImport = Me.lstFiles.Text
If sFileToImport = "" Then
MsgBox "You must select a file from the listbox to import."
Exit Sub
End If
If fConnectToDatabase = False Then Exit Sub
With gXRst
.CursorLocation = adUseClient
.CursorType = adOpenKeyset
.LockType = adLockReadOnly
.Open "SELECT item_cd FROM xmsalinv ORDER BY item_cd ASC", gXCon
End With
Call fStartProgress("Running speed test.")
Dim rstTxtFile As ADODB.Recordset
Set rstTxtFile = New ADODB.Recordset
Dim con As ADODB.Connection
Set con = New ADODB.Connection
Dim sConString As String, sSQL As String
Dim lRecCount As Long, l As Long
Dim s As String
sConString = "DRIVER={Microsoft Text Driver (*.txt; *.csv)};Dbq=" & gsImportFolderPath & ";Extensions=asc,csv,tab,txt;Persist Security Info=False;"
con.Open sConString
sSQL = "SELECT * FROM [" & sFileToImport & "]"
rstTxtFile.Open sSQL, con, adOpenKeyset, adLockPessimistic
If Not (rstTxtFile.EOF And rstTxtFile.BOF) = True Then
rstTxtFile.MoveFirst
lRecCount = rstTxtFile.RecordCount
Do Until rstTxtFile.EOF = True
'This code appears to actually be slower than the filter method I'm now using
'gXRst.MoveFirst
'gXRst.Find "item_cd = '" & fPQ(Trim(rstTxtFile(0))) & "'"
gXRst.Filter = "item_cd = '" & fPQ(Trim(rstTxtFile(0))) & "'"
If Not (gXRst.EOF And gXRst.BOF) = True Then
s = "Item Found - " & Trim(rstTxtFile(0)) 'item found
Else
s = "Item Not Found - " & Trim(rstTxtFile(0)) 'Item not found found
End If
l = l + 1
Call subProgress(l, lRecCount, s)
rstTxtFile.MoveNext
Loop
End If
Call fEndProgress("Finished running speed test.")
Cleanup:
rstTxtFile.Close
Set rstTxtFile = Nothing
gXRst.Close
A simple solution to speed up Yours_Rs.find response is to use the Yours_Rs.move statement first if it is possible for you. What I have done is to use MyRs.move statement prior to using MyRs.find to come in the vicinity of my actual record. It had really worked for me as response of move statement is quite brisk.
I was using it to locate a patient record. So, moving the pointer to a record near the actual record made MyRs.find statement to work with the speed of light.
regards,
MAS.
doesn't answer your question and this is a pretty old thread, but
why don't you import your text file to a temporary table on your db then do a join?
something like
SELECT tt.* FROM texttemp tt left outer join xmsalinv xal on tt.field1=xal.item_cd where xal.item_cd is null
this should return the contents of your imported text file which don't have any item_cd matches in the database, since you're dealing with a text file that complicates the query which is why i'm wondering your not importing the contents to a temporary table.
now assuming you know the mapping of the fields, you can probably also use this to insert assuming your db accepts insert select notation it'd be insert into xmsalinv (fields) select (matching fields) from (as above...)
this moves your choke points to the import process, which i'm hoping is quick.
the ado collections seem like they're pretty stupid, so they don't benefit from any sort of knowledge about the data and are kinda slow.
ah next item on "vb6 filter" google http://www.techrepublic.com/article/why-ados-find-method-is-the-devil/1045830
this response is based on basic sql knowledge and not tailored to foxpro
Use a firehose cursor for the VFP query's results if you aren't, and see your other post here for suggestions regarding the text file Recordset.
Perhaps better yet though, you might try getting rid of your slow "loop and search" aproach.
I would probably create a temporary Jet 4.0 MDB from scratch for each text file you want to look up. Import the text data, index your key field. Use ADOX to define a linked table over in the VFP database. The use a query to do your matching.
Close and dispose of the MDB afterward.
In response to Bob Riemersma's post, the text file is not causing the speed issues. I've changed my code to open a recordset with a query looking for a single item. This code now runs in 1 minute and 2 seconds as opposed to the three to four hours I was looking at the other way.
Dim sFileToImport As String
sFileToImport = Me.lstFiles.Text
If sFileToImport = "" Then
MsgBox "You must select a file from the listbox to import."
Exit Sub
End If
If fConnectToDatabase = False Then Exit Sub
Call fStartProgress("Running speed test.")
Dim rstTxtFile As ADODB.Recordset
Set rstTxtFile = New ADODB.Recordset
Dim con As ADODB.Connection
Set con = New ADODB.Connection
Dim sConString As String, sSQL As String
Dim lRecCount As Long, l As Long
Dim sngQty As Single, sItemCat As String
sConString = "DRIVER={Microsoft Text Driver (*.txt; *.csv)};Dbq=" & gsImportFolderPath & ";Extensions=asc,csv,tab,txt;Persist Security Info=False;"
con.Open sConString
sSQL = "SELECT * FROM [" & sFileToImport & "]"
rstTxtFile.Open sSQL, con, adOpenKeyset, adLockPessimistic
If Not (rstTxtFile.EOF And rstTxtFile.BOF) = True Then
rstTxtFile.MoveFirst
lRecCount = rstTxtFile.RecordCount
Do Until rstTxtFile.EOF = True
l = l + 1
sItemCat = fItemCat(Trim(rstTxtFile(0)))
If sItemCat <> "[item not found]" Then
sngQty = fItemQty(Trim(rstTxtFile(0)))
End If
Call subProgress(l, lRecCount, sngQty & " - " & sItemCat & " - " & rstTxtFile(0))
sngQty = 0
rstTxtFile.MoveNext
Loop
End If
Call fEndProgress("Finished running speed test.")
Cleanup:
rstTxtFile.Close
Set rstTxtFile = Nothing
My Functions:
Private Function fItemCat(sItem_cd As String) As String
'Returns blank if nothing found
If sItem_cd <> "" Then
With gXRstFind
.CursorLocation = adUseClient
.CursorType = adOpenKeyset
.LockType = adLockReadOnly
.Open "SELECT item_cd, ccategory FROM xmsalinv WHERE item_cd = '" & fPQ(sItem_cd) & "'", gXCon
End With
If Not (gXRstFind.EOF And gXRstFind.BOF) = True Then
'An item can technically have a blank category although it never should have
If gXRstFind!ccategory = "" Then
fItemCat = "[blank]"
Else
fItemCat = gXRstFind!ccategory
End If
Else
fItemCat = "[item not found]"
End If
gXRstFind.Close
End If
End Function
Private Function fIsStockItem(sItem_cd As String, Optional bConsiderItemsInStockAsStockItems As Boolean = False) As Boolean
If sItem_cd <> "" Then
With gXRstFind
.CursorLocation = adUseClient
.CursorType = adOpenKeyset
.LockType = adLockReadOnly
.Open "SELECT item_cd, bal_qty, sug_qty FROM xmsalinv WHERE item_cd = '" & fPQ(sItem_cd) & "'", gXCon
End With
If Not (gXRstFind.EOF And gXRstFind.BOF) = True Then
If gXRstFind!sug_qty > 0 Then
fIsStockItem = True
Else
If bConsiderItemsInStockAsStockItems = True Then
If gXRstFind!bal_qty > 0 Then
fIsStockItem = True
End If
End If
End If
End If
gXRstFind.Close
End If
End Function
Private Function fItemQty(sItem_cd As String) As Single
'Returns 0 if nothing found
If sItem_cd <> "" Then
With gXRstFind
.CursorLocation = adUseClient
.CursorType = adOpenKeyset
.LockType = adLockReadOnly
.Open "SELECT item_cd, bal_qty FROM xmsalinv WHERE item_cd = '" & fPQ(sItem_cd) & "'", gXCon
End With
If Not (gXRstFind.EOF And gXRstFind.BOF) = True Then
fItemQty = CSng(gXRstFind!bal_qty)
End If
gXRstFind.Close
End If
End Function
First can try creating an in-memory index on item_cd with gXRst!item_cd.Properties("OPTIMIZE").Value = True which will speed up both Find and Filter.
For ultimate speed in searching initialize helper index Collection like this
Set cIndex = New Collection
On Error Resume Next
Do While Not gXRst.EOF
cIndex.Add gXRst.Bookmark, "#" & gXRst!item_cd.Value
gXRst.MoveNext
Loop
On Error GoTo ErrorHandler
And insetad of Find use some function like this
Public Function SearchCollection(Col As Object, Index As Variant) As Boolean
On Error Resume Next
IsObject Col(Index)
SearchCollection = (Err.Number = 0)
On Error GoTo 0
End Function
3 hours just for a few hundred thousands of records!!! You are doing it the wrong way. Simply:
-append text file to a VFP table,
-then insert the ones that do not exist in existing table with a single SQL
-and update the ones that exist with another Update sql.
That is all and should take less than a minute (a minute is even very slow). You can do all these using the VFPOLEDB driver and it doesn't matter that you have VFP6 database, VFPOLEDB has VFP9 engine built-in.

how to Speed Up the VBA Macros

I am Generating a New Sheets using macros. For a New Sheet generation , Data is retrieved from more than 4 MS Access DB. Each DB had minimum 200 field. My Macro code includes
1. Cell locking
2. Alignment and formatting
3. One third of the cells in the sheet had a formulas
4. Cell reference with other Workbooks
My problem is every sheet generation it takes minimum one hour to complete the hole process. But it seems to me it's taking way too long.
I am already added the Application.ScreenUpdating = True to speed up the code but still it takes same time. How to do speed up the code , If you have any idea please guide me.
`For Ip = 5 To 150
resp = Range("B" & Ip).Value
With ActiveSheet.QueryTables.Add(Connection:= _
"ODBC;DSN=henkel2;DBQ=C:\Hl-RF\RSF-Temp.mdb;DriverId=25;FIL=MS Access;MaxBufferSize=2048;" _
, Destination:=Range("IV4"))
.CommandText = "select Vles from " & Shtname & " where cint(PrductID)='" & resp & "' and cint(DepotID) = '" & cnt1 & "' and Mnth = '" & mnths & "' and Type='" & typs & "'"
.Name = "tab product"
.FieldNames = True
.RowNumbers = False
.FillAdjacentFormulas = False
.PreserveFormatting = True
.RefreshOnFileOpen = False
.BackgroundQuery = True
.RefreshStyle = xlInsertDeleteCells
.SavePassword = False
.SaveData = True
.AdjustColumnWidth = True
.RefreshPeriod = 0
.PreserveColumnInfo = True
.SourceConnectionFile = _
"C:\Hl-RF\tabct.odc"
.Refresh BackgroundQuery:=False
End With`
Is There Is any way to Reduce the loop iteration time
Thanks In advance
Surely you mean
Application.ScreenUpdating = False
Apart from that you could also look to disable the recalculation of the workbook whilst the macro is running and see if that makes a difference. This is of course assuming that the bottle neck is with the spreadsheet part of the process, if its taking ages to get the data from access that might be an area to look at
Get hold of a copy of Professional Excel Development which includes an excellent profiling utility called PerfMon. It will allow you to see which parts of the report are taking all the time so you can analyse and rewrite
You can try the usual vba optimization methods of setting calculation to manual and disabling ScreenUpdating.
Dim calc As XlCalculation
calc = Application.Calculation
Application.ScreenUpdating = False
Application.Calculation = xlCalculationManual
Application.ScreenUpdating = True
Application.Calculation = calc
Put your code or function call between Application.Calculation = xlCalculationManual and Application.ScreenUpdating = True.
This is from my previous Post
Note: I coundn't find info weather or not you run the code from within Access or Excel. If you create the Excel Workbook from Access you probably have some code like this:
Dim xlApp As Excel.Application
Set xlApp = new Excel.Application
In this case you would have to change Application in the code above to xlApp. For example:
xlApp.Calculation = xlCalculationManual
I'd try to do MORE of the work on the database side. Generate the reports you want on the database side, and then export the results to Excel.
Access is MUCH better at automating reports than Excel is.
There is some disucussion of this topic here.
Edit:
Ok, then the next step is to identify which parts of your code are taking the longest. The simplest way to do this is to make a copy of your code and just start measuring various parts like this:
Private Declare Function GetTickCount Lib "kernel32.dll" () As Long
Private mlngStrt As Long
Private mlngEnd As Long
Private Const u As Long = 10000000
Public Sub Example()
Dim i As Long
mlngStrt = GetTickCount
For i = 0 To u
Next
mlngEnd = GetTickCount
Debug.Print "Section1", mlngEnd - mlngStrt
mlngStrt = GetTickCount
ExampleSubCall
mlngEnd = GetTickCount
Debug.Print "ExampleSubCall", mlngEnd - mlngStrt
mlngStrt = GetTickCount
For i = 0 To (u * 1.5)
Next
mlngEnd = GetTickCount
Debug.Print "Section2", mlngEnd - mlngStrt
Debug.Print "Example Complete"
End Sub
Private Sub ExampleSubCall()
Dim i As Long
For i = 0 To (u * 0.75)
Next
End Sub
This approach is fairly straight-forward. The drawback here is that you need to insert all of the timing statements and then turn around and remove them. Which is why I would work on a copy.
Once you know what parts are taking the longest you know where to focus your attention and what to ask for help with.
Take a look at Chris comments. We believe that your performance bottleneck is likely to be in the way you're querying the database rather than in the VBA code that applies the data into the sheet.
Simple questions about Access performance:
- Your tables have indexes?
- Are you using any kind of table join?
- Are the Access databases local on your computer or being accessed remotely?
Again, I'm only reinforcing what Chris already commented.
Yes, make a table in Access to hold your client IDs. Then create the query here and connect to it with the external data connector. After that refresh it manually or use VBA to refresh the connection whenever you're ready.

Resources