PowerQuery running very slow - powerquery

I noticed that my PowerQuery report can run very slow, and it's not only when refreshing, but when I 'Launch PowerQuery Editor' - it can take ~30 mins to get to the last last query in the report to see what has been loaded and then calculated.
The report is using 8 different .csv files as inputs (not very large, <1000 rows and <15 columns each).
Then on these inputs I make joins and grouping multiple times, but apart from that - there's no other 'heavy' calculations (only some sums, averages, percentages, some nested ifs).
So, I would have thought it shouldn't be too complex for PowerQuery to deal with it, but sometimes (not always???!!!) it takes really long to get 'inside'.
Yesterday I worked on it all day (almost :) as of course had also other jobs to do at work :)), and in the morning it took <1min to refresh, and then after launching PowerQuery Editor it was rather quick to get to every query in the report.
In the afternoon, with same inputs, it took ~3mins to refresh, and when I launched PowerQuery editor it took almost 30 mins to get to the last query in the report (my record wait time :O).
Do you know why this is happening?
I have a feeling that it will be something related to some settings of Excel/PowerQuery maybe?
But not sure where to start?
I also had a strange (???) situation when turninng off the annoying pop up message about the Privacy levels when using native databases (Data - Get data - Query options - Security sections - the first box needs to be unchecked):
I had it unchecked already, but after I showed it to my colleague - it took for me very long for the message about ignoring Privacy Levels to pop up (it shouldn't pop up, as I had the relevant bit unchecked???), then I had to tick that I wanted to ignore the Privacy Levels, and then it refreshed in normal time (that's all on the same PowerQuery report, just few days earlier).
Thanks in advance for your help on this.
Ania

I have come across these file corruption issues with PowerQuery in the past,
PowerQuery should not take that long to launch. Similar situation happened to me and I performed a MS Office Repair and rebuilt a new excel file which solved the issue.

Related

How to make a Spotfire link open faster?

I've published a Spotfire file with 70 '.txt' files linked to it. The total size of the files is around 2Gb. when the users open it in their web browser it takes + - 27 minutes to load the linked tables.
I need an option that enhances opening performance. The issue seems to be the aumont of data and the way they are linked to Spotfire.
This runs in a server and the users open the BI in their browser.
I've tryed to embeed the data, it lowers the time, but forces me to interact with the software every time I want to update the data. The solution is supposed to run automatically.
I need to open this in less than 5 minutes.
Update:
- I need the data to be updated at least twice a day.
- The embedded link is acceptable from the time perspective, but the system need to run without my intetrvention.
- I've never used Spotfire automation services.
Schedule the report to cache twice a day on the Spotfire server by setting up a rule under scheduling and routing. The good thing about this is while it is updating the analysis for the second time during the day, it will still allow users to quickly open older data until it is complete. To the end user it will open in seconds but behind the scenes you have just pre-opened the report. Once you set up the rule this will run automatically with no intervention needed.
All functionality and scripting within the report will work the same, and it can be opened up many times at the same time from different users. This is really the best way if you have to link to that many files. Otherwise, try collapsing files, aggregating data, removing all unnecessary columns and data tables for the data to pull through faster.

Form datasource only loads a part of records at a time?

Today I'm wondering why (AX2009) LedgerTransVoucher form only seems to load a part of query results at the time. If the results include, say, 35K rows, only 10k are loaded at once. And if the user decides to print the results to Excel, they would only get 10k rows.
Why is this? The 10k is such a clean number I'm thinking a parameter somewhere but I have no idea where it could be hidden.
And yes, I know they should be using a report instead :)
Alas, this apparently had nothing to do with AX as such, but is some conflict with Citrix. False alarm it seems.

access-2013 report opens very slow

I've a weird problem here with a report which I use every day.
I've moved from XP to WIN-7 some time ago and use access 2013.
(Language is german, so sorry I can only guess how the modes are called in english)
"Suddenly" (I really can't say when this started) opening the report in "report-view" takes VERY long. Around 1 minute, or so. Then, switching to "page-view" and formatting the report takes only 2 or 3 seconds. Switching back to report-view, again takes 1 minute.
The report has a complex Query as datasource. (In fact, a UNION of 8 sub-queries) Opening the this query displays the data after 1 second which is ok.
All tables are "linked" from the same ODBC Datasource, which points to a mysql server on our network.
Further testing I opened every table the queries use, one after another. I noticed that opening these tables takes around 9 seconds for every single table. It doesn't matter if it's a small or big table. Always these 9 seconds.
The ODBC datasource is defined using the IP address of the server, not the name. So I consider it not being a nameserver problem / timeout/ ...
What could cause this slowdown on opening tables ????
I'm puzzeled..
Here are a few steps I would try:
Taking a fresh copy of the Access app running on one of those "fast clients" and see if that solves the issue
try comparing performance with a fast client after setting the same default printer
check the version of the driver on both machines, and if you rely on a DSN, compare them

API User Usage Report: Inconsistent Reporting

I'm using a JVM to perform API calls to the Google Apps Administrator API.
I've noticed with the User Usage Reports, I'm not getting complete data when it comes to a field I'm interested in (num_docs_externally_visible) and the fields which form that fields calculation. I generally request a single day's usage report at a time, across my entire user base (~40k users).
According to the documentation on the developer's, I should be able to see that in a report 2-6 days after; however after running a report for the first 3 weeks of February, I've only gotten it for 60% of the days. The pattern appears to be random (in that I have up to 4 day in a row streaks of the item appearing and 3 days in a row streaks of it not, but there is no consistency to this).
Has anyone else experienced this issue? And if so, were you able to resolve it? Or should I expect this behavior to continue if this is an issue with what the API is returning outside of my control?
I think it's only natural that the data you get is not yet complete, it takes a certain day to receive the complete data.
This SO question is not exactly the same of your question, but i think it will help you. Especially the part that you need to use your account time zone.

Google Analytics: incorrect number of sessions when grouping by Custom Dimension

For a while I have successfully queried the number of sessions for my website, including the number of sessions per 'Lang Code' and per 'Distribution Channel'; both Custom Dimensions I have created in Analytics with their own slot and their Scope Type set to 'Session'.
Recently the number of sessions has decreased significantly when I group by a Custom Dimension, e.g. Lang Code.
The following query gives me a number of say 900:
https://ga-dev-tools.appspot.com/query-explorer/?start-date=2015-10-17&end-date=2015-10-17&metrics=ga%3Asessions
Whereas this query gives returns around a quarter of that, say ~220:
https://ga-dev-tools.appspot.com/query-explorer/?start-date=2015-10-17&end-date=2015-10-17&metrics=ga%3Asessions&dimensions=ga%3Adimension14
Now, my initial reaction was that 'Lang Code' was not set on all pages but I checked and this data is includes guaranteed on all pages of my website.
Also, no changes have been made to the Analytics View I'm querying.
The same issue occurred a couple of weeks ago and at the time I fixed this by changing the Scope Type of said Custom Dimensions to Session, but now I'm no longer sure if this was the correct fix or if this was just a temporary glitch since:
the issue didn't occur before
the issue now reoccurs
Does anyone have any idea what may have caused this data discrepancy?
P.S. to make things stranger, for daily reporting we run this query every night (around 2am), and then the numbers are actually correct, so apparently it makes a difference at what time the query is executed?

Resources