Intel parallel studio warning on VS2005 - visual-studio-2005

I installed the intel parallel studio and used it. But when I ran the application, I got a message in the Output section of Visual Studio 2005 that said
“Data collection has stopped after reaching the configured limit of 10
MB of raw data. The target will continue to run, but no further data
will be collected. The data collection stopped since the data size
limit of (10 Mb) is reached. The application is running but no data is
collected.”
Does anyone has any idea why this message is coming and is it like if I continue running my application the data will not be collected. I am not sure how to configure the settings as this is the first time I am using any such tool for finding performance hotspots.

It would be nice to know what version of Parallel Studio you are using and which collection you are running (since different collector use different settings).
Assuming you are running Amplifier collection:
Click "Project Properties" button (on toolbar or under "Start" button on left side panel).
In "Target" tab expand "Advanced" section.
There is "Collection data limit" option. You could increase it as appropriate. In latest versions of Parallel Studio it is increased to 500 MB by default.
You could set data limit to zero for unlimited collection. I don't recommend this since you may run out of disc space quickly. This is also the reason why this option is here - unlimited collection produce huge amounts of raw data.

Related

Power BI using 90+% CPU while doing.. what?

I have a 9Mb PBIX containing small tables and one table with 250k rows. Data imported from various xlsx & JSON sources. Machine is Windows 10 Pro, 2.6GHz, 64 bit, 16GB RAM.
On the Power BI service online the performance is ok, but on desktop it's practically unworkable. With task manager I can see that it is using 7Mb of memory, but almost 100% CPU, half an hour after opening - while on a blank tab with no visualisations.
I don't understand what it is doing in the background and how I can improve the situation.
There is the 'Allow data preview to download in the background' setting, but I think this is only relevant to the query editor? Would clearing the cache or changing cache settings help?
I am aware of performance analyzer and the query diagnostics tools, but neither seem relevant since the queries are not refreshing and there are no visualisations loading.
Am at a bit of a loss - any help greatly appreciated.
Thanks
Update: Having disabled parallel load and background refresh in Data load settings I noticed that finally the issue seemed to go away (though not immediately). Eventually, when reopening the pbix, mashup containers did not appear and CPU and memory was not being killed. Then at some point Power BI got stuck and had to close and the problem reappeared even though the data load settings were still disabled. Restarting the machine seemed to clear the problem once again.
It seems then, that some zombie processes can persist through application close and re-open. Has anyone else noticed this, can confirm or refute it, suggest what is going on or any steps on how to avoid/prevent? It's very annoying!
Thanks
I have also noticed the same issue, for opening 5 mb pbix file, power bi eating 12 GB of memory, and 90%+ CPU utilization, Power BI Desktop is poorly managed product by Microsoft.

Dump File analysis

Recently I start facing issue on few servers where CPU start consuming more resources than usual trend. I am trying to find out the root cause for this and took the dump of w3wp process from Task Manager(right click on process and took the dump).
Now the dmp file size is 14GB and I am trying to analyze it through WinDBG but the tool is not working and getting message:
I also took few minidumps but some of them opening fine while few are not so it's not related to confusion between 32bit or 64bit.(The collected dump is 64bit).
I am trying to know what causing this issue. Is it file size or I am not taking the dump properly.
I checked link but it's not helpful.
Windbg is not the right tool for this job. Dumps are only snapshots so you have no idea what happened before. Use ETW and here the CPU Sampling, which sums all calls and shows you in detail the CPU usage.
Install the Windows Performance Toolkit which is part of the Windows 10 SDK (V1607 works on Win8/8.1(Server2012/R2) and Win10 or the V1511 SDK if you use Windows 7/Server2008R2)), run WPRUi.exe and select CPU Usage
and press on Start. Capture 1-2 minutes of the high CPU usage and next click on Save. Open the generated ETL with WPA.exe (Perf analyzer), drag and drop the CPU Usage (Sampled) graph to the analysys pane
and load the Debug Symbols. Now select your process in the graph, zoom in and expand the stack, here you see the weight of the CPU usage of all calls
In this sample most CPU usage from Internet Explorer comes from HTML stuff.
For .NET applications WPA shows you .net related groupings like GC or JIT:
Expand the stack of the w3wp process to see what it is doing. From the names you should have a clue what happens.

Speeding up VS2010 when working with WF4 IDE

I am developing on our development server. Windows Server 2008 R2 64 bit OS, 256GB of RAM, 16 CPU machine. VS2010 with SP1 installed.
I have a huge workflow (around 30K lines in the XAMLX file). Every time I make even a small change in an activity, like changing the display name, it takes about 2-3 minutes to register the change. VS2010 is unresponsive all this time. It is extremely frustrating.
What is causing this and how can I make it run faster? Is there any setting in VS2010 that will make this run faster?
Thanks!
The most likely cause of the delay is the associated work flow designer trying to update to the change you just made. It's doing so synchronously on the UI thread and hence you see the delay.
The only way I can think of to stop this is to not open the designer. Instead just open the raw XML file and edit it directly. To do this
Close the file
Right click on the file in solution explorer and select "Open With"
Choose the XML editor
Also I encourage you to file a bug on connect about this behavior.
http://connect.microsoft.com/visualstudio

Exception in Visual Studio.Net: System.OutofMemory.Exception

I am working with VisualStudio 2010 and this would probably be the most common error.
In my code I am calling a script to load data from a database table which comprises of over 1,765,700 rows and is 777,826 KB size.
I keep running into an System.OutOfMemory.Exception error.
Is there anyway I can increase the memory being allocated to my program or change the settings? I had done it while running my programs in eclipse before. Can it be done in Visual Studio2010 as well?
Thank you
The first step would be, if at all possible, to not load all of the data into memory at once unless this is truly a requirement. If there is any way to load the data in stages, you would avoid coming close to the memory limitations.
However, changing this to target x64 and running on a 64bit platform should give you plenty of memory access to load this amount of data without issues. That would potentially be the simplest option.

Visual Studio 2005 Memory Usage

I find that quite often Visual Studio memory usage will average ~150-300 MB of RAM.
As a developer who very often needs to run with multiple instances of Visual Studio open, are there any performance tricks to optimize the amount of memory that VS uses?
I am running VS 2005 with one add-in (TFS)
From this blog post:
[...]
These changes are all available from the Options dialog (Tools –> Options):
Environment
General:
Disable “Animate environment tools”
Documents:
Disable “Detect when file is changed outside the environment”
Keyboard:
Remove the F1 key from the Help.F1Help command
Help\Online:
Set “When loading Help content” to “Try local first, then online” or “Try local only, not online”
Startup:
Change the “At startup” option to “Show empty environment”
Projects and Solutions
General:
Disable “Track Active Item in Solution Explorer”
Text Editor
General (for each language you want):
Disable “Navigation bar” (this is the toolbar that shows the objects and procedures drop down lists allowing you to choose a particular object in your code.
Disable “Track changes”
Windows Forms Designer
General:
Set “AutotoolboxPopulate” to false.
Set “EnableRefactoringOnRename” to false.
Upgrade to a 64-bit OS. My instances of VS were taking ~700MB each (very large solutions).. and you rapidly run out of room with that.
Everyone on my team that has switched to 64-bit (and 8GB RAM) has wondered why they didn't do it sooner.
minimize and re-maximize the main vs window to get vs to release the memory.
By uninstalling (and re-installing) Visual Assist the problem got solved for me.
The number 1 thing you can do is switch to Windows 8.
It uses memory sharing / combining if the same DLL or memory page is loaded into multiple processes. Obviously there's a lot of overlap when running two instances of VS.
As you can see I've got 4 Visual studios running and the shared memory column (you need to enable this column for it to be visible) shows how much memory is being shared.
So in Windows 7 this would use 2454MB but I'm saving 600+MB that are shared with the other devenv processes.
Chrome too has a lot of savings (because each browser tab is a new process). So overall I've still got 2GB free where I'd normally be maxed out.

Resources