I'm looking for an app that does about the same thing at the Performance tab on Task Manager, but on a per-process basis and with more plotted values. At a minimum, I would like to be able to plot CPU and memory usage but it would be nice if it could plot:
Network usage
File system IO (per drive/share sub headings would be nice)
Open file stream count
LSOF like stuff
All the other stats that the Process tab can give you
... anything else ...
Windows comes with perfmon which is pretty much exactly what you want. It has infinity different counters with different categories - also individual apps can register their own counters.
I worked with Norm at FS Walker Hughes, he made this and published it on CodeProject
What you exacly need is this => processstudio
Both have full source code available.
Enjoy!
Just use NAPI.
Related
I am a student at university and I am stuck with some code for my assignment, and would appreciate the help from the community with this. I am very new to bash, and I would say that I do not know how to write bash at all, I struggle with it. I much prefer python or C# lol. Anyways, I am required to create a cluster monitor. I have got to create a simple menu with 2 options that allows the user to choose between the options "Cluster Status" and "Process Analytics". I have created the menu, it's very basic, and the code for that is below:
#!/bin/bash
clear
echo="Choose one of the following options: "
echo="1) Cluster Status"
echo="2) Process Analytics"
echo="3) Exit"
read ans:
if ["$ans"=='1']
then
echo="Loading..."
bash Cluster_Status.sh
elif ["$ans"=='2']
then
echo="Loading..."
bash Process_Analytics.sh
elif ["$ans"=='3']
then
echo="Loading..."
In this menu, I also need to make it so that when it has completed execution, it pauses and returns to the main menu.
I have got 5 nodes for the cluster, with sample data in them, which can be provided if anyone wants to help with this.
The part that I am most stuck with is the creating of the cluster status and the process analytics part of this.
The cluster status is a script that provides a screen and file printout of the current stats of the entire cluster, with all of its parameters set in a nodeconfig.read.me file, which I have got. The stats need to be presented either in a sum or as an average. For example, the total amount of CPUs would be a sum, and the total CPU load would be an average. I need to do this in a table, which I cannot figure out how to do, as I am limited by the amount of different functions I can use.
The process analytics section asks me to process the current processes that are running on each node, if there are any, and I need to present the following stats onscreen and also have it saved to a file:
• Most popular process (in terms of instances running)
• Most CPU demanding process (in terms of CPU usage)
• Most MEM demanding process (in terms of Memory usage)
• Most Disk demanding process (in terms of Disk usage)
• Most Net demanding process (in terms of Network usage)
• 5 top users of CPU/MEM/DISK & Net (separate tables)
I have got to do this in PuTTy, so I am limited to the built in functions of PuTTy, cat, grep, ls, pipe, echo, tr, tee, cut, touch, head, tail. These are the only things that I can use when writing this, and because of the limitation, I am stuck when trying to find how to write this. I have researched how to do this for the last week, and I still do not know how to complete this. I do not ask for this assignment to be completed by anyone. I just want to ask for help on how to use these commands to be able to create this script. Thank you for your help from now, and I am sorry if this is posted in the wrong area. I am still fairly new with using stackoverflow.
It's a little strange I know,
but I want to limit a program (for example the winrar app) resource usage.
The reason:
I have an old laptop, with overheating problem, so if I want to do a calculate intensive task (compress a >10GB folder), my laptop overheats and turns off.
The question:
Is it possible to limit an application's resource/CPU usage? For example, can I set somehow, that winrar can only use my CPU's 50%?
I use windows 8.1,
but answer for other OS is welcome.
See Are there solutions that can limit the CPU usage of a process? for general answer.
WinRAR itself has the command line switch -ri, see in help of WinRAR the page with title:
Switch -RI<p>[:<s>] - set priority and sleep time
For example using a command line like
WinRAR.exe a -ri1:100 Backup.rar *
results in compressing all files in current working directory with default settings using lowest task priority and with 100 ms sleep time between each read or write operation.
I want to calculate the bytes sent and recieved by a particular process .I want to use powershell for that.
Something which I can do using Resource Monitor->Network Activity.
How can i do that using get-counter?
There is a really good Scripting Guy article on using the Get-Counter cmdlet here:
Scripting Guy - Get-Counter
The trick will be finding the counter that you need, and I don't think you can get the granularity you're after as these are the same counters that PerfMon uses. It's more focused around the whole Network Interface than it is around the individual processes using the interface. With that said, if it's the only thing using the given interface it should do the trick nicely.
Have a look at the Network Interface options available for a start:
(get-counter -list "Network Interface").paths
You can't, it seems. I'm absolutely unable to find the counters the performance monitor is reading from, though other people may chime in. There may be some other way than get-counter too, but that is what you specifically requested.
Looking through the counters, the closest thing you will find is the "IO Read Bytes/sec" and "IO Write Bytes/sec" counters on the process object.
The problem with those is that they count more than just network activity. The description in perfmon says:
"This counter counts all I/O activity generated by the process to
include file, network and device I/Os."
That being said, if you know that the process you want to monitor only or mainly writes to the network connection, this may be better than not measuring anything at all.
You'd go about it like this (I'll use Chrome as an example since it is conveniently running and using data right now):
get-counter "\Process(chrome*)\IO Read Bytes/sec"
This will just give you a one-time reading. If you want to keep reading you can add the continous switch.
The PerformanceCounterSampleSet object that is returned is not exactly pretty to work with, but you can find the actual reading in $obj.countersamples.cookedvalue.
The list will be fairly long (if you browse like me). Chrome is running in many separate processes, so we'll do a bit of math to get them all added up, and presented in KB.
Final result:
get-counter "\Process(chrome*)\IO Read Bytes/sec" -Continuous | foreach {
[math]::round((($_.countersamples.cookedvalue | measure -sum).sum / 1KB), 2)
}
Running this will just continously output a reading of how many KB/s Chrome is using.
It's pretty obvious on every platform to see the amount of free mem, but I need to get the value in a batch script.
good old mem command is limited to 64MB
What should I use?
You can use performance counters. There are a few ways to query performance counters from the command line. Probably the simplest way is with the typeperf command. The following example displays one sample (-sc 1) of the “Available Mbytes” counter from the “Memory” object on the “kennypc” computer.
typeperf "\\kennypc\memory\available mbytes" -sc 1
There are a variety of performance counters to choose from to get just the results you need. The Performance Monitor snap-in (perfmon.msc) can be used to browse through the available performance counters.
WMI can help.
wmic os get FreePhysicalMemory
I am looking for a robust way to copy files over a Windows network share that is tolerant of intermittent connectivity. The application is often used on wireless, mobile workstations in large hospitals, and I'm assuming connectivity can be lost either momentarily or for several minutes at a time. The files involved are typically about 200KB - 500KB in size. The application is written in VB6 (ugh), but we frequently end up using Windows DLL calls.
Thanks!
I've used Robocopy for this with excellent results. By default, it will retry every 30 seconds until the file gets across.
I'm unclear as to what your actual problem is, so I'll throw out a few thoughts.
Do you want restartable copies (with such small file sizes, that doesn't seem like it'd be that big of a deal)? If so, look at CopyFileEx with COPYFILERESTARTABLE
Do you want verifiable copies? Sounds like you already have that by verifying hashes.
Do you want better performance? It's going to be tough, as it sounds like you can't run anything on the server. Otherwise, TransmitFile may help.
Do you just want a fire and forget operation? I suppose shelling out to robocopy, or TeraCopy or something would work - but it seems a bit hacky to me.
Do you want to know when the network comes back? IsNetworkAlive has your answer.
Based on what I know so far, I think the following pseudo-code would be my approach:
sourceFile = Compress("*.*");
destFile = "X:\files.zip";
int copyFlags = COPYFILEFAILIFEXISTS | COPYFILERESTARTABLE;
while (CopyFileEx(sourceFile, destFile, null, null, false, copyFlags) == 0) {
do {
// optionally, increment a failed counter to break out at some point
Sleep(1000);
while (!IsNetworkAlive(NETWORKALIVELAN));
}
Compressing the files first saves you the tracking of which files you've successfully copied, and which you need to restart. It should also make the copy go faster (smaller total file size, and larger single file size), at the expense of some CPU power on both sides. A simple batch file can decompress it on the server side.
Try using BITS (Background Intelligent Transfer Service). It's the infrastructure that Windows Update uses, is accessible via the Win32 API, and is built specifically to address this.
It's usually used for application updates, but should work well in any file moving situation.
http://www.codeproject.com/KB/IP/bitsman.aspx
I agree with Robocopy as a solution...thats why the utility is called "Robust File Copy"
I've used Robocopy for this with excellent results. By default, it will retry every 30 seconds until the file gets across.
And by default, a million retries. That should be plenty for your intermittent connection.
It also does restartable transfers and you can even throttle transfers with a gap between packets assuing you don't want to use all the bandwidth as other programs are using the same connection (/IPG switch)?.
How about simply sending a hash after or before you send the file, and comparing that with the file you received? That should at least make sure you have a correct file.
If you want to go all out you could do the same process, but for small parts of the file. Then when you have all pieces, join them on the receiving end.
You could use Microsoft SyncToy (free).
http://www.microsoft.com/Downloads/details.aspx?familyid=C26EFA36-98E0-4EE9-A7C5-98D0592D8C52&displaylang=en
Hm, seems rsync does it, and does not need server/daemon/install I thought it does - just $ rsync src dst.
SMS if it's available works.