When trying to run an intensive Ruby method, I noticed it's only using 25% of the CPU resources while 70% sits idle. Is there any way to configure this to use more? I'm on Windows 7, ruby 2.0.0
You probably have 4 CPU cores. You're running 1 Ruby process. 1 Ruby process = 1 thread = can use max 1 CPU core. The MRI (default) implementation of Ruby currently cannot run more than 1 thread in parallel. For that, you may want to try JRuby or some other implementation like Rubinius that allows parallel threads. I'm guessing you'll need to learn a bit about multi-threading to understand this wholly, start by reading some basic tutorials and then questions like "Does ruby have real multithreading?".
When the process is running, go to the task manager, right click on the program, click "go to process," right click on the process, go to select priority, and check off "high."
Important: never set application to "realtime" it may cause several problems.
references:
http://www.tomshardware.com/forum/57576-63-maximum-capacity-application
Related
I'm working on a Ruby on Rails application which has a memory leak, so eventually it crashes when there's no more memory.
However, in the final stage it basically only running the GC and processing very few requests, so basically DoS-ing itself. This DoS time was between 1 hour and 6 hours for my application!
I tried to locate the memory leak but no luck so far, so now I want to find a workaround for the production server.
Is there a way to configure the MRI Ruby GC so that when it reaches the memory limit then it just crashes? I mean to crash at the first time when Ruby tries to allocate more memory and the operating system denies it.
As far as I know, you cannot do that.
But your have another options:
Setup something in your system, which will prevent ruby from using too much memory (oom maybe?)
Setup your webserver to kill itself - like in this gem
It must be obvious, but I cant get a usecase of Delayed Job, cause due to ruby`s Gargabe Collector specific, it doesnt free memory back to OS. And once delayed job process will take all memory anyway. And the only way is to restart delayed job process.
But if I restart delayed job process and there is currenlty running task - it will never be completed. Probably, there is some workaround to restart that task later, but this approach seems ugly to me.
I tried real jobs and some simple computatuin without any variables, symbols or links so I dont think that "my code leaks". Still, every new job increases memory of delayed_job process.
May be I use Delayed job for something that its not designed? Or it could be environment problem (besides, tried on local machine and on VPS) ?
Tested on: Ubuntu 14.04 and Debian 6 (both x86), Rails 3.2, delayed_job 4.0.2, delayed_job_active_record 4.0.1, ruby 2.1.2
I could give some code examples, but, as I mentioned, I tried both: real job and simple computation. So I won`t if it is not significant and my mistakes are fundamental.
Due to my conditions - my tasks can be executed for couple of minutes, read and write about 100K records to database and require a lot of computation, tasks cant be interrupted, and number of tasks limited by 10-20 dayli, may be - I only guess to use Resque, because it forks process everytime, so there should be no problems with accumulating memory with time.
So do I realy do something wrong or this is a nature of DJ - to occupie all memory or require a restart - and if I cant restart it, I shouldnt use its approach ?
Everything I read on the internet (not so much, by the way) tells that its rubys GC trouble that it doesnt free memory back to OS, and some advises to profile code for unlinked objects (it sounds the most realistic to my case, but, I tried a lot with code that doesnt create any objects, and I explicitly set everything to nil and call GC.start)
Running on Macbook Pro unibody OSX 10.6 Snow Leopard, dual-core. I notice Activity Monitor is stating the ruby process running at 50% consistently...
Is anyone seeing the same results? Is this 'normal'?
EDIT:
Further into clarifying. My hands are not on the keyboard. The Rails server and ruby console are running, but without any activity. I am also running Rails 3.1 RC1.
It depends on what you are doing. If you do simply nothing, then no, this isn't normal. If you are actively developing, then you might have created an endless loop.
Usually, one infinity loop uses one core of CPU (50% usage in your case, because your Mac has got dual-core).
It's not common, no. I've seen it happen a few times, but I don't know why.. It's not a endless loop in my code. I suspect a race condition somewhere deep in the stack or in the interpreter. It happens most often after a resume from sleep event I think.
I just kill the process.
I've restarted the server. The CPU process for Ruby seems to have subsided to a more reasonable state / percentage. At the moment, at a 0.2.
I'm starting to suspect that something may have triggered a CPU spike with Rails 3.1 RC1. After all, it is still not the stable release of Rails. Will observe how this plays out.
I'm having issues with a site on my server loading and was running 'top' and saw this:
alt text http://share.shpigford.com/images/ruby_processes-20091112-103834.png
Dozens of ruby processes...and I have no idea what that means or if that's normal. :)
I have a feeling that your PassengerMaxPoolSize is set too high for such a small amount of memory. Just totaling that up your ruby processes are eating 81% of your available memory.
See this related discussion on ServerFault. This question should probably be migrated over there.
I don't know what is normal on your system.
In a sever production environment ruby scales by adding processes, so I would expect to see at least one process per CPU core. (Real or virtual - my i7 920 has 8 virtual cores and needs 8 ruby processes for a 100% CPU load.)
Dozens sound like a lot, but it could be possible if your site is using lots of ruby for miscellaneous daemon processes.
I think you'll have to ask someone who knows what is supposed to run on the system.
I'm running an application right now which seems to be running at full throttle, but even though the fan seems to be spinning at it's max and the activity monitor reports that the application is using 100% of the processor, I'm suspecting that at the most it is using 100% only of a single of the two cores on my machine.
How can I tell OS X to allow an application use 100%, or as much as the OS can allow, of the processing power of my computer? I have tried some terminal commands like "nice" and "renice" to set up the priority of this process but still can't get it to run at full throttle.
I also would like to know how to do the opposite, set a limit of the processor usage of an app, example set app X to run at 20%.
Is this possible to do without modifying the code of the app?
The answer to this depends upon whether your application is multi-threaded or not. If this is a single-threaded application, which it is unless you have specifically made it multi-threaded then the process will run on one core of your multi-core hardware. There is nothing you can do about this it's a function of the underlying operating system.
If your program is multi-threaded then it is possible to have different threads executing on separate cores. This will increase the overall usage of the process and allow figures greater than 100%.
You can not however force the machine to use 'all' of the processing power available, but you can influence it with nice.
In order to reduce the amount of processor used then you can use nice to lower the priority of the process. If you are root you can also use nice to increase the priority of your process