How can I detect which modules depend on which modules in Ruby? - ruby

What tools can determine which modules have methods that are calling methods from other modules in Ruby?
Background: I'm partway through breaking a 808 line module into smaller modules, having created 12-submodules. However, some of the methods in one of the modules are calling methods in another sub-module. This may or may not be ok, depending on whether the module of the called method is meant to be common functionality.
module DisplayStatistics1
def display_statistics_1_foo
calculate_statistics_foo # call a method that's in CalculateStatistics - this is ok
display_statistics_2_bar # call a method that's in DisplayStatistics2 - this is bad
end
# other methods omitted
end
# modules DisplayStatistics2 and CalculateStatistics omitted
class ExampleClass
include DisplayStatistics1
include DisplayStatistics2
include CalculateStatistics
end
Ideally the analysis tool would show that DisplayStatistics1 has dependencies on DisplayStatistics2 as well as on CalculateStatistics.
Update: Maybe I shouldn't have done it this way - maybe I should have split them up into classes instead. That way, I'd have known for sure what depended on what!

While I'm not aware of a static analysis tool for Ruby, the one that seems closest to what you want is rubyprof. This can generate a callgraph in many formats, including an HTML tree and even a GraphViz box-and-line plot.

Related

When to use a module and when to use a global scope

This is a simple question: Should I have a module that contains all my classes (and submodules):
module ProjectName
class Something
# code
end
module Abc
# code
end
end
Or simply everything in a global scope:
class Something
# code
end
module Abc
# code
end
It is considered good practice not to pollute your global scope. Namespacing your application into modules, encapsulating related behaviour makes it easier to comprehend, helps avoid naming conflicts, and lets you easily port parts of your code into other applications or contexts.
In Ruby it also gives you a natural way of storing module wide constants, and gives you the option to add methods that don't need a containing object directly to the module.
In some languages, (notably JavaScript) scoping also has an impact on performance, as keeping objects in the global scope might prevent them from getting qualified for garbage collection.

ruby module_function vs including module

In ruby, I understand that module functions can be made available without mixing in the module by using module_function as shown here. I can see how this is useful so you can use the function without mixing in the module.
module MyModule
def do_something
puts "hello world"
end
module_function :do_something
end
My question is though why you might want to have the function defined both of these ways.
Why not just have
def MyModule.do_something
OR
def do_something
In what kind of cases would it be useful to have the function available to be mixed in, or to be used as a static method?
Think of Enumerable.
This is the perfect example of when you need to include it in a module. If your class defines #each, you get a lot of goodness just by including a module (#map, #select, etc.). This is the only case when I use modules as mixins - when the module provides functionality in terms of a few methods, defined in the class you include the module it. I can argue that this should be the only case in general.
As for defining "static" methods, a better approach would be:
module MyModule
def self.do_something
end
end
You don't really need to call #module_function. I think it is just weird legacy stuff.
You can even do this:
module MyModule
extend self
def do_something
end
end
...but it won't work well if you also want to include the module somewhere. I suggest avoiding it until you learn the subtleties of the Ruby metaprogramming.
Finally, if you just do:
def do_something
end
...it will not end up as a global function, but as a private method on Object (there are no functions in Ruby, just methods). There are two downsides. First, you don't have namespacing - if you define another function with the same name, it's the one that gets evaluated later that you get. Second, if you have functionality implemented in terms of #method_missing, having a private method in Object will shadow it. And finally, monkey patching Object is just evil business :)
EDIT:
module_function can be used in a way similar to private:
module Something
def foo
puts 'foo'
end
module_function
def bar
puts 'bar'
end
end
That way, you can call Something.bar, but not not Something.foo. If you define any other methods after this call to module_function, they would also be available without mixing in.
I don't like it for two reasons, though. First, modules that are both mixed in and have "static" methods sound a bit dodgy. There might be valid cases, but it won't be that often. As I said, I prefer either to use a module as a namespace or mix it in, but not both.
Second, in this example, bar would also be available to classes/modules that mix in Something. I'm not sure when this is desirable, since either the method uses self and it has to be mixed in, or doesn't and then it does not need to be mixed in.
I think using module_function without passing the name of the method is used quite more often than with. Same goes for private and protected.
It's a good way for a Ruby library to offer functionality that does not use (much) internal state. So if you (e.g.) want to offer a sin function and don't want to pollute the "global" (Object) namespace, you can define it as class method under a constant (Math).
However, an app developer, who wants to write a mathematical application, might need sin every two lines. If the method is also an instance method, she can just include the Math (or My::Awesome::Nested::Library) module and can now directly call sin (stdlib example).
It's really about making a library more comfortable for its users. They can choose themself, if they want the functionality of your library on the top level.
By the way, you can achieve a similar functionality like module_function by using: extend self (in the first line of the module). To my mind, it looks better and makes things a bit clearer to understand.
Update: More background info in this blog article.
If you want to look at a working example, check out the chronic gem:
https://github.com/mojombo/chronic/blob/master/lib/chronic/handlers.rb
and Handlers is being included in the Parser class here:
https://github.com/mojombo/chronic/blob/master/lib/chronic/parser.rb
He's using module_function to send the methods from Handlers to specific instances of Handler using that instance's invoke method.

Applying how ActiveRecord uses modules to other Ruby projects

I decided to dig into the ActiveRecord code for Rails to try and figure out how some of it works, and was surprised to see it comprised of many modules, which all seem to get included into ActiveRecord::Base.
I know Ruby Modules provide a means of grouping related and reusable methods that can be mixed into other classes to extend their functionality.
However, most of the ActiveRecord modules seem highly specific to ActiveRecord. There seems to be references to instance variables in some modules, suggesting the modules are aware of the internals of the overall ActiveRecord class and other modules.
This got me wondering about how ActiveRecord is designed and how this logic could or should be applied to other Ruby applications.
It is a common 'design pattern' to split large classes into modules that are not really reusable elsewhere simply to split up the class file? Is it seen as good or bad design when modules make use of instance variables that are perhaps defined by a different module or part of the class?
In cases where a class can have many methods and it would become cumbersome to have them all defined in one file, would it make as much sense to simply reopen the class in other files and define more methods in there?
In a command line application I am working on, I have a few classes that do various functions, but I have a top level class that provides an API for the overall application - what I found is that class is becoming bogged down with a lot of methods that really hand off work to other class, and is like the glues that holds the pieces of the application together. I guess I am wondering if it would make sense for me to split out some of the related methods into modules, or re-open the class in different code files? Or is there something else I am not thinking of?
I've created quite a few modules that I didn't intend to be highly reusable. It makes it easier to test a group of related methods in isolation, and classes are more readable if they're only a few hundred lines long rather than thousands. As always, there's a balance to be struck.
I've also created modules that expect the including class to define instance methods so that methods defined on the module can use them. I wouldn't say it's terribly elegant, but it's feasible if you're only sharing code between a couple of classes and you document it well. You could also raise an exception if the class doesn't define the methods you want:
module Aggregator
def aggregate
unless respond_to?(:collection)
raise Exception.new("Classes including #{self} must define #collection")
end
# ...
end
end
I'd be a lot more hesitant to depend on shared instance variables.
The biggest problem I see with re-opening classes is simply managing your source code. Would you end up with multiple copies of aggregator.rb in different directories? Is the load order of those files determinate, and does that affect overriding or calling methods in the class? At least with modules, the include order is explicit in the source.
Update: In a comment, Stephen asked about testing a module that's meant to be included in a class.
RSpec offers shared_examples as a convenient way to test shared behavior. You can define the module's behaviors in a shared context, and then declare that each of the including classes should also exhibit that behavior:
# spec/shared_examples/aggregator_examples.rb
shared_examples_for 'Aggregator' do
describe 'when aggregating records' do
it 'should accumulate values' do
# ...
end
end
end
# spec/models/model_spec.rb
describe Model
it_should_behave_like 'Aggregator'
end
Even if you aren't using RSpec, you can still create a simple stub class that includes your module and then write the tests against instances of that class:
# test/unit/aggregator_test.rb
class AggregatorTester
attr_accessor :collection
def initialize(collection)
self.collection = collection
end
include Aggregator
end
def test_aggregation
assert_equal 6, AggregatorTester.new([1, 2, 3]).aggregate
end

Ruby modules and classes

I am just starting Ruby and learning the concept of modules. I understand that one use of modules is to better organize your code and avoid name clashes. Let's say I have bunch of modules like this (I haven't included the implementation as that's not important)
:
module Dropbox
class Base
def initialize(a_user)
end
end
class Event < Base
def newFile?
end
def newImage?
end
end
class Action < Base
def saveFile(params)
end
end
end
and another module:
module CustomURL
class Base
def initialize(a_user, a_url, a_method, some_args, a_regex)
end
end
class Event < Base
def initialize(a_user, a_url, a_method, some_args, a_regex)
end
def change?
end
end
class Action < Base
def send_request(params)
end
end
end
I am going to have a bunch of these modules (10+, for gmail, hotmail, etc...). What I am trying to figure out is, is this the right way to organize my code?
Basically, I am using module to represent a "service" and all services will have a common interface class (base for initializing, action for list of actions and event for monitoring).
You are defining families of related or dependent classes here. Your usage of modules as namespaces for these families is correct.
Also with this approach it would be easy to build abstract factory for your classes if they had compatible interface. But as far as I see this is not the case for current classes design: for example Dropbox::Event and CustomURL::Event have completely different public methods.
You can reevaluate design of your classes and see if it is possible for them to have uniform interface so that you can use polymorphism and extract something like BaseEvent and BaseAction so that all Events and Actions will derive from these base classes.
Update: As far as you define services, it might be useful to define top-level module like Service and put all your classes inside this module. It will improve modularity of your system. If in the future you would refactor out some base classes for your modules services, you can put them in the top-level namespace. Then your objects will have readable names like these:
Service::Dropbox::Event
Service::Dropbox::Action
Service::CustomURL::Event
Service::CustomURL::Action
Service::BaseEvent
Service::BaseAction
I have some similar code at work, only I'm modeling networking gear.
I took the approach of defining a generic class with the common attributes and methods, including a generic comparator, and then sub-class that for the various models of hardware. The sub-classes contain the unique attributes for that hardware, plus all the support code necessary to initialize or compare an instance of that equipment with another.
As soon as I see the need to write a method similar to another I wrote I think about how I can reuse that code by promoting it to the base-class. Often this involves changing how I am passing parameters, and instead of using formal parameters, I end up using a hash, then pulling what I need from it, keeping the method interface under control.
Because you would have a lot of sub-classes to a base class, it's important to take your time and think out how that base-class should work. As you add sub-classes the task of refactoring the base will get harder because you will have to change other sub-classes. I always find I go down some blind-alleys and have to back up a bit, but as the class matures that should happen less and less.
As you will notice soon, there is no 'right way' of organizing code.
There are subtle differences in readability that are mostly subjective. The way you are organizing classes is just fine for releasing your code as a gem. It usually isn't needed in code that won't be included in other peoples projects, but it won't hurt either.
Just ask yourself "does this make sense for someone reading my code who has no idea what my intention is?".

Ruby modules that use some common method:where to put this method?

Given some Uploader Class, I have three modules included in it.
They have each their own distinctive uses and that's why I divided them into three modules. But, as it turns out, there is one method that all three modules need.
So, it is so far obvious to me that there must be a better way than defining this method in each of these three modules. Should I make a fourth module and put this method in it, and have these original three modules include this fourth module? Or is there a better way?
Yes, it sounds like you should have a fourth module. But it sounds like you might actually need a class hierarchy rather than independent, flat modules. Apply the 'is a' test - can you say an UploaderA instance is an Uploader? If so, you should have
class Uploader
include FourthModule # with shared functionality for all uploaders
# or just go ahead and put the shared functionality right in the Uploader class here.
end
class UploaderA < Uploader
end

Resources