Consider the following scenario:
tempfile: bar
generate tempfile (using bar)
foo: tempfile
generate foo (using bar)
Now, when running make foo, I only want foo to be built when bar is changed. What actually happens is that Make removes the tempfile (which I don't mind by itself), and then it builds foo every single time, even though bar has not changed.
Any advice?
Related
I'm trying to write a wrapper script around another program (e.g. foo_wrapper around foo), which simply overrides some flags on every call to foo. I want users to be able to interact with foo_wrapper and foo exactly the same way, including seeing the same bash completion output.
For example, foo_wrapper.sh should simply add --my_flag on every call to foo, but it's not enough to simply prepend or append flags to the argument list, it has to put them in the right location because of the way foo is implemented. So if the user calls foo_wrapper bar baz, I want it to map to foo bar --my_flag baz. This is why I assume I cannot simply use a bash alias.
This would be a simple script if I didn't need bash completion to work, but without bash completion this thing is not going to be nearly as useful. Can anyone help me figure out what to do here? I discovered the compgen builtin for bash, but I'm having trouble seeing how/if I can use it for this.
Let's say I have a library package called foobar.
Over time it became big and heavy.
Fortunately it's separable, I managed to split its functionality into two separate packages foo and bar -- most clients will only need to use one or the other.
Since my library is already in use by many clients, for compatibility I still want to maintain a foobar package as a proxy to the current functionality found in both foo and bar.
How does one achieve this in Go ?
One way that comes to mind is to create aliases in foobar for each struct/function in foo and bar. So if foo defines F() and bar defines B(), I would have in foobar:
var (
F = foo.F
B = bar.B
)
But I am hoping for an easier/cleaner way.
Creating an alias package is the only way.
But your attempt probably won't work: It works only for functions and variables and consts but not for types. For types you have to duplicate the type in foobar.
I wouldn't do this. Just have foobar around in version 1 and start anew with foo and bar (maybe directly in version 2).
I inherited maintenance on an app that uses eval() as a way to evaluate rules written in Ruby code in a rules engine. I know there are a lot of other ways to do it, but the code base so far is pretty big, and changing it to something else would be prohibitive time-wise at this point; so assume I'm stuck using eval() for the moment.
The rules as written typically call up some of the same objects from the database as each other, and the rules writer gave the variables in the rules the same names as each other. This is resulting in pages and pages of "already initialized constant" warnings in the console during development.
I'm wondering a couple things:
First, if feels like those are slowing down the execution of the program in the dev environment, and so I'm wondering if it's a big performance hit in the production environment, specifically, having those warnings pop, not eval() itself, which I know is a hit.
Second, is there any way to "namespace" the execution of each rules so that it's not defining its variables on the same scope as all the other evals in the request to avoid that warning popping all over the place? I know I could rewrite all the rules to use ||= syntax or to check if a name has already been defined, but there's quite a lot of them so I'd rather do it from the code that runs the eval()'s, if possible.
** update with example rule **
A question has a rule about when it's to be displayed to a user. For example, if the user has stated that they live in an apartment, another question might need to be shown to ask what size the apartment building is. So the second question's rule_text might look like:
UserLivesInApartment = Question.find_by_name "UserLivesInApartment"
UserLivesInApartment.answer_for(current_user)
The code that calls the eval ensures there's a current_user variable in scope prior to evaluating.
uh, eval is not perhaps the most golden of standards. You could probably fire up a drb instance and run the stuff in that instead of eval, that way you would have at least some control of what is happening and not pollute your own namespace.
http://segment7.net/projects/ruby/drb/introduction.html
Edit: added another answer for running the code in the same process:
I don't know how your rule code looks, but it might be possible to wrap a module around it:
# create a module
module RuleEngineRun1;end
# run code in module
RuleEngineRun1.module_eval("class Foo;end")
# get results
#....
# cleanup
Object.send(:remove_const, :RuleEngineRun1)
You can also create an anonymous module with Module.new { #block to be module eval'd } if you need to run code in parallel.
In later rubies you can add -W0 to run your code without printing warnings, but doing so makes possible errors go unnoticed:
$ cat foo.rb
FOO = :bar
FOO = :bar
$ ruby foo.rb
foo.rb:2: warning: already initialized constant FOO
$ ruby -W0 foo.rb
You could also run your eval inside a Kernel.silence_warnings block, but might be devastating as well if you actually run into some real problems with the eval'd code, see
Suppress Ruby warnings when running specs
I would like to create a static library (.lib) in Windows that can be used in subsequent builds as a "backup" for undefined functions.
For instance let's say I have foobar.lib which has definitions for:
FOO
BAR
and I have some other program that defines FOO only and is to be built into a DLL that must export FOO and BAR. I want to be able to use foobar.lib to automatically export the default definition of BAR in the resulting DLL (and ignore the definition of FOO in foobar.lib).
I have tried sending foobar.lib to the linker but I get a multiple defined symbols error (/FORCE is supposed to override that but with strong warnings that it probably won't work as expected). I've also tried using /NODEFAULTLIB:foobar.lib but then it completely ignores the library and says BAR is undefined.
I am almost 100% certain there is a way to do this because I use an application that does this (Abaqus) to allow users to write plug-ins and not have to define all of the required exports for the plug-in DLL. And they do not use the /FORCE option.
I figured out a solution (not sure if it is the only or best solution).
I was trying to define FOO and BAR in foobar.lib using one object file (foobar.obj). If I split it up into foo.obj and bar.obj and then use those to create foobar.lib the linker can effectively ignore the appropriate .obj files.
So the short answer is: one function per object file for the static library.
One sign is that target does not exist, understand this.
Another is by comparing modification timestamp of target and prerequisites. How it works in more details? What is the logic of comparing target and prerequisite timestamps and how it works when there are multiple prerequisites?
make first gets the modification time of the target, then compares that value to the modification time of each prereq, in order from left to right, stopping as soon as it finds any prereq that is newer than the target (since a single newer prereq is sufficient to require the target be rebuilt).
For example, suppose you have a rule like this:
foo: bar baz boo
Further, suppose that the modification times on these files are as follows:
foo: 4
bar: 3
baz: 6
boo: 2
In this case, make will compare the modification time of foo (4) to the modification time of bar (3); since bar is older, make will move on and compare the modification time of foo (4) to the modification time of baz (6). Since baz is newer, make will decide that foo must be rebuilt, and will stop checking the prereqs of foo (so boo will never be checked).
If you have multiple dependency lines for the output target, as in:
foo: bar baz
foo: boo
The prereqs in the second and subsequent dependency lines are simply appended to the end of the list of prereqs for the output target -- that is, this example is exactly equivalent to the first example above.
In general, all make variants behave this way, although some variants have extensions that modify this behavior (for example, GNU make includes order-only prerequisites; Sun make has "keep state" features; etc).
Unix make has pretty complicated inference rules to determine if the target needs to be rebuilt. For GNU make you can dump them by running 'make -p' in a directory that doesn't have a Makefile.
Also the rules could be chained, more explanation about it is here
Standard Unix make and Microsoft nmake work in similar fashion