I have some perl applications & modules that I use for a number of tasks.
I would like to bundle these up and place a Coca wrapper around them, so I can distribute them to other people.
Assuming I can force use of the bundled OSX perl and include the modules I need inside my application, is there any real problems with doing this? I really do not want to re-implement everything I have already done.
There is nothing wrong with using the system-provided tools, include Perl. Things to consider:
The version of Perl changes from release to release. So if you need to support 10.4-10.7, you wind up with very diverse versions of Perl to support.
It can be tricky to include your own versions of modules, particularly if those modules rely on other modules, and most especially if those modules include compiled code.
Occasionally users mess with their system Perl more than you would like. In particular, they might install new modules or upgrade existing modules. Ideally you can say "don't do that," but it can create problems.
My team has had nightmares dealing with Net::SSL on different versions of OS X. We finally have removed Perl entirely from our code base due to the headaches of managing all the different versions of Perl and of Perl modules that might be in the system libraries.
But if you keep your dependencies simple, then there's no problem using the system Perl.
As long as you employ good coding practices, it should be fine. I've never use Perl in a Cocoa application myself. You might want to check out Camel Bones: http://sourceforge.net/projects/camelbones/
Its a Perl/Cocoa bridge. Never used it myself, but it may help.
Related
In node projects, package.json lets contributors know what version of node they can use to either consume or contribute to a specific project. Python uses venv to control the version used in collaborative development environments, and many other languages have similar constructs as well.
Does go have a standardized process that allows you to do something similar?
No, but Go has the 1 promise of compatibility - they try very hard not to break any extant software built on 1.x, even at the expense of leaving an ugly API or unwanted behaviour (though this is rare). This means you don't really have to worry about specifying which version of Go you use. Go 2 is not even on the horizon, so for the foreseeable future, you don't have to worry about this. There are a few new features, but most go users upgrade (because of the stable upgrade path).
https://golang.org/doc/go1compat
Re which dependencies you have, at present the only solution is to put your dependencies into a vendor folder, but I think you were asking about the language specifically.
Preface: This is not much about how to structure code within files. I have that down. This is more of the topic of organization of your source tree. Hopefully someone will just say, "Here's a great link on the topic." However, firsthand opinions on the subject are welcome too.
So I've done a bit of digging on this subject and find tons of material on simple structure. I guess the assumption is that, by the time you need to deal with the problems of the size of your codebase, you'll already know the answer. However, even the IDEs seem to be waging a holy war on how these projects should be structured (which is NOT what I wanted to start in this thread).
Java enforced the package structure in the language. Kudos for that. Eclipse then lets you use projects to have (potentially) independent -- we'll call them 'buckets' in this example -- buckets of related code. Intellij has distinct but similar concepts with 'modules' within a singleton instance of a 'project'. If you want another project, you're essentially starting from scratch.
However, RubyMine offers no such modules in ruby apps, and by default just wants to slam everything into the root directory. It allows Directories, so one could essentially just pick some arbitrary tree structure and run with it. This implies to me their intent was that all classes have access to all other classes within your project. This might have some resolution through the use of Ruby 'modules', or might just be an honor-system pattern of "don't reference that stuff."
So, to put it succinctly, say I were building a 'foo' and a 'bar' concept, and both depend on a 'util' class. maybe I'll deploy them as gems, maybe I won't. I could:
Slam them all into one RubyMine project and just ignore the fact that 'foo' and 'bar' have no reason to be aware of each other.
Put each in its own RubyMine project. This seems like a real pain if there is any concurrent development. First of all, 'util' would have to be packaged separately and then included as an external resource in the other projects.
Neither seems particularly appealing. Thougts?
I'd develop all three independently, then make util a gem. In the Gemfile for each of foo and bar, you can give a path to the gem so that you can develop them all concurrently without the pain of messing with version numbers and so forth (for production, you would then point it to the real gem on Rubygems or at some git repository).
For project structures, check out the Ruby Packaging Standard and Gem Patterns.
Ok, I'm not trying to start a discussion on Camelcase vs Underscore here, it doesn't matter what you pick, just stick to your choice.
Rather, what I would like peoples opinion on, is how strict and committed you should be in your choice when importing third party libraries.
Especially in PHP there is a HUGE variety of coding styles, to the point where it's just damn near impossible to maintain one specific style throughout your codebase when you use third party libraries.
So what do you guys do? Modify those libraries to suit your conventions, write some sort of interpreting layer so that when you use those libraries the usage of them still follows your conventions? Do you just say "to hell with it" and mix it all together? Or is there some other ingenious solution that I haven't thought of (apart from simply not using libraries that don't follow your convention)?
In essence what I'm asking is; how do you manage to maintain a clean and consistent coding style when using third party libraries? Can it be done?
I say "to hell with it" and mix it all together. It can be somewhat annoying to have the mixed styles, but I don't think it's worth it to do a bunch of work to avoid this.
Me and my team are starting to build up a few re-usable scripts. They're re-usable within our org only as they work with proprietry apps and our particular server environment. So not really suitable for rubyforge or github, etc.
My question is, what is the best practice for ensuring we're all using the latest and greatest scripts across all users? We pretty much run these scripts on one server, but may need to expand to others.
Should we bundle them into gem(s) and start a private gem server?
Or something simpler like a common, shareable lib directory. Perhaps with a script to download/update from our SCM?
Other ideas?
Thanks....
This depends on some factors, like how many people want to change the code (only your team, or someone else too), or how much money you have for this?
Personally I'd create a build+gem server, where you can upload the scripts using some versioning system (like git or svn, depends on how many people are working on the project), and then create a cron job, that would automatically genereate the gems from the sources at generic intervals and store them as different versions. This way you can be sure that you always have an authorative server that stores your applications gems, and you can always get an earlier version if something breaks. Your script might create separate gem version names, like "appserv-edge" or "appserv-stable"
You might also want to check out github's closed source options too, if you have the money to afford that. I don't know however whether they have gem building and hosting facilities for non open-source programs.
I've created a private gemserver and it's dead easy. The only tricky bit is deciding how you want your users to upload gems. Personally, I just use a PHP upload form, and have it check to make sure that it's not masking any existing gems.
At my office we're using a bit of a hybrid approach for some of our shared scripts and libs. We do bundle them all in to a gem, but rather than using a gem server we keep them in source control, and then build the gem (using newgem) and install it locally as necessary.
The downside of this approach is that it takes two commands instead of one to install the gem, but this is largely mitigated in qa and production environments, since we use Capistrano for deployment.
The upsides are that it's dead simple, and in development there is a very short edit/build/deploy cycle if you're working with something that requires changes to the gem. I'm currently pulling a lot of common functionality in to the shared gem, so I'm really appreciating that aspect.
At work I'm using Perl 5.8.0 on Windows.
When I first put Perl on, I went to CPAN, downloaded all the sources, made a few changes (in the .MAK file(?) to support threads, or things like that), and did nmake / nmake test / nmake install. Then, bit by bit, I've downloaded individual modules from CPAN and done the nmake dance.
So, I'd like to upgrade to a more recent version, but the new one must not break any existing scripts. Notably, a bunch of "use" modules that I've installed must be installed in the new version.
What's the most reliable (and easiest) way to update my current version, ensuring that everything I've done with the nmake dance will still be there after updating?
As others noted, start by installing the new perl in a separate place. I have several perls installed, each completely separate from all of the others.
To do that, you'll have to configure and compile the sources yourself. When you run configure, you'll get a chance to specify the installer. I gave detailed instructions for this in an "Compiling My Own Perl" in the Spring 2008 issue of The Perl Review. There's also an Item in Effective Perl Programming that shows you how to do it.
Now, go back to your original distribution and run cpan -a to create an autobundle file. This is a Pod document that lists all of the extra stuff you've installed, and CPAN.pm understands how to use that to reinstall everything.
To install things in the new perl, use that perl's path to start CPAN.pm and install the autobundle file you created. CPAN.pm will get the right installation paths from that perl's configuration.
Watch the output to make sure things go well. This process won't install the same versions of the modules, but the latest versions.
As for Strawberry Perl, there's a "portable" version you can install somewhere besides the default location. That way you could have the new perl on removable media. You can test it anywhere you like without disturbing the local installation. I don't think that's quite ready for general use though. The Berrybrew tool might help you manage that.
Good luck, :)
I would seriously consider looking at using Strawberry Perl.
You can install a second version of Perl in a different location. You'll have to re-install any non-core modules into the new version. In general, different versions of Perl are not binary compatible, which could be an issue if you have any program-specific libraries that utilize XS components. Pure Perl modules shouldn't be affected.
If you stay within the 5.8 track, all installed modules that contain XS (binary) extensions will continue to work, as binary compatibility is guaranteed within the same 5.8 series. If you moved to 5.10 then you would have to recompile any modules that contain XS components.
All you need to do is ensure that the new build lists the previous include directories in its #INC array (which is used to look for modules).
By the sounds of it, I think you're on Windows, in which case the current #INC paths can be viewed with
perl -le "print for #INC"
Make sure you target your new Perl version in another directory. It will happily coexist
with the previous version, and this will allow you to choose which Perl installation gets used; it's just a question of getting your PATH order sorted out. As soon as a Perl interpreter is started up, it knows where to look for the rest of its modules.
Strawberry Perl is probably the nicest distribution on Windows these days for rolling your own.
I think the answer to this involves virtualisation of some kind:
Set up an exact copy of your current live machine. Upgrade Perl, using the same directory locations and structures as you're using at the moment.
Go through your scripts testing them on the new image.
Once you're happy, flip the switch.
The thinking behind this is that there's probably all sorts of subtle dependencies and assumptions you haven't thought of. While unlikely, the latest version of a particular module (possibly even a core module, although that's even more unlikely) might have a subtle difference compared to the one you were using. Unless you've exhaustively gone through your entire codebase, there's quite possibly a particular module that's required only under certain circumstances.
You can try and spot this by building a list of all your scripts - a list that you should have anyway, by dint of all your code being under version control (you are using version control, e.g. Subversion, yes?) - and iterating through it, running perl -c on each script. e.g. this script. That sort of automated test is invaluable: you can set it running, go away for a coffee or whatever, and come back to check whether everything worked. The first few times you'll probably find an obscure module that you'd forgotten about, which is fine: the whole point of automating this is so that you don't have to do the drudge-work of checking every single script.
When I did it I installed the newer one into a separate directory. There's a bit of added confusion running two versions, but it definitely helps make sure everything's working first, and provides a quick way of switching back to the old one in a pinch. I also set up Apache to run two separate services, so I could monkey around with the newer Perl in one service without touching the production one on the old Perl.
It's probably a lot wiser, in hindsight, to install on a separate computer, and do your testing there. Record every configuration change you need to make.
I am not sure about building it yourself—I always just used prepackaged binaries for Windows.
I'm not sure I understand exactly what you're asking. Do you have a list of changes you made to the 5.8 makefile? Or is the question how to obtain such a list? Are you also asking how to find out which packages above the base install you've obtained from CPAN? Are you also asking how to test that your custom changes won't break those packages if you get them from CPAN again?
Why don't you use ActivePerl and its "ppm" tool to (re)install modules?