Which antlr4-runtime? - maven

I'm trying to add dependency antlr4-runtime in eclipse. It shows two instances to choose from.
com.tunnelvisionlabs::antlr4-runtime (324566 b)
org.antlr::antlr4-runtime (242694 b)
These files are of different size.
Which one should I use?

The reference runtime which is described in The Definitive ANTLR 4 Reference book with JavaDocs posted at antlr.org is the org.antlr::antlr4-runtime.
The other build is a highly experimental branch which is heavily optimized for use in Tunnel Vision Labs' IDE products. This build deviates from the documented version in many ways, so you may be on your own if you run into problems.

Related

XPage Osgi plug in development

background
I have designed many tools in the past year or so that is designed to help me program for XPages. These tools include primarily helper java classes, extended logging (making use of OpenLogger and my own stuff), and a few other things that I personally feel I cannot work without. It has been discussed with my employer, and we feel that it might be a good idea to start publishing these items to openNTF. Since these tools are made up of about 3 .nsfs, all designed to use the same java code, key javascript classes, css, and even a custom control or two, I would like to consolidate key items into a plug-in that can be installed at the server and client level. I want to do this consolidation before I even think about publishing any of the work I've done so far. It would just be far too much work to maintain, not just for me, but for potential users. I have not really found any information on how to do such a thing in google searches. I also have to make sure that I am able to make use of the ExtLib libraries, openNTF Domino API, and the Notes API.
my questions
How does one best go about designing such plug-ins? Must a designer
use eclipse, or is this it possible to do this directly in the Notes
Designer?
How does a designer best go about keeping a server and client up to date while designing and updating the plug-in code? Is this why GitHub is often used?
Where is the best place to get material to get started in this direction? I sort of feel lost in the woods, knowing I need to head north, but not having a compass for that first step.
Thank you very much for your input.
In my experience, I found that diving into plug-in development is a huge PITA until you get used to it, but it's definitely worth it overall.
As for whether you can use Designer for plugin development: yes, but you will likely eventually want to not do so. I started out by using Designer for this sort of thing for a while, presumably with the same sentiment as you: why bother installing another instance of Eclipse when I'm already sitting in one all day? However, between Designer's age (it's roughly equivalent to, I think, Eclipse 3.4), oddities when it comes to working sets between the "Applications" and "Project Explorer" views, and, in my case, my desire to use a Mac app, I ended up switching.
There are two major starting points: the XSP Starter Kit (http://www.openntf.org/internal/home.nsf/project.xsp?name=XSP%20Starter%20Kit) and Niklas Heidloff's video on setting up Eclipse for XPages development (http://www.openntf.org/main.nsf/blog.xsp?permaLink=NHEF-8RVB5H). The latter mentions the XPages SDK (http://www.openntf.org/internal/home.nsf/project.xsp?name=XPages%20SDK%20for%20Eclipse%20RCP), which is also useful. In my setup, I found the video largely useful, but some aspects either difficult to find (IBM's downloads are shifting sands) or optional (debugging, which will depend on whether or not you're using Eclipse on Windows).
Those resources should generally get you set up. The main thing to worry about when setting up your Eclipse environment will be making sure your Plug-In Execution Environment is properly done. If you're following the SDK setup instructions, that SHOULD get you where you need to be.
The next thing to know about is the way plugins are structured. Each plugin you want to install in Designer or Domino will also be paired with a feature project (a feature can house several plugins), and potentially an update site - the last one is optional if you just want to import the features into an Update Site NSF. That's how I often do my normal plugin development: export the paired feature to a directory and then import the feature into the server's Update Site NSF and then install in Designer from there using Application -> Install. You can also set things up so that you deploy into the server's plugin/feature directories instead of taking the step of installing into an update site if you'd prefer. GitHub doesn't really come into play for this aspect - it's more about sharing/collaborating with your code and also having a remote storage location for your git repositories (which I highly advise).
And as for the "lost in the woods" feeling: yep, you'll have that for a good while. There are lots of moving parts and esoteric concepts to get a hold of all at once. If you mostly follow the above links and then start with some basics from the XSP Starter Kit (which is itself a plugin project that you can pair with a feature) - say, printing text in the Activator class and making an implicit global variable just to make sure it works - that should help get your feet wet.
It's best done in Eclipse. You can debug your code running on the server from there, as well as run it directly from there. The editors are also more up-to-date. You want:
Eclipse for RCP and RAP developers
XPages SDK for Eclipse RCP (from OpenNTF)
XPages Debug Plugin (from OpenNTF - basically allows you to load the plugins to the Domino server dynamically, rather than exporting to an Update Site all the time)
XSP Starter Kit on OpenNTF is a good starting point for a plugin. There are various references to the library id, which has to be unique for your plugin. Basically, references to org.openntf.xsp.starter need changing to whatever you want to call your plugin. You're also best advised to remove what you don't need. I tend to work in a copy of the Starter, remove stuff, build and if there are errors with required classes (Activator.java obviously will be required and some others), then paste them back in from the Starter.
XPages OpenLog Logger is a good cross-reference, that was built from XPages Starter Kit. It's pretty much stripped down and you'll be able to see what had to be changed. A lot of the elements of the XSP Starter Kit correspond to Java classes you'll probably be familiar with from your XPages Java development.
GitHub etc tend to be used as source control, which is useful for working out what's changed from time to time.

set different build target in eclipse like in Xcode

I know the title might be a little misleading so if you find better words for what I mean, please feel free to modify it. I take the concept of 'target' from Xcode.
I'm an iPhone developer now turning to android one. With Xcode I can do the following, supposing I have a list of apps in which users grow different plants:
I can set different target, e.g. apple, bear etc.
for each target I can choose to load different database/UI images, they are all in the project file with same names but in different folders, and I can set which target uses which files.
at building phase I simply choose targets and click build, and then I can have a list of my apps.
The advantage of this is I don't need to change anything in code, just grab the resources from designer and change a little project setting, and all is done.
Now with eclipse I can't find out how I can do that so simple. I have to remove old resources and copy new ones there to build for a different target. This takes too much time when I have to make an update for 20 apps. So is there a better to achieve what I'm requiring? any plugin for eclipse or there are some easy ways that I don't know yet?
I got a solution with Android Studio's Gradle Build System i-e defining different flavors of my app and then using build variant configuations, producing different apps from same/shared code base, resources etc.
As per Android Developers Docs
The build system uses product flavors to create different product versions of your app. Each product version of your app can have different features or device requirements. The build system also uses build types to apply different build and packaging settings to each product version. Each product flavor and build type combination forms a build variant. The build system generates a different APK for each build variant of your app. Now one can have two or more product flavors e.g (paid flavor, free/demo flavor) etc for one single project with same code base.
For more information See Build Variants & Product Flavors Doc
I started a similar topic and added a bounty, so I even got responses.
Here is the topic with explanation of the problem and possible solutions that I found on the web. These are mainly library projects and broadcast receivers. Maybe library projects will work for you?
There is also a satisfactory answer that I got with a solution for Android/Eclipse. It proposes using SharedPreferences for determining which code/image/package is invoked and which not. The problem I see with it is that the entire code and resources must be in the app, so this gets quite large in case that one has a lot of different images.
Maybe there will be other helpful answers there, you can have a look after a couple of days. What I already know is that there is no such thing like targets in Eclipse... In Android Studio there are modules which seem to be similar, but that does not really help us.

Hudson/Jenkins source code metrics?

Are there any useful plugins for source code metrics for Hudson/Jenkins?
I'm looking for total lines of code, total number of tests, classes, etc. with graphing.
Does anything like this exist?
Are you using Java? If so, SONAR should certainly be your first port of call. It does a lot on it's own and also wraps up all the major Java analysis tools, such as:
Out of the box, you'll get metrics on:
Potential Architectural & Design issues
Unit test coverage (uses cobertura)
Lines of code\packages\classes etc
Potential bugs
Code duplication
Adherence to code formatting standards
(plus many more)
It allows you to traverse from the high level analysis through to the source code it relates to. It will be easier if you're using Maven for your build though...
There is a Hudson plugin. And it's free.
Try CCCC (http://sourceforge.net/projects/cccc/). It does code counting, module counting (classes), etc., and the plugin also graphs it for you. (for C, C++)
Incidently, what language are you looking at?
There's also CLOC (Count lines of Code) which will tell you how many lines of each language you have, although I can't seem to find a link for it.
You don't specify which language you are using, but Redsolo's awesome blog post Guide to building .NET projects using Hudson shows you how to use FxCop and NUnit on Hudson to give some of what you are looking for. The Violations plugin used also supports Simian, CPD, PMD and PyLint.

Are MVC2 areas with multiple projects supported in the final release?

I had been following this guide to get areas with multiple projects setup:
http://msdn.microsoft.com/en-us/library/ee307987(VS.100).aspx
I was stumbling on the step where you modify the .csproj files to enable the AfterBuild configuration. My googling led me to this post from Steve Mosely:
http://avingtonsolutions.com/blog/post/2010/04/03/JQuery-AspNet-MVC-2-Multi-Project-Areas-and-Other-News-Minutia.aspx
So far the only hang up I had was that
I had set up my solution to
incorporate multi project areas which
was supported in the MVC 2 preview
releases of Areas. However, when the
RTM came out it was no longer
supported. I searched and searched for
solutions to my dilemma, but the only
thing I could find was post by
Jonathon who basically had the same
experience I had, and a reference to
an obscure message on a message board
saying (by what appeared to be some
one from the ASP Team) that it was not
supported. To date, I haven't found
any more formal post or article saying
that was not the case.
Is this true? Did this feature get removed from 2010 MVC2? I haven't been able to find a definite answer.
They were removed in Preview 2. The only supported use of areas are single-project areas.
You can reference the Build assembly in the "Futures" download for both MVC2 and MVC3. Of course, multiple Areas are supported in the RC within a single project. I completely disagree with Levi that it didn't make sense to merge multiple projects. It makes total sense when you develop large applications and desire to break up the functionality into "modules", or "mini applications". Simply research topics like "OO programming", "composition", "modular", "dependency injection", "inversion of control", "aspects" and related frameworks like "MEF", "Unity", "Prism", "Composite Application Framework", SmartClient application block, etc. (not to mention all of the incredible non-MS frameworks, but mentioning one means not mentioning another and people get all touchy about things like that...).
Notes: 1) The documented MSBuild tasks are not included in the project files in the release, so you must find and add them and 2) The futures assemblies are not strongly named, so you will want to change the MSBuild tasks to use the "PublicKeyToken=null" in the "AssemblyName" paths.

Choosing between Impala and OSGi

I've been investigating OSGi for my company's software, but have recently been recommended to take a look at Impala. According to its web page, Impala is "a dynamic module framework for Java-based web applications, based on the Spring Framework."
At a glance, and looking at this blog post about the differences, the key differences I can see are that Impala is simpler than OSGi, does not manage versioning of third party components, and is far less widely used/known (I do not see a single question about it on Stack Overflow).
I wonder whether people who have direct experience with Impala and OSGi (i.e. those who have investigated it more deeply than reading blog posts and online docs), have any more insights into the practical differences between the two, and/or suggestions about what types of projects each one may be more or less suitable for.
Edit: It may also be interesting to include Springsource Slices into the comparison, although it is as yet an early prototype. At a glance, it appears to only work in DM Server.
Impala's approach to modularity is very weak when it comes to controlled sharing between modules. The problem is that Impala still follows the old J2EE-style hierarchical approach to classloading.
Anybody can write a module system that restricts visibility of classes across modules. The difficult part is how you reintroduce dependencies between modules such that specific classes and interfaces from one module can be seen by another module. In OSGi we do this by exporting and importing packages, so we have a non-hierarchical dependency graph.
In Impala, if you want to see the classes in another module, then your module must be a child or descendant of that module. That is, modules can only see their own classes and those of their ancestors. Now if you want to share some classes with your sibling module (e.g. a library that you both use) then you must move that library up into the classpath of your shared ancestor. In the worst case you have to move it right up to the root module. Now the library is visible to ALL other modules whether they want it or not! Indeed, if another module wanted to use a different version of the library they would be prevented from doing so.
If you simply have a copy of the library in each place where it is used, then you make it impossible for the modules using that library to communicate with each other. They will get ClassCastExceptions when they try to pass objects between each other.
A similar problem is inherent in J2EE if two web applications need to use the same library. Typically J2EE developers just copy the library, but this creates "silo" applications that cannot communicate with each other. It is simply not the way to build modular software.
Steven's points also seem pertinent. As far as I can tell, nobody is using Impala aside from its author.
In my eyes, there is no comparison. OSGi is a mature framework that's been around for 10 years and is the basis for the implementation of most of today's Java containers. OSGi has growing adoption, there are books available and, yes, people talk about it on Stack Overflow!
Impala hasn't even hit a stable release and appears to be a 1-man project, though he is asking for additional developers now.
So, it depends on your criteria. If you are investigating technology out of interest, then I don't see any issue writing stuff with Impala. If you are looking to base your company's future products on it, then I think that would be professionally negligent.

Resources