Makefile, main function, ClassNotFound - makefile

I have a Java program, and write a makefile to compile it on Linux.
My project organized like this (Run.java is the main entry)
Program -
Src -
(package)adb.Bing_WebResults
Run.java
(package)adb.jsonModel
*.java
(package)adb.models
*.java
bin -
lib -
gson.jar
commons.jar
resource -
*.txt
This is my makefile:
# My project require 3 parameters from user input.
default: Run.class
Run.class: src/adb/Bing_WebResults/Run.java
javac -sourcepath src/ -classpath lib/*.jar -d bin/ src/adb/Bing_WebResults/*.java src/adb/jsonModels/*.java src/adb/models/*.java
run:
java -classpath bin/:lib/*.jar Run "$(ARG1)" "$(ARG2)" "$(ARG3)"
When I use "make run" command in Linux terminate, exception shows that "Could not find the main class: Run"
Are there something wrong with my makefile? Wrong path or something?

There are many things that could potentially be wrong, but the most apparent issues are the incorrect dependencies of the targets in your makefile.
First of all, the target run should have a dependency on Run.class. If you do make run then make looks at the target called run. In your makefile, this target does not have any dependencies defined, and it will execute the line java ... without checking whether the actual compiled class Run.class exists. As a consequence, if you do make run from a clean situation, your source code will not be compiled and the java command will fail because the compiled class is missing.
Your dependency of default on Run.class is incorrect as well, because Run.class will exist in the bin directory, not in the working directory. The line below mentions the target Run.class as well.
There are several ways to improve your makefile. See below an example of corrected code with some variables added to avoid repeated expressions. This approach is a matter of style and preference though.
BINDIR := bin
RUNCLASS := Run
RUNBINARY := $(BINDIR)/$(RUNCLASS).class
SRCDIR := src/adb/Bing_WebResults
RUNSRC := $(SRCDIR)/$(RUNCLASS).java
# Note: the default target below is superfluous at this moment
default: $(RUNBINARY)
$(RUNBINARY): $(RUNSRC)
javac -sourcepath src/ -classpath lib/*.jar -d $(BINDIR) $(SRCDIR)/*.java src/adb/jsonModels/*.java src/adb/models/*.java
run: $(RUNBINARY)
java -classpath $(BINDIR):lib/*.jar $(RUNCLASS) "$(ARG1)" "$(ARG2)" "$(ARG3)"
This works for me in a simplified, comparable setup -- it might work for you as well. Looking at the snippet you provided, there are most likely other dependencies or changes that need to be added to complete your makefile correctly. Potentially, you might have to add package information to your run command and dependency expressions, but that depends on your source code. Your post does not contain enough information to provide a complete solution.
P.S.: Do not forget to replace spaces by tabs if you copy this code to your own makefile.

At last you need to specify the package when running since you dont seem to have the main class in default package.
java -classpath bin com.example.Run arg1 arg2 ...

It turns out that two points should be noticed:
(1) Run is in a package, so it should be "adb.Bing_WebResults.Run.class" in makefile.
(2) external jar files should be concatenated by : (e.g. lib/a.jar: lib/b.jar)

Related

Passing extra compilation flags to debug build in bitbake recipe

As Bitbake builds -dev and -debug for recipes is it possible for defining compilation definitions specific to debug build for a particular recipe. Lets say I have some source code under DEBUG_INFO for some recipe i.e.,
#ifdef DEBUG_INFO
........... do something
#endif /* DEBUG_INFO */
and uses cmake in bitbake environment.
I want this flag be enabled for the debug binaries generated in the .debug folder. Is this possible?
If I use EXTRA_OECMAKE = "-DDEBUG_INFO" it gets enabled to both dev and debug builds.
No, it is not possible. All packages of a recipe are built in one go, they're just the same files but split somehow.
The only difference is with "special flavors" of a recipe (native, nativesdk, target, multilib, toolchain-specific recipes, etc...), in that case, you can have different flags but still, all the packages resulting from the build of this "flavor" will be built with the same flag.
If you want to build another variant of a package where a certain CMake flag is set in the compilation, you can create a variant of the recipe. If the main recipe is named my-app_git.bb you can create another recipe file named my-app-tweak_git.bb and a common base, my-app.inc. In the bb files, include the inc file:
require my-app.inc
Move most of what's now in my-app_git.bb to my_app.inc, e.g. SRC_URI, but define different contents for EXTRA_OECMAKE in the .bb files.
Now you will have to decide which one of my-app and my-app-tweak goes into the image by specifying either my-app or my-app-tweak in an IMAGE_INSTALL definition.
This is not exactly what you asked for, but as has been stated by qschulz, you cannot change the contents of the -dev and -dbg sub-packages.
Also note that dbg and dev can be considered reserved words for variants of the package name, so if you want to use something other than tweak, as in my example, you cannot use any of them.

rosrun does not work after sourcing my own catkin workspace

I'm following the ROS-tutorial and I am facing the following behavior after creating my own package:
If try to execute any installed package (e.g. any_package), I get the following error:
[rosrun] Couldn't find executable named <any_package> below /opt/ros/kinetic/share/<any_package>
[rosrun] Found the following, but they're either not files
[rosrun] or not executable:
[rosrun] /opt/ros/kinetic/share/<any_package>
Any help?
EDIT:
If I execute catkin_find --without-underlays --libexec --share <any_package>, it gives me the following output:
Multiple packages found with the same name "my_package":
- my_new_package/my_package
- my_new_package/my_package_2
I assume that you have a tainted workspace.
I assume that you've just copied the my_package to my_package_2 without editing the package.xml file in my_package_2.
It is not really mentioned in the tutorial, since it assumes that you use the proper commands which creates a manifest file with a unique package name.
Just edit the name-tag as follows:
<name>my_package</name>
to
<name>my_package_2</name>
in the corresponding folder.
You have to make sure you edit CmakeLists.txt according to your compile version, c++ executable declaration & Specify libraries to link a library
Below are step step modification and then run catkin_make before running your project:
step 1
add_compile_options(-std=c++11)
step 2
## Declare a C++ executable
## With catkin_make all packages are built within a single CMake context
## The recommended prefix ensures that target names across packages don't collide
add_executable(${PROJECT_NAME}_node src/myproject_node.cpp)
step 3
## Specify libraries to link a library or executable target against
target_link_libraries(${PROJECT_NAME}_node
${catkin_LIBRARIES}
)

Scons: how to specify file dependency for 3rd party compile result?

It seem to me that scons targets are being generated not in declaration sequence. My problem is, I need to generate some code first, I'm using protoc to process a my.proto file into .h and .cc file, I need some pseudo code like this(what should the working code look like?)
import os
env=Environment(ENV=os.environ,LIBPATH='/usr/local/lib')
env.ShellExecute('protoc', '--outdir=. --out-lang=cpp', 'my.proto')//produces my.cc
myObj=Object('my.cc')//should wait until 'my.cc' is generated by protoc
Dependency(myObj, 'my.cc')
mainObj=Object('main.cpp')
My question is:
How to specify this ShellExecution of protoc in SConstruct/SConscript?
How to make sure that the compilation of 'main.cpp' depends on the existence of 'my.cc', in another word, wait until 'my.cc' is generated and then execute?
Your observations and assumptions are correct, SCons will not execute the single build commands in the order that you list them in the SConstruct files. It will run them based on the dependencies of the targets and source files in your build, either defined implicitly (header includes in C++, for example) or explicitly (via the Depends() method).
So you have to define and setup your dependencies correctly, such that SCons delivers the output that you want. For the special protoc case in your example, a special Builder exists that will help you to get the dependency graph right. It is available in our ToolsIndex, where also support for a variety of other languages and dialects can be found.
These special builders will emit the correct target nodes, e.g. when given a *.proto input file, and SCons is then able to automatically detect the dependency between the protoc input file and your main program if you say something like:
env=Environment(tools=['default','protoc'])
env.Protoc([], "test.proto")
env.Program('main', ['main.cpp'] + Glob('*.cc'))
The Glob('*.cc') will detect your *.cc files, coming out of the protoc Tool, and include them as dependencies for your final target main.
You can always write your own Builders and Emitters in SCons, which is the canonical way of making new tools/toolchains known to SCons dependency analysis. In the UserGuide, sect. "18 Writing Your Own Builders", and especially our ToolsForFools Guide you can find more infos about this.

How to copy files from the package to the main codebase?

I have got a plugin package to enhance the working of our product. This package contain some additional files and some modified main code-base repository files. But we can't directly merge this package with our code-base. Our target is to copy files from this package to the main code-base at the time of build. So we have to do some modifications in makefiles.
This package follows the similar directory hierarchy as that of the main code-base directory tree. What could be the best method to do so ? I'm thinking of creating some kind of script to do so. Would this be a good option ?
Without seeing any of your code, all I can suggest is creating a make target that will always get executed and putting it as part of the dependencies to your main code-base build. Something along these lines
final_target : other_dependencies copy_plugin_files
command_to_build_final_target
other_dependencies : source_files
command_to_build_other_dependencies
.PHONY : copy_plugin_files #this makes sure this will always execute
copy_plugin_files :
[insert script or cp command here to copy your plugin files]
If you need the plugin files copied first, then put the copy_plugin_files dependency before the other_dependencies after final_target.
If you need the plugin files to run through their own make process first, then put cd path/to/plugin && $(MAKE) as part of the recipe for your copy_plugin_files target.
Hope that helps!

Properly compiling modules in subfolders (ocamlbuild)

I recently decided to organize the files in my project directory. I moved the parsers I had for a few different file types into their own directory and also decided to use ocamlbuild (the as the project was getting more complicated and the simple shell script was not sufficient any longer).
I was able to successfully include external projects by modifying myocamlbuild with some basic rules (calling ocaml_lib, I'll use ocamlfind some other time), but I am stuck on how to include the folder as a module into the project properly. I created a parser.mlpack file and filled it with the proper modules to be included (eg, "parser/Date", et cetera), wrote a parser.mli in the root of the directory for their implementations, and modified the _tags file (see below).
During the compilation, the parser directory is traversed properly, and parser.cmi, parser.mli.depends were both created in the _build directory; as well as all *.cm[xio] files in the parsers subdirectory.
I feel I might be doing something redundant, but regardless, the project still cannot find the Parser module when I compile!
Thanks!
_tags
debug : true
<*.ml> : annot
"parser" : include
<parser/*.cmx>: for-pack(Parser)
<curlIO.*> : use_curl
<mySQL.*> : use_mysql
<**/*.native> or <**/*.byte> : use_str,use_unix,use_curl,use_mysql
compilation error
/usr/local/bin/ocamlopt.opt unix.cmxa str.cmxa -g -I /usr/local/lib/ocaml/site-lib/mysql mysql.cmxa -I /usr/local/lib/ocaml/curl curl.cmxa curlIO.cmx utilities.cmx date.cmx fraction.cmx logger.cmx mySQL.cmx data.cmx project.cmx -o project.native
File "\_none\_", line 1, characters 0-1:
Error: **No implementations provided for the following modules:**
Parser referenced from project.cmx
Command exited with code 2.
You'll notice -I parser is not included in the linking phase above; actually none of the parser related files are included!
edit: Added new details from comments and answer below.
You need to "include" the parser directory in the search path. You can do this in _tags:
"parser": include
Then ocamlbuild can search the parser directory for interesting files.
I wonder if parser.mli is somehow interfering with the dependencies in processing the mlpack file. parser.cmi will be generated from the pack operation when parser.mlpack is processed and compiled. Try building with the parser.mli file removed. If that works, then this can be re-processed into a real answer.
Also, you don't need parser/ as a prefix to your modules in parser.mlpack if parser.mlpack is in the parser directory and you have the include tag set. But that shouldn't make a difference for this.
Update: this worked around the problem, but wasn't the root cause. Root cause, per comment below, was a file mentioned in the .mlpack that had been relocated.

Resources