Properly compiling modules in subfolders (ocamlbuild) - compilation

I recently decided to organize the files in my project directory. I moved the parsers I had for a few different file types into their own directory and also decided to use ocamlbuild (the as the project was getting more complicated and the simple shell script was not sufficient any longer).
I was able to successfully include external projects by modifying myocamlbuild with some basic rules (calling ocaml_lib, I'll use ocamlfind some other time), but I am stuck on how to include the folder as a module into the project properly. I created a parser.mlpack file and filled it with the proper modules to be included (eg, "parser/Date", et cetera), wrote a parser.mli in the root of the directory for their implementations, and modified the _tags file (see below).
During the compilation, the parser directory is traversed properly, and parser.cmi, parser.mli.depends were both created in the _build directory; as well as all *.cm[xio] files in the parsers subdirectory.
I feel I might be doing something redundant, but regardless, the project still cannot find the Parser module when I compile!
Thanks!
_tags
debug : true
<*.ml> : annot
"parser" : include
<parser/*.cmx>: for-pack(Parser)
<curlIO.*> : use_curl
<mySQL.*> : use_mysql
<**/*.native> or <**/*.byte> : use_str,use_unix,use_curl,use_mysql
compilation error
/usr/local/bin/ocamlopt.opt unix.cmxa str.cmxa -g -I /usr/local/lib/ocaml/site-lib/mysql mysql.cmxa -I /usr/local/lib/ocaml/curl curl.cmxa curlIO.cmx utilities.cmx date.cmx fraction.cmx logger.cmx mySQL.cmx data.cmx project.cmx -o project.native
File "\_none\_", line 1, characters 0-1:
Error: **No implementations provided for the following modules:**
Parser referenced from project.cmx
Command exited with code 2.
You'll notice -I parser is not included in the linking phase above; actually none of the parser related files are included!
edit: Added new details from comments and answer below.

You need to "include" the parser directory in the search path. You can do this in _tags:
"parser": include
Then ocamlbuild can search the parser directory for interesting files.

I wonder if parser.mli is somehow interfering with the dependencies in processing the mlpack file. parser.cmi will be generated from the pack operation when parser.mlpack is processed and compiled. Try building with the parser.mli file removed. If that works, then this can be re-processed into a real answer.
Also, you don't need parser/ as a prefix to your modules in parser.mlpack if parser.mlpack is in the parser directory and you have the include tag set. But that shouldn't make a difference for this.
Update: this worked around the problem, but wasn't the root cause. Root cause, per comment below, was a file mentioned in the .mlpack that had been relocated.

Related

Are pysa users expected to copy configuration files?

Facebook's Pysa tool looks useful, in the Pysa tutorial exercises they refer to files that are provided in the pyre-check repository using a relative path to include a path outside of the exercise directory.
https://github.com/facebook/pyre-check/blob/master/pysa_tutorial/exercise1/.pyre_configuration
{
"source_directories": ["."],
"taint_models_path": ["."],
"search_path": [
"../../stubs/"
],
"exclude": [
".*/integration_test/.*"
]
}
There are stubs provided for Django in the pyre-check repository which if I know the path where pyre check is installed I can hard-code in my .pyre_configuration and get something working but another developer may install pyre-check differently.
Is there a better way to refer to these provided stubs or should I copy them to the repository I'm working on?
Many projects have a standard development environment, allowing for hard coded paths in the .pyre_configuration file. These will usually point into the venv, or some other standard install location for dependencies.
For projects without a standard development environment, you could trying incorporating pyre init into your setup scripts. pyre init will setup a fresh .pyre_configuration file with paths that correspond to the current install of pyre. For additional configuration you want to add on top of the generated .pyre_configuration file (such as a pointer to local taint models), you can hand write a .pyre_configuration.local, which will act as an overlay and overwrite/add to the content of .pyre_configuration.
Pyre-check looks for the stubs in the directory specified by the typeshed directive in the configuration file.
The easiest way is to move stubs provided for Django in the pyre-check repository to the typeshed directory that is in the pyre-check directory.
For example, if you have installed pyre-check to the ~/.local/lib directory, move the django directory from ~/.local/lib/pyre_check/stubs to ~/.local/lib/pyre_check/typeshed/third_party/2and3/ and make sure your .pyre_configuration file will look like this:
{
"source_directories": ["~/myproject"],
"taint_models_path": "~/myproject/taint",
"typeshed": "~/.local/lib/pyre_check/typeshed"
}
In this case, your Django stubs directory will be ~/.local/lib/pyre_check/typeshed/third_parth/2and3/django
Pyre-check uses the following algorithm to traverse across the typeshed directory:
If it contains the third_party subdirectory, it uses a legacy method: enters just the two subdirectories: stdlib and third_party and there looks for any subdirectory except those with names starting with 2 but not 2and3, and looks for the modules in those subdirectories like 2and3, e.g. in third_party/2and3/
Otherwise, it enters the subdirectories stubs and stdlib, and looks for modules there, e.g. in stubs/, but not in stubs/2and3/.
That's why specifying multiple paths may be perplexing and confusing, and the easiest way is to setup the typeshed directory to ~/.local/lib/pyre_check/typeshed/ and move django to third_parth/2and3, so it will be ~/.local/lib/pyre_check/typeshed/third_parth/2and3/django.
Also don't forget to copy the .pysa files that you need to the taint_models_path directory. Don't set it up to the directory of the Pyre-check, create your own new directory and copy only those files that are relevant to you.

rosrun does not work after sourcing my own catkin workspace

I'm following the ROS-tutorial and I am facing the following behavior after creating my own package:
If try to execute any installed package (e.g. any_package), I get the following error:
[rosrun] Couldn't find executable named <any_package> below /opt/ros/kinetic/share/<any_package>
[rosrun] Found the following, but they're either not files
[rosrun] or not executable:
[rosrun] /opt/ros/kinetic/share/<any_package>
Any help?
EDIT:
If I execute catkin_find --without-underlays --libexec --share <any_package>, it gives me the following output:
Multiple packages found with the same name "my_package":
- my_new_package/my_package
- my_new_package/my_package_2
I assume that you have a tainted workspace.
I assume that you've just copied the my_package to my_package_2 without editing the package.xml file in my_package_2.
It is not really mentioned in the tutorial, since it assumes that you use the proper commands which creates a manifest file with a unique package name.
Just edit the name-tag as follows:
<name>my_package</name>
to
<name>my_package_2</name>
in the corresponding folder.
You have to make sure you edit CmakeLists.txt according to your compile version, c++ executable declaration & Specify libraries to link a library
Below are step step modification and then run catkin_make before running your project:
step 1
add_compile_options(-std=c++11)
step 2
## Declare a C++ executable
## With catkin_make all packages are built within a single CMake context
## The recommended prefix ensures that target names across packages don't collide
add_executable(${PROJECT_NAME}_node src/myproject_node.cpp)
step 3
## Specify libraries to link a library or executable target against
target_link_libraries(${PROJECT_NAME}_node
${catkin_LIBRARIES}
)

Scons: how to specify file dependency for 3rd party compile result?

It seem to me that scons targets are being generated not in declaration sequence. My problem is, I need to generate some code first, I'm using protoc to process a my.proto file into .h and .cc file, I need some pseudo code like this(what should the working code look like?)
import os
env=Environment(ENV=os.environ,LIBPATH='/usr/local/lib')
env.ShellExecute('protoc', '--outdir=. --out-lang=cpp', 'my.proto')//produces my.cc
myObj=Object('my.cc')//should wait until 'my.cc' is generated by protoc
Dependency(myObj, 'my.cc')
mainObj=Object('main.cpp')
My question is:
How to specify this ShellExecution of protoc in SConstruct/SConscript?
How to make sure that the compilation of 'main.cpp' depends on the existence of 'my.cc', in another word, wait until 'my.cc' is generated and then execute?
Your observations and assumptions are correct, SCons will not execute the single build commands in the order that you list them in the SConstruct files. It will run them based on the dependencies of the targets and source files in your build, either defined implicitly (header includes in C++, for example) or explicitly (via the Depends() method).
So you have to define and setup your dependencies correctly, such that SCons delivers the output that you want. For the special protoc case in your example, a special Builder exists that will help you to get the dependency graph right. It is available in our ToolsIndex, where also support for a variety of other languages and dialects can be found.
These special builders will emit the correct target nodes, e.g. when given a *.proto input file, and SCons is then able to automatically detect the dependency between the protoc input file and your main program if you say something like:
env=Environment(tools=['default','protoc'])
env.Protoc([], "test.proto")
env.Program('main', ['main.cpp'] + Glob('*.cc'))
The Glob('*.cc') will detect your *.cc files, coming out of the protoc Tool, and include them as dependencies for your final target main.
You can always write your own Builders and Emitters in SCons, which is the canonical way of making new tools/toolchains known to SCons dependency analysis. In the UserGuide, sect. "18 Writing Your Own Builders", and especially our ToolsForFools Guide you can find more infos about this.

force object files in current directory even when subdir-objects is on

I have 2 libraries that share same source files:
# src/lib_mt/Makefile.am:
libppb_la_SOURCES = rphs_mt.c timer_mt.c
# src/sipplib/Makefile.am:
libsipp_a_SOURCES = ../lib_mt/rphs_mt.c ../lib_mt/timer_mt.c
Each source file compiled twice. First for lib_mt with -fPIC, second for sipplib without -fPIC.
Object files for each library created in corresponding directory.
Eventually subdir-objects becomes default. How to keep current behavior for these 2 source files? Some explicit rule maybe?
There is no way to disable that the moment it becomes the default. What you can do instead is migrate this to a non-recursive Automake buildsystem. At that point, it will know that there are different targets compiling the same source files with different flags (it requires AC_PROG_CC_C_O to be called in configure.ac.)
Alternatively, the hacky version is to create a src/sipplib/rphs_mt.c file that only contains
#include "../libmt/rphs_mt.c"
so that it is actually a separate build target.

How to get the target(s) for a PBXFileReference in Xcodeproj

I'm attempting to write a Ruby script that will delete certain files from the Xcode project. I can find the files based on the absolute path and remove them from the project using the remove_from_project method of PBXFileReference. However this leaves source files (e.g. .m or .swift files) in the "Compile Sources" build phase of whatever target(s) it is a member of, but without a name.
I know I need to also remove the file from the target(s) but there seems to be no easy link between a PBXFileReference and a target (PBXNativeTarget).
From what I can make out I need to iterate through each of the project's targets, then iterate through the files or files_references of that target's source_build_phase looking for the PBXFileReference I already have.
Is this correct or am I missing some obvious link such e.g. file_ref.target_memberships?
if (object.is_a?(Xcodeproj::Project::Object::PBXFileReference))
if (!object.real_path.exist?)
object.remove_from_project
end
end
project.save(project_path)
Not sure when this was introduced, but as of xcodeproj version 1.15.0, you can can get the build files associated with a file reference with:
file_ref.build_files
From the documentation:
Method: Xcodeproj::Project::Object::PBXFileReference#build_files
#build_files ⇒ Array<PBXBuildFile>
Returns the build files associated with the current file reference.
Returns:
(Array<PBXBuildFile>) — the build files associated with the current file reference.
Seems like this should do the trick:
file_ref.build_files.each { |file| file.remove_from_project }

Resources