Using Scons to run other build scripts "underneath" - bash

I'm working on a contract to enhance a huge gnarly build system that's based on SCons but with a bunch of shell scripts and makefiles intertwined with it.
The system runs a bunch of individual SCons commands and the request from my client was to put in a "top level" SConscript to control some underlying SCons runs.
I got this to work by using the Command function:
tgt = env.Command('bogus.out', 'bogus.in', "./stc.sh")
where the shell script 'stc.sh' removes the next target's controlling fake file 'pkgbogus.out':
tgt2 = env.Command('pkgbogus.out', 'pkgbogus.in', "./stcpkg.sh")
This works fine, and I understand that this is completely out of scope for normal SCons use ... but is there also a slicker way to do this without these 'bogus' files?

Related

Use non-built-in bash commands without modifying .bashsrc

I'm working on cluster and using custom toolkits (more specifically SRA toolkit). In order to use it, I fist had to download (and unpack it) to a specific folder in my directory.
Then I had to modify .bashsrc to include the following segment:
# User specific aliases and functions
export PATH="$PATH:/home/MYNAME/APPS/SRATOOLS/bin"
Now I can use a stuff from SRATools in bash command line, e.g.
prefetch SR111111
My question is, can I use those tools without modifying my .bashsrc?
The reason that I want to do that is because I wrote a .sh script that takes a long time to run, and my cluster has Sun Grid Engine job management system, and I submitted my script to it, only to see the process fail - because a SRA Toolkit command I used was unrecognized.
EDIT (1):
I modified the location where my prefetch command is, and now it looks like:
/MYNAME/APPS/SRA_TOOLS/bin
different from how it is in .bashsrc:
export PATH="$PATH:/home/MYNAME/APPS/SRATOOLS/bin"
And run what #Darkman suggested (put IF THEN ELSE FI and under ELSE put export). The output is that it didn't find SRATools (because path in .bashsrc is different), but it found them under ELSE and script is running normally. Weird. It works on my job management system.
Thanks everybody.

PVS-Studio: No compilation units were found

I'm using PVS-Studio in docker image based on ubuntu:18.04 for cross-compiling a couple of files with arm-none-eabi-gcc. After doing pvs-studio-analyzer trace -- .test/compile_with_gcc.sh strace_out file is successfully created, it's not empty and contains calls to arm-none-eabi-gcc.
However pvs-studio-analyzer analyze complains that "No compilation units were found". I tried using --compiler arm-none-eabi-gcc key with no success.
Any ideas?
The problem was in my approach to compilation. Instead of using a proper build system, I used a wacky shell script (surely, I thought, using a build system for 3 files is an overkill, shell script won't hurt anybody). And in that script I used grep to redefine one constant in the source - kinda like that: grep -v -i "#define[[:blank:]]\+${define_name}[[:blank:]]" ${project}/src/main/main.c | ~/opt/gcc-arm-none-eabi-8-2018-q4-major/bin/arm-none-eabi-gcc -o main.o -xc
So compiler didn't actually compiled a proper file, it compiled output of grep. So naturally, PVS-Studio wasn't able to analyze it.
TL;DR: Don't use shell scripts as build system.
We have reviewed the stace_out file. It can be handled correctly by the analyzer, if the source files and compilers are located by the absolute path in the stace_out file. We have a suggestion what might help you. You can "wrap" the build command in a call to pvs-studio-analyzer -- trace and pvs-studio-analyzer analyze and place them inside your script (compile_with_gcc.sh). Thus, the script should start with the command:
pvs-studio-analyzer trace --
and end with the command:
pvs-studio-analyzer analyze
This way we will make sure that the build and analysis were started at the same container run. If the proposed method does not help, please describe in more detail, by commands, the process of building the project and running the analyzer. Also tell us whether the container reruns between the build and the formation of strace_out, and the analysis itself.
It would also help us a lot if you ran the pvs-studio-analyzer command with the optional --dump-log flag and provided it to us. An example of a command that can be used to do this:
pvs-studio-analyzer analyze --dump-log ex.log
Also, it seems that it is not possible to quickly solve the problem and it is probably more convenient to continue the conversation via the feedback form on the product website.

Cannot `source` shc-compiled scripts

Is there any way to source (include) compiled script?
I use shc to compile all of my scripts and when I run them from the command line they work OK to start. But when script have to include other two scripts (variables.sh.x and functions.sh.x) it crashes and returns an error, that binary files can not be included.
Is there any way to accomplish this?
including piece of code:
source $(dirname $0)/variables.sh.x
source $(dirname $0)/functions.sh.x
shc does not actually compile scripts. It merely obfuscates them by encrypting and embedding them inside a C program, so it cannot improve performance. The actual shell still interprets and executes the code and is required for the script to run.
If you absolutely must use this tool to obfuscate your code, you will have to combine everything into a single file.

invoke make from build

I'd like to simplify the workflow so that rather than issuing these commands
$ make program_unittest
... output of $MAKE ...
$ ./program_unittest args
I could have my program automatically attempt to compile itself (if the source has been updated) when it is run, so that I do not have to go back and run make myself.
Here's what I'm thinking: My unit test build should first check if there is a makefile in the directory it's in, and if so, fork and exec make with the target corresponding to itself. If make determines "nothing to be done", it will continue on its way (running the unit-tests). However, if make actually performs a compilation, one of two things may happen. gcc (invoked by make) might be able to overwrite the build (an older version of which is already running) during compilation, in which case I can then perhaps exec it. If my system does not permit gcc to overwrite the program which is in use, then I have to quit the program before running make.
So this has become quite involved already. Are there perhaps more elegant solutions? Maybe I could use a bash script? How do I ascertain if make issued compilation commands or not?
Why not have make run the unit tests?

Package bash script in "executable" for double-click execution (ideally platform independent)?

I wrote a number of bash scripts that greatly simplify the routine, but very tedious, file manipulation that my group does.
Unfortunately, most in my group cannot open a terminal, let alone run scripts with complex arguments.
Is there a way to nicely package a bash script into an executable (that accepts arguments) that runs nicely on multiple computer platforms?
I run Mac OS X, but many of my colleagues run Windows (which can run bash scripts via Cygwin, etc.). I am aware of Platypus, but is there an equivalent for Windows?
I do not know if it meets all of your requirements but I use makeself wich is really great to package things. It works with cygwin, so it might fill in your needs ^^
Basically, when you create a makeself archive, you give a script that will be executed when the archive is "launched". This script get all the parameters given to the archive (whatever you want) :
makeself.sh ${dir_to_archive} ${name_of_archive} ${description} ${startup_script}
When you run the auto-extractible archive, you do :
my_archive.run ${param1} ${param2} ${paramN}
It will uncompress your archive and run :
${startup_script} ${param1} ${param2} ${paramN}
my2c

Resources