How do you create a Bash function that accepts files of a specific type as arguments? - bash

So far, I know that you have to create a function in order to pass arguments.
However, how do you denote the type of the argument?
For instance, if you want to compile a Java class file and then run the resulting Java file (without having to type the file name twice to distinguish between the extensions each time), how do you let the function know that the names belong to files of different types?
Let's say this is our function:
compileAndRun()
{
javac $1
java $2 # basically, we want to make this take the same argument
# (because the names of the *.class and *.java files are the same)
}
So, instead of typing:
compileAndRun test.class test.java
We wanna just type this:
compileAndRun test
Any help along with any extraneous information you wanna throw in would be much appreciated.

Just use $1 twice. It is safer to connect the two commands with &&, so java is not run if the compilation is not successful.
function compile_n_run () {
javac "$1".java && java "$1".class
}

Arguments to bash functions don't really have types; they are just strings, and it's up to you to use them appropriately. In this case, it's fairly simply to write a function which takes a Java source file, compiles it, and runs the resulting output.
compile_n_run () {
source=$1
expected_output="${source%.java}.class"
javac "$source" && java "$expected_output"
}
$ compile_n_run test.java
I chose to require the full Java source name because it's a little friendlier with auto-completion; you don't have to remove the .java from the command-line, rather you let the function do that for you. (And otherwise, this answer would be identical to choroba's).

Related

Call perl function from another perl script with different Active perl versions

We have two versions of Active perl 5.6 and 5.24. We have web services which has to be executed on Active perl '5.24' versions(to adopt TLS 1.2 version) and this needs to be invoked from Active perl '5.6' version. We are using windows operating system.
Steps followed :
Caller code which is executed in 5.6 version invokes the 5.24 version using system /require command.
Problem:
How to call the 5.24 perl function(example: webservicecall(arg1){return "xyz") from 5.6 perl script through system command, require or etc..?
Also how to get the return value of perl function 5.24?
Note:
Its a temporary work around to have two perl versions and the we have a plan to do upgrade it for higher version.
Here perl version 5.6 installed in "C:\Perl\bin\perl\" and perl version 5.24 installed in "D:\Perl\bin\perl\".
"**p5_6.pl**"
print "Hello Perl5_6\n";
system('D:\Perl\bin\perl D:\sample_program\p5.24.pl');
print $OUTFILE;
$retval = Mul(25, 10);
print ("Return value is $retval\n" );
"**p5_24.pl**"
print "Hello Perl5_24\n";
our $OUTFILE = "Hello test";
sub Mul($$)
{
my($a, $b ) = #_;
my $c = $a * $b;
return($c);
}
I have written sample program for detail information to call perl 5.24 version from perl script 5.6 version. During execution I didn't get the expected output. How to get the "return $c" value & the "our $OUTFILE" value of p5_24.pl in p5_6.pl script?
Note: The above is the sample program based on this I will modify the actual program using serialized data.
Place the code for the function that needs v5.24 in a wrapper script, written just so that it runs that function (and prints its result). Actually, I'd recommend writing a module with that function and then loading that module in the wrapper script.
Then run that script under the wanted (5.24) interpreter, by invoking it via its full path. (You may need to be careful to make sure that all libraries and environment are right.)   Do this in a way that allows you to pick up its output. That can be anything from backticks (qx) to pipe-open or, better, to good modules. There is a range of modules for this, like IPC::System::Simple, Capture::Tiny, IPC::Run3, or IPC::Run. Which to use would depend on how much you need out of that call.
You can't call a function in a running program but to have it somehow run under another program.
Also, variables (like $OUTFILE) defined in one program cannot be seen in another one. You can print them from the v5.24 program, along with that function result, and then parse that whole output in the v5.6 program. Then the two programs would need a little "protocol" -- to either obey an order in which things are printed, or to have prints labeled in some way.
Much better, write a module with functions and variables that need be shared. Then the v5.24 program can load the module and import the function it needs and run it, while the v5.6 program can load the same module but only to pick up that variable (and also run the v5.24 program).
Here is a sketch of all this. The package file SharedBetweenPerls.pm
package SharedBetweenPerls;
use warnings;
use strict;
use Exporter qw(import);
our #EXPORT_OK = qw(Mul export_vars);
my $OUTFILE = 'test_filename';
sub Mul { return $_[0] * $_[1] }
sub export_vars { return $OUTFILE }
1;
and then the v5.24 program (used below as program_for_5.24.pl) can do
use warnings;
use strict;
# Require this to be run by at least v5.24.0
use v5.24;
# Add path to where the module is, relative to where this script is
# In our demo it's the script's directory ($RealBin)
use FindBin qw($RealBin);
use lib $RealBin;
use SharedBetweenPerls qw(Mul);
my ($v1, $v2) = #ARGV;
print Mul($v1, $v2);
while the v5.6 program can do
use warnings;
use strict;
use feature 'say';
use FindBin qw($RealBin);
use lib $RealBin;
use SharedBetweenPerls qw(export_vars);
my $outfile = export_vars(); #--> 'test_filename'
# Replace "path-to-perl..." with an actual path to a perl
my $from_5.24 = qx(path-to-perl-5.24 program_for_5.24.pl 25 10); #--> 250
say "Got variable: $outfile, and return from function: $from_5.24";
where $outfile has the string test_filename while $from_5.24 variable is 250.†
This is tested to work as it stands if both programs, and the module, are in the same directory, with names as in this example. (And with path-to-perl-5.24 replaced with the actual path to your v5.24 executable.) If they are at different places you need to adjust paths, probably the package name and the use lib line. See lib pragma.
Please note that there are better ways to run an external program --- see the recommended modules above. All this is a crude demo since many details depend on what exactly you do.
Finally, the programs can also connect via a socket and exchange all they need but that is a bit more complex and may not be needed.
† The question's been edited, and we now have D:\Perl\bin\perl for path-to-perl-5.24 and D:\sample_program\p5.24.pl for program_for_5.24.
Note that with such a location of the p5.24.pl program you'd have to come up with a suitable location for the module and then its name would need to have (a part of) that path in it and to be loaded with such name. See for example this post.
A crude demo without a module (originally posted)
As a very crude sketch, in your program that runs under v5.6 you could do
my $from_5.24 = qx(path-to-perl-5.24 program_for_5.24.pl 25 10);
where the program_for_5.24.pl then could be something like
use warnings;
use strict;
sub Mul { return $_[0] * $_[1] }
my ($v1, $v2) = #ARGV;
print Mul($v1, $v2);
and the variable $from_5.24 ends up being 250 in my test.
You cannot directly call a Perl function running with another Perl version. You would need to create a program which explicitly invokes the function. The input and output need to be explicitly serialized in order to be transported between these two programs.
Serializing could be done with Data::Dumper, Storable or similar. If lower performance is needed you could invoke the program which provides the function with system and share the serialized data with temporary files or pipes. Or you could create some client-server architecture and share the serialized data with sockets. The latter is faster since it skips the repeated start and teardown of the other process but instead keeps it running.

How to pass "args" while using Intellij IDEA [example list 3.11 in Odersky's Scala book (2nd Ed)]

I basically try to run the example 3.11 in Odersky's book (Programming in Scala). I am using Intellij IDE. While runing the code, the "else" branch got executed.
The screen capture is here:
The source is here in case you need it to try:
package ch3
import scala.io.Source
object l3p11 extends App{
def widthOfLength(s: String) = s.length.toString.length
if (args.length > 0){
val lines = Source.fromFile(args(0)).getLines().toList
val longestLine = lines.reduceLeft(
(a, b) => if (a.length > b.length) a else b
)
val maxWidth = widthOfLength(longestLine)
for (line <- lines){
val numSpaces = maxWidth - widthOfLength(line)
val padding = " " * numSpaces
println(padding + line.length + "|" + line)
}
}
else
Console.err.println("Please enter filename")
}
The reason, I think, is becuase I did not pass args correctly (say here I want to pass the source file l3p11.scala as the args). I tried several option, but have not find a way to pass the args correctly for the code to be executed in the "if" branch. There are two directions in my mind to resolve this problem:
1. Find the right way to pass args in Intellij IDE
Run Scala in commond line, a similar command such as
$ scala l3p11.scala l3p11.scala
should be able to pass the args correctly. But my current setting gives "bash: scala: command not found". I currently use scala REPL to run scala code following the set up given in Odersky's Coursera course on Scala. I think I need to change the set up in orde run scala directly, instead of using "sbt->console" to invoke the scala interpreter like what I am doing now.
Any suggestion on either direction (or other directions that I have not thought of) to resolve the problem is welcome.
Update 1:
Direction 2 works after I reinstall scala. (My to be corrected understanding is that the installation of sbt does not provide an executable binary of scala to be included in the environment list for Windows. Therefore, scala command cannot be found before). After installation of scala directly:
$ scala l3p11.scala l3p11.scala
gives the expected results. But I still have not figured out how to get this result with Intellij IDEA.
Update 2:
I revisited the "Program arguments" option after Joe's confirmation. The reason I was not be able to get it work before was that I only add "l3p11.scala". Adding the complete path from working directory "src/main/scala/ch3/l3p11.scala" solved the problem. The result is as following:
To pass command-line arguments when running a program in IntelliJ IDEA, use the "Edit Configurations …" menu item under "Run". Choose the entry for your main program. There's a "Program arguments" text field where you specify the arguments to pass to the program.
I'm not super familiar on how it will run on windows but if you are able to run it directly from the command line then I think you'll need to compile first, that's the scalac command. So:
$ scalac l3p11.scala
then you can run just with the class name, not sure if you would need quotes on the arg:
$ scala l3p11 l3p11.scala

Julia: Having a function f() containing the macro #printf, how can I access the output outside f()?

In the Julia NMF package a verbose option provides information on convergence using the #printf macro.
How can I access this output without rewriting the NMF package io?
To rephrase, having a function f() containing the macro #printf, how can I access the output outside f()?
This does seem like useful functionality to have: I would suggest that you file an issue with the package.
However, as a quick hack, something like the following should work:
oldout = STDOUT
(rd,wr) = redirect_stdout()
start_reading(rd)
# call your function here
flush_cstdio()
redirect_stdout(oldout)
close(wr)
s = readall(rd)
close(rd)
s

Get autocompletion list in bash variable

I'm working with a big software project with many build targets. When typing make <tab> <tab> it shows over 1000 possible make targets.
What I want is a bash script that filters those targets by certain rules. Therefore I would like to have this list of make targets in a bash variable.
make_targets=$(???)
[do something with make_targets]
make $make_targets
It would be best if I wouldn't have to change anything with my project.
How can I get such a List?
#yuyichao created a function to get autocomplete output:
comp() {
COMP_LINE="$*"
COMP_WORDS=("$#")
COMP_CWORD=${#COMP_WORDS[#]}
((COMP_CWORD--))
COMP_POINT=${#COMP_LINE}
COMP_WORDBREAKS='"'"'><=;|&(:"
# Don't really thing any real autocompletion script will rely on
# the following 2 vars, but on principle they could ~~~ LOL.
COMP_TYPE=9
COMP_KEY=9
_command_offset 0
echo ${COMPREPLY[#]}
}
Just run comp make '' to get the results, and you can manipulate that. Example:
$ comp make ''
test foo clean
You would need to overwrite / modify the completion function for make. On Ubuntu it is located at:
/usr/share/bash-completion/completions/make
(Other distributions may store the file at /etc/bash_completion.d/make)
If you don't want to change the completion behavior for the whole system, you might write a small wrapper script like build-project, which calls make. Then write a completion function for that mapper which is derived from make's one.

Python: Call a shell script which calls a bin. With arguments

The context: There is a map somewhere on the system with bin files which I'd like to call. They are not callable directly though, but through shell scripts which do all kinds of magic and then call the corresponding bin with: "$ENV_VAR/path/to/the/bin" "$#" (the software is non-free, that's probably why this construction is used)
The problem: Calling this from within Python. I tried to use:
from subprocess import call
call(["nameOfBin", "-input somefile"])
But this gave the error ERROR: nameOfBin - Illegal option: input somefile. This means the '-' sign in front of 'input' has disapeared along the way (putting more '-' signs in front doesn't help).
Possible solutions:
1: In some way preserving the '-' sign so the bin at the end actually takes '-input' as an option instead of 'input'.
2: Fix the magic in a dirty way (I will probably manage), and have a way to call a bin at a location defined by a $ENV_VAR (environment variable).
I searched for both methods, but appearantly nobody before me had such a problem (or I didn't see it: Sorry if that's the case).
Each item in the list should be a single argument. Replace "-input somefile" with "-input", "somefile":
from subprocess import call
rc = call(["nameOfBin", "-input", "somefile"])

Resources