How to Wait for more Content when Reading a File with Factor? - factor-lang

I have something like the following code
"file.txt" utf8 <file-reader> [ [ print ] each-line ] with-input-stream* ;
This works nicely for the current contents of file.txt and the processing (like printing in this case) ends when the end-of-file is reached. But I want the process to wait for new contents appended to the file and also process this. Or in other words, the current version implements Unix cat, but I want it to do tail -f.
I hoped with-input-stream* (mind the asterisk) would do the trick, as the docs say the stream is not closed at the end. But there must be something else I'm missing.

You're in luck, I wrote a utility like that a while ago. See https://github.com/bjourne/playground-factor/wiki/Tips-and-Tricks-Filesystem#tailing-a-file
USING: accessors io io.encodings.utf8 io.files io.monitors kernel namespaces ;
IN: examples.files.tail
: emit-changes ( monitor -- )
dup next-change drop
input-stream get output-stream get stream-copy* flush
emit-changes ;
: seek-input-end ( -- )
0 seek-end input-stream get stream>> stream-seek ;
: tail-file ( fname -- )
[
dup f <monitor> swap utf8 [
seek-input-end emit-changes
] with-file-reader
] with-monitors ;
Your problem I think is that the quotation given to with-input-stream* will implicitly close the stream (each-line does it). I don't know if that is a bug or not. A word like this can be used to read the full stream without closing it:
: my-stream-contents* ( stream -- seq )
[ [ stream-read1 dup ] curry [ ] ] [ stream-exemplar produce-as nip ] bi ;
Then:
IN: scratchpad "/tmp/foo" utf8 <file-reader> [ my-stream-contents* print ] keep
file contents here
...
--- Data stack:
T{ decoder f ~input-port~ utf8 f }
IN: scratchpad my-stream-contents* print
more file contents here
...

Related

bash recursion automatically ends after single level

Why is this make_request function ending just after a single traversal?
make_request(){
path="${1//' '/'%20'}"
echo $path
mkdir -p $HOME/"$1"
$(curl --output $HOME/"$1"/$FILE_NAME -v -X GET $BASE_URL"/"$API_METHOD"/"$path &> /dev/null)
# sample response from curl
# {
# "count":2,
# "items": [
# {"path": "somepath1", "type": "folder"},
# {"path": "somepath2", "type": "folder"},
# ]
# }
count=$(jq ".count" $HOME/"$1"/$FILE_NAME)
for (( c=0; c<$count; c++ ))
do
child=$(jq -r ".items[$c] .path" $HOME/"$1"/$FILE_NAME);
fileType=$(jq -r ".items[$c] .type" $HOME/"$1"/$FILE_NAME);
if [ "$fileType" == "folder" ]; then
make_request "$child"
fi
done
}
make_request "/"
make_request "/" should give the following output:
/folder
/folder/folder1-1
/folder/folder1-1/folder1-2
/folder/foler2-1
/folder/folder2-1/folder2-2
/folder/folder2-1/folder2-3 ...
but I am getting the following:
/folder
/folder/folder1-1
/folder/folder1-1/folder1-2
You are using global variables everywhere. Therefore, the inner call changes the loop variables c and count of the outer call, resulting in bogus.
Minimal example:
f() {
this_is_global=$1
echo "enter $1: $this_is_global"
((RANDOM%2)) && f "$(($1+1))"
echo "exit $1: $this_is_global"
}
Running f 1 prints something like
enter 1: 1
enter 2: 2
enter 3: 3
exit 3: 3
exit 2: 3
exit 1: 3
Solution: Make the variables local by writing local count=$(...) and so on. For your loop, you have to put an additional statement local c above the for.
As currently written all variables have global scope; this means that all function calls are overwriting and referencing the same set of variables, this in turn means that when a child function returns to the parent the parent will find its variables have been overwritten by the child, this in turn leads to the type of behavior seen here.
In this particular case the loop variable c leaves the last child process with a value of c=$count and all parent loops now see c=$count and thus exit; it actually gets a bit more interesting because count is also changing with each function call. The previous comment to add set -x (aka enable debug mode) before the first function call should show what's going on with each variable at each function call.
What OP wants to do is insure each function is working with a local copy of a variable. The easiest approach is to add a local <variable_list> at the top of the function, making sure to list all variables that should be treated as 'local', eg:
local path count c child fileType
change variables to have local scope instead of global.
...
local count; # <------ VARIABLE MADE LOCAL
count=$(jq ".count" $HOME/"$1"/$FILE_NAME)
local c; # <------ VARIABLE MADE LOCAL
for (( c=0; c<$count; c++ ))
do
....
done
...

shell - How to define a variable in a C-type for loop?

I want to iterate over an array of file names stored in files_arr to create a terminal-based file manager in the POSIX shell.
A reduced version of the functions list_directory looks like this:
# Presents the user with the files in the directory
list_directory() {
# Iterate over each element in array `files_arr` by index, not by filename!
# And outputs the file name one on each line
for file in "${!files_arr[#]}"; do
echo "${files_arr[file]}"
done
}
I want to implement a way to exclude the first n files from the array files_arr.
n is defined by how often the user scrolls past the current terminal window size to create an effect of scrolling through the files, highlighting the file where the cursor is currently on.
On a directory (e.g. the home directory) that looks like this:
To Implement this I try to create an C-like for loop like so:
for ((file=$first_file; file<=${!files_arr[#]}; file=$((file+1))); do
or as the whole function:
# Presents the user with the files in the directory
list_directory() {
# Iterate over each element in array `files_arr` by index, not by filename!
#for file in "${!files_arr[#]}"; do
for ((file=$first_file; file<=${!files_arr[#]}; file=$((file+1))); do
# Highlighted file is echoed with background color
if [ $file -eq $highlight_index ]; then
echo "${BG_BLUE}${files_arr[file]}${BG_NC}"
# Colorize output based on filetype (directory, executable,...)
else
if [ -d "${files_arr[file]}" ]; then
echo "$FG_DIRECTORY${files_arr[file]}$FG_NC"
elif [ -x "${files_arr[file]}" ]; then
echo "$FG_EXECUTABLE${files_arr[file]}$FG_NC"
else
echo "${files_arr[file]}"
fi
fi
# $LINES is the terminal height (e.g. 23 lines)
if [ "$file" = "$LINES"]; then
break
fi
done
}
which returns the error:
./scroll.sh: line 137: syntax error near `;'
./scroll.sh: line 137: ` for ((file=$first_file; $file<=${!files_arr[#]}; file=$((file+1))); do'
How can I iterate over the array files_arr, defining the start-index of $file?
You can iterate over the array with:
for (( i = $first_file; i < ${#files_arr[#]}; i++ )); do
echo ${files_arr[i]}
done
but it seems cleaner to use:
for file in ${files_arr[#]:$first_file}; do
echo "$file"
done

How can I edit a .conf file easily?

So I read the easiest way to use .conf files for bash scripts is to use source to load such files. Now, what if I want to edit this file ?
Some code I found does a really good job :
function set_config(){
sed -i "s/^\($1\s*=\s*\).*\$/\1$2/" $conf_file
}
But, if the variable is not yet defined, it doesn't define it, nor does it check if the parameters are passed well, isn't secure, doesn't handle default values etc...
Does reliable tools/code already exists to edit .conf file which contain key="value" pairs ? For instance, I would like to be able to do things like this :
$conf_file="my_script.conf"
conf_load $conf_file #should create the file if it doesn't exist !
read=$(conf_get_value "data" "default_value") #should read the value with key "data", defaulting to "default_value"
if [[ $? = 0 ]] #we should be able to know if the read was successful
then
echo "Successfully read value for field \"data\" : $read"
else
echo "Default value for field \"data\" : $read"
fi
conf_set "something_new" "a great value!" #should add the key "something_new" as it doesn't exist
conf_set "data" "new_value" #should edit the value with key "data"
if [[ $? = 0 ]]
then
echo "Edit successful !"
else #something went wrong :-/
echo "Edit failed !"
fi
before running this code, the conf file would contain
data="some_value"
and after it would be
data="new_value"
something_new="a great value!"
and the code should output
Successfully read value for field "data" : some_value
Edit successful !
I am using bash version 4.3.30 .
Thanks for your help.
I'd to that with awk since it's rather good at tokenizing:
# overwrite config's entries for KEY with VALUE or else appends the definition
# Usage: set_config KEY VALUE
set_config() {
[ -n "$1" ] && awk -F= -v key="$1" -v new="$1=\"$2\"" '
$1 == key { $0 = new; key_found = 1; }
{ print }
END { if (!key_found) { print new; }
' "$conf_file" > "$conf_file.new" \
&& cat "$conf_file.new" > "$conf_file" && rm "$conf_file.new"
}
If run without arguments, set_config() will do nothing and return false. If run with only one argument, it will create an empty value (outputting KEY="").
The awk command parses the .conf file line by line, looking for each definition of the given key and altering it to the new value. All lines are then printed (with or without modification), preserving the original order. If the key hasn't yet been found by the end of the file, this appends the new definition.
Because you can't pipe a file atop itself, this gets saved with a ".new" extension and then copied atop the original in a manner that preserves permissions. The ".new" copy is then removed. I used && to ensure that these never happen if an error occurred earlier in the function.
Also note that the type of ".conf file" you're referring to (the type you source with a POSIX shell) will never have spaces around its equals signs, so the \s* parts of your sed command aren't needed.

Filemaker 13 Quicklook Script

Running Filemaker 13 on Mac OSX Yosemite.
We have a quicklook script that has, up until Yosemite, worked without issue. Normally, it takes a .doc/.docx file in the container field and opens it up in Quicklook.
However in Yosemite, it opens qlmanage, then causes Filemaker to freeze and crash.
Set Variable [ $file ; Value: ${database}::Container Field ]
Set Variable [ $path ; Value: Get ( Temporary Path ) & $file ]
Set Variable [ $script ; Value:
Let (
thepath = Middle( $path ; Position ($path ; "/" ; 1 ; 2 ); Length ($path) ;
"set p to POSIX path of " & Quote (thepath) &
"¶ do shell script \"qlmanage -p \" & quoted form of p" )
]
Export Field Contents [Database::Container Field ; "$path" ]
Perform Applescript [ $script ]
Can anyone give me some ideas on what might be going wrong here?
Thanks
I succeeded with an edited version of your script using FileMaker Pro Advanced 14.0.2 running under OS X Yosemite 10.10.5 in a demo file that looked like this:
Set Variable [ $_file ; Value: GetAsText ( Table::container ) ]
Set Variable [ $_fm_path ; Value: Get ( TemporaryPath ) & $_file ]
Set Variable [ $_as_path ; Value: Middle (
$_fm_path;
Position ( $_fm_path; "/" ; 1 ; 2 ) ;
Length ( $_fm_path) )
]
Set Variable [ $_script ; Value: List (
"set p to POSIX path of " & Quote ( $_as_path ) ;
"do shell script \"qlmanage -p \" & quoted form of p" )
]
Export Field Contents [ Table::container ; “$_fm_path” ]
Perform AppleScript [ $_script ]
Exit Script []
The primary differences between this and what you showed are:
I used a direct reference to the table name. I'm actually not sure what ${database} refers to. Perhaps Get ( FileName )?
I stored the AppleScript path in a variable for easier debugging.
If this doesn't work, I'd work with the recommendation I gave about testing the execution of the contents of $_script in Script Editor and the contents of the shell's p variable in Terminal.

Expect script clause evaluation

In my expect script, my goal is to send a command to show the properties of the two processors on a motherboard. Please assume the remote logging in is successful. It's where the send clause variables are not evaluated successfully.
I have a procedure and a variable:
set showcpu "show -d properties /SYS/MB/P\r"
I created a while loop to execute do a "send" if the "cpu" count starts at 0 and less than 2.
set cpu 0
while { $cpu < 2 } {
expect {
-re $prompt {send "${showcpu}${cpu}\r"; }
timeout {
my_puts "ILOM prompt timeout error-2" [ list $fh1 $fh3 stdout ]
exit 1
}
}
set cpu [ expr {$cpu + 1} ]
}
The execution result is this:
[BL0/SP]-> show -d properties /SYS/MB/P
show: Invalid target /SYS/MB/P
[BL0/SP]-> 0
Invalid command '0' - type help for a list of commands.
I wanted the script to combine the value $showcpu with $cpu and it should look like this:
show -d properties /SYS/MB/P0 and show -d properties /SYS/MB/P1.
Could someone please educate me on what I need to do to accomplish that?
The variable ${showcpu} itself already contains "\r" (according to 1.).
Either define it without "\r":
set showcpu "show -d properties /SYS/MB/P"
or use string trim (http://wiki.tcl.tk/10174):
send "[string trim ${showcpu}]${cpu}\r"
I would recommend to trim the white spaces on the place where the variable is set, not at the places where the variable is used.

Resources