I am trying to use a server to do a set of computations for me, and then export the results as a csv to the server to then be transferred to my own computer. I am having trouble exporting files by remotely running a script. I have a .nb file
a = 1;Export[Directory[] <> "/a.csv", a]
Then I transfer the file to the server and with wolframescript run the script:
$ wolframescript -script /location/filename.nb
I expect a file called a.csv to appear in the directory the .nb file is saved in yet it doesn't. I have tried -run and -file and none of them work either. I have also tried a .wl file and it also doesn't work. What am I doing wrong?
As far as I can tell, when using scripts, file operations are more primitive. Functions OpenWrite OpenAppend Write WriteString and Close are key. Options CharacterEncoding FormatType and PageWidth can help with string data / text files. Your example works on the Desktop with:
a = 1; pipeStream=OpenWrite["a.csv",FormatType->OutputForm]; Write[pipeStream,a]; Close[pipeStream]
Save as Wolframscript from Mathematica "name.wls". On Linux, you need to make the file executable see Wolfram Tutorial.
Then at your respective prompt > $ etc
Wolframscript -file name.wls
should run the script file and create a CSV file with the value "1".
This answer reminds that code generated in Mathematica intended for scripting requires Cells to be Initialization cells
Related
I have the following syntax written in PSPP .sps file:
GET FILE = '... result.sav'
save translate
/outfile = '... data.csv'
/type = CSV
/REPLACE
/FIELDNAMES
/CELLS=LABELS.
where ... stand for the path of the files.
The Script works as expected, so when I open PSPP and run it, it opens the first file and saves it as another CSV file. However, I would like to do two more things:
Call this file from CMD (in Windows) so it will execute all command automatically and silently, without showing the PSPP windows.
Add a line to the syntax to terminate PSPP after execution.
Right now I can only type the name of the .sps file in the CMD and it opens it but does nothing else. I have looked in the official docs but couldn't find any solution for that.
Well, I found the answer myself:
In CMD I had to type:
"C:\Program Files\PSPP\bin\pspp.exe" C:\Users\...\Dropbox\MATLAB\atid\convert_to_csv.sps
And that's all. It runs PSPP in silent mode and creates the file as needed.
What I didn't know was that I need to first write the path of PSPP .exe file ("C:\Program Files\PSPP\bin\pspp.exe") before the name of the syntax file. All the rest just worked.
I want to run a bash script on a mac remotely from a batch script on a windows machine. On Windows I have this:
#echo off
echo bash /applications/snowflake/table-updater/test2.sh; exit>tmp_file
putty -ssh User#remote_machine -pw password -m tmp_file
And here is test2.sh on the remote machine
#!/bin/bash
# test2.sh
#
#
7za x table-apps.zip -y -o/Applications/snowflake/applications
When the batch file runs it logs in successfully, but for some reason fails to run the bash file. However the bash file runs fine from mac terminal, where it unzips the files perfectly. What could be happening here?
Please note test2.sh is actually in Applications/snowflake/table-updater as specified in the batch file. And the tmp file does write fine as well. My aim is to have a script to access a further 10 remote machines with the same directory structure.
Thanks in advance
The standard program which resembles the scriptable Unix command ssh in the PuTTy suite is called plink, and would probably be the recommended tool here. The putty program adds a substantial terminal emulation layer which is unnecessary for noninteractive scripting (drawing of terminal windows, managing their layout, cursor addressing, fonts, etc) and it lacks the simple feature to specify a command directly as an argument.
plink user#remote_machine -pw password /Applications/snowflake/table-updater/test2.sh
From your comments, it appears that the problem is actually in your script, not in how you are connecting. If you get 7za: command not found your script is being executed successfully, but fails because of a PATH problem.
At the prompt, any command you execute will receive a copy of your interactive environment. A self-sufficient script should take care to set up the environment for itself if it needs resources from non-standard locations. In your case, I would add the following before the 7za invocation:
PATH=$PATH:/Applications/snowflake/table-updater
which augments the standard PATH with the location where you apparently have 7za installed. (Any standard installation will take precedence, because we are adding the nonstandard directory at the end of the PATH -- add in front if you want the opposite behavior.)
In the general case, if there are other settings in your interactive .bashrc (or similar shell startup file) which needs to be set up in order for the script to work, the script needs to set this up one way or another. For troubleshooting, a quick and dirty fix is to add . /Users/you/.bashrc at the top of the script (where /Users/you should obviously be replaced with the real path to your home directory); but for proper operation, the script itself should contain the code it needs, and mustn't depend on an individual user's personal settings file (which could change without notice anyway).
I want to download one zip file from web and unzipped and import that file to sas data set for further use.
For unzipping i use below sas code.. But result is nothing with no error.
data _null_;
x "cd C:\Program Files\7-Zip ";
X "7z.exe e C:\Users\Ravinder\Downloads\Compressed\*.zip -o C:\sasdata\New";
run;
Please help to do the same.
ravinder kumar
Have you tried running your command from a command prompt? Did it work? I would recommend running the command like this rather than changing the folder first:
"C:\Program Files\7-Zip\7z.exe" e C:\Users\Ravinder\Downloads\Compressed\*.zip -o C:\sasdata\New
When you run the above command from SAS use single quotes as your string delimiter for the command like so:
data _null_;
X '"C:\Program Files\7-Zip\7z.exe" e C:\Users\Ravinder\Downloads\Compressed\*.zip -o C:\sasdata\New';
run;
If it is still not working as expected from SAS, then the easiest way to debug IMO is to put the command in a batch file (.bat), and then at the end of the file put a line that simply says pause. The pause command will tell windows to keep the batch file window open until you close it. The window will give you full visibility of what happened when you ran the command. Try running the batch file from SAS instead of running the command directly.
Note that you may also need to specify the following SAS options if you want SAS to wait for the unzip to finish before your SAS program continues processing:
options noxwait xsync;
noxwait tells SAS to close the command window automatically once the command has run, and xsync tells SAS to wait for the command to finish running before continuing to process. Note that sometimes when extracting files, the command can finish but the file will still not exist on disk due to either network copying and/or caching. If you want to be absolutely sure, you should probably check for the existence of the file and/or tell sas to sleep() for a certain amount of time.
here is my problem solution.. File download to c:\sasdata location.
data _null_;
x "cd c:\sasdata";
x "curl -A 'Mozilla/4.0' -O http://nseindia.com/content/indices/histdata/CNX%20NIFTY&FRNAME-&FRNAME..csv";
run;
Short story: need a method to get the write-status of a file on a server (using BASH) from a client (using CMD batch).
Long-time lurker, first-time poster. I did many searches on variations of what I'm looking for and have not yet found enough data.
I am writing a batch file in CMD (because the clients could be any WinOS [XP - up] with unknown packages installed). The batch uses puTTY's "plink" to connect via SSH to the server. Once connected to the server, plink executes a command to write data to a new file.
Once that file is written, I use PSCP to copy the file to the client
So far, so good; I have successfully accomplished all of this.
The creation of that file is instantaneous but the time it takes to write all of the data is unknown / variable. Therefore I need an automated method to determine when the file is complete, to then copy it. Simply using timeout/sleep for XX seconds is not feasible in my circumstances.
The approach I have taken so far (as of yet unsuccessfully) is to repeatedly grab the filesize using "stat -c '%s' filemane" and run that in a loop until grab1 EQU grab2, indicating a complete file. I am finding this difficult because I can't get the output of stat into the CMD batch to process it.
Q1: Is this (stat result going into CMD for loop) the best approach? Maybe there's something existing in BASH?
Q2: if Q1 is true, any ideas on how to get the stat result into the CMD batch as a variable to parse/analyze the data?
Thanks in advance for suggestions and your time.
DCT
Have the command writing the file write it with a temp filename. So if it will be called xyz.txt, have it written with filename tmpxyz.txt.tmp, then the final step will be a rename.
That way you can just check for the presence of the named file.
Usually a good idea to give the file a unique name, probably incorporating the date and time, I find.
I have a small perl script ( found here ) which adds command line functionality to an application I already have installed, Coda. Basically it will open a file with the application when I type:
coda filename.py
Where (on OSX) do I need to put this file to make it function? Do I need to do anything else to my environment to get this working?
Type echo $PATH at the terminal. You will get back a series of paths separated by colons. The file needs to be placed into one of those folders. The file also needs to have the execute flag set, which is done with the chmod tool.