Rerouting the SAS logs to a specific location in shell scripting - shell

I have created a shell script which runs a SAS program which is created log in the same folder where I'm running Shell script. But, I'm trying to save the logs to a specific folder on LINUX. I used -log option and it is throwing me error...I'm running following command in my shell script...
/saspath/sas /homesas/test.sas -log home/sasu1/log/test.log.$rundatetime \
I'm getting this error... -log: command not found

It is normal Unix convention to put the options (-log) before the parameters (filename) to a command.
sas -log xxx.log xxx.sas
Your real problem might be that you need to construct your log filename first.
pgm=test
log=${pgm}.${rundatetime}.log
sas -log $log &pgm
Another thing to check is that some sites have build scripts to launch SAS and they do not properly pass the command line arguments through to the actual command that launches SAS. Check whether /saspath/sas is the actual command provided by SAS or something your local IT group created.

Related

Is there a command in Shell scripting for executing .exe file and running commands automatically inside of it? replacing the user interaction

I have a .sh script file that I'm modifying which runs an .EXE file that opens the Windows command line prompt automatically.
This .exe asks the user for an input (name of the file in the folder workspace that it will read)
I want to automate this step in my shell script so my user doesn't have to interact with this, and run the commands automatically
I read a bit about the expect command but I think that is for Linux only.
Can someone help me, I'm pretty new to Shell scripting and I couldn't find any useful information elsewhere.
I'm assuming that your executable accepts command-line arguments. So, here we go.
You can use the "start" command in Windows Shell. For example:
start C:\path\to\program.exe -argument
If you want to make the script wait until the .exe file finishes running before continuing, you can use the "/wait" command:
start /wait C:\path\to\program.exe -argument
IF all of that doesn't work, please try:
start myprogram.exe /command1 /command2 /command3
Hope it helps,

How to save cmd output as file name

I'm running a command line script on multiple PC's and i'm trying to save username as a file name so i can see who's information i'm viewing later on.
In the command line script i run Whoami and i'd like to save it as "user"."file type". I'm trying to do this in a command line script because I always do it manually in command line and am trying to automate this process so I can do it faster.
If you know how to do it in a better way do share.
whoami > test.txt
tells it to go to a file and "test.txt" will be generated wherever your CMD CurDir is.
You may use Windows Environment variables %USERNAME% and possibly %USERDOMAIN% if the domain is needed.
%USERNAME% does not return the domain by itself.
Full list of standard environment variables: How-to: Windows Environment Variables
Use these in the command as needed. For example:
dir > %USERNAME%.txt
If you need the domain in there:
dir > %USERDOMAIN%_%USERNAME%.txt
(using _ to separate domain and username instead of \ since filename cannot contain \)
Remember to use >> instead of > if you don't want the file to be overwritten each time the command is run.
You may want to direct errors and standard output as needed: Redirecting error messages from Command Prompt: STDERR/STDOUT

Run a remote bash script on a Mac using PuTTy

I want to run a bash script on a mac remotely from a batch script on a windows machine. On Windows I have this:
#echo off
echo bash /applications/snowflake/table-updater/test2.sh; exit>tmp_file
putty -ssh User#remote_machine -pw password -m tmp_file
And here is test2.sh on the remote machine
#!/bin/bash
# test2.sh
#
#
7za x table-apps.zip -y -o/Applications/snowflake/applications
When the batch file runs it logs in successfully, but for some reason fails to run the bash file. However the bash file runs fine from mac terminal, where it unzips the files perfectly. What could be happening here?
Please note test2.sh is actually in Applications/snowflake/table-updater as specified in the batch file. And the tmp file does write fine as well. My aim is to have a script to access a further 10 remote machines with the same directory structure.
Thanks in advance
The standard program which resembles the scriptable Unix command ssh in the PuTTy suite is called plink, and would probably be the recommended tool here. The putty program adds a substantial terminal emulation layer which is unnecessary for noninteractive scripting (drawing of terminal windows, managing their layout, cursor addressing, fonts, etc) and it lacks the simple feature to specify a command directly as an argument.
plink user#remote_machine -pw password /Applications/snowflake/table-updater/test2.sh
From your comments, it appears that the problem is actually in your script, not in how you are connecting. If you get 7za: command not found your script is being executed successfully, but fails because of a PATH problem.
At the prompt, any command you execute will receive a copy of your interactive environment. A self-sufficient script should take care to set up the environment for itself if it needs resources from non-standard locations. In your case, I would add the following before the 7za invocation:
PATH=$PATH:/Applications/snowflake/table-updater
which augments the standard PATH with the location where you apparently have 7za installed. (Any standard installation will take precedence, because we are adding the nonstandard directory at the end of the PATH -- add in front if you want the opposite behavior.)
In the general case, if there are other settings in your interactive .bashrc (or similar shell startup file) which needs to be set up in order for the script to work, the script needs to set this up one way or another. For troubleshooting, a quick and dirty fix is to add . /Users/you/.bashrc at the top of the script (where /Users/you should obviously be replaced with the real path to your home directory); but for proper operation, the script itself should contain the code it needs, and mustn't depend on an individual user's personal settings file (which could change without notice anyway).

run DXL script on the background (command line) : DXL/DOORS

I am trying to start a dxl script with command line. But i am getting lots of warnings and errors.
When I try this script on doors gui , it works fine but when i try on this command line without gui, it doesn't.
Here is the image of warnings :
Here is the commandline script :
"%ProgramFiles%\IBM\Rational\DOORS\9.3\bin\doors.exe" -d 36677#bie -u "xxx yyy" -P don -b "d:\workset\mc\addins\Devel\exporterRTF.dxl"
Why it doesn't work with commandline ? Any help, idea etc is appreciated.
EDIT :
this is a link which i try to run : myprogram.dxl
this is a link which is imported in my running script include in myprogram.dxl
this is a link which is secondly imported in my running script include in myprogram.dxl
There are other settings you need to run in Batch mode (pulled from the DOORS help):
Runs Rational DOORS in batch mode. Rational DOORS starts without the GUI (it suppresses the login screen and the database explorer), runs the specified DXL program, and then stops.
In batch mode you normally need other switches like -user, -password and -project to log in and specify the current project.
The parameter of the -batch switch specifies the file that contains the DXL program that you want to run in batch mode.
You probably need a current project specified. Also you may need to add a command at the end of your script to exit DOORS if you don't want the session to stay open.
The errors that you list seem like regular DXL errors, so if you need more assistance than this, you will need to post some of the code.
EDIT:
If you put all of the files into one does it run? Another option may be to include the Addins path on your command line. I believe the issue is that the batch mode is not recognizing the included files as part of the same scope.

running a perl script from a windows scheduled task

I have awstats installed on windows 2008 server.
I schedule the Updatestats.bat file to run every day, the task runs fine without error, but the script is not being executed or is throwing an error that I cannot see.
-- If I run the bat file directly from command line then it works fine. --
I have tried various alternatives to the windows scheduler, such as "nncron" and "Freebyte Task Scheduler", nncron had the same issue, but the freebyte app worked, but sadly it does not run as a service so is of no use.
here is the contents of the bat file, all lines look like this.
c:\strawberry\perl\bin\perl.exe D:\AWStats\wwwroot\cgi-bin\awstats.pl config=earlsmere.co.uk -update
anyone got any ideas ?
Your unattended environment is obviously different from you command line. Check if the following are set:
Script's working directory, if it reads anything from it or uses relative paths.
PERL*_LIB environment variables if your script uses any modules.
PATH environment variable, if your script calls any external scripts/binaries.
User that is running scheduled tasks have sufficient rights for everything you want to do.
As a quick workaround you can set them directly in script using chdir function, lib module, and $ENV{PATH} entry.
You also can try to capture standard output and error with following redirections before you start doing anything else:
open(STDOUT, '>>', '/full/path/to/out.log') || die "Error stdout: $!";
open(STDERR, '>>', '/full/path/to/err.log') || die "Error stderr: $!";
Note that you really should use full paths there in case you indeed have working directory set wrong. And make sure target directory/file is writable for anyone.
Looks like the output gets lost in space...
I suggest redirecting the output of the command to a file, like this:
c:\strawberry\perl\bin\perl.exe D:\AWStats\wwwroot\cgi-bin\awstats.pl config=earlsmere.co.uk -update > c:\my_log.txt 2>&1
(courtesy of Anders Lindahl: Redirect stdout and stderr to a single file in DOS)

Resources