I am trying to download a file from a url using wget. If I do this in Terminal it works:
cd ~/Desktop/diffTest/tempFile
wget "http://www.testsite.com/diffTest/file1.html
If I try this AppleScript I get an error:
do shell script "cd ~/Desktop/diffTest/tempFile"
do shell script "/usr/local/bin/wget 'http://www.testsite.com/diffTest/file1.html'"
Error message:
error "--2019-10-21 14:43:28-- http://www.testsite.com/diffTest/file1.html
Resolving www.testsite.com (www.testsite.com)... 66.96.xxx.31
Connecting to www.testsite.com (www.testsite.com)|66.96.xxx.31|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 369 [text/html]
file1.html: Permission denied
Cannot write to ‘file1.html’ (Permission denied)." number 3
I don't understand the error message as I'm not trying to write to a file but download it. There is no existing file to overwrite, the folder is empty. The folder has permission for read/write for everyone.
The second shell script line ignores the directory change.
You have to put both commands into one line, the semicolon is the command separator
do shell script "cd ~/Desktop/diffTest/tempFile; /usr/local/bin/wget 'http://www.testsite.com/diffTest/file1.html'"
Related
I am a web dev trying to do a little bit of Linux admin and could use help. My server needs to retrieve a file daily from a remote location over sftp, name and date/time stamp it, and push it to a directory for archive.
I have adapted a shell script that I had working when doing this over ftp, but sftp is causing me some issues.
I can successfully connect to the server in Filezilla when I have it set to the sftp protocol and choose the "Longon Type" as "Interactive" where it prompts for a password.
When I use the command line to call my script, it seems to resolve but hangs on the logging in step and provides the following error before retrying: "Error in server response, closing control connection. Retrying."
Here is the output:
https://i.imgur.com/dEXYRHk.png
This is the contents of my script where I've replaced any sensitive information with a placeholder in ALL CAPS.
#!/bin/bash
# Script Function:
# This bash script backups up the .csv everyday (dependent on cron job run) with a file name time stamp.
#[Changes Directory]
cd /THEDIRECTORY
wget --no-passive-ftp --output-document=completed`date +%Y-%m-%d`.csv --user=THEUSER --password='THEPASSWORD' ftp://sftp.THEDOMAIN.com:22 completed.csv
Anyone wanna help a newb to get some of them internet points?! :-)
I am a newbie at scripting and simply trying to utilize scp within a script to move a file to a remote system. I keep encountering errors within my code/or nothing happens/the file does not get copied.
I attempted multiple scripts but I feel I'm just not quite getting the language. In the code I've included, I am trying to scp the test.txt file to the remote system. I've also tried including a send "scp test.txt ${user}#XXXXXXXX.com" line as well.
#!/usr/bin/expect
set user "XXXXXX"
set password "XXXXXX"
log_file XXXX.txt
spawn /usr/bin/scp -f test.txt ${user}#XXXXXXXXXXX.com
expect "*assword"
send "${password}\n"
interact
I believe the file should be copied to the remote server but when I attempt to display it with ls -l I get nothing.
I'm trying to execute a bash file with putty/plink but gives me an error.
On windows Ive got this batch:
E:\putty\plink.exe user#link -pw password -m E:\folder\test.sh
on bash file Ive got this:
#!/bin/bash
vtoff
vtadmin check connector /PCS/ConnectionModels/Arbor/
and the error:
C:\folder>e:\folder\test.bat
C:\folder>e:\putty\plink.exe user#link -pw password -m e:\folder\test.sh
ksh[4]: vtoff: not found
ksh[5]: vtadmin: not found
C:\folder>
The documentation for plink says that -m specifies that it should "read remote command(s) from file".
Since #!/bin/bash isn't a command, and your error message references ksh, it's pretty clear that bash is nowhere to be found in this question!
As for your actual error message, it seems that the commands aren't found, probably because they're not in your PATH.
I use wget for simple things so don't scream if there is an obvious answer to my problem but here is an example of a wget for a simple image:
MBP:bin Mike$ wget http://www.mactricksandtips.com/wp-content/uploads/main_page_images/terminal-small.png
--2011-04-25 12:48:05-- http://www.mactricksandtips.com/wp-content/uploads/main_page_images/terminal-small.png
Resolving www.mactricksandtips.com... 209.20.76.249
Connecting to www.mactricksandtips.com|209.20.76.249|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 4432 (4.3K) [image/png]
terminal-small.png: Permission denied
Cannot write to `terminal-small.png' (Permission denied).
MBP:bin Mike$
Any suggestions on why its not simply writing it to my computer? This happens for every single wget request I make...
If the file exists already on your machine, check its permissions. You may not have the writes to overwrite it. As well, check the permissions on the containing directory to see if you're even allowed to write in there.
You don't seem to have write permissions to that folder. I'm not sure what distro you use, but prepending sudo might do the trick:
sudo wget http://www.mactricksandtips.com/wp-content/uploads/main_page_images/terminal-small.png
I have developed a shell script to copy the files from source to destination and simultaneously to delete the copied files in source. I can copy the files but the files cannot be deleted in source side.
files='ls filename'
for file in $files
do
ftp -vn $hostname <<EOFD
quote USER $username
quote PASS $password
binary
cd $start_dir
rm -rf $file
quit
EOFD
done
I got errors as 'No such files or directories found'
By putting ftp outside the forloop also i got error as 'invalid command'
I also tried in ssh but it prompting for username and password
files=`ls filename`
Put backticks, not simple quotes around the command to get its output.
I also tried in ssh but it prompting for username and password - check SSH-Login without password.
Scripting FTP commands using input stream directly to ftp is usually a bad idea: it lacks any error handling, it can go totally wrong and you have no chance to control it. If you have any chance to use saner command-line client, such as lftp, curl or a similar scriptable one.
Also, it's a very bad idea to iterate over files using
files=`ls files`
for file in $files
A slightly better solution is:
for file in *
but it doesn't scale: if * (or ls output) would expand more than command line buffer, it will fail. A fairly scalable solution is something like:
find . | while read file do
do_something_with $file
done
...and yet it's not probably what you want. In fact, if you just want to transfer files from source to destination and then delete files at source, you can just use lftp with mput command and -E option to delete file after transfer, or something similar with rsync --remove-source-files.
Full-proof solution:
Replace the line
`rm -rf $file`
with
`!rm -rf $file`
This is because, at that place in the code you are on the ftp console until the EOFD string is reached, so to run any command on local system(source), you need ! to be prefixed.
Best way to test is manually executing the commands. Here's what I have tested:
mtk4#laptop:~$ ftp XXX.XXX.XXX
Connected to static-XX-XX-XX-XX.XXXXXX.com.
220---------- Welcome to Pure-FTPd [privsep] [TLS] ----------
220-You are user number 2 of 50 allowed.
220-Local time is now 07:52. Server port: 21.
220-IPv6 connections are also welcome on this server.
220 You will be disconnected after 15 minutes of inactivity.
Name (XXXXXX.XXX:XXX): XXXXXXX
331 User XXXXXXX OK. Password required
Password:
230 OK. Current restricted directory is /
Remote system type is UNIX.
Using binary mode to transfer files.
ftp> lcd test/
Local directory now /home/mtk4/test
ftp> pwd
257 "/" is your current location
ftp> !pwd
/home/mtk4/test
ftp> !ls
sample.txt
ftp> !rm sample.txt
ftp> !ls
ftp> bye
221-Goodbye. You uploaded 0 and downloaded 0 kbytes.
221 Logout.
mtk4#laptop:~$
Or another solution,
use the same for loop again, after the complete ftp is done to iterate over the same set of files and delete them.