I am using terminal in Mac for SSH access and it is great. But is there any way for me to do file transfer with the remote server that I SSH into in Mac?
Thanks
scp is your friend, enough said :)
(I realize this is a late reply, but I just stumbled upon this question and thought I'd contribute a tip...)
A quick & dirty way of transferring files over Terminal is:
On the remote side:
cat $file | openssl enc -base64
This will output a bunch of uppercase/lowercase/digits which represent Base64-encoded binary data. Select & copy this block text.
Then, in a separate Terminal window on your local machine:
pbpaste | openssl enc -base64 -d > $file
This will pipe the contents of the clipboard (the Base64-encoded data) to the openssl program (which is set to decode via the -d flag), and save the results in $file.
This works best for small files, and isn't terribly fast. I use it when I'm too lazy to construct a command line for scp or sftp. For larger/multiple files, you'll definitely want to use the latter two.
Related
I have an external drive with an encrypted disk image that I've forgotten the password to-- but I know it's a combination of 4-5 passwords I was using at the time it was encrypted. I generated a list of all the combinations I think it could be (with character variations I use, such as # for 'a'), and I'd like to automate trying each of them as there are several hundred guesses.
I'm using '''echo -n 'Password' | hdiutil attach -stdinpass /path/Disk.dmg''' to try to mount the disk from terminal on Mac, and I know this works using a test disk image I made.
So, is there a way I can pass the list of password guesses to terminal instead of having to paste them in? I've considered using Automator on Mac but it won't take my code above in the "do shell script" command.
Thanks for your help!
cat yourfile.txt | while read line; do hdiutil attach -stdinpass /path/DMG.dmg; done
I use Secure Shell as a Chrome extension (https://chrome.google.com/webstore/detail/secure-shell/pnhechapfaindjhompbnflcldabbghjo?hl=da)
Now I have finished some programming and I need the file on my computer.
How can I transfer the file to my computer? Can I send it by email or something?
I have tried yanking all the lines in vim, but I still don't get it copied to my windows clipboard.
One entertaining (and moderately ridiculous) approach would be sprunge.
Run this on the remote machine:
cat myFile | curl -F 'sprunge=<-' http://sprunge.us
And then visit the URL it prints on your local machine. :D
I presume that you are using Windows OS and trying to download your file from a Linux like OS.
You use MobaXterm and it comes with a file transfer features.
http://mobaxterm.mobatek.net
On a CLI you can use "scp" to download and upload.
Another one is you can also use FileZilla using SFTP protocol
I have a list of SWF files that I want to download to my PC. Is there a quick way to do this from my server, like using WGET or something similar. Each file line list is a new line:
http://super.xx-cdn.com/730_silvxxxen/v/v.swf
http://super.xx-cdn.com/730_sixxxxheen/73xxxxversheen.swf
http://super.xx-cdn.com/730_rxxxd/v/v.swf
There are thousands of lines.
If you use ssh over putty to access your server you could easily use winscp from putty
otherwise you could also use pscp
If you do not have putty installed get it and make up a ssh to your server
Another easy way to download them is just getting an FTP client and download them over FTP
You can use simple SH script, if I correctly understand your question:
#!/bin/sh
while IFS= read -r line
do
wget $line
done < "urls.txt"
I want to download a huge file from an ftp server in chunks of 50-100MB each. At each point, I want to be able to set the "starting" point and the length of the chunk I want. I won't have the "previous" chunks saved locally (i.e. I can't ask the program to "resume" the download).
What is the best way of going about that? I use wget mostly, but would something else be better?
I'm really interested in a pre-built/in-build function rather than using a library for this purpose... Since wget/ftp (also, I think) allow resumption of downloads, I don't see if that would be problem... (I can't figure out from all the options though!)
I don't want to keep the entire huge file at my end, just process it in chunks... fyi all - I'm having a look at continue FTP download afther reconnect which seems interesting..
Use wget with:
-c option
Extracted from man pages:
-c / --continue
Continue getting a partially-downloaded file. This is useful when you want to finish up a download started by a previous instance of Wget, or by another program. For instance:
wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z
If there is a file named ls-lR.Z in the current directory, Wget will assume that it is the first portion of the remote file, and will ask the server to continue the retrieval from an offset equal to the length of the local file.
For those who'd like to use command-line curl, here goes:
curl -u user:passwd -C - -o <partial_downloaded_file> ftp://<ftp_path>
(leave out -u user:pass for anonymous access)
I'd recommend interfacing with libcurl from the language of your choice.
I'm trying to figure out a way to use tar+pipes on a Ubuntu Server LTS.
I've got a postgresql command (pg_dump) that outputs lots of sql on the standard output:
pg_dump -U myUser myDB
I know how to redirect that to a file:
pg_dump -U myUser myDB > myDB.sql
In order to save some disk space, I would rather have it compressed: I can do a tar.gz file from that myDB.sql, and then delete myDB.sql.
But I was wondering - is there a way of doing this without creating the intermediate .sql file? I believe this could be accomplished with pipes... however I'm no shell guru, and know very little about them (I'm able to do ls | more, that's all). I've tried several variations of pg_dump .. | tar ... but with no success.
How can I use a pipe to use the output of pg_dump as an input for tar? Or did I just get something wrong?
I don't see how "tar" figures into this at all; why not just compress the dump file itself?
pg_dump -U myUser myDB | gzip > myDB.sql.gz
Then, to restore:
gzip -cd myDB.sql.gz | pg_restore ...
The "tar" utility is for bundling up a bunch of files and directories into a single file (the name is a contraction of "tape archive"). In that respect, a "tar" file is kind-of like a "zip" file, except that "zip" always implies compression while "tar" does not.
Note finally that "gzip" is not "zip." The "gzip" utility just compresses; it doesn't make archives.
In your use case pg_dump creates only a single file which needs to be compressed. As others have hinted, in *nix land an archive is a single file representing a filesystem. In keeping with the unix ideology of one tool per task, compression is separate task from archival. Since an archive is a file it can be compressed, as can any other file. Therefore, since you only need to compress a single file, tar is not necessary as others have already correctly pointed out.
However, your title and tags will bring future readers here who might be expecting the following...
Let's say you have a whole folder full of PostgreSQL backups to archive and compress. This should still be done entirely using tar, as its -z or --gzip flag invokes the gzip tool.
So let's also say you need to encrypt your database archives in preparation for moving them to a dubiously secured offsite backup solution (such as an S3-compatible object store). And let's assume you like pre-shared token (password) encryption using the AES cipher.
This would be a valid situation where you might wish to pipe data to and from tar.
Archive -> Compress -> Encrypt
tar cz folder_to_encrypt | openssl enc -aes-256-cbc -e > out.tar.gz.enc
Decrypt -> Uncompress -> Extract
openssl enc -aes-256-cbc -in ./out.tar.gz.enc -d | tar xz --null
Do refer to the GNU tar documentation for details of how the --null flag works and more useful examples for other situations where you might need to pipe files to tar.
tar does not compress, what you want is gzip or a similat compression tool
Tar takes filenames as input. You probably just want to gzip the pg_dump output like so:
pg_dump -U myUser myDB |gzip > myDB.sql.gz