I am working on building a application that will pull data from SFTP.
Basically I have written a shell script that's run on a cron job daily.
Now I want to know if I can implement some logic in the shell script that will scan the files (for security threats – such as: software viruses, worms, Trojans, adware, etc.) before downloading. Is that possible and how?
You basically want to "remote control" your SFTP session. There are several ways to do this, but I don't see, how you can scan a file for a virus using the (S)FTP protocol, without downloading it. This would require to execute a program on the remote side, and for what I know, FTP does not support this.
Maybe ssh would be the tool of choice. First open a ssh session, do all the checking of the files, then transfer them with FTP. If you are really paranoid, you can calculate the MD5 sum of the file before the checking, and after the download, and verify that they are identical.
Related
So I finished writing my first program and I'm trying to upload it to my sourceforge account, but the file size is to big to use the normal html5 upload manager sourceforge provides. Soursceforge says "For larger files, use FTP, SCP, or rsync". They also provided this page for references of what to commands to send via the command line like this one, which I had no idea how to use
scp file.zip jsmith#frs.sourceforge.net:/home/frs/project/fooproject/Rel_1
Should I be able to run this through the WinSCP.com prompt like so?
WinSCP> scp file.zip jsmith#frs.sourceforge.net:/home/frs/project/fooproject/Rel_1
Seeing as how the commandline kicked my but (I'm totally open to learning how to use the commandline for file transfer to sourceforge if you have any links to any tutorials, this one was too difficult to understand with all the broken English :/ ), I tried creating a connection with the WinSCP GUI and the following info
Host: myusername#frs.sourceforge.net
Username: MyUsername
Password: MyPassword
but I get the following error before the SCP connection is finished
Connection has been unexpectedly closed. Server sent command exist
status 1.
Error skipping startup message. Your shell is probably incompatible
with the application (bash is recommended).
Any help or a nudge in the right direction would be greatly appreciated. From what I've gathered I should learn more about shells, but I have no idea where to begin. Thanks in advance and cheers!
You are using SCP protocol with WinSCP. Make sure you use SFTP.
It appears that you're trying to do something weird with WinSCP. You're trying the "Open Terminal" option, which, I think just tries to open an ssh session at the host. But, we don't support ssh sessions to the frs.sourceforge.net
Normally, if you try to ssh to that host, you get this message:
Welcome!
This is a restricted Shell Account.
You can only copy files to/from here.
Connection to frs.sourceforge.net closed.
But I think WinSCP eats that, or something.
Anyway, I think what you need to do is just use WinSCP as a graphical two-pane sftp client. Navigate to the file to upload on the left, and to the destination directory on the right, etc.
In general, the best place to ask this kind of question is one of the three following:
The sourceforge channel on irc.freenode.net
Email support at sfnet_ops#geek.net
Open a ticket at https://sourceforge.net/p/forge/site-support/new/
While we do sort of monitor Stackoverflow, it's not our fastest support channel. I've asked one of our support engineers to take a look.
Here is how to connect:
select protocol SCP
Click Advanced
Environment --> SCP/Shell --> Shell (see 3 in red): change the shell that is available on the target server
It can connect to Windows 10's ssh server, with git bash
I am needed to move entire directories from one computer in the network to the other (In a platform independent way). Basically I am working on some automation tool to help the developers do Build Verification Tests, for this; I am directed to automate the installation and un-installation of the product on multiple platforms. So, I will need to first copy the files!
And this is where I needed some help in both conceptual and practical knowledge.
Firstly, let me mention that using something like FileZilla or WinSCP is out of the question since I need things to happen automatically and not through button clicks. But please let me know if these tools have any command line utilities!
I tried Perl's NET::FTP, and while it looked promising, I was wondering whether it was the best way to go. Also, I want to know what are the pre-requisites before I can run FTP, I mean would I need perl installed on the other end as well ? I constantly read that the commands from perl's FTP actually try to connect to a FTP host, does this mean its not going to work if I haven't configured the remote host in some way? And if I am right, then what is this extra piece of configuration to be done?
Apart from this, is there any other way I could solve my problem ? I mean I am looking for API's here that would help me do platform independent file transfers. But once again, I cannot use tools that would need button clicks and stuff, because I am doing automation and everything needs be dome programmatic-ally and automatically.
Also, I think this is a very generic problem-statement: "Moving files across a computers connected by LAN"; So, it would be wonderful if we can have a list of (possibly) many options (ways to solve the problem) in the form of answers to this post.
Thanks in advance for any help that you wish to provide.
If nearly all of the files in your directory have changed, creating an archive, sending it over the network, and unarchiving makes sense. Actually, if your LAN is fast enough, though, it may be faster not to compress the archive--just use tar.
If only some of the files have changed, rsync, a command line tool, will only download the changes. It can be used with ssh like this:
rsync -ae ssh username#hostname:/path/to/files /store/here/locally
http://www.thegeekstuff.com/2010/09/rsync-command-examples/
On Linux and OS X, cron and crontab allow you to schedule scripts to run periodically. Windows provides the Windows Task Scheduler.
FTP is fine if you don't care about encryption over your LAN. Otherwise, SSH would be preferable.
rsync is available on OS X and Linux, but I think you can use it on Windows through Cygwin.
I suggest making an archive (e.g. a .tar.gz file) on the source host, transferring it with scp, and unarchive it on the target host.
You could also use unison or rsync
I would suggest you to develop your own FTP client in .NET. This way you will have complete control over the application, and instead of button-clicks you can schedule it using windows-scheduler. Here is an article about how to create your own FTP client in VB.NET:
http://dot-net-talk.blogspot.com/2008/12/how-to-create-ftp-client-in-vbnet.html
Are there any FTP programs which can automatically copy (or rather 'move') the contents of a folder to a remote server? I have of course googled this but only really found one or two ancient products which look really clunky and unmaintained. I was wondering if there's a way to do this from the command line or any better solution to the base problem.
In more detail, new files get written to a folder every few hours. These new files need to be FTP'd elsewhere and then deleted. Mirroring or synchonisation systems are probably out of the picture as we need to delete the source files once they've been successfully transferred.
If it's easier, the 'solution' could pull the files off the server (rather than the server pushing them to the client). The computers will both be Windows OS.
You could use any off the shelf FTP program that supports command line and schedule a task on Windows Scheduler to run every 10 minutes. Check the folder, and move any files to the FTP site.
In the end I used a program called FTP Auto Sync: http://ftp-auto-sync.com/
My interactive 32-bit Windows app (now moving from Delphi [Ent] 2007 to 2009) uses command-line interactions to spawn child processes that do computationally-intensive tasks, which in turn write text files that the GUI parent app parses and analyzes - resulting in an interactive graphical display of the results.
I have access to a multiprocessor (multi-user) Linux cluster (via ssh), and would like to off-load the heavy lifting to that cluster. My question is how to spawn the processes in Linux from my Windows app. I can envision using secure FTP to put and get files, but not sure how to spawn the child processes in Linux.
Some leads for further reading would be fine - but code/pseudocode would be ideal. I can imagine that this may be more about Windows-Linux interaction than Delphi.
if you have access to ssh, one option is to issue commands through that.
For example:
ssh user#host ls -l ~
will in the ssh terminal show the files in the user's home directory. I'm not sure if this is what you really want. But it would likely work.
If you do this, you almost certainly want to setup SSH password less logins
However, A more ideal solution would likely be to setup a daemon on the linux boxes whose sole job is to run specific long running tasks in the background and let you fetch the results later.
You're going to have to install something on the Linux machine to run the process. You might find some kind of clustering or batch job submission API you can install and access from Windows. You might have to code a custom server. You might be able to run everything over ssh if you can drive an ssh process from Windows and if you have sshd installed on the Linux side. But my preference would be to write a webservice or simple CGI script on the Linux side designed to take your arguments and data and return the result over plain old http (or https as the case might be).
One way or another, this is going to encompass more than just coding on the Windows side.
I would download the full "putty" package.
As well as the excellent secure shell terminal, it includes PSCP to transfer files securely and PLINK to remote execute commands over SSH.
Hint: you will need to set up the full public/private key configuration for PLINK to work without an annoying password prompt. There is a useful guide http://unixwiz.net/techtips/putty-openssh.html>here.
I like the functionality of dreamweaver where you can add a site and define an ftp and then when you save a file it saves a local copy and also uploads a file via ftp. I am trying to get similar functionality with linux. What I have thought of doing is have inotify monitor a local folder and upload any new or changed files to an ftp site, but I am having a hard time finding information on this. Any ideas on how I can accomplish this?
Also, I do not want to install any programs on the ftp server.
Thanks
Dean
You might want to take a look at cron scheduling an rsync job, which will efficiently copy changed files across a network at a chosen interval. rsync will use ssh or rsh (not ftp), so this might not work, but would seem a better way in most cases.
I'd throw together a python script which uses inotify and scp/ftp.
These are all common and should be supported by whatever distro your using. They're also all pretty well documented.