I have a ruby script which runs on my local machine and it ftps a folder periodically from server. Now i want to remove the folder once i pull down the folder from ruby which is running on my local machine. Is it possible to do so?
Thanks!!
I could see we can use sftp.remove("/path/to/file").wait to remove files and sftp.rmdir("/path/to/directory").wait to remove folder.
Related
This may be asking too much from an already very powerful tool, but is there a chance that lftp mirror can execute a command during the mirroring process (from remote directory to the local machine)?
Specific example: lftp is asked to mirror a remote directory with xml files into a local folder and as soon as each file is downloaded/updated, it converts the file to JSON format using xml2json.
I can think of a solution that relies on monitoring the local copy of the mirrored folder for changes via find and then executing xml2json on the new/updated files, but perhaps there is a simpler way?
You can use xfer:verify and xfer:verify-command settings to run a local command on every transferred file.
I am trying to update a module to a newer version. In the past I have manually uploaded each file carefully into the new directory and overwritten older files using FTP. However I wanted to use SSH to try and do this more easily and without any file permission problems.
I have:
Uploaded the .tgz file to the root folder (/http) on the server
Logged into the server via SSH
Changed the directory to the correct directory
Run the following command: tar -zxvf fishpig_splash.tgz
In the command line I was then given a list of all the files that had been extracted. However if I use FTP to go to any of these files I can see that they are still the older version and have not been overwritten.
I was expecting that the files would extract into the correct directories and overwrite any that already existed. I have tested the extraction by creating a temporary directory and extracting into that and everything worked fine.
Is there another part to this script I need to use to overwrite the files?
Thanks
Glynn
Sorry this was just me being stupid! When extracting the tar file there was a subfolder within it for the extension, I completely missed it. I just went down a level in the file and zipped up the contents only then extracted them at the root and everything worked fine. Thanks for the help though!
I recently installed Bonobo git server on my workstation. It will work as a git server for me and my colleagues. Using a script, the repositories will be stored on a separate server usinng a .bat file to copy the files regularly. The problem is I don't know where to find the repo files. I selected D:/git_repo as destination for bonobo repositories, but when i go there in explorer, there are only a few files and not the sources (the folders are alos only a few hundreads kbs big).
Do you have any ideea where the complete files are located?
If it was me, instead of using a scheduled script to copy the repository files to a separate server, I would on the destination/backup server have a scheduled task that performed a 'git pull' from your server.
Did you changed the Repository directory in the Bonobo UI settings ?
If you did, try to restart the IIS service.
If none helps, thenWhat exactly are you copying ? If your repositories are initialy exist, either by a script you execute or by the Bonobo UI, they can be created directly to your D:\Git_Repos as you specify
I am having directory issues when using sublime sftp to upload files. Currently am using ftp type, and the issue is this:
My local and remote folders match, but when I take a file from the local side that is say 3 directories deep and modify/upload, it's putting the file to the root folder and not the directory location.
Any ideas are appreciated.
NOt sure if this helps, but
Uploading "/Volumes/--/Dropbox/ItsJustFood/web/wp-content/themes/justfood/library/css/style.css" to "/library/css/style.css" ..... success
looks like it's only going two directories up when it should be going to the root?
Turns out you have to make sure the .json file is also in the root folder of your remote copy.
I'm trying to test a WordPress installation in XAMPP for OS X (running version 0.7.3 because I prefer the included version of phpMyAdmin). When I try to upload media files, it returns an error about not having a temporary folder. Same with any upload form I test in other applications.
Is there anything I can do to get it to allow uploads to function normally, as if it were a remote machine?
You'll simply need to create a /tmp and /wwwtmp in your sites root directory.
Check the permissions set for these directories