Update Magento extension using ssh to extract .tgz tar file - shell

I am trying to update a module to a newer version. In the past I have manually uploaded each file carefully into the new directory and overwritten older files using FTP. However I wanted to use SSH to try and do this more easily and without any file permission problems.
I have:
Uploaded the .tgz file to the root folder (/http) on the server
Logged into the server via SSH
Changed the directory to the correct directory
Run the following command: tar -zxvf fishpig_splash.tgz
In the command line I was then given a list of all the files that had been extracted. However if I use FTP to go to any of these files I can see that they are still the older version and have not been overwritten.
I was expecting that the files would extract into the correct directories and overwrite any that already existed. I have tested the extraction by creating a temporary directory and extracting into that and everything worked fine.
Is there another part to this script I need to use to overwrite the files?
Thanks
Glynn

Sorry this was just me being stupid! When extracting the tar file there was a subfolder within it for the extension, I completely missed it. I just went down a level in the file and zipped up the contents only then extracted them at the root and everything worked fine. Thanks for the help though!

Related

How to handle support files during Homebrew Formula update

I'm following the Scripts with Support Files answer from here https://stackoverflow.com/a/46479538/4771016 which works great but running into a problem during the update of my script.
If not found, my script creates an .env file for the users to pass some variables in the same directory as the .sh file lives: /home/linuxbrew/.linuxbrew/Cellar/myscript/1.0.2/libexec/.env the problem is that upon releasing a new version, the .env file won't be in the new directory i.e. /home/linuxbrew/.linuxbrew/Cellar/myscript/1.0.3/libexec/ and thus will be recreated losing the modifications.
Any ideas for keeping that .env file during updates or an acceptable design pattern for my use case? I was thinking about keeping the .env file outside that directory somewhere, but I don't know the Homebrew directory structure well enough to store it in the right place.

Why do I not have a .hgrc file?

I'm trying to insert the mercurial_keyring file with my username and password in the .hgrc file but it doesn't exist in my user directory on windows. I have tortoise hg installed and even checked if it was installed properly on the command prompt yet I still don't have the .hgrc folder.
Can anyone tell me what might be the reason to it?
Thanks
Because it's %USERPROFILE%\mercurial.ini
Mercurial reads configuration data from several files, if they exist.
These files do not exist by default and you will have to create the
appropriate configuration files yourself:
Local configuration is put into the per-repository /.hg/hgrc
file.
Global configuration like the username setting is typically put into:
%USERPROFILE%\mercurial.ini (on Windows)
The .hgrc files are not created automatically when you install Mercurial or TortoiseHg.
You will need to manually create it at the location you need whether that is within the repository's .hg folder or your own C:\Users\username\ folder.
You will probably need to use the command line to create the file as it's not usually possible to create filenames that start with . in Windows Explorer.
https://www.selenic.com/mercurial/hgrc.5.html

LFTP, download only files created the same day I execute LFTP

How do I make LFTP download a file from a remote server only if this file was created TODAY (the same day I run LFTP) ?
Use mirror.
It has this --newer-than=SPEC option to download only files newer than specified time. For your specific needs, use --newer-than=now-1days. Now - 1 day should be yesterday therefore lftp will download all the file newer than yesterday.
Refer here for more info: http://lftp.yar.ru/lftp-man.html
EDIT: While I was tweaking my script, I notice there's an --only-newer option which download only newer file which is also useful for your case but with slight changes. --only-newer check the destination folder and download any files from source that's not in the destination folder while --newer-than download any files that's newer than the time you specified without checking the destination folder.

Sublime SFTP file upload issue

I am having directory issues when using sublime sftp to upload files. Currently am using ftp type, and the issue is this:
My local and remote folders match, but when I take a file from the local side that is say 3 directories deep and modify/upload, it's putting the file to the root folder and not the directory location.
Any ideas are appreciated.
NOt sure if this helps, but
Uploading "/Volumes/--/Dropbox/ItsJustFood/web/wp-content/themes/justfood/library/css/style.css" to "/library/css/style.css" ..... success
looks like it's only going two directories up when it should be going to the root?
Turns out you have to make sure the .json file is also in the root folder of your remote copy.

FileZilla FTPing unzip problems

I have a bash script that utilises inotify-tools to wait for .zip files to be dropped in a substructure under the root. From there they are unzipped into a another directory.
When I copy the .zip files in with WinSCP the script executes correctly. Copying the .zip files with Filezilla leads to this error however:
End-of-central-directory signature not found. Either this file is not
a zipfile, or it constitutes one disk of a multi-part archive. In the
latter case the central directory and zipfile comment will be found on
the last disk(s) of this archive.
I've googled this error and the two main issues seem to be an old version of Linux's unzip functionality, which I have a newer version of, and trying to copy files that are > 2gb (this file isn't)
Anyone know the issue here, it seems to me that Linux is trying to unzip the script before it is fully copied to disk? Like I said, only filezilla has this error, I don't get it with winSCP
I believe your main issue is you try to process the ZIP when it is still being transfered. Probably what happens is that as soon as the transfer is initiated WinSCP creates a temporary files to store the transfered data. That event would fire your script before the zip file is complety transfered.
That would explain why you get this error :
End-of-central-directory signature not found. Either this file is not
a zipfile,
So the solution would be to have two folders one for transfer and one for compelete. They should be in the same file system. On transfer complete just move from one folder to another.

Resources