media folder on another server using symlink? (Magento) - magento

is it possible to have my magento store on one server and have the store media folder on another server using a symlink?
Would I run into any problems if I was to do this?
I don't have any experience with symlinks and not exactly sure how they work.

You have to mount second server directory on your first server and than you can created symlink for this mounted point. ssfs is the tool I use for mounting.
But I will not recommend doing this because performance wise it will be slower, but in other hand if you can setup any domain/subdomain on that second server then you can provide your remote media url in Magento's system configuration, just like cdn server configuration. And it will enhance performance.

Related

Backup strategy ubuntu laravel

I am searching for a backup strategy for my web application files.
I am hosting my (laravel) application at an ubuntu (18.04) server in the cloud and currently have around 80GB of storage that needs to be backed up (this grows fast). The biggest files are around ~30mb, the rest of it are small jpg/txt/pdf files.
I want to make at least 2 times a day a full backup of the storage directory and store it as a zip file on a local server. I have 2 reasons for this: independence from cloud providers, and for archiving.
My first backup strategy was to zip all the contents of the storage folder en rsync the zip, this goes well until a couple of gigabytes then the server is completely stuck on cpu usage.
My second approach is with rsync, but this i can't track when a file is deleted / added.
I am looking for a good backup strategy that preferable generate zips before or after backup and stores them so we can browse and examine back in time.
Strange enough i could not find anything that suits me, i hope anyone can help me out.
I agree with #RobertFridzema that the whole server becomes unresponsive when using ZIP functionality from spatie package.
Had the same situation with a customer project. My suggestion is to keep the source code files within version control. Just backup the dynamic/changing files with rsync (incremental works best and fast) and create a separate database backup strategy. For example with MySQL/Mariadb: mysqldump, encrypt the resulting file and move it to an external storage as well.
If ZIP creation still is a problem, I would maybe use a storage which is already set up with raid functionality or if that is not possible, I would definitly not use the ZIP functionality on the live server. rsync incremental to another server and do the backup strategy there.
Spatie has a package for Laravel backups that can be scheduled in the laravel job scheduler. It will create zips with the entire project including storage dirs
https://github.com/spatie/laravel-backup

SSH Command in my FTP dev folder?

Intro:
I have 2 folder on my ftp, one for my main website and one for the dev website.
Question:
If I run a ssh command into my magentoroot\dev\shell folder, should I worry about consequences on my main website?
thx
Not really. Unless the two site use the same database
Please, note in case you need to work on development website. You need a separate database and file system. If database in common there must be possibility of configuration change while working in admin or while performing operations and installation of new extension. So, always try different database for staging and production website

Umbraco media sharing - Development

been struggling with setting up Umbraco on a development machine and test server...
Both environments connect to the same database and I use uSync to keep all my changes in git, however mediafiles are a real p.i.t.a.
I started off by adding media on my dev machine and copying over the media folder when publishing to test. Not very elegant so I tried using the rootPath and rootUrl in the filesystemproviders config. Path points to a network file share and URL to a dedicated virtual directory hosted on a media.test.mysite.com subdomain.
Surprise ... when opening the site the old media is vanished because umbraco saves the absolute path in the cmsProperty tables {'src': 'http://media.mysite.com/1041/...' }, previously the relative path when configuring the virtualRoot.
I'd like to alter the composition of the media url's in both front-and backend. Define a media_root appsetting holding the hostname, protocol and port (http://media.test.mysite.com) and prepending this to the src stuff that comes from the DB...
Any suggestions?
I already tried a custom URLProvider but this only works for non-media content ... it seems :-|
Thanks!
Y.
I'd recommend using the Umbraco File System Provider for Azure which will upload your media to Azure Blob Storage. You can then use the disk cache that comes with ImageProcessor.Web (included in Umbraco Core) to cache the files locally. We run our dev environments pointing to the same blob storage as other environments - so no need to copy the files. And the references are relative (/media/1001/file.jpg) when using Disk Cache thanks to the HTTP module in ImageProcessor.Web which caches them to disk. (You could alternatively use the ImageProcessor Azure blob cache plugin and have the images load from Azure. You might want to check out this documentation at Our.Umbraco.org (even if you aren't using Umbraco Cloud).

How to create a partition in remote ApacheDS, LDAP server?

I know how to create a partition in local ApacheDS instance from this article. Current problem is I don't know how to create a partition in remote ApacheDS.
I am accessing remote ApacheDS server(in CentOS) from Apache Directory Studio(in Windows).
Any help would be appreciated.
ApacheDS
Version: 2.0.0-M14
Apache Directory Studio
Version: 2.0.0.v20130517
I don't know if your problem is that you can't access the remote instance or another.
But if you want to create a partition follow this "guide".
ApacheDS seems to have a very bad tutorial.
Contrary the other answers, here I explain the real problem. The sad truth is the following:
You can't manipulate the partitions of a non-local Apache Directory Server with Apache Directory Studio.
You can't even do this with a locally running one. The only what you can do, are the Apache Directory Server partitions running inside your Apache Directory Studio.
However, there is a workaround for the problem. It is particularly useful, if you are using linux, or at least you have a cygwin by the hand.
The Apache Directory Server has a complex directory structure, full with small files, partially binary and partially text data.
This data structure doesn't contain any filesystem references, so you can freely clone it.
Create an LDAP server inside your Apache Directory Studio. Open its properties. You get a popup form. Inside this form, you will see some like this:
Location /your/home/directory/.ApacheDirectoryStudio/.metadata/.plugins/org.apache.directory.studio.ldapservers/servers/e56640c7-70ed-4eed-921c-75c475117a11
This is what you want!
This is the directory structure, where your local ApacheDS is running!
And you can now easily synchronize this data structure, ideally with a simple rsync command, into your server or back!
So,
You create the new Apache Directory Server instance inside the Apache Directory Studio
Your check its properties
You stop it, and synchronize your server-side server directory into your this one! For example, rsync -va --delete you#your.server.com:/srv/apacheds/instance/ /your/home/directory/.ApacheDirectoryStudio/.metadata/.plugins/org.apache.directory.studio.ldapservers/servers/e56640c7-70ed-4eed-921c-75c475117a11
You play with the partitions as you wish
You synchronize it back.
Of course if you are playing with the Apache Directory Server file structure on such a low, file-system level, the server needs to be stopped!

Incrementally updating a remote joomla web site?

I have a Joomla 1.5 site on my localhost. It's hosted on a public hosting server as well.
I was wondering what is the best way to do incremental updates to the site? I mean I don't want to update the whole site, if I just changed one source file (html, php, images, etc) or made changes to the database. I understand, to be safe I'd have to update the database every time (export from local and import in remote), but I'm sure we can avoid unnecessary uploads of unchanged files.
I've seen https://www.akeebabackup.com and it doesn't offer what I need. One option is to use an ftp client (like Filezilla) which does folder synchronizations, but I'm not sure they work very well.
For database you could use master-master replication, which is quite easy to set up but you need GRANT privileges in MySQL, which most likely won't be possible on a shared hosting. I'd also suggest connection both machines via VPN to make it more secure.
The other easy way to sync databases is "Synchronisation" tool if you're using phpmyadmin.
If not, look at any MySQL planning software like MySQL Workbench, which also has this feature built-in.
You didn't tell what privileges you have to access the public hosting server.
If you're an admin you can have SVN admin installed and configured to sync files with your local data.
You can also have a GIT repository to do exactly the same, or LDAP set-up via VPN to keep your files in sync.
If you're not an admin just see or ask your hosting company what's of the above is available, I'm sure they'll be able to help you. Nowadays, hosting companies have SVN or GIT installed, which should be what you need.
I often use SVN tools built-in PHP Designer 8, but you can have SVN, GIT and many more also in NetBeans.

Resources