May I save a couchbase db as a backup on the localfilesystem?
It should work on IOS and Android.
On IOS I have an App: "Files" with data "on my iphone" from Firefox or Recorder App or other Apps always in a directory.
How can I save the backup in a directory in this location?
I found something about push replication with couchbase, but only remote.
Is it possible to make a push replication to the local filesystem to solve this?
A Couchbase Lite database is a directory. Copying the directory will back up the database.
The backup, of course, will not stay up-to-date. Also, be careful using db copies in replication scenarios. Databases have ids that are assumed to be unique. Multiple copies (which will have the same id) will encumber the replication process.
Related
Anytime my app goes to sleep and comes back on, I lose data in my database
And I'm not storing any media, it's just form data (texts)... I built the app on strapi and I've followed all their guidelines but it keeps happening. I'd be happy if anyone can help
Local data (files, db) is cleared after a Dyno restart because the Heroku File System is ephemeral. A Dyno is restarted (at least) every 24hrs.
In your case Strapi uses SQLite where data is saved in a local file.
Strapi suggests to configure Postgres on Heroku, alternatively you can use an external DB storage service.
First of all:
As you create content types with strapi it generates the code (= new files) for the according controllers/routes/services
Heroku does not persist data after a restart
After a restart strapi checks which content types exist in the code and deletes the tables of nonexisting types from the database.
Therefore, on Heroku you have to set up all your content types locally and connect to an external db (e.g. Heroku Postgres) but never strapi's default textfile based db.
Then push the generated files and finally deploy.
Thus, on Heroku you should always run in production mode. This way the option to alter content types is completely blocked and you will not run into the issue of data loss after a restart.
I am searching for a backup strategy for my web application files.
I am hosting my (laravel) application at an ubuntu (18.04) server in the cloud and currently have around 80GB of storage that needs to be backed up (this grows fast). The biggest files are around ~30mb, the rest of it are small jpg/txt/pdf files.
I want to make at least 2 times a day a full backup of the storage directory and store it as a zip file on a local server. I have 2 reasons for this: independence from cloud providers, and for archiving.
My first backup strategy was to zip all the contents of the storage folder en rsync the zip, this goes well until a couple of gigabytes then the server is completely stuck on cpu usage.
My second approach is with rsync, but this i can't track when a file is deleted / added.
I am looking for a good backup strategy that preferable generate zips before or after backup and stores them so we can browse and examine back in time.
Strange enough i could not find anything that suits me, i hope anyone can help me out.
I agree with #RobertFridzema that the whole server becomes unresponsive when using ZIP functionality from spatie package.
Had the same situation with a customer project. My suggestion is to keep the source code files within version control. Just backup the dynamic/changing files with rsync (incremental works best and fast) and create a separate database backup strategy. For example with MySQL/Mariadb: mysqldump, encrypt the resulting file and move it to an external storage as well.
If ZIP creation still is a problem, I would maybe use a storage which is already set up with raid functionality or if that is not possible, I would definitly not use the ZIP functionality on the live server. rsync incremental to another server and do the backup strategy there.
Spatie has a package for Laravel backups that can be scheduled in the laravel job scheduler. It will create zips with the entire project including storage dirs
https://github.com/spatie/laravel-backup
I'm fairly new to server administration. I have my Laravel app up and running and I want to make sure it has proper backups. I have researched some backup packages and I have settled on https://github.com/spatie/laravel-backup.
However, once the server fails, I need to know how to use the most recent backup (which will be on AWS S3) to restore the database on the rebuilt server. Are there any suggestions for guides on how to do this? I can't seem to find any unless it doesn't really require much learning and instead just a couple mySQL commands.
Thanks!
I would use replication and within Laravel i would try to switch connection to the replica database server so things can run smoothly until the problem is resolved.
Take a look at this Cross-Region Replication
A typical production environment is automatically running backups on most important things that your deployment needs in order to recover from a failure. Those parts would commonly be your database and storage folder, and configuration files.
Also when you deploy a laravel application there aren't many things that are "worth" backing up , you can choose the entire disk to be mirrored somewhere or you can schedule a backup script which run every N times and backups the things that are more important to your application.
Personally i wouldn't rely on an package from laravel to handle my backups , you can always use other backup utilities, replication and so on.
Update
Take a look at the link below:
User Guide » Amazon RDS DB Instance Lifecycle » Backing Up and Restoring
Backing Up and Restoring
You can call the API function RestoreDBInstanceFromDBSnapshot as showed on example.
But i don't think something automated exists that would auto restore or magically make everything work, you need to do a lot of security checks if something like that would even be attempted. Final word i believe a good solution manually entering or sending the request would be the most solid solution.
We have a CMS on heroku, some files were generated by the CMS, how can I pull those changes down? Can I commit the changes remotely and pull them down? Is there an FTP option of some kind?
See: https://devcenter.heroku.com/articles/dynos#ephemeral-filesystem
It's not designed for persistent file generation and usage.
In practice, it works like this: User puts some code into a repository. That code is dynamically pulled into temporary Amazon EC instances and executed. The code can be pulled from virtual machine to virtual machine, node to node, without disruption, across data centers. There is no real "place" to get the products of your code from the environment, because anything generated by the checked-out code can (and will) be destroyed as your code deploy skips around between the temporary machines.
That being said, there are some workarounds:
If your app includes something like a file browser within your deployed code, you can grab the (entirely) temporary files using that file browser, and commit it back to your persistent code trunk.
Another option is using something like S3 for your persistent storage, with your application reading from, and writing to, a data storage service, knowing that while heroku will just re-write and destroy your local data on a frequent basis, the external service will maintain the files.
Similarly, you can change your application to use heroku's postgres for persistent data storage, or use Amazon's RDS, (etc.).
Alternately, you can edit your application in such a way as to ensure that any files generated by it will be regenerated every time the code is refreshed, redeployed, and moved around.
say, I want to use an existing SqlCe Database which HAS data in it. I am not sure which is the best way to do it.
1) here is the normal way to create a SqlCe Database for Windows phone
using (CountryDataContext context = new CountryDataContext(ConnectionString))
{
if (!context.DatabaseExists())
{
// create database if it does not exist
context.CreateDatabase();
}
}
But this Database has data in it , so I dont want to create it. I want to deploy it or store in in isolatedStorage.
How should I do it ?? I want to use the data already in the database.
Thanks
You can include the database file as Content in your application project and it will then be put in your XAP at build time and copied to your application install directory on the phone.
Will you modify this database at all?
If not: you can access the database directly from the install directory and future updates will get any new version of the database automatically (as long as you remember to add them to the XAP).
If you do modify it: then on first install you will need to copy the database to isolated storage before you use it. You will need to watch out if you update the database scheme with a future update.
Usually the pattern is to create the database and to configure seed data with it.
Alternatively, if the database is read-only, you can deploy it with your application. See How to: Deploy a Reference Database with a Windows Phone Application on MSDN.
If you want to deploy a database from your development machine for testing purposes, you can use the the ISETool that can take and restore snapshots of an application's isolated storage to/from a local directory:
# Copy data from IS to directory
ISETool.exe ts xd <PRODUCT-ID> "C:\TempDirectory\IsolatedStore"
# Copy data to IS from directory
ISETool.exe rs xd <PRODUCT-ID> "C:\TempDirectory\IsolatedStore"