Laravel filesystem disk update from ui - laravel

I am trying to use https://github.com/spatie/laravel-backup package to manage backups for my app. I have successfully integrated that and can make a backup to local or s3 disk.
I would like to add the ability for the admin to be able to change the s3 credentials (KEY/secrets) from the admin panel. I am confused here how to get that done? Please guide me how can I change/modify these credentials from the Admin panel. I am a newbie. What I would like is a UI interface to connect S3 bucket to the app and be able to update and link a new S3 account.
the app will use this to store its backups

Related

Programmatically adding new entry in Strapi

I have my Strapi deployed, and users are able to log in to their account and add new entries to the server. The server will then store the content in PostGres on Google Cloud Service. However, I want to be able to reflect the newly added entries for my local server as well, since I need to manually add some image assets for each entry. Is there a way for me to somehow retrieve the new entries from database and directly add it to my local Strapi admin dashboard?
You can use same database for local dev and production versions. In that way the content of production version is the same as dev version.

Which Add-on should I use to file storage my Rails app

I have a Rails App launched in Heroku but because of the Active Storage on Heroku I now need a cloud file storage. I know that there is Amazon's S3 but I really don't want to create another account on another service. There isn't another add-on from Heroku that give the storage I need without going 'outside'?
Hope you can help me. Thank you.
There isn't another add-on from Heroku that give the storage I need without going 'outside'?
While that was the case at one point, that hasn't been true for some time. There are a number of addons available in the marketplace, under the Data Stores sub-heading, that fit the bill. They are external services but you can access them directly through your Heroku account. A couple examples:
HDrive - S3 and Azure object storage
Cloudcube - S3 addon
Bucketeer - S3 addon
These should allow you to leverage external file storage without creating additional account.

trying to use Amazon S3 on Ggost running on Heroku to store all the images on it instead of storing them locally

I've been trying to set up Ghost storage adapter S3 on my 1.7 ghost installation, but I must to be missing something on the way. I created a bucket with policies what are allowing access to IAM user previously created and with AmazonS3FullAccess permissions, so far so good, Ive got added the lines into config.production.json with AccessKey and secretkey from IAM as readme says but its not working properly. I attach a report screen from heroku logs
Well, I couldn't find how to fix it on 1.7 version but after updating Ghost to 1.21.1 it's working right.

Electron framework desktop app with AWS S3 Sync

I have been trying to find a solution for this but I need to ask you all. Do you know if there is a windows desktop application out there which would put (real time sync) objects from a local folder into predefined AWS S3 bucket? This could work just one way - upload from local to s3.
Setting it up
Insall AWS cli https://aws.amazon.com/cli/ for windows.
Through AWS website/console. Create an IAM user with a strict policy that allows access only to the required S3 bucket.
Run aws configure in powershell or cmd and set up the region, access key and secrect key for the IAM user that you created.
Test if your set up is correct by running aws s3 ls in the command line and verify you see a list of your account S3 buckets.
If not, then you probably configured IAM permissions incorrectly, you might need ListBuckets on all of S3 too.
How to sync examples
aws s3 sync path/to/yourfolder s3://mybucket/
aws s3 sync path/to/yourfolder s3://mybucket/images/
aws s3 sync path/to/yourfolder s3://mybucket/images/ --delete deletes files on S3 that are no longer available on your local path.
Not sure what this has to do with electron but you could set up a trigger on your application to invoke these commands. For example, in atom.io or VS code, you could bind this to saving a document on "ctrl+s".
If you are programming an application using Electron then you should consider using AWS JavaScript SDK instead of the AWS CLI but that is a whole different story.
And lastly, back up your files somewhere else before trying to use possibly destructive commands such as sync until you get a feeling of how they work.

Using Amazon S3 in place of an SFTP Server

I need to set up a repository where multiple people can go to drop off excel and csv files. I need a secure environment that has access control so customers logging on to drop off their own data can't see another customers data. So if person A logs on to drop a word document they can't see person B's excel sheet. I have an AWS account and would prefer to use S3 for this. I originally planned to setup an SFTP server on an EC2 server however, I feel that using S3 would be more scalable and safer after doing some research. However, I've never used S3 before nor have I seen it in a production environment. So my question really comes down to this does S3 provide a user interface that allows multiple people to drop files off similar to that of an FTP server? And can I create access control so people can't see other peoples data?
Here are the developer resources for S3
https://aws.amazon.com/developertools/Amazon-S3
Here are some pre-built widgets
http://codecanyon.net/search?utf8=%E2%9C%93&term=s3+bucket
Let us know your angle as we can provide other ideas knowing more about your requirements
Yes. It does, you can actually control access to your resources using IAM users and roles.
http://aws.amazon.com/iam/
You can allow privileges to parts of an S3 bucket say depending on the user or role for example:
mybucket/user1
mybucket/user2
mybucket/development
could all have different permissions.
Hope this helps.

Resources