How to copy objects from one S3 bucket to another - laravel

I have found a solution for syncing S3 buckets data over link Copy objects between S3 buckets
aws s3 sync s3://DOC-EXAMPLE-BUCKET-SOURCE s3://DOC-EXAMPLE-BUCKET-TARGET
But I want to copy specific files from the production S3 bucket to the staging s3 bucket via the file paths that are stored in the production database. Any suggestion of how I can achieve that using Laravel aws-sdk-php, lambda function, or aws-cli?

Related

Transfer files from Oracle RDS to S3 bucket located in a different AWS account

I would like to transfer files from Oracle RDS to S3 bucket. This article from AWS- https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/oracle-s3-integration.html describes how this can be achieved. But the solution works only when the RDS and the S3 bucket are in the same AWS region. Could anyone help how to copy files when the S3 is in a different AWS account.
Thank you in Advance!
as of now i don't think there is a direct/documented method to copy directly from rds to s3 in another account . but you can copy the files to s3 bucket in the same account and do a cross account copy from s3 .
https://aws.amazon.com/premiumsupport/knowledge-center/copy-s3-objects-account/

Is there a way to copy S3 bucket properties from one bucket to another using the AWS CLI?

The title pretty much sums up my question. I am having no problem copying files from Bucket A to Bucket B, but I would also like to copy Bucket A properties to Bucket B from the cli (ie. setting static web hosting to enabled, or versioning to enabled etc.). Here is the commands I am running right now:
aws s3 mb s3://$S3_NEW_BUCKET_NAME
aws s3 sync s3://$S3_PROD_BUCKET_NAME/ s3://$S3_NEW_BUCKET_NAME/

How to download zip(folder zip) from S3 bucket store the files in it to different S3 bucket, using spring boot?

I have a requirement where the input to my module is a zip file(i.e., a folder of files zipped) and placed to S3 bucket. I have to unzip from S3 and store the files in it to a different S3 bucket. Can any one help me how to achieve this using spring boot ?

AWS temporary files before uploading to S3?

My Laravel app allows users to upload images. Currently, when the user uploads their images, they are stored in a temporary location on the server. A cron job then modifies the uploaded images (compresses them, etc.), and uploads them to S3. Any temporary files older than 48 hours that failed to upload to S3 are deleted by another cron job.
I've set up an Elastic Beanstalk environment, but it's occurred to me that storing uploaded images in a temporary directory on an instance is risky because instances can be created and destroyed when necessary.
How and where, then, would I store these temporary files so that they're not at risk of being deleted by an instance?
As discussed in the comments, I think that uploading the file to S3 is the best option. As far as I know, it's not possible to stop Elastic Beanstalk from destroying an ec2 instance, unless you want to get rid of all of the scaling and instance failure/autoreplacement features.
One option I don't know much about may be AWS EBS. "Amazon Elastic Block Store (Amazon EBS) provides persistent block storage volumes for use with Amazon EC2 instances in the AWS Cloud." I don't have any direct experience with EBS, the overriding question of course would be if EBS is truly persistent, even after an ec2 instance is destroyed. As EBS has costs associated with it, it seems like since you are already using S3, S3 would be the way to go.
S3 has a feature called object lifecycle management you can use to have files deleted automatically by setting them them to expire 2 days after they're uploaded.
You can either:
A) Prefix the temporary files to put them in an S3 psuedo-folder (i.e., Temp/), apply the object lifecycle expire rule to that specific prefix (or "folder"), and use the files in there as a source of truth for the new files derived from it post-manipulation.
or
B) Create an S3 bucket specifically for temporary files. Manipulate the files from there and copy to the production bucket.

How to upload thousands of images to Amazon S3 at Once

I want to upload thousands of images from my Digital Ocean Droplet to my S3 Bucket, i already create a piece of code that upload an crop all new images from my site to the bucket, so now that is working fine i just want to move all of my images from my production droplet to the bucket.
I have stored 52 GB on images so i dont know how to move all of my images to the bucket! what will be the best approach?
The best approach will be :
Create a Zip file of images you want to transfer.
Create an EC2 Instance in the same region as the Bucket is.
Copy the Zip file to EC2 Instance.
Unzip the Zip file in EC2 Instance.
Use aws cli to copy the Images from EC2 Instance to the Bucket.
The other approach, is to use aws cli directly from the Droplet, but due to large number of files, it'll take a lot of time to transfer.
In aws cli you can either use aws s3 cp or aws s3 sync to copy your images.

Resources