How to transfer an Amazon S3 bucket to another account? - amazon-ec2

I configure AWS instances for clients, and I need to transfer everything to them at the end, so that the billing for AWS and S3 usage also goes on their accounts.
I know there is a way to "transfer" an EC2 instance via AMI sharing, but is there a way to transfer ownership or share S3 buckets as well? (Preferably avoid making a copy but transfer the original bucket itself).

S3 Buckets cannot be transferred between accounts. At least in the simple sense of "here is my bucket, now it is your bucket". Everyone seems to use some form of copying. If you have permission to both your original bucket and their destination bucket then you can use the AWS CLI and just
aws s3 sync s3://bucket1 s3://bucket2
Have you tried adding their account as an ALL PERMISSION user to one of your buckets? http://docs.aws.amazon.com/IAM/latest/UserGuide/roles-creatingrole-policyexamples.html
Then login as their account and see if they can then edit the policy to remove your original account? Not sure how the billing would turn out since you created the bucket.

If you are going to do this frequently then you should create a new account per customer and then transfer ownership of the whole account to the client.
See Consolidated billing and Organisations.

You can transfer the contents of the buckets between accounts by making the destination bucket public (There are more secure ways to do this).
Then using the aws CLI from the other account you authenticated with run the s3 cp command (This does not need bandwidth):
aws s3 cp "s3://bucket-source/file.zip" "s3://bucket-you-dont-own/NewFolder/" --acl bucket-owner-full-control
If you do not add "--acl bucket-owner-full-control" to your s3 string you will get an access denied error because the destination account does not have file permissions:
A client error (AccessDenied) occurred when calling the ListObjects operation: Access Denied

Related

Google Cloud Storage - Handle rotating keys from outside the environment

Need a help on how do I handle rotating keys on google cloud storage thats managed by one google account but being accessed by an app thats running on another google cloud account. I tried searching for solutions but couldn't find an answer
With IAM service you can grant permission at project level, and, for some resources, at resources level.
It's the case for your KMS keys where you can grant permission on the key rings
Or directly at the key level
Choose what works the best for your use case, and grant the service account of the external project with the correct permission (decrypter to read the files in the GCS, encryter to write files)
Note: A key rotation is a new version of a key.

Electron framework desktop app with AWS S3 Sync

I have been trying to find a solution for this but I need to ask you all. Do you know if there is a windows desktop application out there which would put (real time sync) objects from a local folder into predefined AWS S3 bucket? This could work just one way - upload from local to s3.
Setting it up
Insall AWS cli https://aws.amazon.com/cli/ for windows.
Through AWS website/console. Create an IAM user with a strict policy that allows access only to the required S3 bucket.
Run aws configure in powershell or cmd and set up the region, access key and secrect key for the IAM user that you created.
Test if your set up is correct by running aws s3 ls in the command line and verify you see a list of your account S3 buckets.
If not, then you probably configured IAM permissions incorrectly, you might need ListBuckets on all of S3 too.
How to sync examples
aws s3 sync path/to/yourfolder s3://mybucket/
aws s3 sync path/to/yourfolder s3://mybucket/images/
aws s3 sync path/to/yourfolder s3://mybucket/images/ --delete deletes files on S3 that are no longer available on your local path.
Not sure what this has to do with electron but you could set up a trigger on your application to invoke these commands. For example, in atom.io or VS code, you could bind this to saving a document on "ctrl+s".
If you are programming an application using Electron then you should consider using AWS JavaScript SDK instead of the AWS CLI but that is a whole different story.
And lastly, back up your files somewhere else before trying to use possibly destructive commands such as sync until you get a feeling of how they work.

AWS user just for creating RDS snapshots

I have a requirement to give access to a user to create snapshots of RDS instances for backup and restore, is this possible in AWS. Also, I need the same user, needs to version s3 buckets, is this possible??
I think that should be possible.
Here is the set of RDS Snapshot permissions.
http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/UsingWithRDS.IAM.ResourcePermissions.html
This is possible the permission set you are looking for for restore.
http://docs.aws.amazon.com/AmazonRDS/latest/APIReference/API_RestoreDBInstanceFromDBSnapshot.html
I would guess you might need to grant Describe. I would create a user and then apply the permission sets and test for least possible permissions you can grant to the user.
These are the permission sets for S3 Objects.
http://docs.aws.amazon.com/AmazonS3/latest/dev/using-with-s3-actions.html
When you say Version S3 buckets you normally set this once on a bucket then it will version any object changes applied to it. The permission set above will then just limit there access to the objects in that bucket.

Amazon Snap shot Query

I had a snapshot in my amazon VPC account, which should not be deleted by any one. So is there any option in amazon for making the snapshot read only or not deleted by any one.
If a person has sufficient privileges to the account, they will be able to delete it; if you are really worried throw a copy on S3 and make sure no one has access to it except you.

backup aws ec2 data to a totally separate aws account

I want to backup my AWS snapshots to a completely separate AWS account for additional security (if my AWS credentials were acquired someone could delete all my snapshots and volumes). But I'm a bit stumped on how to do this.
There doesn't seem to be a way to store a volume or snapshot in S3 such that another user could access that data in s3 and store it in a separate AWS account.
Does anyone have any suggestions on how to acheive this?
Thanks
Create an IAM user and an S3 bucket from your secret (backup)
account.
Add an IAM policy to the newly created bucket,
allowing your newly created IAM user to put objects, but denying him
to delete them.
Use IAM user account to upload your backups to S3.
You can share any EBS snapshot with another account by adding this permission. Once the snapshot is shared, the other account can either copy that snapshot to their account or create a volume using that snapshot.

Resources