I am using Amazon's official aws-sdk gem, but I can't seem to find any funcionality that works like the command line tool aws s3 sync <path> <bucket>. Does it exist or am I forced to upload each file separately (slow)?
you don't have an api call that achieves that.
the sync is basically a call to get the objects, a call to inspect your local path and after that uploads/downloads to bring the 2 locations in sync. that's what the was cli tool does under the hood.
Related
I'm trying to install azure-cli on aws lambda for integration purpose. Size of azure-cli seems to be huge for aws lambda and unable to upload the zip file.
I want to create a service principal (client secret) in azure using lambda using python.
The only way to create service principal is through azure-cli.
Is there any other way to create client secret? or can we handle azure-cli package size to upload in aws lambda?
I have gone through many blogs online, but azure-cli is required to create client secret.
install azure-cli on aws lambda
Do you mean pip install Python package within AWS Lambda?
If so, One of the great things about using Python is the availability of a huge number of libraries that helps you implement fast solutions without having to code all classes and functions from scratch. As mentioned before, Amazon Lambda offers a list of Python libraries that you can import into your function. The problem starts when you have to use libraries that are not available. One way to do it is to install the library locally inside the same folder you have your lambda_function.py file, zip the files and upload it to your Amazon Lambda console. This process can be a laborious and inconvenient task to install libraries locally and upload it every time you have to create a new Lambda function.
To make your life easier, Amazon offers the possibility for us to upload our libraries as AWS Lambda layers, which consists of a file structure where you store your libraries, load it independently to Amazon Lambda, and use them on your code whenever needed. Once you create a Lambda Layer it can be used by any other new Lambda Function.
There are the steps of getting started with AWS Lambda Layers for Python.
Looks like AWS layers like AWSLambda-Python37-SciPy1x have a different account and head version in the ARN in different regions. Eg
us-east-1: arn:aws:lambda:us-east-1:668099181075:layer:AWSLambda-Python37-SciPy1x:22
us-east-2: arn:aws:lambda:us-east-2:259788987135:layer:AWSLambda-Python37-SciPy1x:20
From a script I need to add the layer that pertains to the lambda's region, but I'm not finding an AWS CLI or boto3 command that will give me the ARN of a "published" layer (ie one that was given access to by an AWS admin to all accounts), I can only find my own layers (eg aws lambda list-layers).
The AWS console for lambda in web browser shows the vendored layers, so I loaded the page and looked through js console and saw the following request is made:
https://console.aws.amazon.com/lambda/services/ajax?operation=listAwsVendedLayers&locale=en
So it looks like the REST API has this operation to get that, but I cannot find the equivalent anywhere in AWS CLI or boto3.
Any ideas (short of using curl with the proper request head and auth info, pain), perhaps a way to run a "raw" request in boto3 so I could give it this listAwsVendedLayers operation? I looked in the docs could not find anything.
I have been trying to find a solution for this but I need to ask you all. Do you know if there is a windows desktop application out there which would put (real time sync) objects from a local folder into predefined AWS S3 bucket? This could work just one way - upload from local to s3.
Setting it up
Insall AWS cli https://aws.amazon.com/cli/ for windows.
Through AWS website/console. Create an IAM user with a strict policy that allows access only to the required S3 bucket.
Run aws configure in powershell or cmd and set up the region, access key and secrect key for the IAM user that you created.
Test if your set up is correct by running aws s3 ls in the command line and verify you see a list of your account S3 buckets.
If not, then you probably configured IAM permissions incorrectly, you might need ListBuckets on all of S3 too.
How to sync examples
aws s3 sync path/to/yourfolder s3://mybucket/
aws s3 sync path/to/yourfolder s3://mybucket/images/
aws s3 sync path/to/yourfolder s3://mybucket/images/ --delete deletes files on S3 that are no longer available on your local path.
Not sure what this has to do with electron but you could set up a trigger on your application to invoke these commands. For example, in atom.io or VS code, you could bind this to saving a document on "ctrl+s".
If you are programming an application using Electron then you should consider using AWS JavaScript SDK instead of the AWS CLI but that is a whole different story.
And lastly, back up your files somewhere else before trying to use possibly destructive commands such as sync until you get a feeling of how they work.
Our company is using Ruby 2.1.3 with AWS SDK V1 for uploading files on S3. I need to stream files directly from a private external bucket to one of our personal bucket (without actually downloading them locally). I can't find any good documentation on the subject.
The copy_from method provided by the SDK, I think, does not permit streaming from a private external bucket to one of our bucket.
We have tried using open-uri to stream the download and stream the upload to s3 but the file was always downloaded fully first and then uploaded (is it supposed to be like that?).
Any help is welcomed!
Thank you.
The V1 SDK doesn't allow you to transfer between buckets directly. You can do what open-uri does and download the file and then upload to the new bucket.
If you want a solution that can still work in Ruby I suggest using the AWS CLI. You can add a line like this to your code:
`aws s3 cp s3://frombucket/ s3://tobucket/`
The backticks allow you to execute system commands in your ruby script. Alternatively you could upgrade to the V2 SDK and use the object.copy_to command and copy between buckets. Hope this helps!
I would love to use s3distcp for copying data from S3 buckets to S3 buckets but I have the need to use an external proprietary encryption mechanism to ensure the data is encrypted at rest (keeping the keys to myself so amazon could not decrypt)
I would love to do a git clone and create my own s3distcp (with hooks for external encryption/decryption libraries).
I googled and found a potential here https://github.com/libin/s3distcp
But it's not an Amazon account (apparently) and doesn't look like it's documented/updated.
I built a tool that runs in Node.js to copy data from buckets to buckets.
https://github.com/Homefinder/bucketCloner
It uses the AWS JavaScript SDK and isn't very complicated. You could easily modify it for your purposes, assuming you still have this need of course.