Passbook save pem file in AWS - ruby

I would like to create a passbook sign at my ruby server hosted on AWS. what is the best way to save .pem files or .p12 file in AWS ? and retrieve them to sign the passbook.
I'm using passbook gem in https://github.com/frozon/passbook but note at the example he use files from local path
Passbook.configure do |passbook|
passbook.wwdc_cert = Rails.root.join('wwdc_cert.pem')
passbook.p12_key = Rails.root.join('key.pem')
passbook.p12_certificate = Rails.root.join('certificate.pem')
passbook.p12_password = 'cert password'
end
In my case I want to read them from AWS

Just use the url of your files hosted on amazon. Like
https://<bucket-name>.s3.amazonaws.com/<key>

Related

Can't access the blob folder but files inside it are able to download

I have azure storage where I am using containers to store blobs. I am trying to download the blob from this container. But either using python SDK or rest, I am getting error "The specified blob does not exist." but when I giving the full path with the final file such as .txt or whatever instead of root folder, it is able to download it.
For example:
following URL gives error https://mlflowsmodeltorage.blob.core.windows.net/mlflow-test/110/63e7b9f2482b45e29b8c2983fa9522ef/artifacts/models The specified blob does not exist.
but the URL https://mlflowsmodeltorage.blob.core.windows.net/mlflow-test/110/63e7b9f2482b45e29b8c2983fa9522ef/artifacts/models/conda.yaml able to download the file.
Same thing happens with the python SDK. But I want to download the whole folder rather than the files inside it.
How can I achieve it.
Below is the code I am using to access the blob using pytohn SDK
from azure.storage.blob import BlobServiceClient
STORAGEACCOUNTURL = "https://mlflowsmodeltorage.blob.core.windows.net"
STORAGEACCOUNTKEY = "xxxxxxxxxxxxxx"
CONTAINERNAME = "mlflow-test"
BLOBNAME = "110/63e7b9f2482b45e29b8c2983fa9522ef/artifacts/models/"
blob_service_client_instance = BlobServiceClient(
account_url=STORAGEACCOUNTURL, credential=STORAGEACCOUNTKEY,
)
blob_client_instance = blob_service_client_instance.get_blob_client(
CONTAINERNAME, BLOBNAME, snapshot=None)
blob_data = blob_client_instance.download_blob()
data = blob_data.readall()
print(data)

Deploy .sh file in ec2 using terraform

i am trying to deploy *.sh file located in my localhost to ec2,using terraform.Note that all infrastructure i am creating via terraform.So for copy file to the remote host i am using terraform provisioner.The question is,how i can find out a private_key or password for ubuntu-user for deploying.Or maybe somebody knows different solution.The goal to run .sh file in ec2.Thanks before hand)
If you want to do it using a provisioner and you have the private key local to where Terraform is being executed, then SCSI-9's solution should work well.
However, if you can't ensure the private key is available then you could always do something like how Elastic Beanstalk deploys and use S3 as an intermediary.
Something like this.
resource "aws_s3_bucket_object" "script" {
bucket = module.s3_bucket.bucket_name
key = regex("([^/]+$)", var.script_file)[0]
source = var.script_file
etag = filemd5(var.script_file)
}
resource "aws_instance" "this" {
depends_on = [aws_s3_bucket_object.script]
user_data = templatefile("${path.module}/.scripts/userdata.sh" {
s3_bucket = module.s3_bucket.bucket_name
object_key = aws_s3_bucket_object.script.id
}
...
}
And then somewhere in your userdata script, you can fetch the object from s3.
aws s3 cp s3://${s3_bucket}/${object_key} /some/path
Of course, you will also have to ensure that the instance has permissions to read from the s3 bucket, which you can do by attaching a role to the EC2 instance with the appropriate policy.

Download an entire public folder from Google-Drive using Python or wget/curl without authentication

I would like to download an entire public folder from Google-drive from script (python, wget, terminal, etc.).
The procedure shall be accomplished without authentication, as it's a public folder which is accessible for anyone who has the link.
Link example: https://drive.google.com/drive/folders/1Gt-W8jMrADizXYGF5QZwJh_Gc8QpKflX
In case that it's not possible to directly download the entire folder, then it would be sufficient to just being able to list its content (files), without authentication, and then I'll be able to download each file separately. How to obtain such a listing feature?
Note:
I found many similar discussions, but all were assuming either file-download or authentication, and I couldn't find a match for that specific ask, e.g.:
Python: How do download entire folder from Google Drive
download folder from google drive
Download entire Google Drive folder from the shared link using Google drive API
How to download specific Google Drive folder using Python?
Download Shared Google Drive Folder with Python
Python: download files from google drive using url
https://unix.stackexchange.com/questions/136371/how-to-download-a-folder-from-google-drive-using-terminal
https://github.com/vikynandha-zz/google-drive-backup/blob/master/drive.py
I've ended up applying the following code (tested, works well):
import urllib.request
from getfilelistpy import getfilelist
from os import path, makedirs, remove, rename
def download_googledrive_folder(remote_folder, local_dir, gdrive_api_key, debug_en):
success = True
if debug_en:
print('[DEBUG] Downloading: %s --> %s' % (remote_folder, local_dir))
else:
try:
resource = {
"api_key": gdrive_api_key,
"id": remote_folder.split('/')[-1].split('?')[0],
"fields": "files(name,id)",
}
res = getfilelist.GetFileList(resource)
print('Found #%d files' % res['totalNumberOfFiles'])
destination = local_dir
if not path.exists(destination):
makedirs(destination)
for file_dict in res['fileList'][0]['files']:
print('Downloading %s' % file_dict['name'])
if gdrive_api_key:
source = "https://www.googleapis.com/drive/v3/files/%s?alt=media&key=%s" % (file_dict['id'], gdrive_api_key)
else:
source = "https://drive.google.com/uc?id=%s&export=download" % file_dict['id'] # only works for small files (<100MB)
destination_file = path.join(destination, file_dict['name'])
urllib.request.urlretrieve(source, destination_file)
except Exception as err:
print(err)
success = False
return success
I could not find a way to achieve my original goal, i.e. downloading public folder from Google-Drive without any credentials/keys, but the above code is a good compromise for me, as it only requires a key and not full credentials.
Note that there are 2 options here --> with or without providing the Google API Key (gdrive_api_key) for the source URL.
From my experience, the option without the API key works well for small files (< ~100MB), while the option with the API key appear to be more robust and works for any size.

How to use StorageProvider.Download(...) to download private files?

When using "StorageProvider.Download(...)", it seems that I can only download files uploaded to the public storage.
Is there a way to download private uploaded files to my server local storage?
I´m using Azure external storage provider.
Thanks!
Example:
// In this case the file is downloaded to the local filesystem:
&LocalFile.Source = 'LocalFile.txt'
&Result = &StorageProvider.Download('AzurePublicFile.txt', &LocalFile, &Messages)
// In this case the file is not downloaded locally, it only loads a reference to the URI in the Azure blob container:
&LocalFile.Source = 'LocalFile.txt'
&Result = &StorageProvider.GetPrivate('AzurePrivateFile.txt', &LocalFile, 5, &Messages)
Try something like this:
StorageProvider.GetPrivate(...., &File)
&URL = &File.GetURI()
&HttpClient.Execute('GET', &URL)
&HttpClient.ToFile("C:\....")
Update: StorageProvider.DownloadPrivate() will be available in GeneXus 16 Upgrade 2.

wsadmin upload file from local machine to remote

I'm trying to automate process of deployment and I want to upload some files to WAS using wsadmin (jython). My question is if it is possible to upload file from my standalone wsadmin to remote WAS Server. And if so, is it possible to upload file somewhere out of application (fe. /opt/IBM/WebSphere/AppServer/temp)? I don't want to upload it to specific profile, but to server root.
When I'm deploying application it is copying war/ear file to WAS, so is it there some mechani to upload separate file?
many thanks
AntAgent allows you to upload any file, provided that the content of the file can fit in memory:
https://www.ibm.com/support/knowledgecenter/en/SSAW57_8.5.5/com.ibm.websphere.javadoc.doc/web/mbeanDocs/AntAgent.html
In wsadmin you'll need to use invoke_jmx method of AdminControl object.
from java.lang import String
import jarray
fileContent = 'hello!'
antAgent = AdminControl.makeObjectName(AdminControl.queryNames('WebSphere:*,type=AntAgent,process=dmgr'))
str = String(fileContent)
bytes = str.getBytes()
AdminControl.invoke_jmx(antAgent, 'putScript', [String('hello.txt'),bytes], jarray.array(['java.lang.String', '[B'], String))
Afterwards you'll find 'hello.txt' file in WAS profile's temp directory. You may use relative paths as well.

Resources