Can't access the blob folder but files inside it are able to download - azure-blob-storage

I have azure storage where I am using containers to store blobs. I am trying to download the blob from this container. But either using python SDK or rest, I am getting error "The specified blob does not exist." but when I giving the full path with the final file such as .txt or whatever instead of root folder, it is able to download it.
For example:
following URL gives error https://mlflowsmodeltorage.blob.core.windows.net/mlflow-test/110/63e7b9f2482b45e29b8c2983fa9522ef/artifacts/models The specified blob does not exist.
but the URL https://mlflowsmodeltorage.blob.core.windows.net/mlflow-test/110/63e7b9f2482b45e29b8c2983fa9522ef/artifacts/models/conda.yaml able to download the file.
Same thing happens with the python SDK. But I want to download the whole folder rather than the files inside it.
How can I achieve it.
Below is the code I am using to access the blob using pytohn SDK
from azure.storage.blob import BlobServiceClient
STORAGEACCOUNTURL = "https://mlflowsmodeltorage.blob.core.windows.net"
STORAGEACCOUNTKEY = "xxxxxxxxxxxxxx"
CONTAINERNAME = "mlflow-test"
BLOBNAME = "110/63e7b9f2482b45e29b8c2983fa9522ef/artifacts/models/"
blob_service_client_instance = BlobServiceClient(
account_url=STORAGEACCOUNTURL, credential=STORAGEACCOUNTKEY,
)
blob_client_instance = blob_service_client_instance.get_blob_client(
CONTAINERNAME, BLOBNAME, snapshot=None)
blob_data = blob_client_instance.download_blob()
data = blob_data.readall()
print(data)

Related

Download an entire public folder from Google-Drive using Python or wget/curl without authentication

I would like to download an entire public folder from Google-drive from script (python, wget, terminal, etc.).
The procedure shall be accomplished without authentication, as it's a public folder which is accessible for anyone who has the link.
Link example: https://drive.google.com/drive/folders/1Gt-W8jMrADizXYGF5QZwJh_Gc8QpKflX
In case that it's not possible to directly download the entire folder, then it would be sufficient to just being able to list its content (files), without authentication, and then I'll be able to download each file separately. How to obtain such a listing feature?
Note:
I found many similar discussions, but all were assuming either file-download or authentication, and I couldn't find a match for that specific ask, e.g.:
Python: How do download entire folder from Google Drive
download folder from google drive
Download entire Google Drive folder from the shared link using Google drive API
How to download specific Google Drive folder using Python?
Download Shared Google Drive Folder with Python
Python: download files from google drive using url
https://unix.stackexchange.com/questions/136371/how-to-download-a-folder-from-google-drive-using-terminal
https://github.com/vikynandha-zz/google-drive-backup/blob/master/drive.py
I've ended up applying the following code (tested, works well):
import urllib.request
from getfilelistpy import getfilelist
from os import path, makedirs, remove, rename
def download_googledrive_folder(remote_folder, local_dir, gdrive_api_key, debug_en):
success = True
if debug_en:
print('[DEBUG] Downloading: %s --> %s' % (remote_folder, local_dir))
else:
try:
resource = {
"api_key": gdrive_api_key,
"id": remote_folder.split('/')[-1].split('?')[0],
"fields": "files(name,id)",
}
res = getfilelist.GetFileList(resource)
print('Found #%d files' % res['totalNumberOfFiles'])
destination = local_dir
if not path.exists(destination):
makedirs(destination)
for file_dict in res['fileList'][0]['files']:
print('Downloading %s' % file_dict['name'])
if gdrive_api_key:
source = "https://www.googleapis.com/drive/v3/files/%s?alt=media&key=%s" % (file_dict['id'], gdrive_api_key)
else:
source = "https://drive.google.com/uc?id=%s&export=download" % file_dict['id'] # only works for small files (<100MB)
destination_file = path.join(destination, file_dict['name'])
urllib.request.urlretrieve(source, destination_file)
except Exception as err:
print(err)
success = False
return success
I could not find a way to achieve my original goal, i.e. downloading public folder from Google-Drive without any credentials/keys, but the above code is a good compromise for me, as it only requires a key and not full credentials.
Note that there are 2 options here --> with or without providing the Google API Key (gdrive_api_key) for the source URL.
From my experience, the option without the API key works well for small files (< ~100MB), while the option with the API key appear to be more robust and works for any size.

How to use StorageProvider.Download(...) to download private files?

When using "StorageProvider.Download(...)", it seems that I can only download files uploaded to the public storage.
Is there a way to download private uploaded files to my server local storage?
I´m using Azure external storage provider.
Thanks!
Example:
// In this case the file is downloaded to the local filesystem:
&LocalFile.Source = 'LocalFile.txt'
&Result = &StorageProvider.Download('AzurePublicFile.txt', &LocalFile, &Messages)
// In this case the file is not downloaded locally, it only loads a reference to the URI in the Azure blob container:
&LocalFile.Source = 'LocalFile.txt'
&Result = &StorageProvider.GetPrivate('AzurePrivateFile.txt', &LocalFile, 5, &Messages)
Try something like this:
StorageProvider.GetPrivate(...., &File)
&URL = &File.GetURI()
&HttpClient.Execute('GET', &URL)
&HttpClient.ToFile("C:\....")
Update: StorageProvider.DownloadPrivate() will be available in GeneXus 16 Upgrade 2.

Passbook save pem file in AWS

I would like to create a passbook sign at my ruby server hosted on AWS. what is the best way to save .pem files or .p12 file in AWS ? and retrieve them to sign the passbook.
I'm using passbook gem in https://github.com/frozon/passbook but note at the example he use files from local path
Passbook.configure do |passbook|
passbook.wwdc_cert = Rails.root.join('wwdc_cert.pem')
passbook.p12_key = Rails.root.join('key.pem')
passbook.p12_certificate = Rails.root.join('certificate.pem')
passbook.p12_password = 'cert password'
end
In my case I want to read them from AWS
Just use the url of your files hosted on amazon. Like
https://<bucket-name>.s3.amazonaws.com/<key>

File Write - Unauthorized Access Exception

Trying to save a file locally from an app running in the iOS 8 Simulator and I'm continually getting access denied exceptions.
In previous apps I've used the following code to get a valid file path:
Environment.GetFolderPath(Environment.SpecialFolder.Personal)
or
Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments)
But I've read that with iOS 8 this has now got to be written as:
NSFileManager.DefaultManager.GetUrls(NSSearchPathDirectory.DocumentDirectory,
NSSearchPathDomain.User)[0]
So I'm using the following code to generate a file path for a .txt file and receiving an access denied exception when trying to save with it:
public void SaveMyFile(string content)
{
NSUrl[] urls;
string filePath;
//
urls = NSFileManager.DefaultManager.GetUrls(NSSearchPathDirectory.DocumentDirectory,
NSSearchPathDomain.User);
filePath = Path.Combine(urls[0].Path, "MyApp", "myFile.txt");
File.WriteAllText(filePath, content);
}
So the file path that it gives me and also denies access to is /Users/Idox/Library/Developer/CoreSimulator/Devices/92498E38-7D50-4081-8A64-83061DC00A86/data/Containers/Data/Application/C35B3E98-C9E3-4ABA-AA7F-CD8419FA0EA5/Documents/MyApp/myFile.txt.
I'm wondering if there's some setting that needs to be toggled to give the app write access to this directory or if the directory itself is invalid.
I've also done a call to Directory.Exists(string path) to check if the directory is there, which it is.
You're missing the Path property on urls[0].Path
filePath = Path.Combine(urls[0].Path, "MyApp", "myFile.txt");
This was fixed in Xamarin.iOS 8.4, so if you're using a recent version of Xamarin you can use Environment.GetFolderPath without problems (which is useful if you want to share code across platforms).

wsadmin upload file from local machine to remote

I'm trying to automate process of deployment and I want to upload some files to WAS using wsadmin (jython). My question is if it is possible to upload file from my standalone wsadmin to remote WAS Server. And if so, is it possible to upload file somewhere out of application (fe. /opt/IBM/WebSphere/AppServer/temp)? I don't want to upload it to specific profile, but to server root.
When I'm deploying application it is copying war/ear file to WAS, so is it there some mechani to upload separate file?
many thanks
AntAgent allows you to upload any file, provided that the content of the file can fit in memory:
https://www.ibm.com/support/knowledgecenter/en/SSAW57_8.5.5/com.ibm.websphere.javadoc.doc/web/mbeanDocs/AntAgent.html
In wsadmin you'll need to use invoke_jmx method of AdminControl object.
from java.lang import String
import jarray
fileContent = 'hello!'
antAgent = AdminControl.makeObjectName(AdminControl.queryNames('WebSphere:*,type=AntAgent,process=dmgr'))
str = String(fileContent)
bytes = str.getBytes()
AdminControl.invoke_jmx(antAgent, 'putScript', [String('hello.txt'),bytes], jarray.array(['java.lang.String', '[B'], String))
Afterwards you'll find 'hello.txt' file in WAS profile's temp directory. You may use relative paths as well.

Resources