storage blob upload command failed after setting AZURE_STORAGE_CONNECTION_STRING - macos

I'm trying to upload a file to my azure storage. I did
$ set AZURE_STORAGE_CONNECTION_STRING=DefaultEndpointsProtocol=https;AccountName=**;AccountKey=**
but when I did
$ azure storage blob upload PATHFILE mycontainer data/book_270.pdf
then I got the following error:
info: Executing command storage blob upload
error: Please set the storage account parameters or one of the following two environment variables to use the storage command.
AZURE_STORAGE_CONNECTION_STRING
AZURE_STORAGE_ACCOUNT and AZURE_STORAGE_ACCESS_KEY
info: Error information has been recorded to /Users/uqtngu83/.azure/azure.err
error: storage blob upload command failed
But I already set AZURE_STORAGE_CONNECTION_STRING! Please help

As suggested in comment, you are suppose to run the following on your MAC terminal. (Change Defaultblabla with your Azure Storage Connection String)
export AZURE_STORAGE_CONNECTION_STRING="DefaultBlaBlaBla"

Related

mc: <ERROR> Unable to validate source - minio client

i tried to download a file from my minio server using the following command, but it gave me error:
command:
mc cp host/host-db/2022-11-13-57a592e7-d979-40f6-b8e8-0d618964ee7e.gz .
error:
mc: <ERROR> Unable to validate source
notes:
this file is not empty
mc version : RELEASE.2022-10-20T23-26-33Z
this file is 103KiB
the command mc ls host works fine means it can connect to minio server.
im using minio client on windows : Runtime: go1.19.2 windows/amd64
the host-db bucket exists and the internet connection is OK.
what is the problem? how can i fix it?
updates:
i cannot download anything from the minio server using minio client(mc).
access permission is the following:
Access permission for 'host/host-db' is 'private'
uploading files from local to s3 works but downloading files from s3 does not work.
➜ mc cp play/tower/test-ilm.txt .
...ower/test-ilm.txt: 155 B / 155 B ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 125 B/s 1s
Please check the alias if it has been added properly.
mc alias list host and verify that the credentials and correct and API endpoint are correct

How do I upload a file to Azurite from terminal?

I'm using Azurite and wish to create a container/upload a blob etc from the bash terminal!
I've tried using the Azure CLI like this::
az storage container create --account-name devstoreaccount1 --account-key Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw== --name mycontainer
But of course it doesn't work and complains of Authentication failure! By the way the correct account key and name are used in that example.
I believe it's not possible to talk to Azurite using the Azure CLI.
All I want to do is create a container and upload a file to it from the terminal.
Does anybody know if this is possible? Or will I have to use a Java client (for example) to do the job?
Thanks
According to my test, when we account key and account name with Azure CLI to create blob container, cli will use https protocol to connect Azurite. But, in default, Azurite just support http protocol. For more details, please refer to here
So I suggest you use connection string to connect Azurite with Azure CLI, the connection string will tell Azure CLI uses http protocol.
For example
Create contanier
az storage container create -n test --connection-string "DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;BlobEndpoint=http://127.0.0.1:10000/devstoreaccount1;QueueEndpoint=http://127.0.0.1:10001/devstoreaccount1;"
Upload file
az storage blob upload -f D:\test.csv -c test -n test.csv --connection-string "DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;BlobEndpoint=http://127.0.0.1:10000/devstoreaccount1;QueueEndpoint=http://127.0.0.1:10001/devstoreaccount1;"

Creation of authentication connection is failing

I am following the virtual assistant get started sample:
Virtual asistant
I am stuck on the step "Skill Authentication".
I tried to use the following command with all the arguments and generated botsecret for --secret argument.
msbot connect generic --name "Authentication" --keys "{\"YOUR_AUTH_CONNECTION_NAME\":\"Azure Active Directory v2\"}" --bot YOURBOTFILE.bot --secret "YOUR_BOT_SECRET" --url "portal.azure.net"
I still get the following error:
Error: You are attempting to perform an operation which needs access to the secret and --secret is missing
Can someone tell me what am I missing?

Issue creating/accessing hive external table with s3 location from spark thrift service

I have configured the s3 keys (access key and secret key) in a jceks file using hadoop-credential api. Commands used for the same are as below:
hadoop credential create fs.s3a.access.key -provider jceks://hdfs#nn_hostname/tmp/s3creds_test.jceks
hadoop credential create fs.s3a.secret.key -provider jceks://hdfs#nn_hostname/tmp/s3creds_test.jceks
Then, I am opening a connection to Spark Thrift Server using beeline and passing the jceks file path in the connection string as below:
beeline -u "jdbc:hive2://hostname:10001/;principal=hive/_HOST#?hadoop.security.credential.provider.path=jceks://hdfs#nn_hostname/tmp/s3creds_test.jceks;
Now, when I try to create an external table with the location in s3, it fails with the below exception:
CREATE EXTERNAL TABLE IF NOT EXISTS test_table_on_s3 (col1 String, col2 String) row format delimited fields terminated by ',' LOCATION 's3a://bucket_name/kalmesh/';
Exception: Error: org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: java.nio.file.AccessDeniedException s3a://bucket_name/kalmesh: getFileStatus on s3a://bucket_name/kalmesh: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: request_id), S3 Extended Request ID: extended_request_id=) (state=,code=0)
I don't think jceks support for the fs.s3a. secrets went in until Hadoop 2.8. I don't think; it's hard to tell from the source. If that is the case, and you are using Hadoop 2.7, then the secret isn't going to to be picked up. Afraid you will have to put it in the config.
I had a similar situation, just with Drill instead of Hive. But like in your case:
using Hadoop 2.9 jars (1st version to support AWS KMS)
writing to s3a://
encrypting with SSE-KMS
... and got AmazonS3Exception: Access Denied.
In my case (perhaps in yours, as well) the exception description was a bit ambiguous. The reported AmazonS3Exception: Access Denied did not originate from S3 but from KMS! Access was denied to the key I used for encryption. User making the API calls was not on key's users list - once I added that user to key's list writing started to work and I could create encrypted tables on s3a://...
For me the following s3 permissions were required:
s3:ListBucket
s3:GetObject
s3:PutObject
I was receiving the same error and was missing s3:ListBucket.
As for KMS permissions (if applicable):
kms:Decrypt
kms:Encrypt
kms:GenerateDataKey

Azure login using a .publishsettings file

I'm trying to assign a reserved IP to a VM using the CLI tools.
After running $: azure network nic set [
pawel#LAMP-Test:~$ azure network nic set LAMP-Test FirstReservedIP
info: Executing command network nic set
error: The current cmdlet requires you to log in using Azure Active Directory account, not from a .publishsettings file. Please run 'azure login' or use 'azure account set' to select a correct subscription.
info: Error information has been recorded to /home/pawel/.azure/azure.err
error: network nic set command failed
info: Executing command network nic set
error: The current cmdlet requires you to log in using Azure Active Directory account, not from a .publishsettings file. Please run 'azure login' or use 'azure account set' to select a correct subscription.
info: Error information has been recorded to /home/pawel/.azure/azure.err
error: network nic set command failed
azure network nic set LAMP-Test FirstReservedIP
]1 FirstReservedIp
I received following error:
The current cmdlet requires you to log in using Azure Active Directory
account, not from a .publishsettings file. Please run 'azure login' or
use 'azure account set' to select a correct subscription.
Is there a way to use .publishsettings file only to achieve this task?
No, at least not when you are in ARM mode. Using the .publishsettings file to authenticate from the CLI tools is only supported in the ASM mode.
More information available here.
You can still achieve a non-interactive login using CLI but it will require that you authenticate to Azure AD using a Work/School account (aka Organizational account). So, create an admin user (or service principal) in your Azure AD if you don't already have one. Then, add the azure login command to the top of your CLI script. For example...
azure login --username johndoe#contoso.onmicrosoft.com --password passw0rD!
azure network nic set LAMP-Test FirstReservedIP

Resources