azure cli on osx is failing to authenticate - macos

I have to download some files from azure to local, using a Mac.
I have been given this Windows command line:
AzCopy /Source:https://XXX.blob.core.windows.net/YYY /SourceKey:TQSxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxpbA== /Dest:C:\myfolder /Pattern:c /S
I have downloaded and installed azcopy, but it has a radically different syntax, and despite I've been trying for quite some time, I haven't been able to make it work.
What's the correct syntax, given this one?
Looking at some documentation, I've tried:
azcopy cp "https://XXX.blob.core.windows.net/YYY/TQSxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxpbA==" "azcopy_dest" --recursive
but it doesn't work:
failed to perform copy command due to error:
cannot start job due to error: cannot list blobs for download. Failed
with error ->
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/azblob.NewResponseError,
/go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/azblob/zz_generated_response_error.go:28
===== RESPONSE ERROR (ServiceCode=ResourceNotFound) ===== Description=The specified resource does not exist.

From you description it seems you are using AZCopy 10, which mean you do not need to specify key. You either need to generate a SAS token or Login before using azcopy.
create a token
azcopy cp "https://XXX.blob.core.windows.net/YYY?[SAS]" "/path/to/dir" --recursive=true
Login azcopy login --tenant-id "your tenantid"
azcopy cp "https://XXX.blob.core.windows.net/YYY" "/path/to/dir" --recursive=true
Used Linux, do not have a Mac but should be the same across all platform.
Hope this helps.

Related

AWS Beanstalk Laravel post deploy hooks no such file or directory

I'm trying to deploy laravel app to aws beanstalk, OS is Amazon Linux 2 AMI.
I've setup following files:
.ebextensions/01-deploy-script-permission.config
It contains below code:
container_commands:
01-storage-link:
command: 'sudo chmod +x .platform/hooks/postdeploy/post-deploy.sh'
And
.platform\hooks\postdeploy/01-post-deploy.sh
It contains below code:
php artisan optimize:clear
Upon deploying it fails with following entry in eb-engine.log file
[ERROR] An error occurred during execution of command [app-deploy] -
[RunAppDeployPostDeployHooks]. Stop running the command. Error:
Command .platform/hooks/postdeploy/post-deploy.sh failed with error
fork/exec .platform/hooks/postdeploy/post-deploy.sh: no such file or
directory
This answer is for users who are using Windows to deploy their files to elastic beanstalk.
I found this information after spending 6 precious hours. Probably not documented anywhere in official documentations
As per this link "https://forums.aws.amazon.com/thread.jspa?threadID=321653"
psss: most important that the file is saved with LF line separator.
CRLF makes "no file or directory found"
So I used Visual Studio Code to convert CRLF to LF for files in .platform/hooks/postdeploy
At the bottom right of the screen in VS Code there is a little button
that says “LF” or “CRLF”: Click that button and change it to your
preference.
I don't know for sure but I think you are running the command before the files are even created hence getting the following error.
A while ago I faced the same kind of problem where I wrote migration commands in .ebextension and it used to give me an error because my env file wasn't even created yet hence no DB connection is made so I was getting the error. Hope this will give you a direction.
By the way, I resolved the problem by creating env then pushing these commands through the pipeline.

aws s3 cli not working in window task scheduler

I try to run the following aws cli command in console it working correctly.
I have aws access key and secret configured.
aws s3 sync "C:\uploadfolder" s3://uploadfolder
However, when i run it inside windows task scheduler in windows 10 as well as windows server 2012, I got the following error:
cannot find the file specified 0x80070002
It does not seems like it is a corrupted profile because it does not work for both windows and other command is running as expected.
Is there any step that I miss out? or any other special command needed when run aws cli in window task scheduler.
Your cli command is attempting to sync a FILE called "uploadfolder". You need to change to the directory first, then run the command. Your command should instead be:
cd C:\uploadfolder
aws s3 sync . s3://uploadfolder/
This will recursively copy all files in your local directory that are not in your s3 bucket. If you would also like the sync command to delete files that are no longer in the local directory, you also need to add the --delete flag.
aws s3 sync . s3://uploadfolder/ --delete

Heroku CLI authentication issue

After a fresh install of Heroku on Windows 7, I can't seem to authenticate from the command-line.
Running the command: heroku login prompts me to enter my credentials. After doing so, I received an error:
heroku: Enter your login credentials
Email: my_email
Password: ************
Error: ENOENT: no such file or directory, open 'z:/_netrc'
I am using PowerShell, and when I run the command cat z:/_netrc, I get this error:
cat : Cannot find drive. A drive with the name 'z' does not exist.
Z: is a network drive, and it is accessible from the file explorer.
I already have a .netrc file in my %HOME% path, but it does not contain the heroku login credentials.
Looking at the official documentation and CLI help, I couldn't find anything useful to fix this. How can I login to my heroku account?
> heroku version
heroku/7.16.6 win32-x64 node-v10.11.0
So, the issue is arising from not finding _netrc file on your local computer that is required to complete login with Heroku. I have decided to create the file into following location of my windows 10 computer:
cmd>set HomeDrive=C:/Users/your Windows username/AppData/Local/heroku
In my case,
cmd>set HomeDrive=C:/Users/CrazyMoby/AppData/Local/heroku
Finally I ran heroku login
The above step resolved heroku login issue in my case.
Use setx HOME <netrc_default_location>
where <netrc_default_location> can be:
<%NETRC%>\_netrc
<%HOME%>\_netrc
<%HOMEDRIVE%%HOMEPATH%>\_netrc
<%USERPROFILE%>\_netrc
Some clarification can be found here and here.
Probably it's your user <%USERPROFILE%>.
But if you don't need it, just remove _netrc file, reboot and log in again.
Run the following command in powershell and the problem should be solved.
$Env:HOMEDRIVE = "C:"
If you need more information, check out the docs on windows environment variables.
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_environment_variables?view=powershell-7
This way work for me. Write in PowerShell next code , where "paulob" need change to your user, because folser "_netrc" exist in:
$Env:HOMEDRIVE = "C:\Users\paulob\"
Try running it from GIT BASH cmd instead of PowerShell if you can, I had the same problem and it worked in my case.

How to ensure AWS S3 cli file sync fully works without missing files

I'm working on a project that takes database backups from MongoDB on S3 and puts them onto a staging box for use for that day. I noticed during a manual run today I got this output. Normally it shows a good copy of each files but today I got a connection reset error or something one of the files, *.15, was not copied over after the operation had completed.
Here is the AWS CLI command that I'm using:
aws s3 cp ${S3_PATH} ${BACKUP_PRODUCTION_PATH}/ --recursive
And here is an excerpt of the output I got back:
download: s3://myorg-mongo-backups-raw/production/daily/2018-09-10/080001/data/s-ds063192-a1/myorg-production/myorg-production.10
to ../../data/db/myorg-production/myorg-production.10
download: s3://myorg-mongo-backups-raw/production/daily/2018-09-10/080001/data/s-ds063192-a1/myorg-production/myorg-production.11
to ../../data/db/myorg-production/myorg-production.11
download: s3://myorg-mongo-backups-raw/production/daily/2018-09-10/080001/data/s-ds063192-a1/myorg-production/myorg-production.12
to ../../data/db/myorg-production/myorg-production.12
download: s3://myorg-mongo-backups-raw/production/daily/2018-09-10/080001/data/s-ds063192-a1/myorg-production/myorg-production.13
to ../../data/db/myorg-production/myorg-production.13
download s3://myorg-mongo-backups-raw/production/daily/2018-09-10/080001/data/s-ds063192-a1/myorg-production/myorg-production.14
to ../../data/db/myorg-production/myorg-production.14
download failed: s3://myorg-mongo-backups-raw/production/daily/2018-09-10/080001/data/s-ds063192-a1/myorg-production/myorg-produc
tion.15 to ../../data/db/myorg-production/myorg-production.15 ("Connection broken: error(104, 'Connection reset by peer')", error
(104, 'Connection reset by peer'))
download: s3://myorg-mongo-backups-raw/production/daily/2018-09-10/080001/data/s-ds063192-a1/myorg-production/myorg-production.16
to ../../data/db/myorg-production/myorg-production.16
How can I ensure that the data from the given S3 path was fully copied over to the target path without any connection issues, missing files, etc? Is the sync command for the AWS tool a better option? Or should I try something else?
Thanks!

Microsoft Azure Symbolic Link Storage folder Laravel

I am using laravel 5.3 on Microsoft Azure web server...
I am having an issue creating a symbolic link between my /public/storage/ and /storage/app/public/ folders
I do not have the required permissions to create a symbolic link, and I have also tried to use the following command:
php artisan storage:link
Which automatically creates a symbolic link, however it gives the error:
Access is denied.
I have tried to also run the command:
mklink /d "D:home\site\wwwroot\storage\app\public\" "D:home\site\wwwroot\public\storage\"
And the error is something along the lines of:
You do not have sufficient privileges to perform this operation
How can I run command prompt in Microsoft Azure in elevated administrator mode so that I can actually create a symbolic link...
Please help.
Thanks!
Creating symlinks is something that is blocked by the Azure Web App sandbox. You can read more about it here.
If you consist to create symlinks, you can try to leverage Azure VMs.

Resources