i tried to download a file from my minio server using the following command, but it gave me error:
command:
mc cp host/host-db/2022-11-13-57a592e7-d979-40f6-b8e8-0d618964ee7e.gz .
error:
mc: <ERROR> Unable to validate source
notes:
this file is not empty
mc version : RELEASE.2022-10-20T23-26-33Z
this file is 103KiB
the command mc ls host works fine means it can connect to minio server.
im using minio client on windows : Runtime: go1.19.2 windows/amd64
the host-db bucket exists and the internet connection is OK.
what is the problem? how can i fix it?
updates:
i cannot download anything from the minio server using minio client(mc).
access permission is the following:
Access permission for 'host/host-db' is 'private'
uploading files from local to s3 works but downloading files from s3 does not work.
➜ mc cp play/tower/test-ilm.txt .
...ower/test-ilm.txt: 155 B / 155 B ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 125 B/s 1s
Please check the alias if it has been added properly.
mc alias list host and verify that the credentials and correct and API endpoint are correct
Related
I am trying to create a MinIO Server and Client so I can run some tests and I am getting the error bellow when I try to run mc:
C:\mc.exe alias set myminio http://169.254.44.236:9000 admin password
mc.exe: <ERROR> Unable to initialize new alias from the provided credentials. Get
"http://169.254.44.236:9000/probe-bucket-sign-6oubmdhiezos/?location=": dial tcp
169.254.44.236:9000: connectex: A socket operation was attempted on an unreachable network.
I just copied the command given to me when I started the MinIO Server. I am using a Windows10 PC remotely.
I have a release configured in Azure DevOps, and am having problems with the FTP Upload step, which fails most times, but works once in a while. I am trying to deploy to a regular IIS box, not an Azure subscription, and I only have FTP access to it.
When the release gets to the FTP Upload step, the log shows the following (IP address obscured)...
##[section]Starting: FTP Upload
==============================================================================
Task : FTP upload
Description : Upload files using FTP
Version : 2.154.0
Author : Microsoft Corporation
Help : https://learn.microsoft.com/azure/devops/pipelines/tasks/utility/ftp-upload
==============================================================================
f71d50fb-1433-4a21-9748-e519c6a8ffcd exists true
connecting to: 1.1.1.1:21
connected: 220 FileZilla Server version 0.9.48 beta written by Tim Kosse (tim.kosse#filezilla-project.org) Please visit http
files uploaded: 0, directories processed: 1, total: 1, remaining: 133, remote directory successfully created/verified: /
File: /AutoMapper.Extensions.Microsoft.DependencyInjection.dll Type: upload Transferred: 0
File: /AutoMapper.Extensions.Microsoft.DependencyInjection.dll Type: upload Transferred: 0
##[error]Unhandled: This socket has been ended by the other party
##[warning]FTPError: 550 can't access file.
connecting to: 1.1.1.1:21
connected: 220 FileZilla Server version 0.9.48 beta written by Tim Kosse (tim.kosse#filezilla-project.org) Please visit http
File: /AutoMapper.Extensions.Microsoft.DependencyInjection.dll Type: upload Transferred: 0
File: /AutoMapper.Extensions.Microsoft.DependencyInjection.dll Type: upload Transferred: 0
##[error]Unhandled: This socket has been ended by the other party
##[warning]FTPError: 550 can't access file.
connecting to: 1.1.1.1:21
connected: 220 FileZilla Server version 0.9.48 beta written by Tim Kosse (tim.kosse#filezilla-project.org) Please visit http
File: /AutoMapper.Extensions.Microsoft.DependencyInjection.dll Type: upload Transferred: 0
File: /AutoMapper.Extensions.Microsoft.DependencyInjection.dll Type: upload Transferred: 0
##[error]Unhandled: This socket has been ended by the other party
##[warning]FTPError: 550 can't access file.
connecting to: 1.1.1.1:21
connected: 220 FileZilla Server version 0.9.48 beta written by Tim Kosse (tim.kosse#filezilla-project.org) Please visit http
##[error]FTPError: 550 can't access file.
host: 1.1.1.1
path: /
files uploaded: 0
directories processed: 1
unprocessed files & directories: 133
##[error]Ftp Upload failed
disconnecting from: 1.1.1.1
##[section]Finishing: FTP Upload
Using the same FTP credentials, I can access and upload without any problems using FileZilla.
I saw this answer, which suggests that you need to stop the Azure app first, but the suggestion he gives for this doesn't seem to apply to releasing from DevOps, as I can't see where you'd add the lines of code.
Anyone able to help?
If you want to use command lines, add Azure CLI task
Otherwise, you can also stop an App Service with Azure App Service Manage task
But, if you can, instead of FTP, I strongly recommend you to use the Azure App Service Deploy task task.
It has a checkbox flag which take app offline before deploying files.
Trying to configure portworx volume backups (ptxctl cloudsnap) to localhost minio server (emulating S3).
First step is to create cloud credentials using ptxctl cred c
e.g.
./pxctl credentials create --provider s3 --s3-access-key mybadaccesskey --s3-secret-key mybadsecretkey --s3-region local --s3-endpoint 10.0.0.1:9000
This results in:
Error configuring cloud provider.Make sure the credentials are correct: RequestError: send request failed caused by: Get https://10.0.0.1:9000/: EOF
disabling SSL (which is not configured as this is just a localhost test) gives me:
./pxctl credentials create --provider s3 --s3-access-key mybadaccesskey --s3-secret-key mybadsecretkey --s3-region local --s3-endpoint 10.0.0.1:9000 --s3-disable-ssl
Which returns:
Not authenticated with the secrets endpoint
I've tried this with both minio gateway (nas) and minio server - same result.
Portworx container is running within Rancher
Any thoughts appreciated
Resolved via instructions at https://docs.portworx.com/secrets/portworx-with-kvdb.html
i.e. set secret type to kvdb in /etc/pwx/config.json
"secret": {
"cluster_secret_key": "",
"secret_type": "kvdb"
},
Then login using ./pxctl secrets kvdb login
After this, credentials create was successful and subsequent cloudsnap backup. Test was using --s3-disable-ssl switch
Note - kvdb is plain text so not suitable for production obvs.
I'm trying to upload a file to my azure storage. I did
$ set AZURE_STORAGE_CONNECTION_STRING=DefaultEndpointsProtocol=https;AccountName=**;AccountKey=**
but when I did
$ azure storage blob upload PATHFILE mycontainer data/book_270.pdf
then I got the following error:
info: Executing command storage blob upload
error: Please set the storage account parameters or one of the following two environment variables to use the storage command.
AZURE_STORAGE_CONNECTION_STRING
AZURE_STORAGE_ACCOUNT and AZURE_STORAGE_ACCESS_KEY
info: Error information has been recorded to /Users/uqtngu83/.azure/azure.err
error: storage blob upload command failed
But I already set AZURE_STORAGE_CONNECTION_STRING! Please help
As suggested in comment, you are suppose to run the following on your MAC terminal. (Change Defaultblabla with your Azure Storage Connection String)
export AZURE_STORAGE_CONNECTION_STRING="DefaultBlaBlaBla"
I installed Windows Server 2008 on my VMWare machine. In Windows Server 2008, I installed FTP and ran it. I also turn off all firewalls. However, from my main machine, I could not send a text file and got this errors:
200 PORT command successful.
550 file.txt: Access is denied.
Please help
If you got this error message when trying to upload a file to the server, it is possible that you did not enable write permissions for the folder that you are uploading to.