Unable to decrypt backup files of Mail-in-a-box stored in S3 or locally - mailinabox

I set up backups on Amazon S3. I have received three backup files. I also have access to the secret_key in my /home/user-data/backups directory. In my local windows system, I tried to open the file with gpg4win software. After trying to open the file, it asks me for the passphrase. I pasted the key from secret_key.txt file to there. This does not work as it gives error of Bad Passphrase. Am I doing something wrong?

Related

GCP use cloud storage file to pull the code from git

I am trying to put a start-up script in cloud storage file for a vm. this cloud storage file will contain pull related command.
so the first step to get a ssh-key, I generate it from bitbucket, now when I went for adding the ssh-key in vm metadata, I saw there is already ssh there in metadata.
How can I use this metadata ssh key to pull the repo from bitbucket. I want to write the shell script to pull the code in cloud storage file and then give this file as startup script for the vm.
I am stuck on how can I access ssh-key. I saw somewhere
cat ~/.ssh/id_rsa.pub
I was guessing this file should show the keys it has as I am able to see the ssh-keys in vm metadata, but it says file not found.
I am looking into wrong file
Thanks,

aws s3 glacier restore from vault

I have vault and need to restore one of the folder from the vault I have initiated the job using AWS CLI and got the inventory using JSON file but unable to get the complete folder from the inventory. Any one can help me restoring the folder?
I am able to get CSV file formate to see the archive ID of the files but is it possible to take the complete folder as it is showing separate archive ID for all files in folder?

trying to copy an Ansible playbook from my local machine to remote host. But getting bad permissions

I am trying to copy a playbook from my local machine to the host machine (EC2 Instance) but It says I have bad permissions, despite add my key-pair to ~/.ssh/id-rsa/ansible-benchmark.pem.
Ansible-benchmark.pem being the key.
The code I run is scp /Users/mohammedkhot/Documents/terraform-consul/cis-playbook/main.yaml ec2-18-170-61-4.eu-west-2.compute.amazonaws.com:/etc/ansible.
I am trying to copy my main.yaml file to /etc/ansible/
I did also run chmod 400 before trying to copy it but it didn't work.
This is the error I am getting
# WARNING: UNPROTECTED PRIVATE KEY FILE! #
###########################################################
Permissions 0755 for '/Users/mohammedkhot/.ssh/id_rsa' are too open.
It is required that your private key files are NOT accessible by others.
This private key will be ignored.
Load key "/Users/mohammedkhot/.ssh/id_rsa": bad permissions
mohammedkhot#ec2-18-170-61-4.eu-west-2.compute.amazonaws.com: Permission denied (publickey).
lost connection```
The third line in the output is telling you what is wrong. You need more secure permissions on the private key file which resides on your workstation. The current permissions are too permissive.
Change the file permissions to read only for your user using chmod, and then attempt to upload the file to the remote machine.
$ chmod 600 /Users/mohammedkhot/.ssh/id_rsa
$ scp /Users/mohammedkhot/Documents/terraform-consul/cis-playbook/main.yaml ec2-18-170-61-4.eu-west-2.compute.amazonaws.com:/etc/ansible

Local DynamoDB Tables Backup

I want to back my DynamoDB local server. I have install DynamoDB server in Linux machine. Some sites are refer to create a BASH file in Linux os and connect to S3 bucket, but in local machine we don't have S3 bucket.
So i am stuck with my work, Please help me Thanks
You need to find the database file created by DynamoDb local. From the docs:
-dbPath value — The directory where DynamoDB will write its database file. If you do not specify this option, the file will be written to
the current directory. Note that you cannot specify both -dbPath and
-inMemory at once.
The file name would be of the form youraccesskeyid_region.db. If you used the -sharedDb option, the file name would be shared-local-instance.db
By default, the file is created in the directory from which you ran dynamodb local. To restore you'll have to the copy the same file and while running dynamodb, specify the same dbPath.

FTP permission denied error

I am trying to FTP a RAR (zipped) file to another server but am having problems doing so. This is a Windows environment. I know that my FTP connection is setup correctly because I have already transferred over several other RARs. But the difference from what I can tell is that this RAR that is failing is larger in size. It is 761 MB. So when I try to "put" it into the other server, I get the following:
200 PORT command successful.
150 Opening BINARY mode data connection for WCU.rar.
> WCU.rar:Permission denied
226 Transfer complete.
However, the file is never transferred over. Is there a size limitation? And FYI, WCU.rar is a zipped directory, not a file. But I was able to successfully FTP over several other zipped directories.
it can be size limitation, not just stored data but as well transfered data.
did you try to transfer a small file? a small file in the same format? I would say, permissions, but you said that you uploaded already files to this server.
just to help you debug, you can add both commands to your ftp session
ftp> hash
ftp> bin
WCU.rar:Permission denied
You don't have permission to write to that directory. You need write permissions on the folder in order to do so.

Resources