creating keytab to automate scripts - bash

I am having trouble creating a keytab in order to automate the script I am running.
I am using this website for reference here
This is what I did so far:
$ ktutil
ktutil: addent -password -p bli1#testtesttest.corp.supernice.net -k 1 -e arcfour-hmac
Password for bli1#testtesttest.corp.supernice.net:
ktutil:
ktutil: wkt bli1.keytab
ktutil: quit
When I tried to run the script, I got this error:
$ kinit bli1#testtesttest.corp.supernice.net -k -t bli1.keytab; python3 -m pytrinity.monitors.rate_monitor test
kinit: Cannot find KDC for requested realm while getting initial credentials
I'm not sure if I created the keytab correctly as I am having a hard time finding in-depth documentation on each argument during the keytab creation process. I'm not sure what -k is used for.
addent: add entry
-password: add password
-p: principal
-e: encryption

I think the problem is with connectivity to your KDC as per error message. What's in your kerberos configuration file? /etc/krb5.conf is usually the name for it.

Related

How to use PGPASS file in Powershell to avoid password prompt?

I had to automate my postgre database backup. As instructed by my software vendor I am trying to use pg_dump.exe (see below) file to take a backup but that prompts me for password.
.\pg_dump.exe -h localhost -p 4432 -U postgres -v -b -F t -f "C:\Backup\Backup.tar" Repo
So googled and found that as per "https://www.postgresql.org/docs/9.6/libpq-pgpass.html" I can create a pgpass.conf file within 'C:\Users\User1\AppData\Roaming\postgresql\pgpass.conf" which I did.
Then I tried to pass data of pgpass.conf file to env variable before executing my pg_dump command. But it is not working. Still I am getting prompt to enter password. This is the content of pgpass.conf file: *:*:*:postgres:password
Below is the code I am trying in PowerShell,
$Env:PGPASSFILE="C:\Users\User1\AppData\Roaming\postgresql\pgpass.conf"
cd "C:\Program Files\Qlik\Sense\Repository\PostgreSQL\9.6\bin"
.\pg_dump.exe -h localhost -p 4432 -U postgres -v -b -F t -f "C:\Backup\Backup.tar" Repo
Why am I still being asked for password?
When I type following code $Env:AppData I get following response "C:\Users\User1\AppData\Roaming"
Everywhere there are guidance on how to use it in UNIX or command prompt but not in powershell. Any help is appreciated. Also if you could direct me how to secure this password file then it will be great.
With password prompt I cannot automate it with windows task scheduler.
I suspect you have a suitable solution, however, as a quick (and not secure) workaround via the command prompt, you can use the variable PGPASSWORD to hold the password then run the backup script.
A sample might be something like:
SET PGPASSWORD=password
cd "C:\Program Files\Qlik\Sense\Repository\PostgreSQL\9.6\bin" pg_dump.exe -h localhost -p 4432 -U postgres -b -F t -f "d:\qs_backup\QSR_backup.tar" QSR
Rod
I have yet to get the damned thing to work yet, but I did find this:
-w
--no-password Never issue a password prompt. If the server requires password authentication and a password is not available by other means
such as a .pgpass file, the connection attempt will fail. This option
can be useful in batch jobs and scripts where no user is present to
enter a password.
I don't see a -w parameter in your call to pg_dump
I used pg_hba file to allow connection "trust" this is riskier method but I had to get things done ASAP. Thank you for your time and effort

How can I get Logstash-Keystore to find its password?

For background: I'm attempting to automate steps to provision and create a multitude of Logstash processes within Ansible, but want to ensure the steps and configuration work manually before automating the process.
I have installed Logstash as per Elastic's documentation (its an RPM installation), and have it correctly shipping logs to my ES instance without issue. Elasticsearch and Logstash are both v7.12.0.
Following the keystore docs, I've created a /etc/sysconfig/logstash file and have set the permissions to the file to 0600. I've added the LOGSTASH_KEYSTORE_PASS key to the file to use as the environment variable sourced by the keystore command on creation and reading of the keystore itself.
Upon running the sudo /usr/share/logstash/bin/logstash-keystore --path.settings /etc/logstash create command, the process spits back the following error:
WARNING: The keystore password is not set.
Please set the environment variable `LOGSTASH_KEYSTORE_PASS`.
Failure to do so will result in reduced security.
Continue without password protection on the keystore? [y/N]
This should not be the case, as the keystore process should be sourcing my password env var from the aforementioned file. Has anyone experienced a similar issue, and if so, how did you solve it?
This is expected, the file /etc/sysconfig/logstash will be read only when you start logstash as a service, not when you run it from command line.
To create the keystore you will need to export the variable with the password first, as explained in the documentation.
set +o history
export LOGSTASH_KEYSTORE_PASS=mypassword
set -o history
sudo -E /usr/share/logstash/bin/logstash-keystore --path.settings /etc/logstash create
After that, when you start logstash as a service it will read the variable from the /etc/sysconfig/logstash file.
1 - you should write your password for KEYSTORE itself.
It is under config/startup-options.
E.g. LOGSTASH_KEYSTORE_PASS=mypassword (without export)
2 - Then you should use the Keystore password to create your keystore file.
set +o history
export LOGSTASH_KEYSTORE_PASS=mypassword
set -o history
..logstash/bin/logstash-keystore --path.settings ../logstash create
Note: logstash-keystore and logstash.keystore are different things. you created the one with dot. It is in config/.. directory where your startup.options is.
History command is to hide your password to be seen. Because if somebody uses "history" to list all the commands used previously, they can see your password.
3 - Then you can add your first password into keystore file. You should give your keystore password beforehand.
set +o history
export LOGSTASH_KEYSTORE_PASS=mypassword
set -o history
./bin/logstash-keystore add YOUR_KEY
Then it will ask for your VALUE. If you do not give your keystore password, you get an error: Found a file at....but it's not a valid Logstash keystore
4 - Once you give your password. You can list the content of your keystore file, or remove. Replace "list" with "remove".
./bin/logstash-keystore list

How do i enter the password while joining the Linux machine to Active Directory using shell script?

I am joining Linux machines to windows active directory and i am able to do it successfully using SSSD.
Now I am trying to automate the same process wherein i came across the step where i need to enter a password while joining the domain.
Can someone help in how to enter the password via shell script?
My code is :
#!/bin/bash
set -x
passwd=`cat /domain/domain_join.txt | grep password | awk -F '[=]' '{print$2}'`
/usr/bin/expect << EOF
spawn realm join domainname -U username#domainname -v
expect "Password for username#domainname: \r"
send "$passwd\r"
EOF
set +x
This isn't actually an LDAP question - it's AD and Kerberos and sshd.
It looks like you've got a user account to join the machine - presumably it has the correct rights. The easiest thing to do is to get a keytab created for that account, and then you can do a kinit and call the script in that context
kinit principal#EXAMPLE.COM -k -t keytab; joinscript
You don't need to define your username in the realm join command if you've already done the kinit.
Sorry, I can't test this, and it's been a while, but the secret sauce is a keytab for the Windows credential.

Bash File Transfer Issue Using SCP

I need to transfer files from machine 'A' to machine 'B' and I'm executing the command from machine from 'C'.
Run Command :
$sshpass -p 'password_for_a' ssh -A -t a#x.x.x.x rsync -avz /home/test/* b#x.x.x.x:/home/test/
This prompts the "Password" for machine "B" which i do not want to type manually. I cannot install "sshpass" on machine "A" because i don't have authorization to install lib/packages.
Is there anyway to include the password using rsync/scp for the above command ?
I tried passing password using 'scp' 'PreferredAuthentications' too.
$sshpass -p 'password_for_a' ssh -A -t a#x.x.x.x scp -o PreferredAuthentications="password_for_b" /home/test/* b#x.x.x.x:/home/test/
I'm getting ,
Permission denied (publickey,password).
lost connection
If i am wrong anywhere , Please correct me ?
There is nothing like PreferredAuthentications="password_for_b". The authentication method is called password. If you specify something invalid, it obviously fails. Set up public key authentication.
Thanks All , I generated ssh-keygen and copied key files to (*.pub) respective machines. Now i can do password less authentications. Got clarify post discussions.

Hadoop - requestion for network lan password during starting cluster

I can't understant what password is expected by hadoop.
I configured it according to tutorial. I do:
sudo su
#bash start-dfs.sh
And now it expects someting like password lan's network. I have no idea what should I write.
As you can see, I run script as root. Of course master (from that I run script) may ssh to slaves as root without password (I configured and tested it).
Disclaimer: It is possbile that I give incorrect name (for example for script name - it is beacause of I don't understand exactly now. However I am sure that it was about something like lan's network password)
Help me please, for which a password is it?
Edit: I was using http://backtobazics.com/big-data/setup-multi-node-hadoop-2-6-0-cluster-with-yarn/
It seems you may not setup passwordless-ssh. Passwordless-ssh is required to run hadoop services (daemons). So try to setup ssh among nodes
$ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
$ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys
$ chmod 0600 ~/.ssh/authorized_keys
Then ssh user#hostname

Resources