How to connect to postgresql database using shell script - shell

I want to write a shell script to run these commands. I usually connect from terminal using commands as below
//first go to the directory
cd /opt/novell/sentinel/3rdparty/postgresql/bin/
// then type following
export LD_LIBRARY_PATH=/opt/novell/sentinel/3rdparty/postgresql/lib/
// then fire following command
./psql --host 127.0.0.1 --port 5432 --dbname=SIEM --username=dbauser
Password for user dbauser: ****

Why don't you update your PATH and export LD_LIBRARY_PATH permanently, by adding to your .profile these lines:
PATH=/opt/novell/sentinel/3rdparty/postgresql/bin/:$PATH
export LD_LIBRARY_PATH=/opt/novell/sentinel/3rdparty/postgresql/lib/
Then use the script to connect DB as simple as following
#!/bin/sh
psql --host=127.0.0.1 --port=5432 --dbname=SIEM --username=dbauser
After you run the script, you will be asked about the password.
If you would like not to enter password every time, you can use the password file .pgpass (see documentation for details), just add to your ~/.pgpass the following line:
127.0.0.1:5432:SIEM:dbauser:your_password
Be safe, dissallow any access to world or group:
chmod 0600 ~/.pgpass.
After this, you can connect to your db by using script above without password prompt.

Related

How to invoke an interactive bash shell as another user with initial commands?

I currently have a RemoteCommand, RemoteCommand sudo su - admin-user set up in my ssh config that allows me to connect to a system and immediately switch users to an admin user. This is because I must only make changes to the system as that admin-user, the admin-user does not have a password, and I am not allowed to add my public key to the system to connect without specifying a password. This works perfectly; I'm able to login as "login-user", and it immediately switches me to the admin-user.
However, I want to do the same, and create a temporary function that would allow me to run a common command in a more shorthand manner. This is because I am not allowed to change the admin-user's bashrc.
My thought in setting this up was to do the following in my ssh config:
HOST SYSTEM
Hostname 12.34.56.78
User login-user
RemoteCommand sudo -Hu admin-user /usr/bin/bash --init-file <(echo ". ~/bashrc; function testy() { ls ; }") -li
...
In this case, I'm just testing with a function that runs ls. Just using a named pipe to source the normal bashrc and add a func to the new shell via an init file/rc file.
This creates an interactive bash shell as the admin-user as expected, but upon trying to run the function testy in this shell, I get bash: testy: command not found. Doing the same without switching users in the same step works, but not if I add the flags to run the shell as admin-user. I can't figure out how to get this working. Any help using this approach or another is greatly appreciated!
Likely a named pipe sharing issue. You can use another wrapper shell:
HOST SYSTEM
Hostname 12.34.56.78
User login-user
RemoteCommand sudo -Hiu admin-user /usr/bin/bash -c 'exec /usr/bin/bash --init-file <(echo ". ~/bashrc; function testy() { ls ; }")'
...
. ~/.bashrc also likely can be unnecessary but that's besides the main point.

Apache Airflow - SSH connection with Bash operator - possible?

I am trying to start a shell script using SSH operator in Apache Airflow with SSH operator defined like this:
task1= SSHOperator(
ssh_conn_id="ssh_dev_conn",
command=t1_ssh,
task_id="task1",
dag=dag
)
Command is defined like this:
t1_ssh = """
sudo su - db_user
echo whoami
/home/scripts/script1.sh
"""
According to user permissions only db_user is allowed to start this script, so I am trying to login with that user and with next command I am trying to run the script but I am getting permission denied error message. Echo whoami is returning different user, not db_user, and conclusion is that SSH operator makes new connection for every command so I need to find out how to login with db_user and then run the script in the next command?
First I want to ask is, is it possible with BashOperator instead SSH operator?
But I need to establish SSH connection to ssh_dev_conn...
If BashOperator is not solution, is there any way to log as db_user in Linux which has permission to run scripts, and then run script with other command?
Following one-line is not solution because of administration rules:
sudo -u db_user /home/scripts/script1.sh
I need solutions for Airflow and Airflow v2.
I found example on Airflow: How to SSH and run BashOperator from a different server but it doesn't include sudo command with other user, and it shows example of simple command which works fine, but not for my example.

Issue while executing commands on server B from Server A

I have a shell script which I need to execute from server A which executes commands in Server B as well. But I can execute those commands only being a root user in server B.
Manually if I login to server B then I have to change the user to root and execute delete commands. To automate this I am trying to write a script and execute from server A, but it asks me for password. How do I add my password in the script? (Though it is not recommended), or please suggest if any other way to tackle this.
Add your username in /etc/sudoers with nopasswd to remove password prompt
$ visudo
user ALL=(ALL) NOPASSWD: ALL
You can use sshpass command with -p option
sshpass -p 'your_password' ssh root#your_host ls
refer manual of sshpass for more options

psql asks for password and does not read from pgpass.conf

I have installed my Postgresql database on a Windows server environment. I'd like to schedule a job using Windows Task scheduler to run every night so I need to run the following command without asking for password:
psql -U myUserName-d myDBName -c "select MyFunctionName()"
When I run the above query in my cmd shell, it asks me for password. When I enter the password manually, the function is correctly run.
So my solution is to read from the pgpass.conf file so no password is required.
Here are the things I have done to achieve this:
I created the pgpass.conf file in a directory I created in the %appdata% (AppData\Roaming\postgresql to be precise).
Here are the contents of this file:
localhost:5432:myDBName:myUserName:myPassword
I have also tried with the value 127.0.0.1 instead of localhost above.
I, then, added the an environment variable (in the user variables for administrator list) called PGPASSFILE and gave it the pgpass.conf location.
;C:\Users\administrator\AppData\Roaming\postgresql\pgpass.conf
Finally I stopped and restarted my Postgres service on Windows services and re-ran the command. But it is still asking for password.
How can I let my command know from where to read the password?
If you don't want to set the PGPASSFILE environment variable, put the password file in the standard location %APPDATA%\postgresql\pgpass.conf as described by the documentation.

How to enter Multiple entries in .pgpass file?

I am supposed to execute same psql command from a bash script on 5 remote machines using a username and password.
I have read that we have to pass the credentials in .pgpass file and use the -w option while executing the psql command.
But how can I execute the same command on the 5 machines using the same .pgpass file?
You can add multiple entries in .pgpass file for e.g.
syntax:
hostname:port:database:username:password
sample file:
test.net:5432:testdb:testuser:testpass
test1.net:5432:testdb1:testuser1:testpass1
test2.net:5432:testdb2:testuser2:testpass2
Make sure the permission of .pgpass file is set to 0600
chmod 0600 .pgpass
You can also use wildcards too (such as *) which is particularly handy for the database.
This means that for the pgpassfile syntax:
hostname:port:database:username:password
Can be used with value such as:
my-host:5432:*:my-username:my-plaintext-password
To enable you to connect to all databases on the server using the same credential. If you need a different credential for specific databases, then use more rows preceding this one.

Resources