How to do configuration of PostgreSQL database in DHIS2? - windows

I've downloaded and executed DHIS-2 in the Windows operating system. I want to connect with a PostgreSQL database. I'm trying the below configuration settings in the dhis.conf file but it's not working.
connection.dialect = org.hibernate.dialect.PostgreSQLDialect
connection.driver_class = org.postgresql.Driver
connection.url = jdbc:postgresql:dhis2
connection.username = dhis
connection.password = dhis
connection.schema = update
encryption.password = abcd
It's showing me following error message.
HTTP ERROR: 503
Problem accessing /. Reason:
Service Unavailable

Please verify that the database name, username and password is correct.
You can test this on the command line / terminal with the following command:
psql -d dhis2 -U dhis -p
Then enter your password. If you cannot connect, one or more of database name, username and password are incorrect.

Related

Knex - error: password authentication failed for user "Joshua Rieth"

I am working on building out my PostgreSQL database with Knex and I keep getting this error when I run knex migrate:latest.
error: password authentication failed for user "Joshua Rieth"
at Parser.parseErrorMessage (C:\Users\Joshua Rieth\Documents\Projects\Lambda\Labs\homerun-be\node_modules\pg-protocol\dist\parser.js:278:15)
at Parser.handlePacket (C:\Users\Joshua Rieth\Documents\Projects\Lambda\Labs\homerun-be\node_modules\pg-protocol\dist\parser.js:126:29)
at Parser.parse (C:\Users\Joshua Rieth\Documents\Projects\Lambda\Labs\homerun-be\node_modules\pg-protocol\dist\parser.js:39:38)
at Socket.<anonymous> (C:\Users\Joshua Rieth\Documents\Projects\Lambda\Labs\homerun-be\node_modules\pg-protocol\dist\index.js:8:42)
at Socket.emit (events.js:315:20)
at addChunk (_stream_readable.js:295:12)
at readableAddChunk (_stream_readable.js:271:9)
at Socket.Readable.push (_stream_readable.js:212:10)
at TCP.onStreamRead (internal/stream_base_commons.js:186:23)
I am assuming that is that it has to do with my username, but it should be the default postges user that I used when setting up PostgreSQL. I should also mention that the user and password is in a .env file.
Spec
PostgreSql 12
Windows 10
NPM 6.14.6
pg 8.3.0
knex 0.20.15
When you set up knex you need to supply the username / password, unless you'd like to connect as a "current OS user". There are multiple places where you could have set up the credentials, so it would be nice if you could check them all (and inform us :) ).
One way of doing this is:
require('knex')({
client: 'pg',
connection: 'postgres://username:password#localhost:5432/dbname'
})
Alternatively, you could set up environment variables PGUSER and PGPASSWORD, Knex connection would pick those up too.
Your connection could also be configured with full details rather than the link:
var knex = require('knex')({
client: 'pg',
version: '12.0',
connection: {
host : '127.0.0.1',
user : 'your_database_user',
password : 'your_database_password',
database : 'myapp_test'
}
});
To exclude your usernames / passwords from the actual code I would strongly suggest you use something like dotenv package: https://www.npmjs.com/package/dotenv
(NB: below is not about your specific error, as your username is not specified correctly) Another thing that I would check on Pg side is that you have enabled password authentication in your pg_hba.conf. md5 method allows for the password-hash based authentication, which should be good for you. It should look like this:
# TYPE DATABASE USER CIDR-ADDRESS METHOD
host all all 123.123.123.123/32 md5

Databrick - Reading BLOB from Mounted filestorage

I am using Azure databricks and I ran the following Python code:
sas_token = "<my sas key>"
dbutils.fs.mount(
source = "wasbs://<container>#<storageaccount>.blob.core.windows.net",
mount_point = "/mnt/gl",
extra_configs = {"fs.azure.sas.<container>.<storageaccount>.blob.core.windows.net": sas_token})
This seemed to run fine. So I then ran:
df = spark.read.text("/mnt/gl/glAgg_LE.csv")
Which gave me the error:
shaded.databricks.org.apache.hadoop.fs.azure.AzureException: com.microsoft.azure.storage.StorageException: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Not sure what I'm doing wrong though. I'm pretty sure my sas key is correct.
Ok if you are getting this error - double check both the SAS key and the container name.
Turned out I had pointed it to the wrong container!

Vertica admintools error

when I try to connect to database from admintools I am getting following error:
Error: Unable to connect to database
Hint: Username or password could be invalid
I have found in the logs following error:
Apr 20 08:08:29 [24291] [vsql.connect spawn] Exception: Error! pty.fork() failed: out of pty devices
Do you know what is the problem?
Your node might be down
Check logs at
/opt/vertica/log
or at
/opt/vertica/config/admintools.conf
check restart policy section is right
[Database:mydb] host = 11.11.11.11
restartpolicy = ksafe

cx_Oracle connection by python3.5

I've tried several attempt to connect Oracle DB but still unable to connect. Following is my code to connect. However, I could connect Oracle DB through the terminal like this:
$ sqlplus64 uid/passwd#192.168.0.5:1521/WSVC
My evironment: Ubuntu 16.04 / 64bit / Python3.5
I wish your knowledge and experience associated with this issue to be shared. Thank you.
import os
os.chdir("/usr/lib/oracle/12.2/client64/lib")
import cx_Oracle
# 1st attempt
ip = '192.168.0.5'
port = 1521
SID = 'WSVC'
dsn_tns = cx_Oracle.makedsn(ip, port, SID)
# dsn_tns = cx_Oracle.makedsn(ip, port, service_name=SID)
db = cx_Oracle.connect('uid', 'passwd', dsn_tns)
cursor = db.cursor()
-------------------------------------------------
# 2nd attempt
conn = "uid/passwd#(DESCRIPTION=(SOURCE_ROUTE=OFF)(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=192.168.0.5)(PORT=1521)))(CONNECT_DATA=(SID=WSVC)(SRVR=DEDICATED)))"
db = cx_Oracle.connect(conn)
cursor = db.cursor()
------------------------------------------------------
# ERROR Description
cx_Oracle.InterfaceError: Unable to acquire Oracle environment handle
The error "unable to acquire Oracle environment handle" is due to your Oracle configuration being incorrect. A few things that should help you uncover the source of the problem:
when using Instant Client, do NOT set the environment variable ORACLE_HOME; that should only be set when using a full Oracle Client or Oracle Database installation
the value of LD_LIBRARY_PATH should contain the path which contains libclntsh.so; the value you selected looks like it is incorrect and should be /usr/lib/oracle/12.2/client64/lib instead
you can verify which Oracle Client libraries are being loaded by using the ldd command as in ldd cx_Oracle.cpython-35m-x86_64-linux-gnu.so

Hortonworks Hive Username password configuration

I am attempting to get a connection to the hive server and keep getting:
HiveAccessControlException Permission denied: user [hue] does not have [CREATE] privilege on [default/testHiveDriverTable]
Not sure what username/password combination I should use.
I have Hortonworks running on a virtual machine. The code I am attempting to use to connect to the Hive server is:
Connection con = DriverManager.getConnection(“jdbc:hive2://192.168.0.6:10000/default”, “hue”, “”);
I have also tried connecting via the "root", "Hadoop" username, password configuration.
By the look of the error message, the Connection is OK, but you lack privileges to execute the Statement just below... because that "testHiveDriverTable" label smells like you did a copy/paste on some dummy code example such as:
Connection con = DriverManager.getConnection("jdbc:hive://10.0.2.15:10000/default", "", "");
Statement stmt = con.createStatement();
String tableName = "testHiveDriverTable";
stmt.executeQuery("drop table " + tableName);
stmt.executeQuery("create table " + tableName + " (key int, value string)");
The "no [CREATE] privilege on [default/testHiveDriverTable]" message is a good fit for a create table testHiveDriverTable attempt in database default.
Bottom line: congratulations, you actually could connect to Hive. Now you can work on how to configure the "default" database so that you can actually create tables :-)
Have you tried connecting without giving username and password? I have the following connection string on my cloudera virtualbox which works
Connection con = DriverManager.getconnection(jdbc:hive2://localhost:10000/default,"","");
Here default is the DB name.

Resources