Hadoop Credentials using Password File - hadoop

I was going through the doncumentaion of Hadoop Credentials as provided in
https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/CredentialProviderAPI.html
But while using the 3rd option to Provide password for keystone using password file, I am getting a failure everytime. An excerpt of the command used is provided below. Can anyone tell me what is the error and how to rectify this.
hadoop credential -Dhadoop.security.credstore.java-keystore-provider.password-file=/home/dir/test.txt create mssql2.password -value 'SomePassword' -provider localjceks://file/home/dir/aws3.jceks
The Error is provided below:
java.io.IOException: Password file does not exist
at org.apache.hadoop.security.ProviderUtils.locatePassword(ProviderUtils.java:135)
at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.locateKeystore(AbstractJavaKeyStoreProvider.java:323)
at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.(AbstractJavaKeyStoreProvider.java:86)
at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider.(LocalJavaKeyStoreProvider.java:58)
at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider.(LocalJavaKeyStoreProvider.java:50)
at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider$Factory.createProvider(LocalJavaKeyStoreProvider.java:177)
at org.apache.hadoop.security.alias.CredentialProviderFactory.getProviders(CredentialProviderFactory.java:58)
at org.apache.hadoop.security.alias.CredentialShell$Command.getCredentialProvider(CredentialShell.java:181)
at org.apache.hadoop.security.alias.CredentialShell$CreateCommand.validate(CredentialShell.java:345)
at org.apache.hadoop.security.alias.CredentialShell.run(CredentialShell.java:81)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.security.alias.CredentialShell.main(CredentialShell.java:460)

The problem is, this property expect the file name not the file path and then Hadoop api will search for this name on the Hadoop Classpath. Since this file contains clear text so another way is to just
export HADOOP_CREDSTORE_PASSWORD=${PASSWORD}

Related

What is the default H2 console Administration password

I'm trying to figure out what is de default administration password. i've tried to add webAdminPassword=pass in .h2.server.properties but is not working
try this,
The answer is -user "".
else try,
name = "sa"
password = ""
In case you got stuck with the default non-blank user when running the client, the full set of parameters will get you past that:
java -cp \h2.jar org.h2.tools.Shell -url "jdbc:h2:file:" -driver "org.h2.Driver" -user "" -password ""
Please go through the below link for more info,
https://www.ge.com/digital/documentation/meridium/APMConnect/V4302_UDLP210/Content/ChangeH2ConsolePassword.htm
There is no default in H2 itself. If you have a system tray icon of H2 Console, you can open the console from its context menu and you will be able to access these features without a password in the opened window.
In all other cases you need to set up the password explicitly. You can add it to the configuration file .h2.server.properties in the home directory of your user with webAdminPassword=some_password (don't forget to restart the console) or you can pass it in the command line java -cp h2-1.4.200.jar org.h2.tools.Server -web -webAdminPassword some_password … and such password will be used instead of password from the configuration file.
Your question does not describe how and where the H2 Console was started. If it was started on another system by another user, you need to edit the configuration file on that system in the profile of that user. If another password was passed in the startup parameters, the configuration file has no effect and you need to figure out what was passed to it.
With spring-boot you define the following variable in application properties: spring.h2.console.settings.web-admin-password. For standalone h2 console, you would use the solution posted by Evgenij Ryazanov.

'ORA-46632: password-based keystore does not exist' but the file ewallet.p12 exists

I'm using Oracle 12c and when I try to create an auto-login keystore with this command :
ADMINISTER KEY MANAGEMENT
CREATE AUTO_LOGIN KEYSTORE FROM KEYSTORE
'home/BetaCrasher/app/BetaCrasher/admin/orcl/wallet'
IDENTIFIED BY hello;
I get this error:
ORA-46632: password-based keystore does not exist
I check the path and the file for the keystore is there.
I also tried using this path and it still doesn't work
'home/BetaCrasher/app/BetaCrasher/admin/orcl/wallet/ewallet.p12'
I think you might be missing a leading slash in your path, since it appears to be a full path to the wallet folder.
Also, don't forget to grant access to the path to your Oracle's OS user.

Issue creating/accessing hive external table with s3 location from spark thrift service

I have configured the s3 keys (access key and secret key) in a jceks file using hadoop-credential api. Commands used for the same are as below:
hadoop credential create fs.s3a.access.key -provider jceks://hdfs#nn_hostname/tmp/s3creds_test.jceks
hadoop credential create fs.s3a.secret.key -provider jceks://hdfs#nn_hostname/tmp/s3creds_test.jceks
Then, I am opening a connection to Spark Thrift Server using beeline and passing the jceks file path in the connection string as below:
beeline -u "jdbc:hive2://hostname:10001/;principal=hive/_HOST#?hadoop.security.credential.provider.path=jceks://hdfs#nn_hostname/tmp/s3creds_test.jceks;
Now, when I try to create an external table with the location in s3, it fails with the below exception:
CREATE EXTERNAL TABLE IF NOT EXISTS test_table_on_s3 (col1 String, col2 String) row format delimited fields terminated by ',' LOCATION 's3a://bucket_name/kalmesh/';
Exception: Error: org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: java.nio.file.AccessDeniedException s3a://bucket_name/kalmesh: getFileStatus on s3a://bucket_name/kalmesh: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: request_id), S3 Extended Request ID: extended_request_id=) (state=,code=0)
I don't think jceks support for the fs.s3a. secrets went in until Hadoop 2.8. I don't think; it's hard to tell from the source. If that is the case, and you are using Hadoop 2.7, then the secret isn't going to to be picked up. Afraid you will have to put it in the config.
I had a similar situation, just with Drill instead of Hive. But like in your case:
using Hadoop 2.9 jars (1st version to support AWS KMS)
writing to s3a://
encrypting with SSE-KMS
... and got AmazonS3Exception: Access Denied.
In my case (perhaps in yours, as well) the exception description was a bit ambiguous. The reported AmazonS3Exception: Access Denied did not originate from S3 but from KMS! Access was denied to the key I used for encryption. User making the API calls was not on key's users list - once I added that user to key's list writing started to work and I could create encrypted tables on s3a://...
For me the following s3 permissions were required:
s3:ListBucket
s3:GetObject
s3:PutObject
I was receiving the same error and was missing s3:ListBucket.
As for KMS permissions (if applicable):
kms:Decrypt
kms:Encrypt
kms:GenerateDataKey

Hdfs to s3 Distcp - Access Keys

For copying the file from HDFS to S3 bucket I used the command
hadoop distcp -Dfs.s3a.access.key=ACCESS_KEY_HERE\
-Dfs.s3a.secret.key=SECRET_KEY_HERE /path/in/hdfs s3a:/BUCKET NAME
But the access key and sectet key are visible here which are not secure .
Is there any method to provide credentials from file .
I dont want to edit config file ,which is one of the method I came across .
I also faced the same situation, and after got temporary credentials from matadata instance. (in case you're using IAM User's credential, please notice that the temporary credentials mentioned here is IAM Role, which attach to EC2 server not human, refer http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html)
I found only specifying the credentials in the hadoop distcp cmd will not work.
You also have to specify a config fs.s3a.aws.credentials.provider. (refer http://hortonworks.github.io/hdp-aws/s3-security/index.html#using-temporary-session-credentials)
Final command will look like below
hadoop distcp -Dfs.s3a.aws.credentials.provider="org.apache.hadoop.fs.s3a.TemporaryAWSCredentialsProvider" -Dfs.s3a.access.key="{AccessKeyId}" -Dfs.s3a.secret.key="{SecretAccessKey}" -Dfs.s3a.session.token="{SessionToken}" s3a://bucket/prefix/file /path/on/hdfs
Recent (2.8+) versions let you hide your credentials in a jceks file; there's some documentation on the Hadoop s3 page there. That way: no need to put any secrets on the command line at all; you just share them across the cluster and then, in the distcp command, set hadoop.security.credential.provider.path to the path, like jceks://hdfs#nn1.example.com:9001/user/backup/s3.jceks
Fan: if you are running in EC2, the IAM role credentials should be automatically picked up from the default chain of credential providers: after looking for the config options & env vars, it tries a GET of the EC2 http endpoint which serves up the session credentials. If that's not happening, make sure that com.amazonaws.auth.InstanceProfileCredentialsProvider is on the list of credential providers. It's a bit slower than the others (and can get throttled), so best to put near the end.
Amazon allows to generate temporary credentials that you can retrieve from http://169.254.169.254/latest/meta-data/iam/security-credentials/
you can read from there
An application on the instance retrieves the security credentials provided by the role from the instance metadata item iam/security-credentials/role-name. The application is granted the permissions for the actions and resources that you've defined for the role through the security credentials associated with the role. These security credentials are temporary and we rotate them automatically. We make new credentials available at least five minutes prior to the expiration of the old credentials.
The following command retrieves the security credentials for an IAM role named s3access.
$ curl http://169.254.169.254/latest/meta-data/iam/security-credentials/s3access
The following is example output.
{
"Code" : "Success",
"LastUpdated" : "2012-04-26T16:39:16Z",
"Type" : "AWS-HMAC",
"AccessKeyId" : "AKIAIOSFODNN7EXAMPLE",
"SecretAccessKey" : "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY",
"Token" : "token",
"Expiration" : "2012-04-27T22:39:16Z"
}
For applications, AWS CLI, and Tools for Windows PowerShell commands that run on the instance, you do not have to explicitly get the temporary security credentials — the AWS SDKs, AWS CLI, and Tools for Windows PowerShell automatically get the credentials from the EC2 instance metadata service and use them. To make a call outside of the instance using temporary security credentials (for example, to test IAM policies), you must provide the access key, secret key, and the session token. For more information, see Using Temporary Security Credentials to Request Access to AWS Resources in the IAM User Guide.
if you do not want to use access and secret key (or show them on your scripts) and if your EC2 instance has access to S3 then you can use the instance credentials
hadoop distcp \
-Dfs.s3a.aws.credentials.provider="com.amazonaws.auth.InstanceProfileCredentialsProvider" \
/hdfs_folder/myfolder \
s3a://bucket/myfolder
Not sure if it is because of a version difference, but to use "secrets from credential providers" the -Dfs flag would not work for me, I had to use the -D flag as shown on the hadoop version 3.1.3 "Using_secrets_from_credential_providers" docs.
First I saved my AWS S3 credentials in a Java Cryptography Extension KeyStore (JCEKS) file.
hadoop credential create fs.s3a.access.key \
-provider jceks://hdfs/user/$USER/s3.jceks \
-value <my_AWS_ACCESS_KEY>
hadoop credential create fs.s3a.secret.key \
-provider jceks://hdfs/user/$USER/s3.jceks \
-value <my_AWS_SECRET_KEY>
Then the following distcp command format worked for me.
hadoop distcp \
-D hadoop.security.credential.provider.path=jceks://hdfs/user/$USER/s3.jceks \
/hdfs_folder/myfolder \
s3a://bucket/myfolder

TeamCity forgotten admin password - where to look?

I need to recover/reset the admin password for JetBrain's TeamCity.
I have full RDP access to the server so no problems there. It's just been 2 months since we used it so now I have forgotten my login - my usual ones don't work.
It is setup without a database at the moment, so was hoping the usernames would just be in a file somewhere, but no luck finding it so far.
From TeamCity 8 you can log in as a super user and change the password that way. You just need to use an empty username and last occurrence of the "super user authentication token" found in the logs\teamcity-server.log file as your password.
Please see the following for more information:
TeamCity 8 - http://confluence.jetbrains.com/display/TCD8/Super+User
TeamCity 9 - http://confluence.jetbrains.com/display/TCD9/Super+User
TeamCity 10 - https://confluence.jetbrains.com/display/TCD10/Super+User
In case none of those works, see http://sebastienlachance.com/post/Resetting-TeamCity-Password.aspx.
Open a command prompt and go to \webapps\ROOT\WEB-INF\lib folder. Now type the following :
..\..\..\..\jre\bin\java.exe -cp server.jar;common-api.jar;commons-codec-1.3.jar;util.jar;hsqldb.jar ChangePassword username newpassword
For TeamCity 6.5.4
From a command prompt in the [TeamCity install folder]\webapps\ROOT\WEB-INF\lib:
..\..\..\..\jre\bin\java -cp server.jar;common-api.jar;commons-codec-1.3.jar;util.jar;hsqldb.jar ChangePassword admin NewPassword
My username was 'admin' in my case (I think I set it during installation but I can't be sure).
I ommitted the path to TeamCity argument, it's smart enough to use the correct path (mine was c:\users\administrator.BuildServer)
When I provided the (wrong) path to TeamCity as an argument I received this message:
Using TeamCity configuration directory path: c:/TeamCity/.BuildServer
Exception in thread "main" java.sql.SQLException: Table not found in statement [UPDATE users SET PASSWORD = ? WHERE USERNAME = ? AND REALM IS NULL]
at org.hsqldb.jdbc.Util.throwError(Util.java:58)
at org.hsqldb.jdbc.jdbcPreparedStatement.<init>(jdbcPreparedStatement.java:1833)
at org.hsqldb.jdbc.jdbcConnection.prepareStatement(jdbcConnection.java:580)
at ChangePassword.main(ChangePassword.java:14)
In case this confuses other people too.
Super user direct URL :
http://servername:port/login.html?super=1
Open TeamCity log folder (Example C: drive: C:\TeamCity\logs) : teamcity-server.log, find the key: “Super user authentication”
[2019-03-04 12:14:30,770] INFO - jetbrains.buildServer.SERVER - Super user authentication token: `8347518935696887114` (use empty username with the token as the password to access the server)
You could try to reset the installation of TeamCity, by removing TeamCity data directory ($/.BuildServer directory by default)
Try the following:
First stop the TeamCity service (would also stop the build agent if installed).
Next open up a console, go to your java directory and run the following command from there:
java.exe -cp server.jar; hsqldb.jar ChangePassword USERNAME PASSWORD "PATH_TO_YOUR_TEAMCITY_INSTALLATION".BuildServer
I've just had to go through this pain with v5 EAP.
I managed to reset the password successfully by running:
C:\TeamCity\webapps\ROOT\WEB-INF\lib>..\..\..\..\jre\bin\java -cp server.jar;common-api.jar;commons-codec-1.3.jar;util.jar;hsqldb.jar ChangePassword admin password c:\TeamCity\.BuildServer
Although you'll need to substitute C:\TeamCity with wherever your installation is located.
In case this helps someone else, whoever installed TeamCity on my server placed the build directory under the Administrator's Profile, not in C:\TeamCity.
Specifically for TeamCity running on Tomcat on Windows it will be C:\ProgramData\JetBrains\TeamCity
The directory specified as the last parameter needs to be your Data Directory (find in /logs/teamcity-server.log)
You'll get the 'Table not found' error if you don't have this correct.
You'll get a 'The database is already in use' error if you've got TeamCity running.
you can also search you /logs/teamcity-server.log to see whether you created admin, administrator, or some other admin user name.
For everyone that may arrive at this article years after the original answer like I just did, there's a built-in super user account, and the password is regenerated every time team city is started, and the password is in the log. You can use this super user to login and reset any passwords. It's super easy.
https://confluence.jetbrains.com/display/TCD9/Super+User
TeamCity always uses a database - if you haven't explicitly configured one, it uses a HSQLDB database to store data internally.
When using an external database, user information is stored within that database, so it seems pretty likely that the user information in your case will be stored within the HSQLDB system.
You might be able to gain access to the system by futzing around with the database - but I'd suggest taking a backup first.
Second suggestion - drop the support guys at JetBrains an email. Even before my workplace splashed out on a TeamCity Enterprise license, their support was superb - fast, accurate and helpful.
With TeamCity 5 using MySQL (probably other versions and RDBMs as well, but untested), it's possible to update the password directly via SQL:
mysql> update users set password = md5("mypass123") where username = "bob";
Nevertheless, I'd stick with the CLI versions already mentioned by others if there isn't a good reason not to do so.
First point is if you logout the login screen has the username 'TCAdmin' already filled in, when it should be 'administrator'. TCAdmin is the full name of (I think) the default version 5 admin user. Changing that to administrator and then using the password I thought it was solved my issue.
For resetting...
In case it helps someone else on Windows XP on version 5 of TeamCity, my .BuildServer config info was also under my current logged in user's documents and settings folder. Also I was tripped up by a space in the list of jar files in Sebastien's good answer above.
So I changed to this directory in a command prompt:
c:\teamcity\webapps\ROOT\WEB-INF\lib
and then this command line (to set password: Password1) worked for me:
C:\TeamCity\webapps\ROOT\WEB-INF\lib>..\..\..\..\jre\bin\java.exe -cp server.jar;commonapi.jar;commons-codec-1.3.jar;util.jar;hsqldb.jar ChangePassword administrator Password1
Which gave output:
Using TeamCity configuration directory path: C:/Documents and Settings/tamw/.BuildServer
Password changed successfuly
Stop teamcity
You should pass path to your buildserver
e.g. if you installed build server to dir "c:\.BuildServer"
........\jre\bin\java.exe -cp server.jar;common-api.jar;commons-codec-1.3.jar;util.jar;hsqldb.jar ChangePassword username newpassword c:\.BuildServer
To change user password:
Shutdown server
Switch to the /webapps/ROOT/WEB-INF/lib directory
Invoke the following command:
Windows platform:
java -cp server.jar;common-api.jar;commons-codec-1.3.jar;util.jar;hsqldb.jar ChangePassword
Unix platform:
java -cp server.jar:common-api.jar:commons-codec-1.3.jar:util.jar:hsqldb.jar ChangePassword
You can skip the option, if you are using default path for TeamCity data files: /.BuildServer
[Ref: http://confluence.jetbrains.com/display/TCD7/Changing+user+password+with+default+authentication+scheme]
Here's what worked for me.
Shut down server services
> cd c:\TeamCity\webapps\ROOT\WEB-INF\lib>
then
> ..\..\..\..\jre\bin\java.exe -cp server.jar ;common-api.jar;commons-codec-1.3.jar;util.jar;hsqldb.jar ChangePassword admin password1 C:\ProgramData\JetBrains\TeamCity\
Without the path at the end, it would fail with:
Exception in thread "main" java.sql.SQLException: Table not found in statement [
UPDATE users SET PASSWORD = ? WHERE USERNAME = ? AND REALM IS NULL]
at org.hsqldb.jdbc.Util.throwError(Util.java:58)
at org.hsqldb.jdbc.jdbcPreparedStatement.<init>(jdbcPreparedStatement.ja
va:1833)
at org.hsqldb.jdbc.jdbcConnection.prepareStatement(jdbcConnection.java:5
80)
at ChangePassword.main(ChangePassword.java:14)
We are using Teamcity 7 with MS SQL Server as the RDBMS.
To reset your password you can use the following query:
UPDATE users SET password = LOWER(SUBSTRING(sys.fn_sqlvarbasetostr(HASHBYTES('md5','your_new_password')),3,32))
where username = "your_user_name";
Alternatively, You could use the TeamCity Server Log and retrieve the Super User token.
Using the token, go to the URL : http://(server):(port)/login.html?super=1
ie: http://localhost:92/login.html?super=1
Once you logged in, you could always create a new user or reset the password for the account in question.
I passed in the same situation and did login with super user, follow steps below:
1 - Get Token in the "teamcity-server.log" in the path "XX:\TeamCity\logs";
2 - Access and login using token at url: "/login.html?super=1";
More about it:
https://confluence.jetbrains.com/display/TCD18/Super+User

Resources