Getting error connecting to Kafka Kerberos - go

getting this error :-
SASL authentication error: SASL handshake failed (step): SASL(-1): generic failure: GSSAPI Error: An invalid name was supplied (Success) (after 25ms in state AUTH_REQ)
kinit command is working fine with the keytab file, need to know what might be the issue?
Looked over internet, tried changing file permission of krb5.conf file, nothing changed.

Related

Passwordless chef client bootstrapping

I am bit familiar with Chef and its bootstrapping techniques. I am trying to bootstrap my new chef-client/node without passing password
I tried below by generating a ssh key but still failing
knife bootstrap MY_NODE_IP -x SERVER_ADMIN_USERNAME -i PATH_TO_KEY_FILE --sudo --node-name THE_NODE_NAME
On triggering above command on Chef DK getting error as below
WARN: [SSH] PTY requested: stderr will be merged into stdout
WARN: [SSH] connection failed, terminating (#<Net::SSH::AuthenticationFailed: Authentication failed for user user#mynode>)
ERROR: Train::Transports::SSHFailed: SSH session could not be established
I also tried doing manual installation as per below instruction , but again a failure https://serverfault.com/questions/761167/how-to-manually-set-up-a-chef-node
I created a client manually, but I was unable to create a node in chef server manually. Please suggest
Getting network error as below
Networking Error:
-----------------
Error connecting to https://myserver/organizations/organization/nodes/mynode - Failed to open TCP connection to www.internet:8080 (getaddrinfo: Name or service not known)
Bootstrapping from my chef DK also throws an error
Is there a way to bootstrap linux chef client without using password from a windows chef DK?
Below is my Chef environment
1.Chef Infra Client: 15.14.0
2.Chef Workstation 0.8.7.1
3.Chef-server 12.18.14

Uploading file to client server

I am trying to upload a file to client server with help of shell script facing below error in command prompt as well as in cygwin software.
curl: (35) schannel: next InitializeSecurityContext failed: Unknown error (0x80092012) - The revocation function was unable to check revocation for the certificate.
Even after adding certificate.
Using Windows X OS.
can anyone suggest to solve this issue.

greenplum initialization failed

When I tried to initialize Greenplum I got the following error
20180408:23:21:02:017614 gpstop:datanode3:root-[INFO]:-Starting gpstop with args:
20180408:23:21:02:017614 gpstop:datanode3:root-[INFO]:-Gathering information and validating the environment...
20180408:23:21:02:017614 gpstop:datanode3:root-[ERROR]:-gpstop error: postmaster.pid file does not exist. is Greenplum instance already stopped?
also when i tried to check gpstate command i got the following error
20180408:23:21:48:017711 gpstate:datanode3:root-[INFO]:-Starting gpstate with args:
20180408:23:21:48:017711 gpstate:datanode3:root-[INFO]:-local Greenplum Version: 'postgres (Greenplum Database) 5.7.0 build f7c6eb5-oss'
20180408:23:21:48:017711 gpstate:datanode3:root-[CRITICAL]:-gpstate failed. (Reason='could not connect to server: Connection refused
I also did the configuration an add a permission on PostgreSQL.conf, but the same issue
You have pasted the output of gpstop.
gpstop error: postmaster.pid file does not exist. is Greenplum
instance already stopped?
Which means that the database is not running.

Ispconfig 3: Error: Authentication failed. Error: Critical error Error: Could not connect to server

I am using ispconfig, and I have created a FTP user with the respective password. But when I try to connect to that server I hm experiencing this error.
Error: Authentication failed.
Error: Critical error
Error: Could not connect to server
I tried all the solutions given Here, but to no avail.
So I found the error,
I had used FileZilla to login into the server for uploading some file, and the connection was using the sftp:// protocol.
When I tried to conenct with the ftp account the above error was being reported, even after I updated/confirmed the user password.
Finally I realized that the protocol was wrong and after removing the sftp:// protocol prefix, it defaulted to ftp:// which the server readily accepted.

In cdh5 After changing the authentication from kerberos to simple data nodes are not coming up

After changing the properties :
authentication from kerberos to simple and
authorization to false, data nodes are not coming up.
I made these changes from Cloudera Manager(CDH 5.0).
In logs i see below error :
java.io.IOException: Failed on local exception: java.net.SocketException: Permission denied; Host Details : local host is: "ip-10-0-0-132.us-west-2.compute.internal"; destination host is: (unknown):0;

Resources