How to use openLDAP and slapd proxy to authenticate with Active Directory without using the deprecate slapd.conf - proxy

I have a Linux based app which using openLDAP to communicate with slapd to retrieve authorization data and I'd like to get authentication working using a connection to Active Directory. The examples I've seen are all related to using the slapd proxy with the slapd.conf file however I'm not using slapd.conf as it appears to have been deprecated. How do you configure slapd to be a proxy into Active Directory without using slapd.conf?

Related

Connection over a proxy Quickfix/n

I am trying to establish a connection with quickfix/n but I am behind the company firewall. So I need to connect over a proxy to the acceptor.
According to the documentation of quickfix/j it is possible via JAVA, however in the documentation of quickfix/n there is no information about possibility of connecting to the acceptor over a proxy via C#.
Is it possible at all to do that from C# with quickfix/n ?
In JAVA this can be done via configuration file, e.g.:
ProxyType=http
ProxyVersion=1.1
ProxyHost=XXX.XXX.XXX.XXX
ProxyPort=YYYY
QuickFix/n v.1.7.0 does not support proxy at all.
QuickFix/n v.1.10.0 does support proxy (automatic via. WebRequest.GetSystemWebProxy()), however it does not support bypassing of the proxy details (user name and password).
I have created my own fork from v1.7.0 that supports user defined proxy (not WebRequest.GetSystemWebProxy) + credentials. This can be found here: https://github.com/mcjacek/quickfixn
ProxyHost=proxy.intranet.yourproxy.com
ProxyPort=8080
ProxyUserName=UserName
ProxyUserNamePassword=Password
I will soon refactor and create PR to merge it to master.

Snowflake ODBC driver not using valid proxy environment variables?

I have a proxy at work that I'd like to use to connect PowerBI and Tableau to Snowflake. I should be able to set my proxies as the Snowflake documentation specifies, but this doesn't work for me when I try to connect to Snowflake using PowerBI or Tableau ('Couldn't resolve proxy name' errors).
I know that I am using a working proxy because I can use it manually to install Python libraries. Here is the Snowflake documentation:
https://docs.snowflake.com/en/user-guide/odbc-parameters.html#using-environment-variables
As per the documentation, I am setting the proxy as follows via cmd line:
set http_proxy=<http_proxy>
set https_proxy=<https_proxy>
I've also tried setting these manually in env vars. I know that these proxies work because I can use them to install python libraries:
python -m pip --proxy <http_proxy> install -r <github_requirements>
What else might be stopping me from using the proxy? I went so far as to set-up a AWS EC2 to get around the proxy to ensure my credentials to snowflake were valid and they are. But I'd like to connect to Snowflake without AWS since that will eventually add an additional cost.

How to use anything but Google Shell or Web browser when oauth2.googleapis.com is blacklisted (not sure about this)?

I can not connect to Google Services from client application if it is trying to communicate with oauth2.googleapis.com (which is probably blocked in my corporate network - I dont know how to test it for sure).
I tried BigQuery with JDBC driver in Dbeaver. With basic settings.
User-based login does this:
It generates link for OAUTH. I open the browser and login with the right google account. Then I insert generated code into the Dbeaver and I recieve that AUTH has failed.
Service-based login does this:
It does not want me to visit any webpage. It just tells me:
[Simba][BigQueryJDBCDriver](100004) HttpTransport IO error : oauth2.googleapis.com.
I also tried to use ODBC, where PROXY can be filled in. But no luck.
When I take a look into 'Proxy Options' the proxy port is always rewritten by proxy host. Weird.
This is what happens when i click on 'catalog' or 'dataset' drop-down field. I cant do any further steps.
BUT!
When I set my HTTP PROXY in GCLOUD CLI APP then communication works. And I can call BQ from it.
Does it mean that GCLOUD communicates through HTTP Proxy and DBeaver or ODBC does not? Or does it mean that GCLOUD does not need oauth2.googleapis.com but ODBC and JDBC do and it is blacklisted? I am confused.
We need to migrate from our internal environment to GCP. We would love to use various applications. I would ask for whitelisting oauth2.googleapis.com but i am not sure this is the only problem as GCLOUD app works without any flaws.
I am not-experienced with networking so i am more than happy to update / correct this question or add any info (if you need) to help me understand this issue. Thank you
According to your description, your corporate network is using a Proxy to reach out Internet, this is the reason why gcloud is capable to reach out BigQuery service when Proxy settings are configured in your system; through Cloud SDK Proxy settings or HTTP PROXY environment variable.
You require to setup the proxy settings within the JDBC connection string as described in Simba JDBC driver documentation, e.g.:
jdbc:bigquery:DataSetId=MyDataSetId;ProjectId=MyProjectId;OAuthType=1;ProxyHost=MyProxyHost;ProxyPort=MyProxyPort;ProxyUID=MyProxyUsername;ProxyPWD=MyProxyPassword
This connection string will indicate the Proxy settings to Simba JDBC driver.

Accessing cassandra without hardcoded username password

I have an existing Datastax Cassandra setup that is working. We just added authentication to the system and now we can log in with our AD accounts. This is very nice and certainly works. However applications need to use a hard-coded username/password in order to connect.
In SQL Server we were able to setup a user to run the service as and then it would connect and work through AD. However in Cassandra it is not the same.
If I don't want to include usernames and especially passwords in my app.config files what are my options?
You can use authentication via LDAP with DSE (Datastax Enterprise), so the authentication stage is done with LDAP instead of the internal authentication in DSE which you're using at the moment. Note that my comments here apply to DSE5.0 onwards but you can use LDAP auth with earlier versions of DSE from 4.6 onwards.
The documentation (link below) covers this. The basic steps are as follows:
Configure your authenticator in the cassandra.yaml to use the DSE authenticator
authenticator: com.datastax.bdp.cassandra.auth.DseAuthenticator
Create an internal role in cassandra to map to the LDAP group(s) in your LDAP server using the CREATE ROLE command
Ensure all the users you need to use map to the relevant LDAP group (part of your LDAP config)
Configure your dse.yaml to have the correct settings for your LDAP server
Restart the DSE process for the settings to take effect
The following documentation gives some good examples and background information:
https://docs.datastax.com/en/latest-dse/datastax_enterprise/unifiedAuth/unifiedAuthConfig.html
https://docs.datastax.com/en/latest-dse/datastax_enterprise/sec/authLdapConfig.html
Note: when configuring the dse.yaml note the comment in the docs regarding user_search_filter:
When using Active Directory set the filter to (sAMAccountName={0})

Cannot connect to RabbitMq from an Asp.Net application

I have a asp.net mvc application that interacts with RabbitMq. Everything works great locally.
However, on our deployment server it cannot connect
DEBUG|MassTransit.RabbitMqTransport.Integration.RabbitMqConnectionCache|Connecting: muyuser#localhost:5672/|
ERROR|MassTransit.RabbitMqTransport.RabbitMqReceiveTransport|RabbitMQ connection failed: Connect failed: muyuser#localhost:5672/|
What I'm able to gather is this
In order to connect to RabbitMq you need a valid .erlang.cookie in (on windows) your User root
As best I can tell, this cookie is created when you install rabbitmq
In development we're using localdb which runs as the developer's user (which has this cookie)
In production the application runs off of IIS which uses the application pool and the built-in ApplicationPoolIdentity account. Which doesn't have a User folder for the .erlang.cookie file to live in.
So the question becomes...what now? How is this intended to work?
Obviously we could create a dedicated user for the web application but our system administrator is understandably very reluctant to do this.
Another clue, is that when I tried to RDP, log in as myself and connect to rabbit I found that I could not. After troubleshooting I discovered that my cookie didn't match up with that of others who could! I replaced it with the one from c:\windows\.erlang.cookie and could then connect from cli. It seems possible like there is a cookie installed somewhere for the applicationpoolidentity but it is an incorrect cookie. What is the location where it would go?
Erlang cookies are used for internode communication, whether it is for clustering RabbitMQ or for contacting RabbitMQ via the command line using rabbitmqctl.
If you have problems with an AMQP connection, then the erlang cookie has nothing to do here.
Take a look at access control https://www.rabbitmq.com/access-control.html to see if your user is properly configured.
At the same time check the server logs to see why the connection is refused.

Resources