I'm looking for the correct way to give users anonymous access to a Kibana dashboard, but at the same time preventing them from having access to other Kibana features.
I read that the role kibana_dashboard_only_user is now deprecated (I'm on 7.6.2), and that I have to give anonymous users the cluster monitor privilege, so I tried creating a role that has:
cluster monitor privilege
read and view-metadata access to the index concerned
dashbaord read privilege on global spaces
but is doesn't work: I get the following error:
[security_exception] action [indices:data/read/search] is unauthorized
for user [anonymous_user]
then I tried adding the read rights on the indices .kibana*. It worked, but all the Kibana features are available, not only the dashboards.
How can I solve this?
Thanks.
Related
We have two different Azure cloud resource groups, RG1 and RG2, where RG1 hosts the ADB_source of the data source, and RG2 hosts the ADB_sink & ADLS_sink(gen2) of the data sink.
Use Case:
We have a few delta tables in ADB_source (ACL enabled) where a list of users has Read access.
In the ADB_source workspace, we need to read the delta tables and write them into ADLS_sink as parquet for further processing at the sink.
What's Available:
We have a high concurrency cluster created in ADB_Source workspace, which -
Allows only Python & SQL (dbutils.fs also restricted).
Credential Passthrough is disabled.
Has ACLs Enabled in spark config.
Has mount point created to a container in ADLS_sink.
Has no Admin Access to the cluster.
Errors Observed:
We could read the delta tables as expected and run action commands as long as they are in the ADB_source workspace. However, when we write that data into the ADLS_sink with .save(), we get the below error.
Py4JJavaError: An error occurred while calling o410.save. : java.lang.SecurityException: User does not have permission SELECT on any file. User does not have permission MODIFY on any file.
I would appreciate it if anyone could explain this and recommend additional security checks/accesses needed to implement the use case successfully.
This is happening because ACL is enabled, please refer to the documentation below:
https://learn.microsoft.com/en-us/azure/databricks/kb/security/table-create-security-exception
I setup an Elasticsearch Kibana stack with xpack.
from the built-in user, i logged in to the account and created a kibana_user and a kibana_dashboard_only user.
When i logged in, the login was successful in both the users but when i go to the dashboards or discover,
Im getting an
error in visualization internal server error
pop up error.
Why is this and how can i fix this?
All actions work fine with the built-in elastic user account.
Please help me.
You also need to give that user read access to the .kibana index
I am trying to copy an object from Phoenix region to Ashburn . The admin for the tenant still unable to perform this action . Am I missing any privileges?
I am seeing an error in the Work Request The Service Cannot Access the Source Bucket
Do I need to add additional policy statements?
Yes, the service needs access too.
You can refer to the documentation here, specifically:
Service Permissions
To enable object copy, you must authorize the service to manage objects on your behalf. To do so, create the
following policy:
allow service objectstorage-<region_name> to manage object-family in
compartment <compartment_name>
Because Object Storage is a
regional service, you must authorize the Object Storage service for
each region that will be carrying out copy operations on your behalf.
For example, you might authorize the Object Storage service in region
us-ashburn-1 to manage objects on your behalf. Once you do this, you
will be able to initiate the copy of an object stored in a
us-ashburn-1 bucket to a bucket in any other region, assuming that
your user account has the required permissions to manage objects
within the source and destination buckets.
I have an Elastic cluster at https://www.elastic.co/. I enabled Kibana, but then when I click Kibana, it prompts me to enter in a user/password.
My login to the normal Elastic cluster doesn't work -- is there a default user/password that I can enter, or how do I actually get into my Kibana dashboard? Ideally, I would just like to disable having to login (so just clicking the Kibana dashboard goes into it) -- is there a way to do that?
When you were creating your cluster, you were given a password with UID: elastic. It should have warned you that you needed to save the password as you cannot retrieve it. If you have not saved it, you will need to generate a new password from the Security tab on the left within the cluster.
When you login into Kibana, you can create more users through Management > Elasticsearch: Security > Create User.
I am using openLdap server for centralised authentication.I wanted to count number of times all clients have logined against this openLdapserver. On googling i found that Logoncount is there for Active Directory but didnt find anything for openldap. Any help is appreciable.Thanks
This is not supported directly in the user entry. You would have to enable the access log overlay and scan the access logs yourself.