I made a JSON file as documented here: http://developers.gigya.com/display/GD/Users
But I can't find where to import the file.
Any clues?
As outlined in Gigya's documentation, "A migration to the Gigya platform is performed by exporting data from your current system to a JSON file and sending it to Gigya to import."
To schedule such a data migration, you will need to coordinate with the Account Manager assigned to your company's account at Gigya. They will be able to provide with further instructions.
Related
I'm wondering if it is somehow possible to display a private report if we don't have access to the account or company's Data Studio environment? Using the Google API would be ideal.
The reason why we need an API is because the reports need to be displayed on a distributed network of displays where user interaction is not possible. They are all separated from each other and run on their own browser instance. Setting the reports public to embed them is not an option due to sensitive data being exposed to the internet. Embedding is fine but we need some way of accessing the customers data without user interaction, i.e. via access tokens or something. If the user authenticates to our app beforehand and that allows us to then use their access token inside of the embed iframe would be fine too.
As far as I can tell I cannot display private reports through the official google-data-studio API. I did find another stackoverflow link that mentions something about community connectors. Would this still be a solution to our problem if we don't own the reports itself? We are just displaying the reports but do not have access to their Google Cloud environment nor their data sources.
Any help is appreciated.
This is not supported by Google Data Studio.
In short, if you can't make your report public, and you can't authenticate to view it, you're out of luck.
The link you provided is much more a hack than a feature. It suggests you to make your data public with a custom authentication mechanism developed in a Community Connector that requires a token as parameter.
This way, your report will be available to anyone, however, since your report requires the token to work (a kind of password), it won't show any data to unauthorized users (those who don't provide a valid token).
Although it is possible, I think it requires so much effort to make it work, and, of course, it requires that you change the connector of your report. So 'yes', you need to own the report to connect to your data through the suggested Community Connector.
I am using laravel to build an app in BigCommerce. I am able to get access token but I need to store that for future requests. What is the best possible way to store the app data for BigCommerce?
I've got this working by creating a DB schema in Laravel where in there are tables like stores, users, app_settings.
Whenever the user installs an app, I am storing an access token and other information like store hash in stores table and user details in users table.
Whenever the app is loaded I could get the store and user information via verify signed request payload. Using this I am able to configure my app settings for the user and store those in app settings table.
So when I create a Webhook for the store, I could get the store hash from response as producer key and accordingly I can find the access token for the store using store hash in stores table.
If you're git ignoring your config.json or .env files, you could store these there. However after speaking with one of our Developer Advocates, I wanted to pass along some best practice advice. :) You may want to consider using a secrets manager for option #1 in your decision here. A secrets manager meaning a tool to safely store these variables like Secrets in Github or Key Vault in Azure.
Also, this resource may be helpful to review for your use case. https://www.codementor.io/#ccornutt/keeping-credentials-secure-in-php-kvcbrk55z
I am aware of adding/managing users from GUI in NiFi i.e. a admin user and add user and groups/policies etc.
This is maintained in users.xml file.
I wanted to know, can we manually add records in users.xml instead of GUI?
If yes, how is identifier tag of user derived by NiFi. For e.g. I see a tag:
How is above identifier generated?
The reason for above is, we can maintain the users.xml file in our code base and whenever new users need to be added in NiFi, team can update its details in this file and release and we re-start NIFi. We do not have to rely on GUI to add new users.
Is it possible?
EDIT:
To be more clear, currently we have ldap authentication in place using ldap-provider. So that part is fine. I am not lookng for ldap authentication to NiFi.
Now for actual roles/permissions for "authorized users" i.e. who can see the processors/components, create new processors, query data provenance etc admin go to NiFi UI and add users/groups/policies etc. These details are then updated in users.xml.
I am specifically looking to achieve this activity via automating or from backend.
As per the response from Bryan, I think the feasible solution is using Nifi REST API for that.
The users.xml and authorizations.xml really shouldn't be manually edited/maintained, they are internals of the file-based authorizer that are not meant to be a public API.
It would be better to maintain a script that looped through a list of users and used NiFi's REST API to see if the user existed, and if it didn't then created the user using the REST API.
Another option would be to load your users from a Directory Server. This is detailed in the admin guide [1]. This implementation is configured with an interval for retrieving new users from the Directory Server.
[1] https://nifi.apache.org/docs/nifi-docs/html/administration-guide.html#authorizers-setup
users.xml doesn't intended to updated by users. This can be overcomed if you use external authentication providers ex. ldap-authentication provider
The main use case is using IPython as CLI to my own Google accounts. What I am really after is minmizing the fussing around between starting the IPython shell and actually issuing usefull calls against the API.
The docs for authenticating with Google APIs focus on setting up application which other user will use to access their data.
This leads to a lengthy Oauth dance involving a browser in order to allow other users to authenticate without compromising their credential.
However, I do not mind sharing my private credentials with myself. I am not planning on sharing the code. If I did share the code I would use something like dotenv to separate the credentials from the code.
Twitter provides developers a second set of credentials
that allows developers to access their own accounts for testing.
Thus it is possible to access ones own account programmatically
by just providing to sets of credentials: the developer credentials that allow the calls to the API and the other credentials that grant access to the developers own data. For example:
from twitter import *
t = Twitter(
auth=OAuth(token, token_key, con_secret, con_secret_key))
# Get your "home" timeline
t.statuses.home_timeline()
# Update your status
t.statuses.update(
status="Tweeting from Python")
Where con_secret* are the developer credentials and
and token* are the account access credentials.
How can I do something equally simple with Google APIs?
Where can I get credentials to access my own account?
How would I use them in Google API?
As an example what would be the simplest procedure for retrieving the contents from one of my own Youtube playlists?
I have com to think that a Python headless browser library could be give me what I need. I have asked a related question on SE Software Recommendations
https://softwarerecs.stackexchange.com/questions/35744/python-headless-browser-library-for-oauth2-authentication-from-ipython-console
I would like to download a set of credentials
Google offers this ability through it's client_secrets.json file. There are different ways to download this, depending on the type of account you want to use (Web application, installed application, Service account). The different techniques can be found here .
Store the credentials locally and keep using them without requiring
new credentials every call
This also isn't a problem, the client secret is valid until you renew it - AFAIK there is no automatic expiry unless you specify otherwise.
Once you have downloaded your client_secrets.json, store the file in a non-public directory (normally inside your project directory/config).
Similar to the downloading of the file, there are different techniques (flow classes) to use the JSON file depending on what type of account you are using. As an example, the below would be used for installed and web applications;
from oauth2client.client import OAuth2WebServerFlow
...
flow = OAuth2WebServerFlow(client_id='your_client_id',
client_secret='your_client_secret',
scope='scope URL here',
redirect_uri='http://example.com/auth_return')
Other flow class examples can be found here
Hope this helps - If you need further information, the official documentation (which be warned, can be incredibly inaccurate and confusing) can be found here https://developers.google.com/api-client-library/python/guide/aaa_oauth
I have a script which is retrieving user info for a set of users in our account. First we're calling socialize.exportUsers to get all UIDs, then I'd like to retrieve user info for a subset of this. Currently I'm calling socialize.getUserInfo on each UID I want to retrieve, which is quite slow.
I'm wondering if there's a way to batch API calls together so I can retrieve user info for multiple UIDs with one call. I didn't find anything in their API documentation or Developer's Guide.
Thanks,
Scott
Unfortunately, Gigya does not currently not have API functionality that would allow you to retrieve user information back as a batch.
However, if you have an active contract with Gigya, then you should be able to contact Gigya support directly and request a user export. Gigya has an internal data migration process where they can extract all the users on an API key into a JSON file and then send you that file.