Access legacy GAE Datastore from standalone python - macos

Trying to find out how to access the GAE Cloud Datastore from outside GAE. Ideally using a standalone python script (Mac) to run GQL like
q = GqlQuery("SELECT * FROM user WHERE email = 'test#example.com'")
or maybe even a local instance of GAE using GoogleAppEngineLauncher if a standalone script is not possible.
I have done the following
Accessing an existing App Engine Datastore from another platform (https://cloud.google.com/datastore/docs/activate) - permissions with service account + private key
Installed the Python SDK - confirmed SDK files are files are in
/usr/local/lib/python2.7/site-packages/google_appengine
/usr/local/google_appengine/google/appengine/*** (api,base,client,datastore,...,ext,etc.)
Running a print sys.path shows (among other paths)
/usr/local/Cellar/python/2.7.8_1/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/google_appengine,
/usr/local/Cellar/python/2.7.8_1/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/gtk-2.0', '/Library/Python/2.7/site-packages,
/usr/local/lib/python2.7/site-packages,
/usr/local/lib/python2.7/site-packages/google_appengine, /usr/local/lib/python2.7/site-packages/gtk-2.0
Did the export
export DATASTORE_SERVICE_ACCOUNT=...
export DATASTORE_PRIVATE_KEY_FILE=... (full path to .p12 file)
export DATASTORE_DATASET=...
export PYTHONPATH="$PYTHONPATH:/usr/local/google_appengine"
Ran the example adams.py file
Created a new entity called "Trivia" with name= hgtg, answer=42 as a record in PROD
However, running a standalone script
from appengine.ext.db import GqlQuery (or from google.appengine.ext.db import GqlQuery)
gives me ImportError:
No module named appengine.ext.db
I then tried to use a local GAE instance but can't figure out how to connect the PROD GAE database instance.
===
The existing GAE application us using the Datastore (Java) in PRODUCTION. In the Developer Console (https://console.developers.google.com/project), this would be under Storage > Cloud Datastore > Query where I would see the "Entities" (or kind). Obviously, there is a fairly limited amount of things you can do here and I don't really want to touch PRODUCTION code to run a query.
Thanks,
Chris

Related

Connect to Google Cloud Datastore using dev_appserver.py & google.golang.org/appengine#v1.6.6

As the title says. We have a legacy Go 1.11 AppEngine API that requires the dev_appserver.py to run. Simply, I want appengine.Main() & appengine.NewContext(r) to allow my application to point to my Cloud Datastore using my project-id, rather than the local emulator's storage. I have set GOOGLE_APPLICATION_CREDENTIALS to no avail.
This would be so I can locally run the server, while accessing a shared, Cloud DB.
I am using google.golang.org/appengine#v1.6.6 w/ dev_appserver.py --enable_console --port=8081 --support_datastore_emulator=true --go_debugging=true app.yaml
Is this possible? Or am I stuck on a local emulator when using the old Go libraries?
Moving from comments to answer
Take a look at remote_api for Go 1.11 https://cloud.google.com/appengine/docs/legacy/standard/go111/tools/remoteapi
The logic for using it would be something along the lines of -
If running on local environment, use remote_api else stick to the default behavior (i.e. since remote_api isn't enabled, it will either use the emulator in local environments or use production data directly in production)
To keep things simple, you could try using same variable name i.e.
if this is local environment
ctx, err := remote_api.NewRemoteContext(host, hc)
else
ctx := appengine.NewContext(r)
Then you use 'ctx' in the rest of your queries/calls to datastore
Note: I'm not familiar with 'Go' so take the above as pseudo-code and not working code
You might also want to consider not running the above changes with the --support_datastore_emulator=true flag

Deploy Oracle Apex app through different schema of database

I want to install apex app. I know of two ways to install any app.
Importing export file via Apex designer interface from the same workspace
Or, copy the app export file let's say f101.sql and execute that as script directly in SQL developer using the database Schema (which is connected to workspace)
I want to do something similar like 2, but instead of using same Schema, I want to use different Schema to execute script but install into the same workspace.
Example: f101.sql is app export of workspace finance_ws, which is connected to finance_schema of database.
I have another schema deployment_schema. Could I execute f101.sql file in deployment_schema so that it gets deployed in finance_ws workspace?
I'd
create a new workspace (let's call it deployment_ws)
designate it to deployment_schema
create a developer in that workspace (let's call it arif)
connect to deployment_ws as arif
import f101.sql
Perhaps you can do it as you wanted, but ... I find the above approach simpler.
[EDIT]
You'd want to have only one workspace (which is FINANCE_WS), but import F101.SQL into it so that one application works on finance_schema, and another one on deployment_schema.
As you can't have two applications with the same ID, you'll have to import F101.SQL and change application ID during import (that's on the "Install application" tab. Use either "auto assign new application ID" or "change application ID" (manually).
On the same import page, you'll see the Parsing schema property - change it to deployment_schema. If you don't see it in the list of values, connect as admin into internal workspace and designate deployment_schema to finance_ws.
I guess that's what you, actually, wanted ...
deployment_schema should have APEX_ADMINISTRATOR_ROLE assigned. Once assigned, we can deploy/execute any app through deployment_schema in any workspace which exist in that database

How to do a full export (including users) with embedded keycloak

I have a Springboot application using embedded keycloak.
What I am looking for is a way to load the keycloak server from it, make changes to the configuration, add users and to then export this new version of keycloak.
This question got an answer on how to do a partial export but I can't find anything in the documentation of the Keycloak Admin REST API on how to do a full export.
With the standalone keycloak server I would be able to simply use the CLI and type
-Dkeycloak.migration.action=export -Dkeycloak.migration.provider=singleFile -Dkeycloak.migration.file=/tmp/keycloak-dump.json
But this is the embedded version.
This is most likely trivial since I know for a fact that newly created users have to be stored somewhere.
I added a user and restarting the application doesn't remove it, so keycloak persists it somehow. But the json files I use for keycloak server and realm setup haven't been changed.
So, with no access to a CLI without a standalone server and no REST endpoint for a full export, how do I load the server, make some changes and generate a new json via export that I can simply put into my Spring App instead?
You can make a full export with the following command (if the Springboot works with Docker containers):
[podman | docker] exec -it <pod_name> opt/jboss/keycloak/bin/standalone.sh
-Djboss.socket.binding.port-offset=<interger_value> Docker recommend an offset of 100 at least
-Dkeycloak.migration.action=[export | import]
-Dkeycloak.migration.provider=[singleFile | dir]
-Dkeycloak.migration.dir=<DIR TO EXPORT TO> Use only iff .migration.provider=dir
-Dkeycloak.migration.realmName=<REALM_NAME_TO_EXPORT>
-Dkeycloak.migration.usersExportStrategy=[DIFFERENT_FILES | SKIP | REALM_FILE | SAME_FILE]
-Dkeycloak.migration.usersPerFile=<integer_value> Use only iff .usersExportStrategy=DIFFERENT_FILES
-Dkeycloak.migration.file=<FILE TO EXPORT TO>
I am creating an open source keycloak example with documentation; you can see a full guide about import/export in my company's GitHub.

default ODI user for multiple user installation on same server

i have two ODI instances installed on the same machine with the same home directory.
i am trying to import scenarios using ./startcmd.sh command and it is working but it is always deploy the scenario to instance1.
the question is where i can redirect the deployment to instance2 instead of instance1?
are there any properties file or something else providing that ?
A scenario is not imported into an instance / path on the machine. It's imported in a work repository in a database.
OdiImportScen does not have a parameter to specify the repository in which you want to import.
Instead you can use OdiImportObject and specify the WORKREPNAME parameter.

Google Cloud CLOUD NATURAL LANGUAGE API

I am trying to use Google Cloud CLOUD NATURAL LANGUAGE API.
I already have Google cloud running Account.
I enabled CLOUD NATURAL LANGUAGE API service and generated Service account keys and downloaded locally.
I ham using Goggle default program
LanguageServiceClient language = LanguageServiceClient.create();
// The text to analyze
String text = "My stay at this hotel was not so good";
Document doc = Document.newBuilder().setContent(text).setType(Type.PLAIN_TEXT).build();
// Detects the sentiment of the text
Sentiment sentiment = language.analyzeSentiment(doc).getDocumentSentiment();
System.out.printf("Text: %s%n", text);
System.out.printf("Sentiment: %s, %s%n", sentiment.getScore(), sentiment.getMagnitude());
I am using Eclipse as IDE on Mac
When I run application I got error
java.io.IOException: The Application Default Credentials are not available. They are available if running in Google Compute E
ngine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials. See https://developers.google.com/accounts/docs
/application-default-credentials for more information.
I even added GOOGLE_APPLICATION_CREDENTIALS as export in Terminal and on using "printenv" it shows the Path like this
GOOGLE_APPLICATION_CREDENTIALS=/Users/temp/Downloads/Sentiment-0e556940c1d8.json
Still it wasn't working with some hit and trial method I found out that in eclipse we can configure run.
There I have added environment variable and after that when I run program it works fine.
Now MY problem is I am implementing that code inside J2EE project and that ear file is to deploy in Wildfly.
I am again getting same error. Now I dont know where to set enviromnet variable in Wildfly or where???
Finally I found a way to set up GOOGLE_APPLICATION_CREDENTIALS as environment variable inside Wildfly
If you are running server through Eclipse
Open Wildfly Server setting by double clicking your server inside
Server Tab
Click "Open Launch Configuration"
Move to "Environment" tab and add new variable as key value pair
eg
GOOGLE_APPLICATION_CREDENTIALS /Users/temp/Downloads/Sentiment-0e556940c1d8.json
If you are running server using terminal
By default Wildfly looks for additional setting inside standalone.conf file.
just open wildfly/bin/standalone.conf file and add following line
GOOGLE_APPLICATION_CREDENTIALS=/Users/temp/Downloads/Sentiment-0e556940c1d8.json
Thats it. You are good to go.....

Resources