Is there any way to auto-detect or notify when a new table is created in bigquery? Based on that I want to trigger a rundeck job.
The easiest way to do this is to utilize event arc and cloud functions v2. There is actually a pretty decent repository outlining this approach here:
https://github.com/GoogleCloudPlatform/eventarc-samples/tree/main/bigquery-jobs-notifier/gcf
As a note, cloud functions are not the only way, you could also do this via cloud run or other compute options.
Related
I'd like to periodically clean up old conversations/user states, e.g. older than 30 days.
Is there a script that can do that?
There is no direct way of achieving this currently through the framework.
I have been able to achieve this in one simple way.
Assuming you are using CosmosDB(SQL) for your state management, you could set the TTL for that container. This will help deleting the documents that are 30days since the last update done to them.
You could also do the same by setting the TTL, when you first create the container through your Bot SDK, that way you wouldnt need to manually change it on portal.
We are using templates to package up some data transfer jobs between two nifi clusters, one acting as a sender, the other as the receiver. One of our jobs contains a remote process group and all worked fine at the point the template was created.
However when we deploy the template through our environments (dev, test, pre, prod), it is tedious and annoying to have to manually delete and a recreate a remote process group in the user interface. I'd like to automate this to simplify deploying templates and reduce the manual intervention.
Is it possible to update a remote processor group and its port configuration through the rest-api ?
Do I just use the REST api to create a new RPG with the correct configuration ?
Does anyone have any experience with this?
There is a JIRA to address this issue [1] which will be worked in conjunction with some of the ongoing Flow Registry (SDLC for flows) efforts. Until then, the best option would be (2) above.
[1] https://issues.apache.org/jira/browse/NIFI-4526
I am very new to Hadoop tools. So I am asking this doubt. I am using sqoop to push data from my relational DB to HDFS. Now as a next step I want to generate some reports using this data which is stored in HDFS. I have my own custom reports format.
I am aware that using HIVE I can get data from HDFS. but is it possible that I can design my own custom reports(Web UI) over this? is there any other tools I can use?
Else, is it possible to deploy an application( containing HTML GUI and java API's) on same machine and I can access it via HTTP and can see data present in HDFS?
You can use Tableau for better experience though it is paid but is the best in market,you can even customize your graph or report using tableau.You can get trial version of tableau from their site. You can use PowerBI from Microsoft which free and works well with Big data. Ambrose is created by twitter which is also having good support(I dind't tried this one).
Check Ambrose as this is what your are looking for. You can access it via HTTP url.
I want to move my magento site from AWS to Google and I want to make sure I'm doing it the right way as I am new with google cloud computing.
These are the steps I'm planning on doing:
create an instance and install redis and my magento store on it.
create sql for my DB
create a snapshop of this instance
create a template from this instance
create a group of instances with the template
create a load balancer and connect it with the instances group
is that the correct way to build a solid and fairly scalable magento site on GCC?
are there any services on google cloud I can use to make my store even more fast and scalable?
That's a fairly good way to deploy, but you can offload a few of those to managed services by GCP.
Use Click-To-Deploy solution for Magento (https://cloud.google.com/launcher/solution/bitnami-launchpad/magento?q=magento)
Launch another Click-To-Deploy solution for Redis (https://cloud.google.com/launcher/solution/bitnami-launchpad/redis?q=redis)
Launch a Cloud SQL instance (https://cloud.google.com/sql/)
Update your Magento instance with the configuration for these servers
Use this as a template to launch instances-group
Put this groups behind a load balancer
Why is this better?
You don't have to manage your SQL DB security and scaling
You get redis and magento using simple clicks, saves a lot of time
All you need to manage are your settings. Even if you wanted to update your magento to newer upgrades on better servers
Bonus: You should also make use of a CDN for your static resources and Cloud CDN (https://cloud.google.com/cdn/) will be helpful there.
Further Read: Go through this to get a sense of what else can you do with GCP (https://cloud.google.com/solutions/commerce/)
Is it possible to create a new user in sonar without using the web interface?
I need to write a script that inserts the same users for some tools, including sonar.
There are three ways you can do this:
Write directly to the database (there is a simple table called users).
Use the LDAP plugin, if you specify sonar.authenticator.createUsers: true in sonar.properties, it will create the users in the sonar database automatically the first time they authenticate.
Write a java application that depends on the sonar plugin API, you can then use constructor injection to get a Sonar hibernate session and persist the user you want. See Here.
Since SonarQube version 3.6, there is support for user management in webservice API:
https://sonarqube.com/web_api/api/users
http://docs.sonarqube.org/display/DEV/Web+API
The web service API does not seem to support user management. Anything's possible, but it doesn't look like this is offered directly via Sonar.
You could probably use some web automation library (webbrowser, webunit, watir, twill) to do it through the running server; it might even be possible to just use something like 'curl' by looking carefully at the page source for the users/create form.
Or, if you want to go straight to the database, you could try to pull out the user creation functionality from the code and mess with the sonar.users table directly.
There is the LDAP Plugin, which would take care of authentication, but it still requires you to create the users in Sonar, so that wouldn't solve your problem.