I am integrating with the spotfire web player, when I select rows I can trap the event and send the rows to my datasource, however, when I save I'd like to call a service. Is there a way to combine local data source and remote?
Thank you.
What do you mean by combine local data and remote?
Variant 1: You can use dataSource.data - to set the initial records used. And define separate transport that will be used for future requests (after the initial one)
Variant 2: You can define the dataSource.transport.read as function and thus perform the binding on your own (allowing you to fetch the data either from local source or from remote service)
Variant 3: You can use dataSource.data method to set the records after the Grid is initialized.
All those configurations/settings are listed in the API reference.
Related
We have tried consuming a (ORDS) REST service in an Oracle APEX (v.20.2) interactive report using two different methods:
Using REST Data Source, as defined in Shared Components
configured as Oracle REST Data Services
Using a Local Database source call in this format: select * from json_table( apex_web_service.make_rest_request( p_url => and so on. (using WITH_PLSQL hint)
Both ways work well, but the problem is when using the more clean method, 1, then the Actions menu contains less options, compared to method 2, for example the Group By is missing.
As ORDS returns its data in pages, APEX by default assumes that not all data is available when the Interactive Report renders (only the rows which are actually seen on the report page). Thus the report options, which need to access all data, are disabled.
To change that, do the following:
In Shared Components, navigate to your REST Data Source
Edit the "GET" (Fetch Rows) Operation
Enable the Allow Fetching All Rows switch.
For a normal report view, the behavior will not change; APEX will only fetch rows from ORDS, as needed to display the report page. But now, the Chart and Group By options will appear - and if you configure a GROUP BY, APEX will potentially execute multiple HTTP requests to get all required rows from your REST API.
So be careful with this for REST Services potentially returning a large amount of rows ...
Hope this helps
I'm planning to distribute load from my database making a copy on several servers (each server will have the same tables but with different company data).
In order to do this, I will need to programmatically change the Datastore associated to my Data Views. For other tables I'm using the "Before Connect" property.
It's possible to handle this in Genexus?
Thanks,
Yes, you can use dbConnection Data Type.
Just create a variable based on this data type, and use it's methods and properties to set it up when you need it to be changed...
I'm currently new to Talend and I'm learning through videos and documentation, so I'm just not sure how to approach/implement this with best practices.
Goal
Integrate Magento and Quick Book using Talend.
My thoughts
Initially my first thought was I will setup direct DB connection for Magento and will take relevant data which I need and will process it and will send to QuickBook using REST API's(specifically bulk API's in batch)
But then again I thought it would be little hectic for me to query Magento database(multiple joins) so I've another option to use Magento's REST API.
But as I'm not much familiar with the tool I'm struggling little to find best suitable approach, so any help is appreciated.
What I've done till now?
I've saved my auth(for QB) and db(Magento) credentials data in file and using tFileInputDelimited and tContextLoad, I'm storing them in context variables so they can be accessible globally.
I've successfully configured database connection and dbinput but I've not used metadata for connection(should I use that and if Yes how can I pass dynamic values there?). I've used my context variables data in db connection settings.
I've taken relevant fields for now but if I want multiple fields simple query is not enough as Magento stores data in multiple tables for Customer etc but it's not big deal I know but I think it might increase my work.
For now that's what I've built and my next step is send the data to QB using REST while getting access_token and saving it to context variable and again storing the QB reference into Magento DB.
Also I've decided to use QB bulk API's but I'm not sure how I can process data in chunks in Talend(I tried to check multiple resources but no luck) i.e. if the Magento is returning 500 rows I want to process them in chunks of 30 as QB batch max limit is 30, so I will be sending it using REST to QB and as I said I also want to store back QB reference ID in magento(so I can update it later).
Also this all will be on local, then how can I do same in production? how I can maintain development and production environment?
Resources I'm referring
For REST and Auth best practices - https://community.talend.com/t5/How-Tos-and-Best-Practices/Using-OAuth-2-0-with-Talend-to-Access-Goo...
Nice example for batch processing here:
https://community.talend.com/t5/Design-and-Development/Batch-processing-in-talend-job/td-p/51952
Redirect your input to a tFileOutputDelimited.
Enter the output filename, tick the option "Split output in several files" from the "Advanced settings" and enter the value of 1000 into the field "Rows in each output file". This will create n files based on the filename with 1000 in each.
On the next subjob, use a tFileList to iterate over this file list to get records from each file.
I have a simple web app UI (which stores certain dataset parameters (for simplicity, assuming they are all data tables in a single Redshift database, but the schema/table name can vary, and the Redshift is in AWS). Tableau is installed on an EC2 instance in the same AWS account.
I am trying to determine an automated way of passing 'parameters' as a data source (i.e. within the connection string inside Tableau on EC2/AWS) rather than manually creating data source connections and inputting the various customer requests.
The flow for the user would be say 50 users select various parameters on the UI (for simplicity suppose the parameters are stored as a JSON file in AWS) -> parameters are sent to Tableau and data sources created -> connection is established within Tableau without the customer 'seeing' anything in the back end -> customer is able to play with the data in Tableau and create tables and charts accordingly.
How may I do this at least through a batch job or cloud formation setup? A "hacky" solution is fine.
Bonus: if the above is doable in real-time across multiple users that would be awesome.
** I am open to using other dashboard UI tools which solve this problem e.g. QuickSight **
After installing Tableau on EC2 I am facing issues in finding an article/documentation of how to pass parameters into the connection string itself and/or even parameterise manually.
An example could be customer1 selects "public_schema.dataset_currentdata" and "public_scema.dataset_yesterday" and one customer selects "other_schema.dataser_currentdata" all of which exist in a single database.
3 data sources should be generated (one for each above) but only the data sources selected should be open to the customer that selected it i.e. customer2 should only see the connection for other_schema.dataset_currentdata.
One hack I was thinking is to spin up a cloud formation template with Tableau installed for a customer when they make a request, creating the connection accordingly, and when they are done then just delete the cloud formation template. I am mainly unsure how I would get the connection established though i.e. pass in the parameters. I am not sure spinning up 50 EC2's though is wise. :D
An issue I have seen so far is creating a manual extract limits the number of rows. Therefore I think I need a live connection per customer request. Hence I am trying to get around this issue.
You can do this with a combination of a basic embed and applying filters. This would load the Tableau workbook. Then you would apply a filter based on whatever values your user selects from the JSON.
The final missing part is that you would use a parameter instead of a filter and pass those values to the database via initial sql.
I'm not tyring to set a dynamic datasource but just want to pull the Initial Catalog value from the Shared Datasource. The values are already hardcoded into the shared DS and I could just hardcode into the field but for other purposes I'm trying to 'pull' the value from the shared DS. I've looked around and everything want's to point me to creating dynamic but that's not the issue here
If you are using SQL Server you can add DB_NAME() as a column to your query. This will return the current database regardless of the server. There should be an equivalent expression for other languages.