Googe Cloud Healthcare api: How can I get a list of all the datasets with the names of data stores in each of them? - google-api

I am using the Google cloud healthcare api to store dicom images. I need a way to fetch a list of all the dataset in a project with names of all the data stores in each dataset.
so possible output would look like this:
{ dataset1: [ "dataStoreA", "dataStoreB", ],
dataset2: [ "dataStoreC", "dataStoreD", ],
dataset3: [ "dataStoreE", "dataStoreF", ],
}
I am able to fetch the list of all the datasets using
healthcare.projects.locations.datasets.list({
parent: `projects/${project}/locations/${location}`
})
but that only returns an object for each dataset containing only the name of the dataset. I want to get name of all the data stores in that datasets as well.
My current way is to fetch a list of all the datasets and then for each dataset, fetch all the data stores in it. Using
healthcare.projects.locations.datasets.dicomStores
.list({
parent: `projects/${project}/locations/${location}/datasets/${dataset}`,
})
but this very time consuming because if there are 5 datasets, then I have to make 5 different requests to the api. I have looked the Google docs for the api and searched around without any luck.

The method dicomStores.list is designed to list the DICOM stores in the given dataset.
There is no method that will list all of the DICOM stores across datasets.
The current method you are using is the only option.

Related

Hasura GraphQL how to group query by month and year?

Is it possible in GraphQL or Hasura to group the results by month or year? I'm currently getting the result list back as a flat array, sorted by the date attribute of the model. However, I'd like to get back 12 subarrays corresponding to each month of the year.
From docs - natively not supported.
Derived data or data transformations leads to views. Using PostgreSQL EXTRACT Function you can have separate month field from data ... but still as flat array.
Probably with some deeper customization you can achieve desired results ... but graphql [tree, arrays] structures are more for embedding not for view ...
How many records you're processing? Hundreds? Client side conversion (done easily from apollo client data on react component/container [view] level) may be good enough [especially with extracted month field].
PS. You can have many results groupped in arrays if you 'glue' many queries (copies, each month filtered) on top level ... but probably not recommended solution.

Laravel 5.4 - Saving Excel data into dynamically created columns in MySQL

I have a internal webpage that makes data from excel searchable and readable from a 3rd party excel export file. The webpage allows for the uploading of multiple excel files in which the data gets read and stored in a MySQL database.
We want to update the application to keep a history of the uploaded data (it's data that has monthly values) so we can easily search, filter and generate graphs from the uploaded data.
So I am using Laravel 5.4 and have maatwebsite\excel to import and parse the excel file.
The Excel file always consists of the following columns (Dummy File)
| Item group | item # | item name | Item Currency | <month> <year> |
After Item Currency there is always 36 columns for the past 3 years of data from the current month so a column would be named like dec 2017
Now in Laravel I have created a Model for the item named Item and a model for the monthly values named ItemMonthly
Now I am able to read the file and create columns dynamically in the database but I feel like this is very ugly and not efficient at all:
(Gist) Code for Models and Excel Function
Biggest problem
Because I need to read all the monthly data and since I need them in order of month I can't really rename all the columns as far as I know. I need to be able to get all the columns to render in a Highchart graph and in a Datatable. and some items don't have the same monthly data (some only go till 2015 for example.
Needed advice
I've read a couple of solutions here some of them saying instead of creating columns in MySQL just store the monthly data as a json object in a single column.
Some answers just completely advice on changing from MySQL to MongoDB
I am kind of at a loss to find the best approach for this, and am sincerely wondering if MySQL is the right way to go. The solutions I have been trying so far all seem to involve really hacky ways of doing this.
If there is more info needed please let me know. I don't want to write an immense wall of text but I also want to provide the correct amount of information.
Many thanks!

Global filters for different data sources (with common tables)

I am currently working on Tableau using 2 data sources using each a join of 2 tables (named A, B, C):
Data source 1: A-B
Data source 2: A-C
Basically, A contains the major information that I need and then I join data from B and C to get the extra information I need for each report I am doing.
I then do a dashboard that contains reports using the data source 1 and 2.
My problem now is that I am filtering this dashboard using a dimension in A and I would like it to apply to all worksheets (e.g. for those using data sources 1 and those using data source 2).
I thought that because A is the common table in all data sources, that using a dimension in A would be ok to filter everything but it seems that it is not the case.
Is there a way to fix this?
I read some forums about creating a parameter. However, the filtering I am doing is basically as follows: I want my users to choose 1 shop name. They can find it either by:
Typing the name in the 'Shop name' quick filter,
Using a combination of the quick filters 'Region' and 'country' to then get a drop down of 'Shop Name' that has a reduced amounts of shop names (easier when the user knows where the shop is but does not remember its exact name).
Using a parameter would not allow me to do this anymore since all of this is based on 'filtering the relevant values'.
Does anyone have any recommendations?

Redis - how to store my data?

In the redis site , in "memory optimization" it says that small hashes use way less memory than a few keys so it is better to store a small hash with few fields instead of a few keys so I thought of making,for example, a users hash and storing the users in fields as json serialized data but how about my hash is REALLY big meaning I have a lot of fields.
Is it better to store the users as a single hash with a lot of fields or as several small hashes ??
Im asking this because in the redis site it says that "small" hashes are better than several keys for storing a couple of values but I dont know if it still aplies for really big hashes.
I would say your best solution is creating a key per user, perhaps named by the users id and storing the json data.
We tried storing each user as 1 hash per user and then fields for each of the users properties, but we found we never really used the fields individually and in most cases needed most of the data (HGETALL), so switched to storing json - which also helps with preserving data types.
Would need more detail as to what and how you're trying to store the data to give more suggestions really.
Let's say you have a user like this:
{"ID": "344", "Name": "Blah", "Occupation": "Engineer", "Telephone": [ "550-33...", ...] }
You would serialize the JSON and store it as what Redis says String. I.e. you would use the "GET" and "SET" commands.
e.g.
SET "user:344" "<SERIALIZED>
Since "users" is one of your main objects, it is not a small hash.
The gist of the documentation is about having hashes will small number of elements. For example, let's say that in your whole system you have 10 colors, and you want to associate some data with each of them. Instead of doing:
color:blue -> DATA, color:white -> DATA
It is better to you the hash.
colors -> blue -> DATA
colors -> while -> DATA

TABLEAU: Create global filter from a secondary data source to multiple data sources on dashboard

I have a Tableau dashboard with various visualizations created from 3 data sources (i.e. A,B, C).
Each data source has a relationship (join) with the same secondary data source (i.e. D), and the secondary data sources provides information to create a filter for each data source. In other words, there is the following relationship for my data sources:
A - D
B - D
C - D
I would like to create a global filter on a dashboard I have created. I would like one filter card from "D" to show up and be applied to "A," "B," and "C" at once rather than having a separate filter card show up for each data source.
I tried to create a global filter via a parameter and calculated field, but the parameter requires layers of connections because data sources "A,B, and C" only have "D" in common.
Thoughts?
Its not completely clear from your question, but it sounds like you are using Tableau data blending on your worksheets to include data from multiple data sources, rather than a join to create a data source based on multiple tables. If all your tables are on the same database server or spreadsheet, then traditional joins are usually more efficient than data blending.
The following approach often works well.
Instead of using Tableau's quick filter feature, create a worksheet based solely on D that shows the values you wish to use for filtering. It can be a simple list of names, or a bubble chart or anything you like. Use that worksheet as your filter by creating actions where it is the source and all the other worksheets on your dashboard are the target. Typically, you would want to specify the field names explicitly.
Data blending is useful but can be complex. Depending on details, you may need to make D the primary data source on your other worksheets. Experiment.
The parameter and calculated field you mentioned can be even simpler and faster than using actions, but users are restricted to selecting a single value for a parameter unlike the filter action approach. (Of course, one parameter value can represent multiple values in your target data source field depending entirely on how your calculated field interprets the parameter).
I can't tell why that didn't work for you or what you mean by "layers of connections". You might consider clarifying that part of your question.

Resources