Central list of SMART on FHIR endpoints - hl7-fhir

I am building a SMART on FHIR app for patient mediated EHR analysis.
Is there a centralized list of SMART on FHIR endpoints? For example, Epic provides this list of endpoints, and a very nice patient facing website to access Epic MyChart for a particular clinic. It is searchable by location. So I can search for "Maryland" and see all clinics with MyChart for that state.
I seek to replicate the Epic MyChart access page, but add SMART on FHIR endpoints from eClinicalWorks, Cerner, and other vendors. I think a central and continually updated list of SMART on FHIR endpoints would be beneficial to EHR interoperability - especially if it were open source and updated frequently. The information is all public, it would be a matter of compiling it and keeping it updated. Is there such a list that already exists?

Right now we are working on a specification to provide more consistent publication of endpoint and branding information to help patients select their provider. We are aiming to have each vendor openly publish more complete data in a consistent format. From there, compiling a centralized list should be possible.
Feel free to check out https://hackmd.io/#argonaut/patient-access-brands for details.

Probably not quite what you're after, but I wanted to point out that Lantern aggregates published FHIR API endpoints and some details about them.

Related

Phantom wallet: Solana transactions debug

I need to know which instuctions list is used for a transaction in order to use it in my application.
For example:
There is an unknown NFT marketplace, I can buy NFT through the "Buy" button in the browser.
I want to buy these nft programmatically, not through the browser.
To do this, I need to know all the instructions and transactions.
And in general, can I buy NFTs if the marketplace does not use CandyMachine, MagicEden, etc.?
Each marketplace/platform has a different way of managing transactions.
If you are interested in how transactions are put together and want to start building an understanding of the ecosystem, you have two options:
pick an open source repo (on Metaplex on you have auction house / storefront / fixed price listing / candy machine at least ). Some of these will have template UIs as well that you can customise to suit your needs. They come with several examples of transactions so that you can learn to build them from ground up.
Pick a proprietary framework (like Magic Eden), and interact with them via their API. You will not be building the transactions yourself in this case. In the case of Magic Eden, they build the transaction instructions serverside and serve them over their instructions API. You will not know (nor would you necessarily need to know) how the transactions are put together, as they've handled the abstractions for you.
Some examples of different NFT frameworks in the solana space:
Metaplex Auction Manager (open source) + Storefront UI. Useful for learning about how transactions are created, but probably unwise to start building a long-term solution on top this. Also Storefront UI is in places quite complex but as long as you just focus on the transactions and how they are generated, it's a useful pedagogical tool.
Metaplex Auction House (open source) is intended as a replacement for Auction Manager framework. Like the Auction Manager, Auction House is open source and there are multiple examples of how to use it in metaplex repos.
Metaplex Fixed Price (open source) is another listing framework with its own set of contracts. Again repos are available on metaplex github.
Magic Eden (proprietary) expose their transactions via an API, which gives you serialized transaction. You can use the transactions generated by their API in your application, but because ME is not open source, you will not be able to generate the transactions from scratch.
The above are just examples. However, the one thing they all share is that they use Metaplex standard for expressing their mint metadata. The way ownership is expressed via TokenAccounts is also standard across all, which means that as long as you hold NFts in your wallet, you will be able to list them in any of the above.
In addition to the above, Metaplex discord is also a useful source of information with a friendly community, but please check the growing SO knowledge base first.

How integrate FireStore Health Check and Dashboard metrics with our internal Company systems

Context: it is my first use of FireStore. I want to use it to push notification status to our Mobile Application. I can see that there is Google Firestore Dashboard under Analytics umbrella. In our company we use mainly three tools for monitoring our applications: Zabbix, Dynatrace and certain internal solution based on Elasticsearch. I need to ntegrate our internal monitoring systems with metrics resulted from our first Firestore project.
What I am looking for: based on personal assumptions:
1) Maybe there might exist either some GET endpoints that a I can connect and poll for information let's say each minute
2) Maybe, following the idea of Database Realtime pushing events accross a long time connection, I can code a Spring Boot application that import Firebase SDK and every day I connect to some specific Firestore endpoint which will push any interested events (eg. delay based on custom logic or dead service)
3) Maybe some plugin I can connect straight to a Kafka hosted in our internal Datacent
4) Some plugin to connect from Firestore/Firebase to either third tools (eg. Zabbix or Dynatrace or Elasticsearch)
5) Some dependency I could import in google-cloud-funtions thiggered from Firestore Healcheck engine in orther to consume some internal end-point posting data
Perhaps there is already some approach universally used for a scenario when you have to connect Firestore to internal monitoring system. I will be highly appreciated if tell me that than I can narrow my googling searchs because I am not finding anything usefull.
Please, it is not part of this question comparing Monitoring approach. It is a very solid fact in our company use internal Dashboards and some custom alerts trigger. I just mentioned the names above to clarify what I mean by internal monitoring tools. The focus on this question is HOW IMPORT/INTEGRATE/OBSERVE/CONSUME Firestore monitoring data. Our internal stack is beyond this question.
Here is the Official Documentation for Cloud Monitoring using which you can collect metrics, events, and metadata from Google Cloud Platform products that you can use to create dashboards, charts, and alerts.
Please let me know if you have further questions.

YouTube API Quota Extension for research

I need an extension to my YouTube API quota allotment to conduct research for my dissertation. I have been trying to get an estimate of the available resources and costs for extensions for an NSF grant but have not been able to get in contact with a human for several weeks despite filing the quota extension form.
Currently, I've been trapped in a loop with the youtube API compliance team where they continuously ask me for the following info.
In order to proceed further, please provide us the following information:
Provide API Client link and demo credentials
Screenshots and/or video recording(s) that clearly demonstrates how your API Client and its users access and use the YouTube API Services.
Documents relating to your implementation, access and use of YouTube API Services.
I have attached the required responses mulitple times and still receive the same message. For the first I have attached my python code for accessing the API (the only usage of the service), the second I have attached the pictures of the terminal window and the data output, for the third I have attached the project summary, description, and data collection plan for the project plus the first paper I published using the limited quota on YouTube.
I've repeatedly asked to be connected with a human to go through their needs but have had no response. The project has received a great deal of interest in the Economics community and I am under a great deal of pressure to continue the work, it is very stressful for a graduate student to bear especially when barred by an automated response. Please help D:
The service tag is 1-0726000027117

Provider information using HL7-FHIR standard

I am new to FHIR
How to create a Provider information (XML) using HL7-FHIR standard ? do we have any xsd reference to create the provider information
Any help is really appreciated.
Information about healthcare providers (and anyone acting in their professional capacity including receptionists, taxi drivers, etc.) is shared using the Practitioner resource. If you do a search on that page, you'll find links to the XML and JSON schemas.
If you want to build a Health Provider Directory/Index you can have a look at existing Implementation Guides:
Mobile Care Services Discovery
is a IHE profile which implements a HPD with FHIR
Validated Healthcare Directory Implementation Guide
is a FHIR Implementation Guide for a Health Directory
If you compare these two you'll see a common approach on how to represent Paracticioners, Organizations, etc.

Google Client Library for Java SDK and GDPR

I am using the Google Client Library for Java SDK in my Android app to interface with Google Drive.
Do Google act as a Data Controller or Data Processor by using this SDK? I need to know if I need to store any data to show the user has consented to my app interfacing with Google Drive in line with GDPR.
I know I need to ask permission for personalised or non-personalised ads but the Google Drive SDK and GDPR stuff is driving me crazy.
Thanks
Disclaimer I am not a legal type person this is my opinion from the guidelines that we have been given. You should also seek independent legal advice relating to your status and obligations under the GDPR, as only a lawyer can provide you with legal advice specifcally tailored to your situation.
For refrence I am going to quote from the following documents which as of my writing are the only thing Google has released with regard to GDPR that i am aware of ath this time
Google Cloud & the General Data Protection Regulation
GOOGLE CLOUD & THE GDPR WHITEPAPER
Google Cloud & the General Data Protection Regulation (GDPR)
G Suite1
and Google Cloud Platform customers will typically act as
the data controller for any personal data they provide to Google in
connection with their use of Google’s services. The data controller
determines the purposes and means of processing personal data,
while the data processor processes data on behalf of the data
controller. Google is a data processor and processes personal data
on behalf of the data controller when the controller is using G Suite
or Google Cloud Platform.
Data controllers are responsible for implementing appropriate
technical and organisational measures to ensure and demonstrate
that any data processing is performed in compliance with the GDPR.
Controllers’ obligations relate to principles such as lawfulness,
fairness and transparency, purpose limitation, data minimisation,
and accuracy, as well as fulfilling data subjects’ rights with respect
to their data.
If you are a data controller, you may find guidance related to your
responsibilities under GDPR by regularly checking the website of
your national or lead data protection authority under the GDPR (as
applicable)2, as well as by reviewing publications by data privacy
associations such as the International Association of Privacy
Professionals (IAPP).
You should also seek independent legal advice relating to your status
and obligations under the GDPR, as only a lawyer can provide you with
legal advice specifcally tailored to your situation. Please bear in mind
that nothing on this website is intended to provide you with, or should
be used as a substitute for legal advice.
Gsuite is Googles sweet of tools that being Drive, Calendar ... they are the data controller for the data behind the Google tools.
Controller vs. Processor
(7) ‘controller’ means the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data; where the purposes and means of such processing are determined by Union or Member State law, the controller or the specific criteria for its nomination may be provided for by Union or Member State law;
(8) ‘processor’ means a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller;
IMO
If you are accessing a users data on Google Drive and changing it or doing anything with it then yes you are going to need to tell them what you are using their data for and log their consent. If you are saving their data anywhere then you are also going to have to give them the ability to delete that data.
There are some things you cant do for example if they want to delete all their files on drive thats not your responsibility that's Googles. You are only responsible for the data thats on your system and what you have done with it.
Using googles client library IMO doesn't have much to do with GDPR its what you are doing with the data that they return that matters. I did contact google a few months ago hoping to get some official guidelines with regard to GDPR and the client libraries. I have not heard anything as of yet.

Resources