How to send Google Analytics events to Pub/Sub - events

We are using Google Analytics Event tracking problematic behavior. Some key events need to be operated immediately. How could we send the event to Pub/Sub and then do the real-time analytic job?
All the events could synchronized to Bigquery for offline analytic. But for some specific events, we want it to trigger some logical operation immediately. But found nothing could achieve this.

Related

Do smart contracts on NEAR have events?

Do smart contracts have events now that I can set up listeners for or do I need to poll the chain manually to get data about them? (Rust)
NEAR Protocol doesn't have events, so you need to poll the chain manually or create an indexer to listen for everything in the network.
For the record, NEAR Protocol allows contract developers to use Events Format logs for a more standardized way of logging. That makes it easier to catch such logs via indexers.
This indexer tutorial utilizes NEAR Lake Framework JS to catch Events Formatted logs.

Do smart contracts on NEAR have events or do I need to poll the chain to get data?

Do smart contracts have events now that I can set up listeners for or do I need to poll the chain manually to get data about them?
There are no events right now on NEAR but you could do the following
https://github.com/near-examples/erc-20-token/blob/master/contract/events.ts
and in Rust
https://github.com/near/docs/issues/362
Instead of native events we have a way to poll for changes in the contract's state. For example the events above for fungible tokens are implemented by using that.
Polling for events can be done via RPC https://docs.near.org/docs/api/rpc-experimental#example-of-data-changes and also we are finishing the indexing infrastructure so can later just run indexing node that will provide all this events (https://github.com/nearprotocol/nearcore/pull/2651)

How to trigger GraphQL Subscriptions from a backend message queue application

Currently I'm using Socket.io / SignalR to emit an event from my backend message queue system, whenever new data is incoming. That way I can setup an event handler in my React application and update the relay cache from within the event handler.
It does not seem like the most Graphql ish way to do things, so I was playing a bit around with pre-RFC live queries implementations, where you observed data changes in reactive data stores pushed it to the graphql server, and further to the client using websockets... with some rather complex custom code... obviously graphql is not ready for real live queries (not polling)
A few lines further down it says:
When building event-based subscriptions, the problem of determining what should trigger an event is easy, since the event defines that explicitly. It also proved fairly straight-forward to implement atop existing message queue systems.
Which leads me to my question. How can you (in a graphql way) best trigger graphql subscriptions when a new event is incoming to your backend message queue application and you need to reflect this new data in the ui in realtime - let's say each second? I'm not talking about triggering the event in the frontend/client or polling ever x seconds like you usually see when talking about subscriptions.
Not sure it's relevant but I'm using Relay Modern as my preferred graphql client.
Here's some ideas that might work if I get a little help to understand in general how to trigger/call a subscription without a mutation.
Backend worker / message queue "A" receives new incoming event with some device data. It uses either SignalR, or other pubsub (redis/socket.io/?) to notify the graphql server "B" (which subscribes to the event) about a new event has happened. The graphql server then trigger/execute the subscription and the frontend react relay application "C" automatically updates, since it has a relay subscription defined. This would be ideal, right? but how to trigger subscription on the graphql server?
Simply use Socket.io/SignalR to emit events from backend worker / message queue "A" on incoming data, subscribe and handle the event in the frontend "B", and then programically calling the subscription from within the Socket.io/SignalR event handler (if such a thing, directly calling a subscription, is even possible?). But then the only improvement from using subscriptions, instead of pure Socket.io/SignalR will be that I have moved the updating of the relay cache/store from the handler to the subscription. Not a big improvement, if any. But the manual update of the cache/store is really cumbersome, although not that hard :/
How do people handle real streaming live (device) data with signalr, and why is all realtime articles/examples just repeating the same old simple chat application, where the ui just updates after a user makes a click event? Is graphql not suited yet for dealing with a stream of frequently incoming device data in realtime? I understand why live queries was delayed after playing with implementing them myself, but without them, REAL realtime data updates and push it from the server to the frontend?

How can I view a events sent to a topic?

I created an Azure Event Grid Topic and I can send events to it. How can I view the events submitted to that topic?
In the Azure Portal, I can view the Event Grid Topic and see the topic's metrics but I cannot find a way to view the event messages.
At the moment, I created a workaround where I send all messages to a storage queue. The issue with this, Azure Storage Explorer doesn't let me scroll through thousands of messages; it limits me to viewing a single page of 50 +/- records.
try to use the Azure Event Grid Tester. You can clone any event subscription to the local machine subscriber via Hybrid Connection and/or ngrok tunnel channel.
The following screen snippet shows this tester with 2079 events on the custom topic rk20180724topic2:

how does event tracing and event logging works in terms of real-time event consumption

I am newbie to windows events I was wondering how I could consume events in real-time as new events are generated. So i need to know in simple terms how event tracing and event logging works
Simple google search will find examples...
http://blogs.iis.net/eokim/archive/2009/05/15/consume-iis-etw-tracing.aspx
How to consume real-time ETW events from the Microsoft-Windows-NDIS-PacketCapture provider?
https://www.google.com/search?q=event+tracing+for+windows+consume+events+example

Resources