I've connected my blob storage account to Event Grid, via an Event Hub subscription, and can see the events from uploaded blobs.
But I was hoping to be able to pass some metadata with each received event, so I can relate the event back to a foreign key (customer identifier) without having to do extra work on each event.
Is this possible? I couldn't see anything in the API docs regarding this.
Based on the Azure Event Grid event schema for Blob storage there is no metadata properties in the Blob storage event data.
Note, there is only one specific case passing some metadata from the AEG Subscription to its subscriber such as a query string of the webhook event handler endpoint (e.g. HttpTrigger function).
Solution for your scenario is using an EventGridTrigger function (subscriber) with output binding to the Event Hub.
The following example shows a lightweight implementation of the event message mediator using the EventGridTrigger function:
[FunctionName("Function1")]
[return: EventHub("%myEventHub%", Connection = "AzureEventHubConnectionString")]
public async Task<JObject> Run([EventGridTrigger]JObject ed, ILogger log)
{
// original event message
log.LogInformation(ed.ToString());
// place for event data enrichment
var metadata = new { metadata = "ABCD", abcd = 12345 };
// enrich data object
ed["data"]["url"]?.Parent.AddAfterSelf(new JProperty("subscription", JObject.FromObject(metadata)));
// show after mediation
log.LogWarning(ed.ToString());
// forward to the Event Hub
return await Task.FromResult(ed);
}
and the log output from the Event Hub:
Related
I am using googleapis in NodeJS to create & fetch the calendar events. I am using the following method to get the list of events.
const getEvents = async (dateTimeStart, dateTimeEnd,timeZone) => {
console.log("Date Start : " + dateTimeStart + " date end :" + dateTimeEnd + " time zone " + timeZone);
try {
let response = await calendar.events.list({
auth: auth,
calendarId: CALENDER_ID,
timeMin: (new Date(dateTimeStart)).toISOString(),
timeMax: (new Date(dateTimeEnd)).toISOString(),
timeZone: timeZone,
singleEvents: true,
maxResults: 9999,
orderBy: 'startTime
});
let items = response['data']['items'];
console.log(items);
return items;
} catch (error) {
console.log(`Error at getEvents --> ${error}`);
return 0;
}
};
The above method returns only events that are created programmatically via googleapis. If I create the events directly on the calendar from the browser this does not return those events.
Any idea how to fetch all events even if they are created from browser.
Based on what you were explaining about the behavior of the events being created by the service account instead of the actual users I think the problem is that the events created through API are being created under the Calendar ID of the service account, and the ones created by the users through API may have a different Calendar ID, therefore when you try to get the list of events, since you are probably using the Calendar ID from the service account you get only those events created using the API and not the ones created by the users through the web UI.
In this case it may be necessary to make sure that every event is being created under the exact same calendar ID through the web UI and the API so that all the events no matter if they were created through the API or web UI get listed as expected.
Let me know if this is useful, otherwise I can edit the response to add more clarification depending on your specific situation.
I have created a bot and installed it to my microsoft teams. and I got conversation update event along with the contextObject.
/ Listen for incoming requests.
server.post('/api/messages', (req, res) => {
adapter.processActivity(req, res, async (context) => {
console.log(context);
await bot.run(context);
});
});
I want to store this context object for future reference. I tried storing it in postgress database of column type json. When I retrieve the context object from database and perform some actions like
context.sendActivity(MessageFactory.text('All messages have been sent.'));
it is throwing activity not found error
[onTurnError] unhandled error: Error: Missing activity on context
I want to store the context Object somewhere. or Is there any way that I can get the context object from "activity".
Have a look at how to send proactive notifications to users.
In short; there are helper function to achieve your goal. First you retrieve the conversation reference.
const conversationReference = TurnContext.getConversationReference(context.activity);
Followed by the following snippet to continue a conversation, based on the saved actvity.
await adapter.continueConversation(conversationReference, async turnContext => {
// If you encounter permission-related errors when sending this message, see
// https://aka.ms/BotTrustServiceUrl
await turnContext.sendActivity('proactive hello');
});
I'm trying to set up a DLQ for a Kinesis.
I used SQS and set it as the Kinesis on failure destination.
The Kinesis is attached to a lambda that always throws an error so the event will go right away to the SQS DLQ.
I can see the events in the SQS, but that payload of the event is missing ( the json I send as part of the event ), in the lambda if I print the event before throwing the exception, I can see the base64 encoded data, but not in my DLQ.
Is there a way to send the event data to the DLQ as well? I want to be able to examine the cause of the error correctly and put the event back to the Kinesis after I finished fixing the issue in the lambda.
https://docs.aws.amazon.com/lambda/latest/dg//with-kinesis.html#services-kinesis-errors
The actual records aren't included, so you must process this record and retrieve them from the stream before they expire and are lost.
According to the above the event payload won't be sent to the DLQ event so "missing event data" is expected here.
Therefore, in order to retrieve the actual record back, you might want to try something like
1) assuming we have the following kinesis batch info
{
"KinesisBatchInfo": {
"shardId": "shardId-000000000001",
"startSequenceNumber": "49601189658422359378836298521827638475320189012309704722",
"endSequenceNumber": "49601189658422359378836298522902373528957594348623495186",
"approximateArrivalOfFirstRecord": "2019-11-14T00:38:04.835Z",
"approximateArrivalOfLastRecord": "2019-11-14T00:38:05.580Z",
"batchSize": 500,
"streamArn": "arn:aws:kinesis:us-east-2:123456789012:stream/mystream"
}
}
2) we can get the record back by doing something like
import AWS from 'aws-sdk';
const kinesis = new AWS.Kinesis();
const ShardId = 'shardId-000000000001';
const ShardIteratorType = 'AT_SEQUENCE_NUMBER';
const StreamName = 'my-awesome-stream';
const StartingSequenceNumber =
'49601189658422359378836298521827638475320189012309704722';
const { ShardIterator } = await kinesis
.getShardIterator({
ShardId,
ShardIteratorType,
StreamName,
StartingSequenceNumber,
})
.promise();
const records = await kinesis
.getRecords({
ShardIterator,
})
.promise();
console.log('Records', records);
NOTE: don't forget to make sure your process has permission to 1) kinesis:GetShardIterator 2) kinesis:GetRecords
Hope that helps!
I am using ASP.NET Boilerplate with Code-First Entity Framework and MVC 5.
and in order to send a notification I am using the following code
public async Task SendNotification(Guid auditId, int auditNo, int? tenantId, long userId)
{
var notificationData = new LocalizableMessageNotificationData(
new LocalizableString(
"NotificationName",
EODAConsts.LocalizationSourceName
)
);
notificationData["auditNo"] = auditNo;
notificationData["auditId"] = auditId;
await _notificationPublisher.PublishAsync(NotificationName, notificationData, severity: NotificationSeverity.Error, userIds: new[] { new UserIdentifier(tenantId, userId) });
}
we know that sending the notification means adding it to AbpTenantNotifications and AbpUserNotifications ,but after sending it what is the way to retrieve inserted notification id in AbpTenantNotifications ,because PublishAsync method doesn't return any value
i mean what is the unique key in table AbpTenantNotifications which insures selecting specific one notification that is inserted after calling PublishAsync method
NotificationInfo only persist in the table for a short time only.
When you calls PublishAsync, NotifcationInfo is created immediately (see here).
Subsequently, it is consumed by NotificationDistributor.DistributeAsync and deleted right after converting NotificationInfo into TenantNotification & UserNotification (see here)
If you want to capture the TenantNotification when it is created, you can try with entity event handler (see here)
I am using directline V3 for testing out a bot inside MS Teams.
This is a bot showing some messages inside MS Teams.
Is there a way to read all the messages which are already posted in the bot without knowing their respective Conversation IDs. How to read all the conversations from the bot show in the attached screenshot.
On bot side, if we want to save and retrieve all the conversation history, in C# we can implement the IActivityLogger interface, and log the data in Task LogAsync(IActivity activity) for example:
public class ActivityLogger : IActivityLogger
{
public Task LogAsync(IActivity activity)
{
IMessageActivity msg = activity.AsMessageActivity();
//log here
return null;
}
}
So if you save data in Azure SQL Database, you can refer to Saving Bot Activities in Azure SQL Database, and here are some official examples.
Then in node.js, you can intercept and log messages using middleware:
bot.use({
botbuilder: function (session, next) {
myMiddleware.logIncomingMessage(session, next);
},
send: function (event, next) {
myMiddleware.logOutgoingMessage(event, next);
}
})