I want to send the event pattern with customized values as follows.
{
"source": ["aws.ecr"],
"detail-type": ["ECR Image Action"],
"detail": {
"action-type": ["PUSH", "DELETE"],
"result": ["SUCCESS"],
"git-repository-name": ["github-api-test"], <-- custom
"file-path": ["bin/imgtag.json"] <-- custom
}
}
I want to get these values out of Lambda and use them as follows.
const handler = async (event, context) => {
const { detail } = event;
const filePath = detail['file-path'];
const gitRepoName = detail['git-repository-name'];
}
However, if you put the customized value(file-path, git-repository-name) in the event pattern, the event will not work.
Please tell me how to use it as above.
I found this anwser.
First, click Edit rule,
Select targets > configure input > input transformer
Fill it out as follows. ($.detail is one of the data that comes over when an ECR event occurs.)
And then parse the event in the lambda function as follows.
const handler = async (event, context) => {
const data = JSON.parse(event);
const gitRepoName = data['git-repository-name'];
const filePath = data['file-path'];
console.log(gitRepoName, filePath);
};
console log output
github-api-test, bin/imgtag.json
Related
Back in Parse Server 3.0 update, there was an addition of request.context to pass data between BeforeSave and AfterSave as documented here:
https://docs.parseplatform.org/cloudcode/guide/#using-request-context
However, I'm having a bit of trouble understanding how and when Parse runs this code in the example.
const beforeSave = function beforeSave(request) {
const { object: role } = request;
// Get users that will be added to the users relation.
const usersOp = role.op('users');
if (usersOp && usersOp.relationsToAdd.length > 0) {
// add the users being added to the request context
request.context = { buyers: usersOp.relationsToAdd };
}
};
const afterSave = function afterSave(request) {
const { object: role, context } = request;
if (context && context.buyers) {
const purchasedItem = getItemFromRole(role);
const promises = context.buyers.map(emailBuyer.bind(null, purchasedItem));
item.increment('orderCount', context.buyers.length);
promises.push(item.save(null, { useMasterKey: true }));
Promise.all(promises).catch(request.log.error.bind(request.log));
}
};
in other examples, cloud code functions are run via Parse.Cloud.beforeSave or Parse.Cloud.afterSave. In this example above, the function beforeSave is assigned to a
const beforeSave.
Why was this done and is this supposed to be placed inside main.js top level or inside another function?
We have a .NET Core 2.1 AWS Lambda that I'm trying to hook into our existing logging system.
I'm trying to log through Serilog using a UDP sink to our logstash instance for ingestion into our ElasticSearch logging database that is hosted on a private VPC. Running locally through a console logs fine, both to the console itself and through UDP into Elastic. However, when it runs as a lambda, it only logs to the console (i.e CloudWatch), and doesn't output anything indicating that anything is wrong. Possibly because UDP is stateless?
NuGet packages and versions:
Serilog 2.7.1
Serilog.Sinks.Udp 5.0.1
Here is the logging code we're using:
public static void Configure(string udpHost, int udpPort, string environment)
{
var udpFormatter = new JsonFormatter(renderMessage: true);
var loggerConfig = new LoggerConfiguration()
.Enrich.FromLogContext()
.MinimumLevel.Information()
.Enrich.WithProperty("applicationName", Assembly.GetExecutingAssembly().GetName().Name)
.Enrich.WithProperty("applicationVersion", Assembly.GetExecutingAssembly().GetName().Version.ToString())
.Enrich.WithProperty("tags", environment);
loggerConfig
.WriteTo.Console(outputTemplate: "[{Level:u}]: {Message}{N---ewLine}{Exception}")
.WriteTo.Udp(udpHost, udpPort, udpFormatter);
var logger = loggerConfig.CreateLogger();
Serilog.Log.Logger = logger;
Serilog.Debugging.SelfLog.Enable(Console.Error);
}
// this is output in the console from the lambda, but doesn't appear in the Database from the lambda
// when run locally, appears in both
Serilog.Log.Logger.Information("Hello from Serilog!");
...
// at end of lambda
Serilog.Log.CloseAndFlush();
And here is our UDP input on logstash:
udp {
port => 5000
tags => [ 'systest', 'serilog-nested' ]
codec => json
}
Does anyone know how I might go about resolving this? Or even just seeing what specifically is wrong so that I can start to find a solution.
Things tried so far include:
Pinging logstash from the lambda - impossible, lambda doesn't have ICMP
Various things to try and get the UDP sink to output errors, as seen above, various attempts at that. Even putting in a completely fake address yields no error though
Adding the lambda to a VPC where I know logging is possible from
Sleeping around at the end of the lambda. SO that the logs have time to go through before the lambda exits
Checking the logstash logs to see if anything looks odd. It doesn't really. And the fact that local runs get through fine makes me think it's not that.
Using UDP directly. It doesn't seem to reach the server. I'm not sure if that's connectivity issues or just UDP itself from a lambda.
Lots of cursing and swearing
In line with my comment above you can create a log subscription and stream to ES like so, I'm aware that this is NodeJS so it's not quite the right answer but you might be able to figure it out from here:
/* eslint-disable */
// Eslint disabled as this is adapted AWS code.
const zlib = require('zlib')
const { Client } = require('#elastic/elasticsearch')
const elasticsearch = new Client({ ES_CLUSTER_DETAILS })
/**
* This is an example function to stream CloudWatch logs to ElasticSearch.
* #param event
* #param context
* #param callback
*/
export default (event, context, callback) => {
context.callbackWaitsForEmptyEventLoop = true
const payload = new Buffer(event.awslogs.data, 'base64')
zlib.gunzip(payload, (err, result) => {
if (err) {
return callback(null, err)
}
const logObject = JSON.parse(result.toString('utf8'))
const elasticsearchBulkData = transform(logObject)
const params = { body: [] }
params.body.push(elasticsearchBulkData)
esClient.bulk(params, (err, resp) => {
if (err) {
callback(null, 'success')
return
}
})
callback(null, 'success')
})
}
function transform(payload) {
if (payload.messageType === 'CONTROL_MESSAGE') {
return null
}
let bulkRequestBody = ''
payload.logEvents.forEach((logEvent) => {
const timestamp = new Date(1 * logEvent.timestamp)
// index name format: cwl-YYYY.MM.DD
const indexName = [
`cwl-${process.env.NODE_ENV}-${timestamp.getUTCFullYear()}`, // year
(`0${timestamp.getUTCMonth() + 1}`).slice(-2), // month
(`0${timestamp.getUTCDate()}`).slice(-2), // day
].join('.')
const source = buildSource(logEvent.message, logEvent.extractedFields)
source['#id'] = logEvent.id
source['#timestamp'] = new Date(1 * logEvent.timestamp).toISOString()
source['#message'] = logEvent.message
source['#owner'] = payload.owner
source['#log_group'] = payload.logGroup
source['#log_stream'] = payload.logStream
const action = { index: {} }
action.index._index = indexName
action.index._type = 'lambdaLogs'
action.index._id = logEvent.id
bulkRequestBody += `${[
JSON.stringify(action),
JSON.stringify(source),
].join('\n')}\n`
})
return bulkRequestBody
}
function buildSource(message, extractedFields) {
if (extractedFields) {
const source = {}
for (const key in extractedFields) {
if (extractedFields.hasOwnProperty(key) && extractedFields[key]) {
const value = extractedFields[key]
if (isNumeric(value)) {
source[key] = 1 * value
continue
}
const jsonSubString = extractJson(value)
if (jsonSubString !== null) {
source[`$${key}`] = JSON.parse(jsonSubString)
}
source[key] = value
}
}
return source
}
const jsonSubString = extractJson(message)
if (jsonSubString !== null) {
return JSON.parse(jsonSubString)
}
return {}
}
function extractJson(message) {
const jsonStart = message.indexOf('{')
if (jsonStart < 0) return null
const jsonSubString = message.substring(jsonStart)
return isValidJson(jsonSubString) ? jsonSubString : null
}
function isValidJson(message) {
try {
JSON.parse(message)
} catch (e) { return false }
return true
}
function isNumeric(n) {
return !isNaN(parseFloat(n)) && isFinite(n)
}
One of my colleagues helped me get most of the way there, and then I managed to figure out the last bit.
I updated Serilog.Sinks.Udp to 6.0.0
I updated the UDP setup code to use the AddressFamily.InterNetwork specifier, which I don't believe was available in 5.0.1.
I removed enriching our log messages with "tags", since I believe it being present on the UDP endpoint somehow caused some kind of clash and I've seen it stop logging without a trace before.
And voila!
Here's the new logging setup code:
loggerConfig
.WriteTo.Udp(udpHost, udpPort, AddressFamily.InterNetwork, udpFormatter)
.WriteTo.Console(outputTemplate: "[{Level:u}]: {Message}{NewLine}{Exception}");
I have this code, and failing to understand why I am not getting inside the map function (where I have the comment "I AM NEVER GETTING TO THIS PART OF THE CODE"):
export const fiveCPMonitoringLoadEpic = (action$, store) =>
action$
.ofType(
FIVE_CP_MONITORING_ACTION_TYPES.LOAD_FIVE_CP_MONITORING_DATA_STARTED
)
.debounceTime(250)
.switchMap(action => {
const params = action.params;
const siteId = { params };
// getting site's EDC accounts (observable):
const siteEdcAccount$ = getSiteEDCAccountsObservable(params);
const result$ = siteEdcAccount$.map(edcResponse => {
// getting here - all good so far.
const edcAccount = edcResponse[0];
// creating another observable (from promise - nothing special)
const fiveCPMonitoringEvent$ = getFiveCPAndTransmissionEventsObservable(
{
...params,
edcAccountId: edcAccount.utilityAccountNumber
}
);
fiveCPMonitoringEvent$.subscribe(x => {
// this is working... I am getting to this part of the code
// --------------------------------------------------------
console.log(x);
console.log('I am getting this printed out as expected');
});
return fiveCPMonitoringEvent$.map(events => {
// I NEVER GET TO THIS PART!!!!!
// -----------------------------
console.log('----- forecast-----');
// according to response - request the prediction (from the event start time if ACTIVE event exists, or from current time if no active event)
const activeEvent = DrEventUtils.getActiveEvent(events);
if (activeEvent) {
// get event start time
const startTime = activeEvent.startTime;
// return getPredictionMeasurementsObservable({...params, startTime}
const predictions = getPredictionMock(startTime - 300);
return Observable.of(predictions).delay(Math.random() * 2000);
} else {
// return getPredictionMeasurementsObservable({...params}
const predictions = getPredictionMock(
DateUtils.getLocalDateInUtcSeconds(new Date().getTime())
);
return Observable.of(predictions).delay(Math.random() * 2000);
}
});
can someone please shed some light here?
why when using subscribe it is working, but when using map on the observable it is not?
isn't map suppose to be invoked every time the observable fires?
Thanks,
Jim.
Until you subscribe to your observable, it is cold and does not emit values. Once you subscribe to it, the map will be invoked. This is a feature of rxjs meant to avoid operations that make no change (= no cunsumer uses the values). There are numerous blog posts on the subject, search 'cold vs hot obserables' on google
On the App.xaml.cs I have the following code
private async void OnCommandsRequested(SettingsPane settingsPane, SettingsPaneCommandsRequestedEventArgs e)
{
var loader = ResourceLoader.GetForCurrentView();
var generalCommand = new SettingsCommand("General Settings", "General Settings", handler =>
{
var generalSettings = new GeneralSettingsFlyout();
generalSettings.Show();
});
e.Request.ApplicationCommands.Add(generalCommand);
object data;
IAuthService _authService = new AuthService();
if (Global.UserId == 0)
data = await _authService.GetSettingValueBySettingName(DatabaseType.GeneralDb, ApplicationConstants.GeneralDbSettingNames.ShowSupportInfo);
else
data = await _authService.GetSettingValueBySettingName(DatabaseType.UserDb, ApplicationConstants.UserDbSettingNames.ShowSupportInfo);
if (data != null && data.ToString().Equals("1"))
{
var supportCommand = new SettingsCommand("Support", "Support", handler =>
{
var supportPane = new SupportFlyout();
supportPane.Show();
});
e.Request.ApplicationCommands.Add(supportCommand);
}
var aboutCommand = new SettingsCommand("About", loader.GetString("Settings_OptionLabels_About"), handler =>
{
var aboutPane = new About();
aboutPane.Show();
});
e.Request.ApplicationCommands.Add(aboutCommand);
}
This code adds the setting "General Settings" but neither "Support" or "About" commands. Can anyone advice what's wrong with this code?
Instead of querying the commands from your service when they are requested you'll need to query them ahead of time and then add the already known commands.
You cannot use await in OnCommandsRequested.
A method returns when it gets to the first await, so only commands added to the request before the await will be used.
Since the SettingsPaneCommandsRequestedEventArgs doesn't provide a deferral there is no way to tell the requester to wait for internal async calls to complete.
Note also that SettingsPane is deprecated and not recommended for new app development for Windows 10.
In backbone.js documentation it says:
To make a handy event dispatcher that can coordinate events among different areas of your application: var dispatcher = _.clone(Backbone.Events)
Can anyone explain how to implement the dispatcher to communicate from one view to another? Where do I have to place the code in my app?
Here is a good article about using an event aggregator.
Can anyone explain how to implement the dispatcher to communicate from one view to another? Where do I have to place the code in my app?
You will probably have some kind of App Controller object, which will control the flow of the app, creating views, models, etc. This is also a good place for the event aggregator.
From my point of view, I think that article explains it pretty well.
Recently I needed an EventDispatcher to handle a large amount of events without loosing track of their names and their behave.
Perhaps it helps you too.
Here a simple example View:
define(['backbone', 'underscore', 'eventDispatcher'],
function(Backbone, _, dispatcher){
new (Backbone.View.extend(_.extend({
el: $('#anyViewOrWhatever'),
initialize: function () {
window.addEventListener('resize', function () {
// trigger event
dispatcher.global.windowResize.trigger();
});
// create listener
dispatcher.server.connect(this, this.doSomething);
// listen only once
dispatcher.server.connect.once(this, this.doSomething);
// remove listener:
dispatcher.server.connect.off(this, this.doSomething);
// remove all listener dispatcher.server.connect from this:
dispatcher.server.connect.off(null, this);
// remove all listener dispatcher.server.connect with this method:
dispatcher.server.connect.off(this.doSomething);
// remove all listener dispatcher.server.connect no matter what and where:
dispatcher.server.connect.off();
// do the same with a whole category
dispatcher.server.off(/* ... */);
// listen to all server events
dispatcher.server.all(this, this.eventWatcher);
},
doSomething: function(){
},
eventWatcher: function(eventName){
}
})
))();
});
Here the EventDispatcher with some example events. The events itself are predefined in the template Object. Your IDE should recognize them and lead you through the list.
As you can see, the Dispatcher run on its own. Only your View or whatever needs underlying Event methods from Backbone.
// module eventDispatcher
define(['backbone', 'underscore'], function (Backbone, _) {
var instance;
function getInstance () {
if ( !instance ) {
instance = createInstance();
}
return instance;
}
return getInstance();
function createInstance () {
// dummy function for your ide, will be overwritten
function e (eventContext, callback) {}
var eventHandler = {},
// feel free to put the template in another module
// or even more split them in one for each area
template = {
server: {
connect: e,
disconnect: e,
login: e,
logout: e
},
global: {
windowResize: e,
gameStart: e
},
someOtherArea: {
hideAll: e
}
};
// Create Events
_.each(template, function (events, category) {
var handler = eventHandler[category] = _.extend({}, Backbone.Events);
var categoryEvents = {
// turn off listener from <category>.<**all events**> with given _this or callback or both:
// off() complete purge of category and all its events.
// off(callback) turn off all with given callback, no matter what this is
// off(null, this) turn off all with given this, no matter what callback is
// off(callback, this) turn off all with given callback and this
off: function (callback, _this) {
if(!callback && _this){
handler.off();
}else{
_.each(template[category], function(v, k){
k != 'off' && template[category][k].off(callback, _this);
});
}
}
};
events.all = e;
_.each(events, function (value, event) {
// create new Listener <event> in <category>
// e.g.: template.global.onSomething(this, fn);
categoryEvents[event] = function (_this, callback) {
_this.listenTo(handler, event, callback);
};
// create new Listener <event> in <category> for only one trigger
// e.g.: template.global.onSomething(this, fn);
categoryEvents[event].once = function (_this, callback) {
_this.listenToOnce(handler, event, callback);
};
// trigger listener
// e.g.: template.global.onSomething.trigger();
categoryEvents[event].trigger = function (debugData) {
console.log('**Event** ' + category + '.' + event, debugData ? debugData : '');
handler.trigger(event);
};
// turn off listener from <category>.<event> with given _this or callback or both:
// off() complete purge of category.event
// off(callback) turn off all with given callback, no matter what this is
// off(null, this) turn off all with given this, no matter what callback is
// off(callback, this) turn off all with given callback and this
// e.g.: template.global.onSomething.off(fn, this);
categoryEvents[event].off = function (callback, _this) {
handler.off(event, callback, _this);
}
});
template[category] = categoryEvents;
});
return template;
}
});
The behavior of Backbones Event-system is not affected in any way and can be used as normal.