How to create this event filter in AWS SQS - aws-lambda

This is the real data structure of my current json data, it contains [] in the data in the first place.
{"events":[{"type":"message","message":{"type":"text"}}]}
So, basically I just want text message type data from SQS, but now I don't know what should the filter be.

If you are using yaml in serverless, I´d try this:
filterPatterns:
- body: {events: {type: ["message"], message: {type: ["text"]}}}
In case it helps, I had a similar scenario: in my case, I want a function to be triggered only when inside the body of the SQS message, the field "type" has values "create" or "delete"}
In my case, the next code worked:
filterPatterns:
- body : {type: [create, delete]}

Related

GraphQL: how to have it return a flexible, dynamic array, depending on what the marketeer filled in? [duplicate]

We are in the situation that the response of our GraphQL Query has to return some dynamic properties of an object. In our case we are not able to predefine all possible properties - so it has to be dynamic.
As we think there are two options to solve it.
const MyType = new GraphQLObjectType({
name: 'SomeType',
fields: {
name: {
type: GraphQLString,
},
elements: {
/*
THIS is our special field which needs to return a dynamic object
*/
},
// ...
},
});
As you can see in the example code is element the property which has to return an object. A response when resolve this could be:
{
name: 'some name',
elements: {
an_unkonwn_key: {
some_nested_field: {
some_other: true,
},
},
another_unknown_prop: 'foo',
},
}
1) Return a "Any-Object"
We could just return any object - so GraphQL do not need to know which fields the Object has. When we tell GraphQL that the field is the type GraphQlObjectType it needs to define fields. Because of this it seems not to be possible to tell GraphQL that someone is just an Object.
Fo this we have changed it like this:
elements: {
type: new GraphQLObjectType({ name: 'elements' });
},
2) We could define dynamic field properties because its in an function
When we define fields as an function we could define our object dynamically. But the field function would need some information (in our case information which would be passed to elements) and we would need to access them to build the field object.
Example:
const MyType = new GraphQLObjectType({
name: 'SomeType',
fields: {
name: {
type: GraphQLString,
},
elements: {
type: new GraphQLObjectType({
name: 'elements',
fields: (argsFromElements) => {
// here we can now access keys from "args"
const fields = {};
argsFromElements.keys.forEach((key) => {
// some logic here ..
fields[someGeneratedProperty] = someGeneratedGraphQLType;
});
return fields;
},
}),
args: {
keys: {
type: new GraphQLList(GraphQLString),
},
},
},
// ...
},
});
This could work but the question would be if there is a way to pass the args and/or resolve object to the fields.
Question
So our question is now: Which way would be recommended in our case in GraphQL and is solution 1 or 2 possible ? Maybe there is another solution ?
Edit
Solution 1 would work when using the ScalarType. Example:
type: new GraphQLScalarType({
name: 'elements',
serialize(value) {
return value;
},
}),
I am not sure if this is a recommended way to solve our situation.
Neither option is really viable:
GraphQL is strongly typed. GraphQL.js doesn't support some kind of any field, and all types defined in your schema must have fields defined. If you look in the docs, fields is a required -- if you try to leave it out, you'll hit an error.
Args are used to resolve queries on a per-request basis. There's no way you can pass them back to your schema. You schema is supposed to be static.
As you suggest, it's possible to accomplish what you're trying to do by rolling your own customer Scalar. I think a simpler solution would be to just use JSON -- you can import a custom scalar for it like this one. Then just have your elements field resolve to a JSON object or array containing the dynamic fields. You could also manipulate the JSON object inside the resolver based on arguments if necessary (if you wanted to limit the fields returned to a subset as defined in the args, for example).
Word of warning: The issue with utilizing JSON, or any custom scalar that includes nested data, is that you're limiting the client's flexibility in requesting what it actually needs. It also results in less helpful errors on the client side -- I'd much rather be told that the field I requested doesn't exist or returned null when I make the request than to find out later down the line the JSON blob I got didn't include a field I expected it to.
One more possible solution could be to declare any such dynamic object as a string. And then pass a stringified version of the object as value to that object from your resolver functions. And then eventually you can parse that string to JSON again to make it again an object on the client side.
I'm not sure if its recommended way or not but I tried to make it work with this approach and it did work smoothly, so I'm sharing it here.

UI Router: get params of "to" and "from" state with $transitions service

I'm trying to use $transitions service instead of $stateParams like there for listening on state changing, but can't get state params. I'm using property of StateObject, but instead of getting for example {id: 123}, i got {id: e}, where e is a object in which i can't find a value. Anybody help with this ?
$transitions.onStart({ }, function(trans) {
console.log(trans.$from().params);
}
I noticed that trans.params() return "to" state params.
trans.$from().params will get you from state parameters declaration.
trans.params('from') will get you their actual values
Probably what you need is:
$transitions.onStart({ }, function(trans) {
console.log(trans.params('from'));
}
Please refer to documentation here
State params
https://ui-router.github.io/ng1/docs/latest/interfaces/state.statedeclaration.html#params
Transition params
https://ui-router.github.io/ng1/docs/latest/classes/transition.transition-1.html#params

Retaining File names when using Fine Uploader

I'm trying to implement the Fine Uploader for the first time and have a question. I've got the uploader pushing files to my S3 bucket. The issue I have is the file is being renamed to some sort of string.
Here's an example: 4f65aefe-c55b-42b0-afd4-b749c755e7e8.zip
I'd like to keep the original file name if possible. Is that possible?
Here is the script on the page with my current set of params:
var uploader = new qq.s3.FineUploader({
element: document.getElementById("fineUploader"),
request: {
endpoint: "mybucket.amazonaws.com",
accessKey: "ABCDEFGHIJKLMNOP"
},
signature: {
endpoint: "/wp-content/themes/zone/vendor/fineuploader/php-s3-server/endpoint.php"
},
iframeSupport: {
localBlankPagePath: "/wp-content/themes/zone/success.html"
},
cors: {
expected: true
},
chunking: {
enabled: true
},
resume: {
enabled: true
},
});
Am I missing something? Thanks in advance.
Yes, this is expected. By default, Fine Uploader S3 will use a UUID to name your object when sending it to an S3 bucket. In almost all cases, this is the safest behavior. If you change this value, you run the risk of overwriting existing files with new ones in the event of a name collision. The object is annotated with the original file name attached to an "x-amz-meta-qqfilename" header.
If you must save the object in S3 using a different name, you can modify the objectProperties.key option appropriately. A value of "filename" will save the object using the original filename. You can also set the value to a function where you can determine the name on-demand, even using values from some other location, provided your key function returns a Promise. Read more about this option at http://docs.fineuploader.com/api/options-s3.html#objectProperties.key.

Where to munge websocket data in Ember Data

I'm writing a web socket service for my Ember app. The service will subscribe to a URL, and receive data. The data will push models into Ember Data's store.
The URL scheme does not represent standard RESTful routes; it's not /posts and /users, for example, it's something like /inbound. Once I subscribe it will just be a firehose of various events.
For each of these routes I subscribe to, I will need to implementing data munging specific to that route, to get the data into a format Ember Data expects. My question is, where is the best place to do this?
An example event object I'll receive:
event: {
0: "device:add",
1: {
device: {
devPath: "/some/path",
label: "abc",
mountPath: "/some/path",
serial: "abc",
uuid: "5406-12F6",
uniqueIdentifier: "f5e30ccd7a3d4678681b580e03d50cc5",
mounted: false,
files: [ ],
ingest: {
uniqueIdentifier: 123
someProp: 123,
anotherProp: 'abc'
}
}
}
}
I'd like to munge the data to be standardized, like this
device: {
id: "f5e30ccd7a3d4678681b580e03d50cc5",
devPath: "/some/path",
label: "abc",
mountPath: "/some/path",
serial: "abc",
uuid: "5406-12F6",
mounted: false,
files: [ ],
ingestId: 123
},
ingest: {
id: 123,
someProp: 123,
anotherProp: 'abc'
}
and then hand that off to something that will know how to add both the device model and the ingest model to the store. I'm just getting confused on all the abstractions in ember data.
Questions:
Which method should I pass that final, standardized JSON to in order to add the records to the store? store.push?
Where is the appropriate place for the initial data munging i.e. getting the event data from the array? Application serializer's extractSingle? pushPayload? Most of the munging will be non-standard across the different routes.
Should per-type serializers be used for each key in the data after I've done the initial munging? i.e. should I had initial "blob" to application serializer, which will then delegate each key to the per-model serializers?
References:
Docs on the store
RESTSerializer

Twitter typeahead.js remote and search on client

As of my understanding typeahead.js got three ways of fetching data.
Local: hardcoded data
Prefetch: Load a local json file, or by URL
Remote: Send a query to the backend which responds with matching results
I want to fetch all data from the backend and then
process it on the client.
The data my server responds with got the following structure:
[{id:2, courseCode:IDA530, courseName:Software Testing, university:Lund University},
{id:1, courseCode:IDA321, courseName:Computer Security, university:Uppsala University}, ...]
I want it to search on all fields in each entry. (id, courseCode, courseName, university)
I wanna do more on the client and still fetching one time for each user (instead of every time a user are typing), I probably misunderstood something here but please correct me.
You should re-read the docs. Basically there are two things you need:
Use the prefetch: object to bring all the data from the backend to the client only once (that's what you are looking for, if I understand correctly.)
Use a filter function to transform those results into datums. The returned datums can have a tokens field, which will be what typeahead searched by, and can be built from all your data.
Something along the lines of:
$('input.twitter-search').typeahead([{
name: 'courses',
prefetch: {
url: '/url-path-to-server-ajax-that-returns-data',
filter: function(data) {
retval = [];
for (var i = 0; i < data.length; i++) {
retval.push({
value: data[i].courseCode,
tokens: [data[i].courseCode, data[i].courseName, data[i].university],
courseCode: data[i].courseCode,
courseName: data[i].courseName,
template: '<p>{{courseCode}} - {{courseName}}</p>',
});
}
return retval;
}
}
}]);

Resources