Parsing list in ramda - graphql

I am working on a project in which i am getting a getting data from backend via
api and i need to parse in using ramda. i am not sure how to do it .can anyone help me parse both the data given below
{
"data": [
{
"id": 60009001,
"userFullName": "",
"gender": "male",
"depositDone": 1,
"familyId": 60009001
}
]
}
[
{
"id": 60009001,
"gender": "male",
}
]
i'm getting this data from backend and i only need the field "familyId" in first case and "gender" in second case.
How can i parse it using ramda.

Something along these lines should satisfy the requirement:
const { project, propOr } = R
const sampleData = {
data: [
{
id: 60009001,
userFullName: "",
gender: "male",
depositDone: 1,
familyId: 60009001
}
]
}
const result = project(['id', 'gender'], propOr([], 'data', sampleData))
console.log(result)
<script src="https://cdnjs.cloudflare.com/ajax/libs/ramda/0.27.1/ramda.min.js"></script>

Related

How to filter any data on any filed is present or not in object in GraphQL

Below is my sample data which store in Dgraph
{
"data": {
"comp": [
{
"topologytemplate": {
"namespace": "a",
"nodetemplates": [
{
"name": "a",
},
{
"name": "b",
}
]
}
},
{
"topologytemplate": {
"namespace": "c",
"nodetemplates": [
{
"name": "a",
},
{
"name": "b",
"directives": [
"a"
]
},
]
}
},
]
},
}
I want to filter data so that as a result we get data that does not contain "directives" filed. I am want to filter data using GraphQL query?
Currently, I am trying to filter data as follows:
query {
comp(func: eq(dgraph.type,"QQQ")){
name
topologytemplate{
nodetemplates #filter (eq(nodetypename,"a")){
name
directives
}
}
}
}
Query to check that directives filed are present or not in nodetemplate?

Incorrectly selected data in the query

Only articles that contain the EmailMarketing tag are needed.
I'm probably doing the wrong search on the tag, since it's an array of values, not a single object, but I don't know how to do it right, I'm just learning graphql. Any help would be appreciated
query:
query {
enArticles {
title
previewText
tags(where: {name: "EmailMarketing"}){
name
}
}
}
result:
{
"data": {
"enArticles": [
{
"title": "title1",
"previewText": "previewText1",
"tags": [
{
"name": "EmailMarketing"
},
{
"name": "Personalization"
},
{
"name": "Advertising_campaign"
}
]
},
{
"title": "title2",
"previewText": "previewText2",
"tags": [
{
"name": "Marketing_strategy"
},
{
"name": "Marketing"
},
{
"name": "Marketing_campaign"
}
]
},
{
"title": "article 12",
"previewText": "article12",
"tags": []
}
]
}
}
I believe you first need to have coded an equality operator within your GraphQL schema. There's a good explanation of that here.
Once you add an equality operator - say, for example _eq - you can use it something like this:
query {
enArticles {
title
previewText
tags(where: {name: {_eq: "EmailMarketing"}}){
name
}
}
}
Specifically, you would need to create a filter and resolver.
The example here may help.

denormalise reverse processStrategy

I have an API that gives out data like this with the attributes in a fields property.
{
records: [
{
id: "123",
fields: {
author: {
id: "1",
name: "Paul"
},
title: "My awesome blog post",
comments: [
{
id: "324",
commenter: {
id: "2",
name: "Nicole"
}
}
]
}
}
]
};
When normalizing, I now handle this with a simple processStrategy: (input, parent, key) => input.fields but I would like denormalise this again so that the denormalised entities to contain this fields structure because the API expects it this way.
So far denormalising my normalised data with const denormalizedData = denormalize([123], [article], normalizedData.entities) omits the field:
[
{
"author": {
"id": "1",
"name": "Paul"
},
"title": "My awesome blog post",
"comments": [
{
"id": "324",
"commenter": {
"id": "2",
"name": "Nicole"
}
}
]
}
]
I cannot find anything in the api docs on how to add extra processing on denormalisation, any idea?
Because processStrategy is intended for pre-processing of data during the normalization process, it is not going to be executed during the denormalization. For your use case, I would not use this feature and simply structure your schemas as follows:
const { schema, denormalize, normalize } = normalizr;
const user = new schema.Entity("users");
const comment = new schema.Entity("comments", { commenter: user });
const commentList = [comment];
const post = new schema.Entity("posts", {
fields: { author: user, comments: commentList }
});
const postList = [post];
const mockApiResponse = {
records: [
{
id: "123",
fields: {
author: {
id: "1",
name: "Paul"
},
title: "My awesome blog post",
comments: [
{
id: "324",
commenter: {
id: "2",
name: "Nicole"
}
}
]
}
}
]
};
const normalizedResponse = normalize(mockApiResponse.records, postList);
const denormalizedResponse = denormalize(
normalizedResponse.result,
postList,
normalizedResponse.entities
);
console.log("normalizedResponse", normalizedResponse);
console.log("denormalizedResponse", denormalizedResponse);
This will give you the result you are looking for. If for some reason, you need to stick to your current implementation, I would recommend implementing a transform on your request prior to sending it back to the server. As an example, axios solves this with their transformRequest feature.

DynamoDB DocumentClient returns Set of strings (SS) attribute as an object

I'm new to DynamoDB.
When I read data from the table with AWS.DynamoDB.DocumentClient class, the query works but I get the result in the wrong format.
Query:
{
TableName: "users",
ExpressionAttributeValues: {
":param": event.pathParameters.cityId,
":date": moment().tz("Europe/London").format()
},
FilterExpression: ":date <= endDate",
KeyConditionExpression: "cityId = :param"
}
Expected:
{
"user": "boris",
"phones": ["+23xxxxx999", "+23xxxxx777"]
}
Actual:
{
"user": "boris",
"phones": {
"type": "String",
"values": ["+23xxxxx999", "+23xxxxx777"],
"wrapperName": "Set"
}
}
Thanks!
The [unmarshall] function from the [AWS.DynamoDB.Converter] is one solution if your data comes as e.g:
{
"Attributes": {
"last_names": {
"S": "UPDATED last name"
},
"names": {
"S": "I am the name"
},
"vehicles": {
"NS": [
"877",
"9801",
"104"
]
},
"updatedAt": {
"S": "2018-10-19T01:55:15.240Z"
},
"createdAt": {
"S": "2018-10-17T11:49:34.822Z"
}
}
}
Please notice the object/map {} spec per attribute, holding the attr type.
Means you are using the [dynamodb]class and not the [DynamoDB.DocumentClient].
The [unmarshall] will Convert a DynamoDB record into a JavaScript object.
Stated and backed by AWS. Ref. https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB/Converter.html#unmarshall-property
Nonetheless, I faced the exact same use case, as yours. Having one only attribute, TYPE SET (NS) in my case, and I had to manually do it. Next a snippet:
// Please notice the <setName>, which represents your set attribute name
ddbTransHandler.update(params).promise().then((value) =>{
value.Attributes[<setName>] = value.Attributes[<setName>].values;
return value; // or value.Attributes
});
Cheers,
Hamlet

Immutable js updating a Map in a List

Pushing nested data into a Map inside a List
Can anyone tell me:
How do i push a task to either of these users (List items) ideally by specific user id?
Thanks in advance.
My code:
const initialState = Immutable.List([
Immutable.Map({
"id": 1,
"name": "Abe Bell",
"tasks": [
{
"id": 1,
"title": "Get haircut",
"status": false
}
]
}),
Immutable.Map({
"id": 2,
"name": "Chad Dim",
"tasks": [
{
"id": 2,
"title": "Get real job",
"status": false
}
]
})
])
First, the way you're building this structure, the tasks array will not be an immutable instance, I think that is not what you want, you can use Immutable.fromJS to transform all the nested arrays and maps into a Immutable instance.
The way your data is structured you'll have to navigate through the list of users and perform the update when the id matches.
One way of doing that is using map
const initialState = Immutable.fromJS([
{
"id": 1,
"name": "Abe Bell",
"tasks": [
{
"id": 1,
"title": "Get haircut",
"status": false
}
]
},
{
"id": 2,
"name": "Chad Dim",
"tasks": [
{
"id": 2,
"title": "Get real job",
"status": false
}
]
}
]);
let userId = 2;
let newState = initialState.map(user => {
if (user.get('id') !== userId) {
return user;
}
return user.update('tasks', tasks => {
return tasks.push(Immutable.fromJS({
id: 3,
title: "new task",
status: false
}))
});
});
Although this will do what you want,I think you should change your data to a map instead of a list if this kind of operation is something recurrent in your application. This will make things easier and faster.
const initialState = Immutable.fromJS({
"1": {
"id": 1,
"name": "Abe Bell",
"tasks": [
{
"id": 1,
"title": "Get haircut",
"status": false
}
]
},
"2": {
"id": 2,
"name": "Chad Dim",
"tasks": [
{
"id": 2,
"title": "Get real job",
"status": false
}
]
}
});
let userId = "2";
let newState = initialState.updateIn([userId, 'tasks'], tasks => {
return tasks.push(Immutable.fromJS({
id: 3,
title: "new task",
status: false
}));
});

Resources