CouchDB: Trouble querying a view with a key using rewrites - url-rewriting

In CouchDB I have created a view called "zip", the map looks like this;
function (doc) {
if(doc.type == 'zip') {
emit(doc.zip_code, doc)
}
}
I then added a bunch of docs related to zip codes, a sample doc goes like this;
{
"_id": "zip/48114",
"_rev": "1-990b2c4f682ed0b6a27e2fa0c066c93d",
"zip_code": 48114,
"state": null,
"county": null,
"rep_code1": "INTL2",
"rep_code2": "MI1",
"type": "zip"
}
Now when I query the view directly like so,
http://localhost:5984/partslocator/_design/partslocator/_view/zip?key=48114
I get the row back that I am expecting;
{
"total_rows": 41683,
"offset": 20391,
"rows": [
{
"id": "zip/48114",
"key": 48114,
"value": {
"_id": "zip/48114",
"_rev": "1-990b2c4f682ed0b6a27e2fa0c066c93d",
"zip_code": 48114,
"state": null,
"county": null,
"rep_code1": "INTL2",
"rep_code2": "MI1",
"type": "zip"
}
}
]
}
I have then set up a vhost and am using rewrites, and my rewrite for 'zip' looks like this.
{from: "/zip/:zip", to: "_view/zip", query: {"key": ":zip"}}
To me this seems like it should be correct, however when I try to query the view with the rewrite url, it always returns zero rows.
rewrite url:
http://partslocatordev.com:5984/zip/48114
response:
{
"total_rows": 41683,
"offset": 41683,
"rows": []
}
Am I missing anything here?
Note: I am using rewrites in the same fashion with other views and they work, but I cannot figure out why this one in particular isn't.

It's likely that the rewriter is querying zip?key=":zip" rather than zip?key=:zip. You can use a formats field in your rewriter to name how different arguments should be typed. In this case, try this:
{
from: "/zip/:zip",
to: "_view/zip",
query: {"key": ":zip"},
formats: {
"zip": "int"
}
}
Alternatively, in your map function, emit a string as the ID rather than a number, like this:
function (doc) {
if(doc.type == 'zip') {
emit(String(doc.zip_code), doc)
}
}
That will handle cases where the zipcode isn't an integer, like in the UK.

Related

graphQL filter array containing ALL

I am quite new to graphQL, and after searching the whole afternoon, i didn't found my answer to a relative quite simple problem.
I have two objects in my strapi backend :
"travels": [
{
"id": "1",
"title": "Bolivia: La Paz y Salar de Uyuni",
"travel_types": [
{
"name": "Culturales"
},
{
"name": "Aventura"
},
{
"name": "Ecoturismo"
}
]
},
{
"id": "2",
"title": "Europa clásica 2020",
"travel_types": [
{
"name": "Clasicas"
},
{
"name": "Culturales"
}
]
}
]
I am trying to get a filter where I search for travels containing ALL the user-selected travel_types.
I then wrote a query like that :
query($where: JSON){
travels (where:$where) {
id # Or _id if you are using MongoDB
title
travel_types {name}
}
And the parameter i try to input for testing :
{
"where":{
"travel_types.name_contains": ["Aventura"],
"travel_types.name_contains": ["Clasicas"]
}
}
This should return an empty array, because none of the travels have both Aventura and Clasicas travel-types.
But instead it returns the travel with id=2. It seems that only the second filter is taken.
I searched for a query which would be like Array.every() in javascript, but i wasn't able to find.
Does someone has an idea how to achieve this type of filtering ?
Thank you very much,

GraphQL - How to get field types from the retrieved schema?

Knowing the schema (fetched via getIntrospectionQuery), how could I get the type of a particular field?
For example, say I run this query:
query {
User {
name
lastUpdated
friends {
name
}
}
}
and get this result:
{
"data": {
"User": [
{
"name": "alice",
"lastUpdated": "2018-02-03T17:22:49+00:00",
"friends": []
},
{
"name": "bob",
"lastUpdated": "2017-09-01T17:08:49+00:00",
"friends": [
{
"name": "eve"
}
]
}
]
}
}
I'd like to know the types of the fields and construct something like this:
{
"name": "String",
"lastUpdated": "timestamptz",
"friends": "[Friend]"
}
How could I do that without extra requests to the server?
After retrieving the schema, you can build it into a JSON object (if your graphql framework does not do it already for you).
Using a JSON parser, you can retrieve the the types of each field.
I will not enter into the detail, as it would depend on the technology your are using.

ElasticSearch URI Search null field

I need to create a query via URI to filter all data between two dates and also if this date field is null.
For example:
I have the field "creation_date" in some objects, however I want that in the resulting also does not appear the objects that the field does not have.
I tried something similar below:
http://localhost//elasticsearch/channels/channel/_search?q=channel.schedule.creation_date:[2018-06-19 TO 2018-12-22] OR channel.schedule.creation_date: NULL
As far as comparing the dates is OK, it works. The problem is to get the NULL values.
Edited
Source sample:
"_source": {
"channel": {
"activated": false,
"approved": false,
"content": "Jvjv",
"creation_date": "2018-06-21T13:06:10.000Z",
"facebookLink": "J jv",
"id": "Kvjvjv",
"instagramId": "Jvjv",
"name": "Kbkbkvk",
"ownerId": "sZtxdhiNbNY9sr2DtiCzlgJfsqb2",
"plan": 0,
"purpose": "Jvjv",
"recurrence": 1,
"segment": "Jvjvjv",
"twitterId": "Jvjv",
"youtubeId": "Jvj"
}
}
}
You can do this using the NOT(_exists_:field_name) constraint:
Can you try this ?
http://localhost//elasticsearch/channels/channel/_search?q=channel.schedule.creation_date:[2018-06-19 TO 2018-12-22] OR NOT(_exists_:channel.schedule.creation_date)

mgo with aggregation and grouping

I am trying to perform a query using golang mgo
to effectively get distinct values from a join, I understand that this might not be the best paradigm to work with in Mongo.
Something like this:
pipe := []bson.M{
{
"$group": bson.M{
"_id": bson.M{"user": "$user"},
},
},
{
"$match": bson.M{
"_id": bson.M{"$exists": 1},
"user": bson.M{"$exists": 1},
"date_updated": bson.M{
"$gt": durationDays,
},
},
},
{
"$lookup": bson.M{
"from": "users",
"localField": "user",
"foreignField": "_id",
"as": "user_details",
},
},
{
"$lookup": bson.M{
"from": "organizations",
"localField": "organization",
"foreignField": "_id",
"as": "organization_details",
},
},
}
err := d.Pipe(pipe).All(&result)
If I comment out the $group section, the query returns the join as expected.
If I run as is, I get NULL
If I move the $group to the bottom of the pipe I get an array response with Null values
Is it possible to do do an aggregation with a $group (with the goal of simulating DISTINCT) ?
The reason you're getting NULL is because your $match filter is filtering out all of documents after the $group phase.
After your first stage of $group the documents are only as below example:
{"_id": { "user": "foo"}},
{"_id": { "user": "bar"}},
{"_id": { "user": "baz"}}
They no longer contains the other fields i.e. user, date_updated and organization. If you would like to keep their values, you can utilise Group Accumulator Operator. Depending on your use case you may also benefit from using Aggregation Expression Variables
As an example using mongo shell, let's use $first operator which basically pick the first occurrence. This may make sense for organization but not for date_updated. Please choose a more appropriate accumulator operator.
{"$group": {
"_id":"$user",
"date_updated": {"$first":"$date_updated"},
"organization": {"$first":"$organization"}
}
}
Note that the above also replaces {"_id":{"user":"$user"}} with simpler {"_id":"$user"}.
Next we'll add $project stage to rename our result of _id field from the group operation back to user. Also carry along the other fields without modifications.
{"$project": {
"user": "$_id",
"date_updated": 1,
"organization": 1
}
}
Your $match stage can be simplified, by just listing the date_updated filter. First we can remove _id as it's no longer relevant up to this point in the pipeline, and also if you would like to make sure that you only process documents with user value you should placed $match before the $group. See Aggregation Pipeline Optimization for more.
So, all of those combined will look something as below:
[
{"$group":{
"_id": "$user",
"date_updated": { "$first": "$date_updated"},
"organization": { $first: "$organization"}
}
},
{"$project":{
"user": "$_id",
"date_updated": 1,
"organization": 1
}
},
{"$match":{
"date_updated": {"$gt": durationDays } }
},
{"$lookup":{
"from": "users",
"localField": "user",
"foreignField": "_id",
"as": "user_details"
}
},
{"$lookup":{
"from": "organizations",
"localField": "organization",
"foreignField": "_id",
"as": "organization_details"
}
}
]
(I know you're aware of it) Lastly, based on the database schema above with users and organizations collections, depending on your application use case you may re-consider embedding some values. You may find 6 Rules of Thumb for MongoDB Schema Design useful.

Filter by languages in 2nd and 4th level keys of a couchdb document

Given the following document in CouchDB....
{
"_id": "002bafd55b353692a7ab2968074310cc2cbff258",
"_rev": "1-bc853056ac61d817ae3c4ecb4f81322b",
"names": [
{ "locale": "en", "value": "Example" },
{ "locale": "de", "value": "Beispiel" },
{ "locale": "fr", "value": "Exemple" }
],
"details": [
{ "locale": "en", "value": "An Example is here" },
{ "locale": "de", "value": "Ein Beispiel ist heir" }
{ "locale": "en", "value": "Un exemple est ici" }
]
}
...how can I write a view that will allow me to return a partial document with
the undesired languages filtered out?
curl ..snip.. '_design/locale_filter/?locale=en,de,fr,it'
curl ..snip.. '_design/locale_filter/?locale=en,fr'
curl ..snip.. '_design/locale_filter/?locale=en'
Should return something looking like this:
{
"_id": "002bafd55b353692a7ab2968074310cc2cbff258",
"_rev": "1-bc853056ac61d817ae3c4ecb4f81322b",
"names": [
{ "locale": "en", "value": "Example" },
],
"details": [
{ "locale": "en", "value": "An Example is here" },
]
}
There's also a sub-case, where the documents have a further deeper structure,
which repeats the names and details structure, these would also be
filtered in an ideal world:
{
"_id": "002bafd55b353692a7ab2968074310cc2cbff258",
"_rev": "1-bc853056ac61d817ae3c4ecb4f81322b",
"names": [ ... snip ... ],
"details": [ ... snip ... ]
"deeper": {
"names": [
{ "locale": "en", "value": "Sub-Example" },
],
"details": [
{ "locale": "en", "value": "The Sub-Example is here" },
}
}
I also note that this might not be a view, but rather a show, from the
documentation couchdb says that a show is for transforming documents into any
format.
The final query from a beginner is whether there's some way to make it easier
to work on couchdb views and design docs, right now I'm experimenting with
erica which feels like overkill as I'm
pretty sure I don't want a couch app, I just want to easily maintain my views
in files on the disk, and sync them with the couch database whenever I've made
significant enough changes.
I was able to implement this using a show function, I implemented two show functions, one for convenience:
(doc, req) ->
all_locales = []
for name in doc.names
all_locales.push name.locale
toJSON(all_locales)
(I also implemented it on details, and remove duplicate locales in my real code)
This allows me to do the following:
GET /_design/dbname/_show/list_locales/c0db9ad..snip..
and returns ["en", "de", "fr"], for example - whatever locales the language happens to have.
I can then follow up with the function to retrieve the filtered document:
(doc, req) ->
locales = req.query.locales.split(",")
doc.names = doc.names.filter (name) ->
locales.indexOf(name.locale) > -1
doc.overviews = doc.details.filter (overview) ->
locales.indexOf(overview.locale) > -1
return toJSON(doc) + "\n"
The usage pattern for this is:
GET /_design/dbname/_show/restrict_locales/c0db9ad..snip..?locales=en,fr
GET /_design/dbname/_show/restrict_locales/c0db9ad..snip..?locales=fr
GET /_design/dbname/_show/restrict_locales/c0db9ad..snip..?locales=en,fr,de,it,hu,zh
It works quite remarkably well, and was much faster than I expected. I believe the show function results are aggressively cached by CouchDB.

Resources