MongoDB ruby driver update single field - ruby

https://docs.mongodb.com/ruby-driver/v2.14/tutorials/ruby-driver-crud-operations/#updating
I' been doing this
videos.find({"id": "c024f2bd"}).update_one({"title": "this is testing"})
When i look over the database it replace the entire document with just this field, my other field were all gone and empty. How can i update just single field? i've read the document it doesn't seem to have option parameter where i can define update field only don't replace.

You should use $set. Try this:
videos.update_one({"id": "c024f2bd"}, {"$set": {"title": "this is testing"}})

Related

Control which multi fields are queried by default

I have a preexisting index that contains field mappings and is currently being queried by many applications. I would like to add additional ways for the data to be queried, specifically, support full text search via analysis. Multi-fields seemed like the obvious way to do this, but I found that adding new multi-fields actually changes the existing query behavior.
For example, I have an "id" field that is a keyword. Applications are already using this field to query on. After I add a new multi-field, like "txt" (using the standard analyzer), new documents can be found by querying with just a partial value match. Values for "id" look like this: "123-abc" so now a query with just "abc" will match when querying against the "id" field. This is not how it worked previously (the keyword only field would require the entire value "123-abc").
Ideally, the top-level "id" field would be keyword only, and if a "full text" search was required, the query would need to specify "id.txt". So my question is... is there a way to disable multi-fields and require that the query explicitly set a sub field when needed?
My only other thought on how to solve this, was to use copy_to so that these fields are completely distinct... but that is a bit more work and there are many many fields to deal with that would require this.

How can I use standard SQL on text fields of elastic without using the specials SQL elasticSearch operators?

I would like to create SQL query on some text field (not keyword) for example "name" field and send that query to elastic server.
my problem is that I need to use the standard SQL language (not the MATCH and QUERY operators which are specials for elastic SQL) of text fields.
when I tried to use JDBC driver or when I tried to use high-level-java-client with LIKE operatorI got the following error
"No keyword/multi-field defined exact matches for [name]; define one or use MATCH/QUERY instead"
I also tried to use the translate API of elasticsearch- but even there I couldn't use the "LIKE" operator on text fields only on keyword fields.
does anyone have any solution for me? I want to use the LIKE operator on text fields instead of the full text operators which are unique to elastic sql.
Please check the this documentation. they have clearly mentioned in document that it is not possible.
One significant difference between LIKE/RLIKE and the full-text search
predicates is that the former act on exact fields while the latter
also work on analyzed fields. If the field used with LIKE/RLIKE
doesn’t have an exact not-normalized sub-field (of keyword type)
Elasticsearch SQL will not be able to run the query. If the field is
either exact or has an exact sub-field, it will use it as is, or it
will automatically use the exact sub-field even if it wasn’t
explicitly specified in the statement.
If you still want to used text field then you need to enabled multi-field as mentioned here. or you can try out to enable fielddata on text field but i am not sure that it will work SQL or not.

Apache NiFi. add new CSV field based on existing with modification

I have .csv file with several fields. One of them (for example 3th) contains email. How can I add additional filed that will contain only serverName from email field?
Example:
input:
01;city;name#servername.com;age;
result:
01;city;name#servername.com;age;servername;
I guess it possible through ReplaceText processor, but I can't choose correct value in "Search Value" and "Replacement Value" fields.
You can convert your flowfile, to a record with the help of ConvertRecord.
It allows you to pass to a JSON (or something else) format to whatever you prefer.
Then you can add a new field, like the following, with an UpdateRecordProcessor:
I recommend you the following reading:
Update Record tutorial

ElasticSearch: Replace a field value

Is there anyway to "replace a field value" by another one using a more or less straightforward process?
Example:
PUT twitter/tweet/{1...n}
{
"user" : "kimchy"
}
I want to replace user field of documents where "user is kimchy" setting a new value.
I figure out that the most straightforward way to get that is:
getting all documents,
change them,
save them again.
I'm looking forward to knowing if there's another a bit more elegant way.
Example: Update field = "new value" on documents where field is "value". Is there anyway to perform this single command on ES?

Multiple Field Search using Lucene Parser with Solr Using Sunspot

I'm using Solr with Sunspot (ruby) and due to other constraints i have to use the Lucene parser instead of the DisMax parser. I need to be able to search using username as well as first_name fields at the same time.
If i were using DisMax i can specify qf="username+first_name" but using only the lucene parser I am only able to set df (default field) and it will not allow me to specify more than one field.
How can I search multiple fields using the lucene parser?
Update: Answer: just use the q parameter
adjust_solr_params do |params|
params[:defType] = "lucene"
params[:q] = "username:\"#{params[:q]}\" OR first_name:\"#{params[:q]}\""
end
You can use copy fields instructions in your schema to create a "catch all" field from all the fields you want to search on. You then set df to that field.
To expand on Karussell's comment, the default field is just that, the default. You can explicitly specify however many fields you want, it's only if you don't specify one that the default comes into play.
So a query like username:foo first_name:bar will find documents with a username of "foo" and a first_name of "bar."

Resources