Logstash: Pass Field Content to Filter - filter

I'm writing a filter and I need to pass a field to it, but I don't know how. I can only specify the name of the field rather than its content:
filter {
if [path] =~ "access" {
:
enhance {
ref => uri # or "uri", [uri]
}
}
}
All will set ref to the string "uri". The config-line in the filter is
config :ref, :validate => :string, :required => true
Of course I can do event[ref] in the filter, but passing the content seems more elegant. What do I need to do?

If I understand your question, you want to know how to use ref in your Ruby code that you've written. Since you've already defined the
config :ref, :validate => :string, :required => true
you just use #ref to get the value.
For example, event['somefield'] = #ref

Related

Mongodb ruby driver: edit Collection::View instance filter

When I create Collection::View instance with:
client = Mongo::Client.new('mongodb://127.0.0.1:27017/test')
view = client[:users].find( { name: "Sally" } )
=> #<Mongo::Collection::View:0x69824029475340 namespace='test.users' #filter={"name" => "Sally"} #options={}>
How I can change filter hash of this instance later? This does not work:
view.filter.merge!("age" => 30)
=> #FrozenError: can't modify frozen BSON::Document
I don't think you can. .filter is a method which takes arguments. It is not a hash.
See examples
and also search the code
However you might be able to do something like:
view = lambda { |hash| client[:users].find(hash) }
search_params = { name: "Sally" }
view.(search_params)
view.(search_params.merge!({foo: 'bar'}))

Upsert hash in mongodb. Remove unused keys

From the Mongo documentation:
If you specify multiple field-value pairs, $set will update or create
each field.
I have a mongoid document like this:
class MyCounter
include Mongoid::Document
field :date, type: Date
field :properties, type: Hash
end
and when I try to change the properties like this:
hash = {properties.0 => 50, properties.1 => 100 }
MyCounter.where(something).find_and_modify(
{ '$set' => hash, {'upsert' => 'true', new: true}
)
it keeps the old keys on the property hash.
What's the correct way to completely replace (and create a new document if it doesn't exist) an hash in a document?
EDIT
I'm currently doing this which is dumb:
MyCounter.where(
date: date
).find_and_modify(
{ '$unset' => { properties: nil} }, {'upsert' => 'true', new: true}
)
MyCounter.where(
date: date
).find_and_modify(
{ '$set' => hash }, {'upsert' => 'true', new: true}
)
Just don't use $set. By only passing field: value pairs all the fields are replaced (except the _id field)
This should work fine.
MyCounter.where(
date: date
).find_and_modify(
hash, {'upsert' => 'true', new: true}
)
http://docs.mongodb.org/manual/reference/method/db.collection.update/#example-update-replace-fields

How to stop DataMapper from double query when limiting columns/fields?

I'm not sure if I'm at fault here or if my approach is wrong with this.
I want to fetch a user (limiting columns/fields only to name, email, id):
#user = User.first(:api_key => request.env["HTTP_API_KEY"], :fields => [:id, :name, :email])
The output in the command line is correct as follows:
SELECT "id", "name", "email" FROM "users" WHERE "api_key" = '90e20c4838ba3e1772ace705c2f51d4146656cc5' ORDER BY "id" LIMIT 1
Directly after the above query, I have this code:
render_json({
:success => true,
:code => 200,
:user => #user
})
render_json() looks like this, nothing special:
def render_json(p)
status p[:code] if p.has_key?(:code)
p.to_json
end
The problem at this point is that the #user variable contains the full user object (all other fields included) and DataMapper has made an additional query to the database to fetch the fields not included in the :fields constraint, from the logs:
SELECT "id", "password", "api_key", "premium", "timezone", "verified", "notify_me", "company", "updated_at" FROM "users" WHERE "id" = 1 ORDER BY "id"
My question is this: how do I stop DM from performing the additional query? I know it has to do with it's lazy loading architecture and that returning the #user variable in JSON assumes that I want the whole user object. I particularly don't want the password field to be visible in any output representation of the user object.
The same behaviour can be seen when using DM's own serialisation module.
I think you should use an intermediate object for json rendering.
First, query the user from database :
db_user = User.first(:api_key => request.env["HTTP_API_KEY"], :fields => [:id, :name, :email])
Then, create a "json object" to manipulate this user :
#user = { id: db_user.id, name: db_user.name, email: db_user.email }

tire terms filter not working

I'm trying to achieve a "scope-like" function with tire/elasticsearch. Why is this not working, even when i have entries with status "Test1" or "Test2"? The results are always empty.
collection = #model.search(:page => page, :per_page => per_page) do |s|
s.query {all}
s.filter :terms, :status => ["Test1", "Test2"]
s.sort {by :"#{column}", "#{direction}"}
end
The method works fine without the filter. Is something wrong with the filter method?! I've checked the tire doku....it should work.
Thanks! :)
Your issue is most probably being caused by using the default mappings for the status field, which would tokenize it -- downcase, split into words, etc.
Compare these two:
http://localhost:9200/myindex/_analyze?text=Text1&analyzer=standard
http://localhost:9200/myindex/_analyze?text=Text1&analyzer=keyword
The solution in your case is to use the keyword analyzer (or set the field to not_analyzed) in your mapping. When the field would not be an “enum” type of data, you could use the multi-field feature.
A working Ruby version would look like this:
require 'tire'
Tire.index('myindex') do
delete
create mappings: {
document: {
properties: {
status: { type: 'string', analyzer: 'keyword' }
}
}
}
store status: 'Test1'
store status: 'Test2'
refresh
end
search = Tire.search 'myindex' do
query do
filtered do
query { all }
filter :terms, status: ['Test1']
end
end
end
puts search.results.to_a.inspect
Note: It's rarely possible -- this case being an exception -- to offer reasonable advice when no index mappings, example data, etc. are provided.

Not Equal (ne) and OR not returning correct results in Mongomapper scopes

I can't chain these two scopes together in Mongomapper using an OR:
scope :comment_is_nil, where(:comment => nil)
scope :post_not_blank, where(:post.ne => "")
It should return model objects where the comment is not nil, OR the post is not blank.
This doesn't work:
Model.where("$or" => [{:comment_is_nil, :post_not_blank])
Any ideas?
Chaining scopes is an and operation so M.comment_is_nil.post_not_blank won't work as you know. MongoDB's or syntax looks like this:
Model.where(
:$or => [
{ :comment => nil },
{ :post.ne => '' }
]
)
So you need to give it an array of individual conditions by manually expanding the scopes.

Resources