we got an ES/Kibana [7.17] running, everything works fine so far but we want to autotranslate a field based on a static table how is this possible? I remember it was possible over custom formats in older Kibana versions but I cannot find how to do it in this one.
e.g.
1 => HR Department
2 => IT Department
3 => Production
etc.
Data is:
Max Muster 3
Data should be
Max Muster 3 Production
P.S. I tried adding a runtime field to the template but it always complains that the syntax is wrong
filter {
translate {
source => "[dep]
target => "[department]"
dictionary => {
"1" => "HR Dep"
"2" => "IT Dep"
"3" => "Production"
}
}
}
}
Related
I've seen similar posts here, however they're for older Laravel versions, and also my query string is created dynamically.
I get an array of arrays that represent the values I need to use when I construct the sql query string. For example:
[
["a" => "a1", "b" => "b1", "c" => "c1"],
["a" => "a2", "b" => "b2", "c" => "c2", "d" => "d2"]
]
Then I need to create some complex query that's impossible to write with Laravel's query builder and uses the dynamic data from above:
SELECT
...
WHERE (a="a1", b="b1", c="c1")
OR WHERE (a="a2", b="b2", c="c2", d="d2")
...
From older posts I've seen here, it was mentioned I can use
$result = DB::select($query_string):
Or even with DB::statement.
But, I'm not sure it's still a good way in Laravel 8 and above because it's not in the docs.
But even if yes, it means I'll put the string as is without taking care of binding the values to prevent sql injection.
So how can I do it in the case?
I would go with Closure and foreach, because you can use array with string keys to define where/orWhere conditions.
$conditions = [
["a" => "a1", "b" => "b1", "c" => "c1"],
["a" => "a2", "b" => "b2", "c" => "c2", "d" => "d2"]
];
DB::table('your_table')->where(function($query) use ($conditions) {
foreach($conditions as $condition) {
$query->orWhere($condition);
}
})->selectRaw($query_string)->get();
If you really need to go with raw SQL, just add the raw functions like orderByRaw(), groupByRaw, etc... https://laravel.com/api/9.x/Illuminate/Database/Query/Builder.html
And about sql injection, you need to validate your data. https://laravel.com/docs/9.x/validation
I have some coordinates that I pass to Elasticsearch from Logstash, but Elastic keeps only 3 decimals, so coordinate wise, I completely lose the location.
When I send the data from Logstash, I can see it got the right value:
{
"nasistencias" => 1,
"tiempo_demora" => "15",
"path" => "/home/elk/data/visits.csv",
"menor" => "2",
"message" => "5,15,Parets del Vallès,76,0,8150,41.565505,2.234999575,LARINGITIS AGUDA,11/3/17 4:20,1,38,1,2,POINT(2.2349995750000695 41.565505000000044)",
"id_poblacion" => 76,
"#timestamp" => 2017-03-11T04:20:00.000Z,
"poblacion" => "Parets del Vallès",
"edad_valor" => 0,
"patologia" => "LARINGITIS AGUDA",
"host" => "elk",
"#version" => "1",
"Geopoint_corregido" => "POINT(2.2349995750000695 41.565505000000044)",
"id_tipo" => 1,
"estado" => "5",
"cp" => 8150,
"location" => {
"lon" => 2.234999575, <- HERE
"lat" => 41.565505 <- AND HERE
},
"id_personal" => 38,
"Fecha" => "11/3/17 4:20"
}
But then, I get it on Kibana as follows:
I do the conversion as follows:
mutate {
convert => { "longitud_corregida" => "float" }
convert => { "latitude_corregida" => "float" }
}
mutate {
rename => {
"longitud_corregida" => "[location][lon]"
"latitude_corregida" => "[location][lat]"
}
}
How can I keep all the decimals? With geolocation, one decimal can return the wrong city.
Another question (related)
I add the data to the csv document as follows:
# echo "5,15,Parets del Vallès,76,0,8150,"41.565505","2.234999575",LARINGITIS AGUDA,11/3/17 4:20,1,38,1,2,POINT(2.2349995750000695 41.565505000000044)" >> data/visits.csv
But in the original document, instead of dots there are comas for the coordinates. like this:
# echo "5,15,Parets del Vallès,76,0,8150,"41,565505","2,234999575",LARINGITIS AGUDA,11/3/17 4:20,1,38,1,2,POINT(2.2349995750000695 41.565505000000044)" >> data/visits.csv
But the problem was that it was getting the coma as field separator, and all the data was being sent to Elasticsearch wrong. Like here:
Here, the latitude was 41,565505, but that coma made it understand 41 as latitude, and 565505 as longitude. I changed the coma by dot, and am not sure if float understands comas and dots, or just comas. My question is, did I do wrong changing the coma by dot? Is there a better way to correct this?
Create a GEO-Point mapping for the lat/lon fields. This will lead to a more precise and internally optimized storage in ES and allow you more sophisticated GEO-Queries.
Please keep in mind, that you'll need to reindex the data as mapping changes are not possible afterwards (if there are already docs present having the fields to change)
Zero downtime approach:
Create a new index with a optimized mapping (derive it from the current, and make your changes manually)
Reindex the data (at least some docs for verification)
Empty the new index again
Change the logstash destination to the new index (consider using aliases)
Reindex the old data into the new index
I'm facing a problem with logstash configuration. You can find my logstash configuration below.
Ruby filter removes every dot - "." from my fields. It seems that every works fine - the result of data filtration is correct but elasticsearch magically responds with: "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"Field name [/ConsumerAdminWebService/getConsumerTransactions.call] cannot contain '.'"} where getConsumerTransactions.call is one of my field key.
input {
http_poller {
urls => {
uatBackend1 => {
method => get
url => "http://some-url/"
headers => {
Accept => "application/json"
}
}
}
request_timeout => 60
# Run every 30 seconds
schedule => { cron => "* * * * * UTC"}
codec => "json"
metadata_target => "http_poller_metadata"
}
}
filter {
ruby {
init => "
def remove_dots hash
new = Hash.new
hash.each { |k,v|
if v.is_a? Hash
v = remove_dots(v)
end
new[ k.gsub('.','_') ] = v
if v.is_a? Array
v.each { |elem|
if elem.is_a? Hash
elem = remove_dots(elem)
end
new[ k.gsub('.','_') ] = elem
} unless v.nil?
end
} unless hash.nil?
return new
end
"
code => "
event.instance_variable_set(:#data,remove_dots(event.to_hash))
"
}
}
output {
elasticsearch {
hosts => localhost
}
}
I'm afraid that this line of code is not correct: event.instance_variable_set(:#data,remove_dots(event.to_hash)) - result data is somehow pinned to the event but the original data persists unchanged and is delivered to Elasticsearch api.
I suppose some clarifications are required here:
I use ES version > 2.0 so dots are not allowed
Ruby filter should replace dots with "_" and it works great - resulting data is fully correct however ES replies with mentioned error. I suspect that filter does not replace event data but simply adds a new filed to Event object. ES then still reads primal data not the updated one.
To be honest Ruby is a magic to me :)
If you're using the ES version 2.0 it could be a version issue where ES doesn't pick up fields which contains . dots.
According to this response in this thread:
Field names cannot contain the . character in Elasticsearch 2.0.
As a work around you might have to mutate (rename) your field names into something like _ or - instead of using the . dot. This ticket pretty much explains this issue, where as . dots can be used in the ES versions which are after 2.0. Hope it helps!
Basically, I want to do this:
$locals['companies'] = Company::orderBy('name')->get(['id','name'])->map(function($c) { return [$c->id, $c->name]; })->toArray();
But without such a verbose map function. Isn't there a get-like method that will return flat numeric arrays instead of objects?
To be clear, the output should look like this:
array:4 [
0 => array:2 [
0 => 4
1 => "My Company"
]
1 => array:2 [
0 => 14
1 => "Example Company"
]
2 => array:2 [
0 => 13
1 => "Best Company"
]
3 => array:2 [
0 => 12
1 => "Super Co"
]
]
This is what I mean by 2-tuples: two-element numeric arrays. I know they don't exist in PHP, but the concept is the same; each entry has a fixed length.
There is no function out of the box to do this, but Laravel's Collection is Macroable, so you can add your own function to it to do this.
For example, somewhere in your code (like the boot() method of your AppServiceProvider), you can add a new method to the Collection:
// add toIndexedArray method to collections
\Illuminate\Support\Collection::macro('toIndexedArray', function() {
return array_map('array_values', $this->toArray());
});
Now you can use this new method like any other normal Collection method, so your final code would be:
$locals['companies'] = Company::orderBy('name')->get(['id','name'])->toIndexedArray();
If this is something you need a lot, you can change the PDO fetch mode in config/database.php to PDO::FETCH_NUM. I'm assuming it's possible to change it on-the-fly as well, but the code probably won't look that great. I don't think there's a Laravel command to change it for a single query, I'm afraid.
Otherwise, since the array is multidimensional, I'm afraid you do need to map over them somehow, and Laravel collections don't work nicely with e.g. ->map('array_values') which would have been a lot cleaner.
You could wrap it in array_map('array_values', $array) but that seems silly.
At least you could make it a little shorter if you change ->map() to ->transform() - then you don't need to tack on the ->toArray() at the end.
Use pluck():
$locals['companies'] = Company::orderBy('name')->pluck('id', 'name')->toArray();
If you need a list for Form::select this will work:
$locals['companies'] = Company::orderBy('name')->pluck('name', 'id');
You can omit the map function and just do:
$locals['companies'] = Company::orderBy('name')->get(['id','name'])->toArray();
For some reason I just can't get the latest version of Hominid working with groupings in MailChimp.
Here's a snippet of what I'm doing:
info[:GROUPINGS] = { 'name' => 'Locations', 'groups' => 'SomeLocation' }
mailchimp = Hominid::API.new(MAILCHIMP_API_KEY)
list_id = mailchimp.find_list_id_by_name MAILCHIMP_LIST_NAME
mailchimp.list_update_member(list_id, email_value, info)
I've tried seemingly every combination of arrays and hashes to get the groupings working, but I keep getting variations of this error:
<270> "V" is not a valid Interest Grouping id for the list: Test List
Any help would be appreciated!
It seems that it needs an array of hashes:
info[:GROUPINGS] = [ { 'name' => 'Locations', 'groups' => 'SomeLocation' } ]
I hope this helps someone!