How to replace specific item in a collection of items in Laravel - laravel

I have a bit of a problem in Laravel. I'm using a collection where I add in new users which have been registered to a particular activity. These are formatted in the following way
{
[
"ID" => "1",
"Email" => "test#example.com",
"Registrations" => [
"Sports" => [
[
"id" => "457",
"title" => "Football"
],
[
"id" => "459",
"title" => "Rugby"
]
]
]
],
[
"ID" => "2",
"Email" => "test2#example.com",
"Registrations" => [
"Sports" => [
[
"id" => "457",
"title" => "Football"
],
[
"id" => "458",
"title" => "Badminton"
]
]
]
]
}
The issue I'm having is that when attempting to add a new Sports id I'm not sure how to go about it. The way I originally thought about doing it is checking if that particular email has already been registered and then replacing the data within the collection. However I encountered a problem with each time there was an existing entry the whole data structure would get replaced. Any helpful advice? Below is a reference to what I already have
return $user_courses->where('ContactID' , $user['Registration']['Link']['Contact']['ContactID'])
->map(function($key) use ($user, $sportService){
$course_id = $sportService->validateSport($user['Registration']['Link']['Activity']['ContentUri']);
if($course_id){
$key['Registrations']['Sports'][] = [
'id' => $course_id,
'title' => $user['Registration']['Link']['Activity']['Title']
];
return $key;
}
return $key;
});

first, you can filter that object that you want to replace from the collection and then push the new object to it.
checkout this link

Related

How to apply a rule in laravel to check filename is unique?

In laravel 8 how can use the validation to check the file name provided in the array were unique.
` "main": [
{
"content": "ABC",
"filename": "recording_1",
"code": "264",
"parameters": ""
},
{
"content": "XYZ",
"filename": "recording_2",
"code": "264",
"parameters": ""
}
...more to come
]`
Above is the request structure. From that request, I have to check the all filename should be unique
How can I achieve this?
you can use distinct
$validator = Validator::make(
[
'main' =>
[
[
"content" => "ABC",
"filename" => "recording_1",
"code" => "264",
"parameters" => ""
],
[
"content" => "XYZ",
"filename" => "recording_1",
"code" => "264",
"parameters" => ""
]
]
],
['main.*.filename' => 'distinct']
);
then you can check
if($validator->fails()){
echo "<pre>";
print_r($validator->errors());
exit();
}
Output will be
Illuminate\Support\MessageBag Object
(
[messages:protected] => Array
(
[main.0.filename] => Array
(
[0] => The main.0.filename field has a duplicate value.
)
[main.1.filename] => Array
(
[0] => The main.1.filename field has a duplicate value.
)
)
[format:protected] => :message
)
Ref:https://laravel.com/docs/8.x/validation#rule-distinct

In Elasticsearch how can I get field length using the painless script?

I'm trying to update a field if it's longer than the existing one in an index, but neither of these function seem to work to get the length of a field
ctx._source.description.length() < params.description.length()
ctx._source.description.size() < params.description.size()
ctx._source.description.length < params.description.length
I get the same error that the methods don't exist. Any idea how to achieve this?
Edit:
This is the error I'm getting:
array:1 [
"update" => array:5 [
"_index" => "products_a"
"_type" => "_doc"
"_id" => "XjouMXoBeY37PI1TSOQl"
"status" => 400
"error" => array:3 [
"type" => "illegal_argument_exception"
"reason" => "failed to execute script"
"caused_by" => array:6 [
"type" => "script_exception"
"reason" => "runtime error"
"script_stack" => array:2 [
0 => """
if (ctx._source.description.length()<params.description.length()) {\n
\t\t\t\t\t\t\t\t
"""
1 => " ^---- HERE"
]
"script" => "add_new_seller_to_existing_doc"
"lang" => "painless"
"caused_by" => array:2 [
"type" => "illegal_argument_exception"
"reason" => "dynamic method [java.util.ArrayList, length/0] not found"
]
]
]
]
]
Edit 2:
Stored script
$params = [
'id' => 'add_new_seller_to_existing_doc',
'body' => [
'script' => [
'lang' => 'painless',
'source' =>
'if(params.containsKey(\'description\')){
if (!ctx._source.containsKey(\'description\')) {
ctx._source.description=params.description;
} else if (ctx._source.description.length()<params.description.length()) {
ctx._source.description=params.description;
}
}'
]
]
];
Bulk update command:
$bulk_obj['body'][] = ['update' => ['_id' => $id, '_index' => 'products']];
$bulk_obj['body'][] = ['id' => 'add_new_seller_to_existing_doc', 'params' => ['description'=>'some_description']];
The problem seems to be that the description field is an ArrayList awhile you think it is a String.
Either you use ArrayList.size() or you convert the ArrayList to a String (if it contains only one string) and you can then use String.length()

logstash : create fingerprint from timestamp part

I have a problem to create a fingerprint based on client-ip and a timestamp containing date+hour.
I'm using logstash 7.3.1. Here it the relevant part of my configuration file
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date{
match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
}
...
ruby{
code => "
keydate = Date.parse(event.get('timestamp'))
event.set('keydate', keydate.strftime('%Y%m%d-%H'))
"
}
fingerprint {
key => "my_custom_secret"
method => "SHA256"
concatenate_sources => "true"
source => [
"clientip",
"keydate"
]
}
}
The problem is into the 'ruby' block. I tried multiple methods to compute the keydate, but none works without giving me errors.
The last one (using this config file) is
[ERROR][logstash.filters.ruby ] Ruby exception occurred: Missing Converter handling for full class name=org.jruby.ext.date.RubyDateTime, simple name=RubyDateTime
input document
{
"timestamp" => "19/Sep/2019:00:07:56 +0200",
"referrer" => "-",
"#version" => "1",
"#timestamp" => 2019-09-18T22:07:56.000Z,
...
"request" => "index.php",
"type" => "apache_access",
"clientip" => "54.157.XXX.XXX",
"verb" => "GET",
...
"tags" => [
[0] "_rubyexception" # generated by the ruby exception above
],
"response" => "200"
}
expected output
{
"timestamp" => "19/Sep/2019:00:07:56 +0200",
"referrer" => "-",
"#version" => "1",
"#timestamp" => 2019-09-18T22:07:56.000Z,
...
"request" => "index.php",
"type" => "apache_access",
"clientip" => "54.157.XXX.XXX",
"verb" => "GET",
...
"keydate" => "20190919-00", #format : YYYYMMDD-HH
"fingerprint" => "ab347766ef....1190af",
"response" => "200"
}
As always, many thanks for all your help !
I advice to remove the ruby snippet and use the build in Date filter: https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html
What you are doing in the ruby snippet is exactly what the date filter does - extract a timestamp from a field and reconstruct it into your desire format.
another option (a bit less recommended, but will also work) is to use grok in order to extract the relevant parts of the timestamp and combine them in a different manner.

inner_hits with nested syntax on PHP client for elasticsearch does not seem to work

I am using the PHP client for elasticsearch (5.2.0) and fail to get the inner_hits results , this is my PHP query (which does not return the inner_hits)
$params = [
'index' => 'caption_index',
'type' => 'caption',
'body' => [
'query' => [
'nested' => [
'path' => 'lines',
'query' => [
'bool' => [
'must' => [
['match' => ['lines.content' => 'Totally different text' ]]
]
]
],
'inner_hits' => [ ]
]
]
],
'client' => [ 'ignore' => 404 ]
];
$results = $client->search($params);
Simultaneously I am running the same requests on Kibana and I do get the answers correctly
GET /caption_index/caption/_search
{
"query": {
"nested" : {
"path" : "lines" ,
"query": {
"bool" : {
"must": [
{
"match" :
{ "lines.content" : "Totally different text" }
}
]
}
},
"inner_hits" : {}
}
}
}
Any idea what is the difference and why the PHP won't show the results?
I can attach the current results but it seems like an overkill in this case - trust me - the inner hits is not there
I was having the same issue with the ES PHP API, got it working by including a parameter in the inner_hits array.
For example:
'inner_hits' => ['name' => 'any-name']
You can find which parameters are allowed here.

elasticsearch multiple types and indexes in query

i have a query like this, its in php, using the elastica library
$queryDefinition = [
'query' => [
'bool' => [
'must' => [
[
'query_string' => [
'default_field' => '_all',
'query' => $term
]
]
],
'must_not' => [],
'should' => []
]
],
//'from' => 0,
//'size' => 50,
'sort' => [],
'facets' => [
'types' => [
'terms' => ['field' => '_type']
]
]
];
i know you can run a query against multiple indices and types, using the REST api.
i would like to run the above query against variable count of types and indices. is it possible to do with the json query style?
if not how do i translate that query to REST style?
thanks
Here is an example for curl - where sales is my index and order is my type (if you need mutiple type, use comma seperated values like ../sales/order,order1/_search ..), i used 3 field from my data i.e. id, name and status, and query string on name like 'shoh'
curl -XGET 'localhost:9200/sales/order/_search?pretty=1' -d '
{
"query":
{
"bool":
{
"must":[{"query_string":{"query":"shoh*"}}]
}
},
"fields":["id","name"],
"facets":{"Facet Result":{"terms":{"field":"status"}}}
}
'

Resources