So when i use retrieve an object using get, i got a normal result
Code:
Contact::select(\DB::raw("CONCAT(COALESCE(`name`,''),' ',COALESCE(`last_name`,'')) AS display_name"),'id','name','last_name')->where('id',2382)->get()
Result:
[
"display_name" => "OFNA • CASA "
"id" => 2382
"name" => "OFNA • CASA"
"last_name" => null
]
but if i do a ->pluck() or ->toArray() i got this result:
[
"display_name" => b"Ofna €¢ Casa "
"id" => 2382
"name" => "OFNA • CASA"
"last_name" => null
]
For some reason the display_name is encoding incorrectly when converting to an Array. is there a way to fix this? or is a Laravel issue?
Thanks
My Laravel version is 6.8
I made a workaround, but im sure there should be a fix to this issue
This is my work around, map the get and then use the pluck on the collection after the map
get()->
map(function($object){
return [
'name'=>$object->name.' '.$object->last_name,
'id'=>$object->id
];
})->pluck('name','id');
It does work but im sure there should be a better way or maybe report it to Laravel.
Hope someone knows better about this.
THanks
Related
Versions:
Firebird 3.0 with PDO
Laravel 7
I'm using the Eloquent and I'm using package for connection with database https://packagist.org/packages/harrygulliford/laravel-firebird.
OBS: In Windows Serve work very well, but in Linux don't (In CentOS 7 and Ubuntu Server 20.04LTS).
I'm using Laravel with Firebird and I've problems with fields of type NUMERIC that return wrong values.
Example:
A query that should turn back 190,65, returns 0.0001.
This is the SQL DDL:
Item::selectRaw("CODIGO,DESCRICAO,PRECOVAREJO,PRECOATACADO,PRECOESPECIAL")->get();
This is return in json:
{ "data": [ { "CODIGO": "123456", "DESCRICAO": "DESCRIPTION EXAMPLE", "PRECOVAREJO": "0.0001", "PRECOATACADO": "0.0001", "PRECOESPECIAL": "0.0001" } ] }
Create Table:
CREATE TABLE ITENS ( PRECOVAREJO NUMERIC(15,3), PRECOATACADO NUMERIC(15,3), PRECOESPECIAL NUMERIC(15,3), CODIGO VARCHAR(8) NOT NULL, DESCRICAO VARCHAR(80) );
#Mark Rotteveel
This is the query builded by Eloquent Laravel:
array:2 [ 0 => array:3 [ "query" => "select count(*) as "aggregate" from "ITENS" where "ITENS"."DATACANCELAMENTO" is null" "bindings" => [] "time" => 33.17 ] 1 => array:3 [ "query" => """select CODIGO, DESCRICAO, PRECOVAREJO, PRECOATACADO, PRECOESPECIAL from "ITENS" where "ITENS"."DATACANCELAMENTO" is null order by "CODIGO" asc fetch first 10 rows only """ "bindings" => [] "time" => 8.39 ] ]
This is return expected
Image return expected
I solved the problem by updating the PHP version to version 7.4.16.
The problem was resolved in version 7.4.0
I also created a file in /etc/php.d/30-pdo_firebird.ini with the content: extension = pdo_firebird.
https://www.php.net/ChangeLog-7.php#7.4.0
resolve the bug
https://bugs.php.net/bug.php?id=65690
Thanks to everyone who helped
I'm using laravel-mysql-spatial package to store geo-cordinates in database. while using it in other places like, everything works fine. but when I use it in an observer, the error came
Call to undefined method Grimzy\LaravelMysqlSpatial\Eloquent\SpatialExpression::getLat()
The following code is not giving point object.
public function created(Beneficiary $beneficiary)
{
dd($beneficiary);
}
but when I retrive the data using the created id like below, it worked fine then
public function created(Beneficiary $beneficiary)
{
$beneficiary = Beneficiary::find($beneficiary->id);
dd($beneficiary);
}
but the above is not considered as a good practice. I'm already having an object and making an another call for it.
Expected Result. This result came after I make a call to retrieve the same data
#attributes: array:19 [▼
"id" => 95
"name" => "Test beneficiary"
"phone" => "80572*****"
"coordinates" => Point {#547 ▼
#lat: 30.3165
#lng: 78.0322
}
This is how I'm getting the result.
#attributes: array:16 [▼
"name" => "Test beneficiary"
"phone" => "80572*****"
"coordinates" => SpatialExpression {#521 ▼
#value: Point {#510 ▼
#lat: 30.3165
#lng: 78.0322
}
}
Do the following:
$p = Point::fromWKT($elm->coordinates->getSpatialValue());
Now you can use:
$p->getLng()
$p->getLat()
Hello I`m trying to get query results using FosElasticaBundle with this query, I
can't find a working example for filtering common words like (and, or) if it is possible this words not to be highlighted also would be really good. My struggle so far :
$searchForm = $this->createForm(SearchFormType::class, null);
$searchForm->handleRequest($request);
$matchQuery = new \Elastica\Query\Match();
$matchQuery->setField('_all', $queryString);
$searchQuery = new \Elastica\Query();
$searchQuery->setQuery($matchQuery);
$searchQuery->setHighlight(array(
"fields" => array(
"title" => new \stdClass(),
"content" => new \stdClass()
),
'pre_tags' => [
'<strong>'
],
'post_tags' => [
'</strong>'
],
'number_of_fragments' => [
'0'
]
));
Thanks in advance ;)
Do you want (and, or) to be ignored or not to have a value on your search?
If that's the case you may want to use stop words on your elasticsearch index.
Here's a reference.
https://www.elastic.co/guide/en/elasticsearch/guide/current/using-stopwords.html
This is kind of a follow up from another one of my questions:
JSON parser in logstash ignoring data?
But this time I feel like the problem is more clear then last time and might be easier for someone to answer.
I'm using the JSON parser like this:
json #Parse all the JSON
{
source => "MFD_JSON"
target => "PARSED"
add_field => { "%{FAMILY_ID}" => "%{[PARSED][platform][family_id][1]}_%{[PARSED][platform][family_id][0]}" }
}
The part of the output for one the logs in logstash.stdout looks like this:
"FACILITY_NUM" => "1",
"LEVEL_NUM" => "7",
"PROGRAM" => "mfd_status",
"TIMESTAMP" => "2016-01-12T11:00:44.570Z",
MORE FIELDS
There are a whole bunch of fields that like the ones above that work when I remove the JSON code. When I add the JSON filter, the whole log just disappears form elasticserach/kibana for some reason. The bit added by the JSON filter is bellow:
"PARSED" => {
"platform" => {
"boot_mode" => [
[0] 2,
[1] "NAND"
],
"boot_ver" => [
[0] 6,
[1] 1,
[2] 32576,
[3] 0
],
WHOLE LOT OF OTHER VARIABLES
"family_id" => [
[0] 14,
[1] "Hatchetfish"
],
A WHOLE LOT MORE VARIABLES
},
"flash" => [
[0] 131072,
[1] 7634944
],
"can_id" => 1700,
"version" => {
"kernel" => "3.0.35 #2 SMP PREEMPT Thu Aug 20 10:40:42 UTC 2015",
"platform" => "17.0.32576-r1",
"product" => "next",
"app" => "53.1.9",
"boot" => "2013.04 (Aug 20 2015 - 10:33:51)"
}
},
"%{FAMILY_ID}" => "Hatchetfish 14"
Lets pretend the JSON won't work, I'm okay with that now, that shouldn't mess with everything else to do with the log from elasticsearch/kibana. Also, at the end I've got FAMILY_ID as a field that I added separately using add_field. At the very least that should show up, right?
If someone's seen something like this before it would be great help.
Also sorry for spamming almost the same question twice.
SAMPLE LOG LINE:
1452470936.88 1448975468.00 1 7 mfd_status 000E91DCB5A2 load {"up":[38,1.66,0.40,0.13],"mem":[967364,584900,3596,116772],"cpu":[1299,812,1791,3157,480,144],"cpu_dvfs":[996,1589,792,871,396,1320],"cpu_op":[996,50]}
The sample line will be parsed (Everything after load is JSON), and in stdout I can see that it is parsed successfully, But I don't see it in elasticsearch.
This is my output code:
elasticsearch
{
hosts => ["localhost:9200"]
document_id => "%{fingerprint}"
}
stdout { codec => rubydebug }
A lot of my logstash filter is in the other question, but I think like all the relevant parts are in this question now.
If you want to check it out here's the link: JSON parser in logstash ignoring data?
Answering my own question here. It's not the ideal answer, but if anyone has a similar problem as me you can try this out.
json #Parse all the JSON
{
source => "MFD_JSON"
target => "PARSED"
add_field => { "%{FAMILY_ID}" => "%{[PARSED][platform][family_id][1]}_%{[PARSED][platform][family_id][0]}" }
}
That's how I parsed all the JSON before, I kept at the trial and error hoping I'd get it sometime. I was about to just use a grok filter to get bits that I wanted, which is a option if this doesn't work for you. I came back to this later, and thought "What if I removed everything after" because of some crazy reason that I've forgotten. In the end I did this:
json
{
source => "MFD_JSON"
target => "PARSED_JSON"
add_field => { "FAMILY_ID" => "%{[PARSED_JSON][platform][family_id][1]}_%{[PARSED_JSON][platform][family_id][0]}" }
remove_field => [ "PARSED_JSON" ]
}
So, extract the field/fields your interested in, and then remove the field made by the parser at the end. That's what worked for me. I don't know why, but it might work for other people too.
I have the following that I am generating in an Xpath query:
<xforms:instance id="Instance">
{//xforms:instance[#id="Instance"][ .= "Code" ]}
<Code>blah</Code>
</xforms:instance>
As it is it will return where the tag is 'Code' in the instance block. However, I want it to return everything else, and not 'Code'
I have tried variations including:
[ .not(="Code") ]}
[ .!= "Code" ]}
But these don't seem to work. I'd be grateful for your help.
Thanks.
Do you just want the text node?
/xforms:instance/text()