New record not appearing in elastic search result - elasticsearch

Just after creating new record, new record not appearing in listing api although it is getting indexed. New record is appearing when hitting index api again. Using elastic search, implemented through searchkick. Anyone faced this issue?
Started POST "/api/v1/pm/projects/4/meetings" for 127.0.0.1 at 2018-10-12 13:15:45 +0530
Processing by Api::V1::Pm::MeetingsController#create as JSON
Parameters: {"meeting"=>{"name"=>"prj 4 meeting 78", "date"=>"12/10/2018", "start_time"=>"01:30 PM", "end_time"=>"02:00 PM", "url"=>"https://asdf.com", "agenda"=>"prj 4 meeting 78 agenda", "notes"=>"", "members"=>["abc#gmail.com"]}, "project_id"=>"4"}
AuthenticationToken Load (0.2ms) SELECT "authentication_tokens".* FROM "authentication_tokens" WHERE "authentication_tokens"."body" = $1 LIMIT $2 [["body", "Eu5fwDmEkLDootjzE3kcUrGi"], ["LIMIT", 1]]
User Load (0.5ms) SELECT "users".* FROM "users" WHERE "users"."id" = $1 AND "users"."is_approved" = $2 AND "users"."is_archived" = $3 LIMIT $4 [["id", 1], ["is_approved", "t"], ["is_archived", "f"], ["LIMIT", 1]]
Partner Load (0.2ms) SELECT "partners".* FROM "partners" WHERE "partners"."code" = $1 LIMIT $2 [["code", "e-ai"], ["LIMIT", 1]]
Project Load (0.6ms) SELECT "projects".* FROM "projects" WHERE "projects"."id" = $1 LIMIT $2 [["id", 4], ["LIMIT", 1]]
(0.2ms) BEGIN
SQL (1.7ms) INSERT INTO "meetings" ("name", "date", "start_time", "end_time", "url", "agenda", "notes", "members", "project_id", "user_id", "created_at", "updated_at") VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12) RETURNING "id" [["name", "prj 4 meeting 78"], ["date", "2018-10-12"], ["start_time", "13:30:00"], ["end_time", "14:00:00"], ["url", "https://asdf.com"], ["agenda", "prj 4 meeting 78 agenda"], ["notes", ""], ["members", "{abc#gmail.com}"], ["project_id", 4], ["user_id", 1], ["created_at", "2018-10-12 07:45:45.133414"], ["updated_at", "2018-10-12 07:45:45.133414"]]
(10.7ms) COMMIT
***Meeting Store (77.8ms) {"id":78}***
[active_model_serializers] Rendered ActiveModel::Serializer::Null with Hash (1.81ms)
Completed 200 OK in 116ms (Views: 3.5ms | Searchkick: 77.8ms | ActiveRecord: 13.9ms)
Started GET "/api/v1/pm/projects/4/meetings" for 127.0.0.1 at 2018-10-12 13:15:45 +0530
Processing by Api::V1::Pm::MeetingsController#index as JSON
Parameters: {"project_id"=>"4", "meeting"=>{}}
AuthenticationToken Load (0.3ms) SELECT "authentication_tokens".* FROM "authentication_tokens" WHERE "authentication_tokens"."body" = $1 LIMIT $2 [["body", "Eu5fwDmEkLDootjzE3kcUrGi"], ["LIMIT", 1]]
User Load (1.4ms) SELECT "users".* FROM "users" WHERE "users"."id" = $1 AND "users"."is_approved" = $2 AND "users"."is_archived" = $3 LIMIT $4 [["id", 1], ["is_approved", "t"], ["is_archived", "f"], ["LIMIT", 1]]
Partner Load (1.0ms) SELECT "partners".* FROM "partners" WHERE "partners"."code" = $1 LIMIT $2 [["code", "e-ai"], ["LIMIT", 1]]
Project Load (0.6ms) SELECT "projects".* FROM "projects" WHERE "projects"."id" = $1 LIMIT $2 [["id", 4], ["LIMIT", 1]]
Meeting Search (16.8ms) curl http://localhost:9200/meetings_development/_search?pretty -H 'Content-Type: application/json' -d '{"query":{"bool":{"must":{"match_all":{}},"filter":[{"term":{"project_id":4}}]}},"sort":{"cancelled_at":{"order":"asc","unmapped_type":"boolean"},"updated_at":{"order":"desc","unmapped_type":"long"}},"timeout":"11s","_source":false,"size":25,"from":0}'
Meeting Load (1.8ms) SELECT "meetings".* FROM "meetings" WHERE "meetings"."id" IN (77, 75, 76, 74, 73, 72, 71, 70)
User Load (0.4ms) SELECT "users".* FROM "users" WHERE "users"."id" = 1
[active_model_serializers] Rendered ActiveModel::Serializer::Null with Hash (13.02ms)
Completed 200 OK in 52ms (Views: 14.4ms | Searchkick: 16.8ms | ActiveRecord: 5.5ms)
As you can see in logs meeting with id 78 is getting indexed but not appearing in index api.

Elasticsearch is Near Realtime (NRT) search platform. What this means
is there is a slight latency (normally one second) from the time you
index a document until the time it becomes searchable. NRT
If you can use bulk API instead as it is faster in indexing than one record by time.
For example, the update settings API can be used to dynamically change
the index from being more performant for bulk indexing, and then move
it to more real time indexing state. Before the bulk indexing is
started, use:
PUT /twitter/_settings
{
"index" : {
"refresh_interval" : "-1"
}
}
check the docs. refresh_interval
and see this discussion discussion

Related

Dynamic Date and Time Filter: After 5:00 PM In Previous Day

I am trying to creat a filter that will pull every account that has been set up after 5:00 PM from the previous day. The date and time exist in the same row. I have created a filter that works for the day but the next day, it pulls for two days. For example, here is what it looks like right now:
= Table.SelectRows(#"Sorted Rows", each [Driver ID] > #datetime(2021, 12, 29, 17, 0, 0))
I have tried changing it to the following so it would dynamically change as the days pass:
= Table.SelectRows(#"Sorted Rows", each DateTime.From([Driver ID]) > Date.AddDays(DateTime.From(Driver ID), -1))
But when I do this I get the following error:
Expression.Error: We cannot convert the value #datetime(2021, 12, 30, 0, 5, 0) to type Function.
Details:
Value=12/30/2021 12:05:00 AM
Type=[Type]
I have made sure the column type is in Date/Time format but that doesn’t seem to help.
Has anybody ran into this issue and know a good solution?
try
= Table.SelectRows(#"Sorted Rows", each Date.IsInPreviousNDays([DriverID], 1) and Time.From([DriverID])> #time( 17, 0, 0))

Groupby and sort Pandas

I have a dataframe:
df = pd.DataFrame({
'Metric': ['Total Assets', 'Total Promo', 'Total Assets', 'Total Promo'],
'Risk': ['High', 'High','Low', 'Low'],
'2021': [ 200, 100, 400, 50]})
I want to groupby the Metric column and sort by '2021' column.
I tried:
df = df.sort_values(['2021'],ascending=False).groupby(['Metric', 'Risk'])
But I get the following output:
<pandas.core.groupby.generic.DataFrameGroupBy object at 0x00000213CA672B48>
The output should look like:
df = pd.DataFrame({
'Metric': ['Total Assets', 'Total Assets', 'Total Promo', 'Total Promo'],
'Risk': ['Low', 'High', 'High', 'Low'],
'2021': [ 400, 200, 100, 50]})
If I understand you right, you want to sort by column "Metric" (ascending) and then "2021" (descending):
df = df.sort_values(["Metric", "2021"], ascending=[True, False])
print(df)
Prints:
Metric Risk 2021
0 Total Assets Low 400
1 Total Assets High 200
2 Total Promo High 100
3 Total Promo Low 50

Laravel: Is it possible to use eagerloading after pagination?

Eagerloading with pagination is simple:
Model::with(['relation1', 'relation2'])->paginate();
There are 6 models M1, ..., M6 and model M1 has foreign key to models M2, ..., M6. There are at least 2,000,000 records in each model and model M1 has more than 10,000,000 records. The following statement
M1::paginate();
is fast enough but when relations are included, it takes more than 45 seconds to return the results. To improve the performance, I need to run the M1::paginate(); at the beginning, then include other relations.
My solution is to loop through the collection, gather the ids and add the relations. I would like to know does such thing have been implemented in Laravel before?
Whenever you are unsure about how the queries made, open the console (php artisan tinker) and write the following:
DB::listen(fn($q) => dump([$q->sql, $q->bindings, $q->time]))
For each query you make (in the current console session), you'll get an array containing the SQL, the bindings and the time it actually takes for the database to return the data (this does not take into account how long it takes PHP to turn these results into an Eloquent Collection).
For example, for a Model (A) that has one hasMany relation with another Model (B), look at the output below:
>>> DB::listen(fn($q) => dump([$q->sql, $q->bindings, $q->time]))
=> null
>>> App\Models\A::with('b')->get()->first()->id
array:3 [
0 => "select * from "a""
1 => []
2 => 0.0
]
array:3 [
0 => "select * from "b" where "b"."a_id" in (1, 2, 3, 4, 5,
6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26,
27)"
1 => []
2 => 0.0
]
=> 1
>>> App\Models\A::with('b')->paginate(5)->first()->id
array:3 [
0 => "select count(*) as aggregate from "a""
1 => []
2 => 0.0
]
array:3 [
0 => "select * from "a" limit 5 offset 0"
1 => []
2 => 0.0
]
array:3 [
0 => "select * from "b" where "b"."a_id" in (1, 2, 3, 4, 5)"
1 => []
2 => 0.0
]
As you can see, the pagination has an effect on the relationship queries made.

Replacing translations using ActiveAdmin

I'm trying to use Mobility in my Rails application with ActiveAdmin as administration panel.
I use Container backend with JSONB column.
I also have activeadmin_json_editor gem installed so it's not possible to produce bad JSON. Inside my admin resource I permit :translations attribute using StrongParams.
When editing translations using ActiveAdmin and submitting the form I get the following parameters:
2.5.3 (#<Admin::QuestionsController:0x00007fd466a9a690>):0 > permitted_params
=> <ActionController::Parameters {"utf8"=>"✓", "_method"=>"patch", "authenticity_token"=>"DwSuN9M9cD27dR7WmitBSMKKgVjhW1om3xwxOJUhK41no8RWH1Xh6L9QNIhOc1NhPYtm5QnKJWh7KEIUvuehUQ==", "commit"=>"Update Question", "id"=>"37", "question"=><ActionController::Parameters {"translations"=>"{\"en\":{\"body\":\"dupa\"}}", "dimension_id"=>"6"} permitted: true>} permitted: true>
However once the update query gets processed my model has no translations at all:
2.5.3 (#<Admin::QuestionsController:0x00007fd466a9a690>):0 > resource.update(permitted_params["question"])
(0.4ms) BEGIN
↳ (pry):18
Dimension Load (0.4ms) SELECT "dimensions".* FROM "dimensions" WHERE "dimensions"."id" = $1 LIMIT $2 [["id", 6], ["LIMIT", 1]]
↳ (pry):18
(0.3ms) COMMIT
↳ (pry):18
=> true
2.5.3 (#<Admin::QuestionsController:0x00007fd466a9a690>):0 > resource
=> #<Question:0x00007fd45c284d98
id: 37,
body: nil,
translations: {},
created_at: Wed, 16 Jan 2019 12:17:38 UTC +00:00,
updated_at: Fri, 08 Feb 2019 12:07:00 UTC +00:00,
dimension_id: 6>
What am I doing wrong? Should I parse the JSON from the params and use resource.<attribute_name>_backend.write for each locale?
Since I didn't get any answers I dug around and came up with the following solution. In your resource admin model add:
controller do
def update
translations = JSON.parse(permitted_params.dig(resource.class.name.downcase, "translations"))
translations.each do |locale, attributes|
supported_attributes = attributes.select { |attribute_name, _| resource.class.mobility_attributes.include?(attribute_name) }
supported_attributes.each do |attribute_name, translation|
resource.send("#{attribute_name}_backend").send(:write, locale.to_sym, translation.to_s)
end
end
resource.save
redirect_to admin_questions_path
end
end
This is probably not really the proper way to mass update the translations but I couldn't figure out a better way to do this. Please keep in mind that this doesn't really care if the locale key is valid.

How to compare two Time objects only down to the hour

I want to compare two Time objects only down to the hour, while ignoring the difference of minutes and seconds.
I'm currently using t0.strftime("%Y%m%d%H") == t1.strftime("%Y%m%d%H"), but I don't think this is a good solution.
Is there better way to do this?
You can use this trick in pure Ruby
t0.to_a[2..9] == t1.to_a[2..9]
Where Time#to_a
$> Time.now.to_a
# => [7, 44, 2, 8, 3, 2014, 6, 67, false, "GMT"]
# [ sec, min, hour, day, month, year, wday, yday, isdst, zone ]
So you can check that the times are equals or not up to the level you want and without missing important components of the object like the zone, etc.
If you have ActiveSupport (either through Rails, or just installed as a gem), it includes an extension to the Time class that adds a change method which will truncate times:
$> require "active_support/core_ext/time"
# => true
$> t = Time.now
# => 2014-03-07 21:30:01 -0500
$> t.change(hour: 0)
# => 2014-03-07 00:00:00 -0500
This won't modify the original time value either. So you can do this:
t0.change(minute: 0) == t1.change(minute: 0)
It'll zero out everything at a lower granularity (seconds, etc.).
require 'time'
t1 = Time.new ; sleep 30 ; t2 = Time.new
t1.hour == t2.hour
This should give you a boolean answer.

Resources