Inserting id of child into the parent model Elixir/Phoenix - phoenix-framework

So here is my question: I have two schemas: Table and Order
schema "tables" do
field :table_number, :string
field :current_order, :integer
has_many :orders, Pos1.Order
schema "orders" do
field :number_of_customers, :integer
field :completed, :boolean, default: false
belongs_to :table, Pos1.Table
When I create a new order, controller just inserts the changeset and redirect to show.
def create(conn, %{"order" => order_params}) do
changeset = conn.assigns[:table]
|> build_assoc(:orders)
|> Order.changeset(order_params)
case Repo.insert(changeset) do
{:ok, _order} ->
conn
|> put_flash(:info, "Order created successfully.")
|> redirect(to: table_order_path(conn, :show, conn.assigns[:table], _order))
{:error, changeset} ->
render(conn, "new.html", changeset: changeset)
end
end
However is it possible to insert the id of the created order into the parent(table) into the current_order field?
Additionally, when the order.completed changes to "true" value, how can i remove the id from current_order (in table)?
Basically, I am working on functionality, that if the table has current_order - it will show it onclick. If not, then it redirects to a page to create order. And ones order is created, current_order in table should be assigned.

Didn't fully understand your problem but is this that you want?
def create(conn, %{"order" => order_params}) do
changeset = conn.assigns[:table]
|> build_assoc(:orders)
|> Order.changeset(order_params)
case Repo.insert(changeset) do
{:ok, order} ->
changeset = Table.changeset(conn.assigns[:table], %{current_order: order.id})
table = Repo.update! changeset
conn
|> put_flash(:info, "Order created successfully.")
|> redirect(to: table_order_path(conn, :show, conn.assigns[:table], _order))
{:error, changeset} ->
render(conn, "new.html", changeset: changeset)
end
end
Better mechanisms should be use to ensure that the insertion was successful and what to do if it wasn't (using transactions and rollbacks maybe).

Related

Absinthe - How to put_session in resolver function?

I'm using Absinthe and have a sign in mutation. When users send over valid credentials, I'd like to set a session cookie in the response via put_session.
The problem I'm facing is that I'm not able to access the conn from within a resolver function. That tells me that I'm not supposed to update the connection's properties from within a resolver.
Is it possible to do this with Absinthe? What are some alternative solutions?
It looks like one solution is:
In the resolver, resolve either an {:ok, _} or an {:error, _} as normal
Add middleware after the resolver to pattern match that resolution.value returned from step 1 and update the GraphQL context
Use the before_send feature of Absinthe (which has access to both the GraphQL context and the connection to put_session before sending a response
Code Example
Mutation:
mutation do
#desc "Authenticate a user."
field :login, :user do
arg(:email, non_null(:string))
arg(:password, non_null(:string))
resolve(&Resolvers.Accounts.signin/3)
middleware(fn resolution, _ ->
case resolution.value do
%{user: user, auth_token: auth_token} ->
Map.update!(
resolution,
:context,
&Map.merge(&1, %{auth_token: auth_token, user: user})
)
_ ->
resolution
end
end)
end
end
Resolver:
defmodule AppWeb.Resolvers.Accounts do
alias App.Accounts
def signin(_, %{email: email, password: password}, _) do
if user = Accounts.get_user_by_email_and_password(email, password) do
auth_token = Accounts.generate_user_session_token(user)
{:ok, %{user: user, auth_token: auth_token}}
else
{:error, "Invalid credentials."}
end
end
end
Router:
defmodule AppWeb.Router do
use AppWeb, :router
pipeline :api do
plug(:accepts, ["json"])
plug(:fetch_session)
end
scope "/" do
pipe_through(:api)
forward("/api", Absinthe.Plug,
schema: AppWeb.Schema,
before_send: {__MODULE__, :absinthe_before_send}
)
forward("/graphiql", Absinthe.Plug.GraphiQL,
schema: AppWeb.Schema,
before_send: {__MODULE__, :absinthe_before_send}
)
end
def absinthe_before_send(conn, %Absinthe.Blueprint{} = blueprint) do
if auth_token = blueprint.execution.context[:auth_token] do
put_session(conn, :auth_token, auth_token)
else
conn
end
end
def absinthe_before_send(conn, _) do
conn
end
end
Not sure why you want to use a session, can't this be solved using a bearer?
Please disregard the interfaces. :-)
Mutation.
object :user_token_payload do
field(:user, :user)
field(:token, :string)
end
object :login_user_mutation_response, is_type_of: :login_user do
interface(:straw_hat_mutation_response)
field(:errors, list_of(:straw_hat_error))
field(:successful, non_null(:boolean))
field(:payload, :user_token_payload)
end
Resolver.
def authenticate_user(args, _) do
case Accounts.authenticate_user(args) do
{:ok, user, token} -> MutationResponse.succeeded(%{user: user, token: token})
{:error, message} -> MutationResponse.failed(StrawHat.Error.new(message))
end
end
Now the client can pass along that token with the Authorization header, and pick it up with a plug.
defmodule MyAppWeb.Plugs.Context do
import Plug.Conn
alias MyApp.Admission
def init(opts), do: opts
def call(conn, _) do
case build_context(conn) do
{:ok, context} -> put_private(conn, :absinthe, %{context: context})
_ -> put_private(conn, :absinthe, %{context: %{}})
end
end
#doc """
Return the current user context based on the authorization header
"""
def build_context(conn) do
auth_header =
get_req_header(conn, "authorization")
|> List.first()
if auth_header do
"Bearer " <> token = auth_header
case Admission.get_token_by_hash(token) do
nil -> :error
token -> {:ok, %{current_user: token.user}}
end
else
:error
end
end
end
Then add the plug to your pipeline
plug(MyApp.Plugs.Context)
Then you can pick up the current user in your resolvers like so.
def create_note(%{input: input}, %{context: %{current_user: user}}) do
end

How do I use Absinthe Dataloader for many to many relationships

In my phoenix application, I have a many to many relationship between an Artist and a Cause schema implemented using a join table artists_causes. In my Artist schema, I have many_to_many :causes, Cause, join_through: "artists_causes" and in the Cause schema I have many_to_many :artists, Artist, join_through: "artists_causes"
I am using absinthe for graphql and in my CauseTypes module, I have implemented a the cause object as below
defmodule MyAppWeb.Schema.CauseTypes do
#moduledoc """
All types for causes
"""
use Absinthe.Schema.Notation
import Absinthe.Resolution.Helpers, only: [dataloader: 1, dataloader: 3]
object :cause do
field :id, :id
field :artists, list_of(:artist), resolve: dataloader(Artists)
end
def dataloader do
alias MyApp.{Artists, Causes}
loader = Dataloader.new
|> Dataloader.add_source(Causes, Causes.datasource())
|> Dataloader.add_source(Artists, Artists.datasource())
end
def context(ctx) do
Map.put(ctx, :loader, dataloader())
end
def plugins do
[Absinthe.Middleware.Dataloader] ++ Absinthe.Plugin.defaults()
end
end
From my understanding, with Absinthe Dataloader, the dataloader/1 is what I need to have the list of Artists loaded. However, I am not able to query for the artists from within a cause getting the error artists: #Ecto.Association.NotLoaded<association :artists is not loaded> when I run the query below in graphiql
query{
causes{
id
artists {
id
}
}
}
Is there any little piece that I am missing on working with many to many relationships?
==========
Update
I updated my list_causes function as below
def list_causes do
Repo.all(MyApp.Causes.Cause)
end
to
def list_causes do
Repo.all(from c in Cause,
left_join: ac in "artists_causes", on: c.id == ac.cause_id,
left_join: a in Artist, on: a.id == ac.artist_id,
preload: [:artists]
)
end
and , I am now getting the error FunctionClauseError at POST /graphiql\n\nException:\n\n ** (FunctionClauseError) no function clause matching in anonymous fn/3 in Absinthe.Resolution.Helpers.dataloader/1 which maybe pointing towards with the Absinthe.Resolution.Helpers.dataloader/1 method. I have the helpers imported Is there something else I could be missing?
I think you must preload relation with artists manualy from Ecto, before passing it to Absinthe.
For example, fetch causes like:
from(c in Cause,
preload: [:artists],
select: c
)
|> Repo.all()
ADDITIONAL
My way of resolving Absinthe query.
In query object I pass resolver module function reference.
resolve(&App.Resolver.get_all_causes/2)
And with resolver function I return dataset
def get_all_causes(_params, _info) do
{:ok,
from(c in Cause,
preload: [:artists],
select: c
)
|> Repo.all()}
end

Argument error when paginating data with preloaded associations

I'm trying to paginate data that has preloaded associations.
In my controller this works:
def index(conn, _params) do
products = Product
|> Repo.all
|> Repo.preload(:category)
render(conn, "index.html", products: products)
end
And this also works:
def index(conn, params) do
{products, kerosene} = Product
|> Repo.paginate(params)
render(conn, "index.html", products: products, kerosene: kerosene)
end
But combining them produces argument error on line |> Repo.paginate(params)
def index(conn, params) do
{products, kerosene} = Product
|> Repo.all
|> Repo.preload(:category)
|> Repo.paginate(params)
render(conn, "index.html", products: products, kerosene: kerosene)
end
If I drop line |> Repo.all it produces:
no function clause matching in Ecto.Repo.Preloader.preload/4" on line |> Repo.preload(:category)
Assuming you're using Kerosene, Repo.paginate expects a query as argument while Repo.all |> Repo.paginate returns a list of structs. You can use either Ecto.Query.preload or create a query using from with a preload attribute to generate a query which automatically preloads when it's run.
{products, kerosene} =
Product
|> Ecto.Query.preload(:category)
|> Repo.paginate(params)
# or
{products, kerosene} =
from(p in Product, preload: :category)
|> Repo.paginate(params)

How to check if struct is persisted or not?

Is there a way to figure out if struct is persisted or not? I started digging source for Ecto's insert_or_update but with no luck as it hits some private method. I want to accoplish something like this:
def changeset(struct, params \\ %{}) do
struct
|> cast(params, [:whatever]
|> do_a_thing_on_unsaved_struct
end
defp do_a_thing_on_unsaved_struct(struct) do
case ARE_YOU_PERSISTED?(struct) do
:yes -> struct
:no -> do_things(struct)
end
end
Is it possible, or I'm doing something dumb?
You can check the .__meta__.state of the struct. If it's a new one (not persisted), it'll be :built, and if it was loaded from the database (persisted), it'll be :loaded:
iex(1)> Ecto.get_meta(%Post{}, :state)
:built
iex(2)> Ecto.get_meta(Repo.get!(Post, 1), :state)
:loaded
You can check struct.data.id if the struct's primary key is id:
defp do_a_thing_on_unsaved_struct(struct) do
if struct.data.id, do: struct, else: do_things(struct)
end

How to wrap Ecto hstore field to get translations automatically?

There is a lot of gems in Ruby to do what I would like in Elixir: globalize, multilang-hstore, hstore_translate
How to automate e.g. using Gettext.get_locale retrieving translation according to current locale from hstore field? For example if I get it by related Post by post.tags, tags will contain a list of strings in case of locale is :en: "tags: ["climbing", "ski"]" instead of "tags":[{"name":{"pl":"narty","en":"ski"}},{"name":{"pl":"wspinaczka","en":"climbing"}}]? How to handle fallbacks with Gettext?
defmodule Myapp.Tag do
use Myapp.Web, :model
schema "tags" do
field :name, :map
belongs_to :post, Myapp.Post
timestamps
end
def match(query, q) do
from tag in query,
where: fragment("?->>? ILIKE ?", tag.name, "en", ^(String.downcase(q) <> "%"))
end
end
defmodule Myapp.TagController do
use Myapp.Web, :controller
alias Myapp.Tag
def search(conn, %{"q" => q}) do
tags = Tag |> Tag.match(q) |> Repo.all
render(conn, "options.json", tags: tags)
end
end
defmodule Myapp.TagView do
use Myapp.Web, :view
def render("options.json", %{tags: tags}) do
%{options: render_many(tags, Myapp.TagView, "option.json")}
end
def render("option.json", %{tag: tag}) do
%{id: tag.id,
value: tag.name["en"],
label: tag.name["en"]}
end
end
defmodule Myapp.PostView do
use Myapp.Web, :view
def render("posts.json", %{posts: posts}) do
%{data: render_many(posts, Myapp.PostView, "post.json")}
end
def render("post.json", %{post: post}) do
%{id: post.id,
title: post.title,
tags: post.tags} # <= how to get translated keys
end
end
Gettext.get_locale Myapp.Gettext should return according locale. When getting value from map, you can set the default value. For example:
Map.get(map, key, default)
So you can define function like:
defp translated_tag_name(tag, default_lang) do
current_lang = Gettext.get_locale Myapp.Gettext
Map.get(tag.name, current_lang, Map.get(tag.name[default_lang])
end
You do the fallback manually, but it is easy to make it work with any map.

Resources