How to wrap Ecto hstore field to get translations automatically? - internationalization

There is a lot of gems in Ruby to do what I would like in Elixir: globalize, multilang-hstore, hstore_translate
How to automate e.g. using Gettext.get_locale retrieving translation according to current locale from hstore field? For example if I get it by related Post by post.tags, tags will contain a list of strings in case of locale is :en: "tags: ["climbing", "ski"]" instead of "tags":[{"name":{"pl":"narty","en":"ski"}},{"name":{"pl":"wspinaczka","en":"climbing"}}]? How to handle fallbacks with Gettext?
defmodule Myapp.Tag do
use Myapp.Web, :model
schema "tags" do
field :name, :map
belongs_to :post, Myapp.Post
timestamps
end
def match(query, q) do
from tag in query,
where: fragment("?->>? ILIKE ?", tag.name, "en", ^(String.downcase(q) <> "%"))
end
end
defmodule Myapp.TagController do
use Myapp.Web, :controller
alias Myapp.Tag
def search(conn, %{"q" => q}) do
tags = Tag |> Tag.match(q) |> Repo.all
render(conn, "options.json", tags: tags)
end
end
defmodule Myapp.TagView do
use Myapp.Web, :view
def render("options.json", %{tags: tags}) do
%{options: render_many(tags, Myapp.TagView, "option.json")}
end
def render("option.json", %{tag: tag}) do
%{id: tag.id,
value: tag.name["en"],
label: tag.name["en"]}
end
end
defmodule Myapp.PostView do
use Myapp.Web, :view
def render("posts.json", %{posts: posts}) do
%{data: render_many(posts, Myapp.PostView, "post.json")}
end
def render("post.json", %{post: post}) do
%{id: post.id,
title: post.title,
tags: post.tags} # <= how to get translated keys
end
end

Gettext.get_locale Myapp.Gettext should return according locale. When getting value from map, you can set the default value. For example:
Map.get(map, key, default)
So you can define function like:
defp translated_tag_name(tag, default_lang) do
current_lang = Gettext.get_locale Myapp.Gettext
Map.get(tag.name, current_lang, Map.get(tag.name[default_lang])
end
You do the fallback manually, but it is easy to make it work with any map.

Related

How do I save cookie hash key values in seperate db fields rails 7?

I'm using rails 7 and ruby 3.1.2.
My cookie I've created returns these keys and values:
>> cookies[:utm_params]
=> "{:source=>\"facebook\", :medium=>\"facebook_feed\", :campaign=>\"123123123123\", :max_age=>2592000}"
And I've created these fields on my Subscription model:
# utm_source :string
# utm_medium :string
# utm_campaign :string
My Subscription.create code looks like this atm, but I can't figure out how to save them.
#subscription = Subscription.create!({
utm_source: cookies[:utm_params][:source],
# utm_medium: cookies[:utm_params],
# utm_campaign: cookies[:utm_params],
})
EDIT: My own solution and refactor
application_controller.rb
UTM_COOKIE_NAME = "_ta_utm"
before_action :capture_utm_params
private
def capture_utm_params
if(params.has_key?(:utm_medium))
cookies[UTM_COOKIE_NAME] = {
expires: 30.days.from_now,
value: {
source: params[:utm_source],
medium: params[:utm_medium],
campaign: params[:utm_campaign]
}.to_json
}
end
end
checkout_controller.rb
utm_tags = JSON.parse(cookies[UTM_COOKIE_NAME]) rescue nil
if utm_tags.present?
source = utm_tags["source"]
medium = utm_tags["medium"]
campaign = utm_tags["campaign"]
end
#subscription = Subscription.create!({
utm_source: source,
utm_medium: medium,
utm_campaign: campaign
})
You need to convert that string in to hash. I tried to use JSON.parse(string) on your string but it is not parsable as json. So i found this answer for you to convert that string into hash so that you can save the data you want. Hope it helps.
you can use the code in the answer i linked like this:
utm_hash = JSON.parse(cookies[:utm_params].gsub(/:([a-zA-z]+)/,'"\\1"').gsub('=>', ': ')).symbolize_keys

Parse JSON like syntax to ruby object

Simple parser which turned out to be much harder than i thought. I need a string parser to convert nested fields to ruby object. In my case api response will only return desired fields.
Given
Parser.parse "album{name, photo{name, picture, tags}}, post{id}"
Desired output or similar
{album: [:name, photo: [:name, :picture, :tags]], post: [:id]}
Any thoughts?
Wrote my own solution
module Parser
extend self
def parse str
parse_list(str).map do |i|
extract_item_fields i
end
end
def extract_item_fields item
field_name, fields_str = item.scan(/(.+?){(.+)}/).flatten
if field_name.nil?
item
else
fields = parse_list fields_str
result = fields.map { |field| extract_item_fields(field) }
{ field_name => result }
end
end
def parse_list list
return list if list.nil?
list.concat(',').scan(/([^,{}]+({.+?})?),/).map(&:first).map(&:strip)
end
end
str = 'album{name, photo{name, picture, tags}}, post{id}'
puts Parser.parse(str).inspect
# => [{"album"=>["name", {"photo"=>["name", "picture", "tags"]}]}, {"post"=>["id"]}]

Ruby - reading from .csv and creating objects out of it

I have .csv file with rows of which every row represents one call with certain duration, number etc. I need to create array of Call objects - every Call.new expects Hash of parameters, so it's easy - it just takes rows from CSV. But for some reason it doesn't work - when I invoke Call.new(raw_call) it's nil.
It's also impossible for me to see any output - I placed puts in various places in code (inside blocks etc) and it simply doesn't show anything. I obviously have another class - Call, which holds initialize for Call etc.
require 'csv'
class CSVCallParser
attr_accessor :io
def initialize(io)
self.io = io
end
NAMES = {
a: :date,
b: :service,
c: :phone_number,
d: :duration,
e: :unit,
f: :cost
}
def run
parse do |raw_call|
parse_call(raw_call)
end
end
private
def parse_call(raw_call)
NAMES.each_with_object({}) do |name, title, memo|
memo[name] = raw_call[title.to_s]
end
end
def parse(&block)
CSV.parse(io, headers: true, header_converters: :symbol, &block)
end
end
CSVCallParser.new(ARGV[0]).run
Small sample of my .csv file: headers and one row:
"a","b","c","d","e","f"
"01.09.2016 08:49","International","48627843111","0:29","","0,00"
I noticed a few things that isn't going as expected. In the parse_call method,
def parse_call(raw_call)
NAMES.each_with_object({}) do |name, title, memo|
memo[name] = raw_call[title.to_s]
end
end
I tried to print name, title, and memo. I expected to get :a, :date, and {}, but what I actually got was [:a,:date],{}, and nil.
Also, raw_call headers are :a,:b,:c..., not :date, :service..., so you should be using raw_call[name], and converting that to string will not help, since the key is a symbol in the raw_call.
So I modified the function to
def parse_call(raw_call)
NAMES.each_with_object({}) do |name_title, memo|
memo[name_title[1]] = raw_call[name_title[0]]
end
end
name_title[1] returns the title (:date, :service, etc)
name_title[0] returns the name (:a, :b, etc)
Also, in this method
def run
parse do |raw_call|
parse_call(raw_call)
end
end
You are not returning any results you get, so you are getting nil,
So, I changed it to
def run
res = []
parse do |raw_call|
res << parse_call(raw_call)
end
res
end
Now, if I output the line
p CSVCallParser.new(File.read("file1.csv")).run
I get (I added two more lines to the csv sample)
[{:date=>"01.09.2016 08:49", :service=>"International", :phone_number=>"48627843111", :duration=>"0:29", :unit=>"", :cost=>"0,00"},
{:date=>"02.09.2016 08:49", :service=>"International", :phone_number=>"48622454111", :duration=>"1:29", :unit=>"", :cost=>"0,00"},
{:date=>"03.09.2016 08:49", :service=>"Domestic", :phone_number=>"48627843111", :duration=>"0:29", :unit=>"", :cost=>"0,00"}]
If you want to run this program from the terminal like so
ruby csv_call_parser.rb calls.csv
(In this case, calls.csv is passed in as an argument to ARGV)
You can do so by modifying the last line of the ruby file.
p CSVCallParser.new(File.read(ARGV[0])).run
This will also return the array with hashes like before.
csv = CSV.parse(csv_text, :headers => true)
puts csv.map(&:to_h)
outputs:
[{a:1, b:1}, {a:2, b:2}]

Datamapper into String

I want to be able to see the string like the TwitchTV name I have in my database. Here is my current code
get '/watch/:id' do |id|
erb :mystream
#result = Twitchtvst.all( :fields => [:Twitchtv ],
:conditions => { :user_id => "#{id}" }
)
puts #result
end
result in terminal;
#< Twitchtvst:0x007fb48b4d5a98 >
How do I get that into a string (TwitchTV answer in database)
Opppppsss!
Here is the real code sample. Sorry!
get '/livestream' do
erb :livestream
#users_streams = Twitchtvst.all
puts #users_streams
end
If I add .to_s at users_stream it does not work
By adding .to_csv, not exactly a string, but it should show the content:
get '/livestream' do
erb :livestream
#users_streams = Twitchtvst.all
#users_streams.each do |us|
p us.to_csv
end
end
You're getting a Collection of Twitchtvst objects, so you need to convert each to a String:
puts Twitchtvst.all.map(&:to_s).join

How to set DataMapper String length to unlimited for PostgreSQL

Granted, one could use property :foo, Text, lazy: false all over the place to replace property :foo, String but that, of course, defeats several purposes in one go. Or I could use manual migrations, which I have been doing—I'm looking around now to see if they can finally be abandoned, insofar as VARCHAR v. TEXT is concerned.
In other words, I'd like automigrate to create TEXT fields for PostgreSQL for models with String properties rather than arbitrarily, pointlessly, constrained VARCHAR atop a TEXT.
The Postgres adapter seems to be able to handle it:
# size is still required, as length in postgres behaves slightly differently
def size
case self.type
#strings in postgres can be unlimited length
when :string then return (#options.has_key?(:length) || #options.has_key?(:size) ? #size : nil)
else nil
end
end
Untested suggestion, but considering the source of initialize, and the length function immediately beneath it, try passing nil as the length, or no length at all.
Here's a lousy but working solution I came up with—if you've better one please add it and I'll accept that instead.
In config/initializers/postgresql_strings.rb
module DataMapper
module Migrations
module PostgresAdapter
def self.included(base)
base.extend ClassMethods
end
module ClassMethods
def type_map
precision = Property::Numeric.precision
scale = Property::Decimal.scale
super.merge(
Property::Binary => { :primitive => 'BYTEA' },
BigDecimal => { :primitive => 'NUMERIC', :precision => precision, :scale => scale },
Float => { :primitive => 'DOUBLE PRECISION' },
String => { :primitive => 'TEXT' } # All that for this
).freeze
end
end
end
end
end
# If you're including dm-validations, it will surprisingly attempt
# to validate strings to <= 50 characters, this prevents that.
DataMapper::Property::String.auto_validation(false)

Resources