Phoenix/Ecto - integer values in a Postgres array column not being retrieved - phoenix-framework

Was trying to build schemas for an existing set of tables using Ecto 2.1, in a Phoenix 1.3.0 app.
Example:
defmodule Book do
use Ecto.Schema
schema "books" do
field :title, :string
field :owner_ids, {:array, :integer}
field :borrower_ids, {:array, :integer}
field :published, :boolean
end
end
On the console when I do Book |> first |> Repo.one, I see the owner_ids are printed properly ["29"], but the borrower_ids shows '$'. Verified using psql that borrower_ids for that row in the table does have a list of values in the table, exactly like the owner_ids column.
All other columns in the table print just fine. Anything I am missing here?
Update: Rails/ActiveRecord 5.1.4 was able to retrieve this table and row just fine.

'$' is a list containing the number 36:
iex> [36]
'$'
In a nutshell, every time Elixir sees a list of integers representing ASCII characters, it prints them between single quotes, because that's how Erlang strings are represented (also called charlists).
The i helper in IEx is very useful in those situations. When you see a value that you don't understand, you can use it to ask for more information:
iex(2)> i '$'
Term
'$'
Data type
List
Description
This is a list of integers that is printed as a sequence of characters
delimited by single quotes because all the integers in it represent valid
ASCII characters. Conventionally, such lists of integers are referred to
as "charlists" (more precisely, a charlist is a list of Unicode codepoints,
and ASCII is a subset of Unicode).
Raw representation
[36]
Reference modules
List

Related

How to query GUID stored in a number field

I have a primary key in my table as NUMBER and it was populated using the following:
to_number(sys_guid(),'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX')
Now how can I query the table?
Using
SELECT * FROM TABLE1 WHERE ID = 2.68819609716248E38
does not return any results
OK, so you had a raw value which you converted to number. Now you need to reverse the process.
First convert the saved number back to a string. Then apply hextoraw. When you convert from number to string, you must use a format model to request hex representation. Then you must use the fm format model modifier, and use 0 as the first digit to make sure the string keeps leading zeros; the fm modifier is needed so you don't get a space (placeholder for sign) as the first character.
Alternatively, you could compare directly to a fixed number, but the number must make sense. You show something like 2.68199...E38 as your input. Where is that coming from? Obviously not from converting a raw value to number (like you did with the sys_guid values). The value should be a 38-digit integer, with 38 significant digits; your value has only 15 significant digits, so the rest are filled with zeros. Obviously that will not match any saved values.

extract and replace parameters in SQL query using M-language

This question is related with this question. However, in that question I made some wrong assumptions...
I have a string that contains a SQL query, with or without one or more parameters, where each parameter has a "&" (ampersand sign) as prefix.
Now I want to extract all parameters, load them into a table in excel where the user can enter the values for each parameter.
Then I need to use these values as a replacement for the variables in the SQL query so I can run the query...
The problem I am facing is that extracting (and therefore also replacing) the parameter names is not that straight forward, because the parameters are not always surrounded with spaces (as I assumed in my previous question)
See following examples
Select * from TableA where ID=&id;
Select * from TableA where (ID<&ID1 and ID>=&ID2);
Select * from TableA where ID = &id ;
So, two parts of my question:
How can I extract all parameters
How can I replace all parameters using another table where the replacements are defined (see also my previous question)
A full solution for this would require getting into details of how your data is structured and would potentially be covering a lot of topics. Since you already covered one way to do a mass find/replace (which there are a variety of ways to accomplish in Power Query), I'll just show you my ugly solution to extracting the parameters.
List.Transform
(
List.Select
(
Text.Split([YOUR TEXT HERE], " "), each Text.Contains(_,"&")
),
each List.Accumulate
(
{";",")"}, <--- LIST OF CHARACTERS TO CLEAN
"&" & Text.AfterDelimiter(_, "&"),
(String,Remove) => Text.Replace(String,Remove,"")
)
)
This is sort of convoluted, but here's the best I can explain what is going on.
The first key part is combining List.Select with Text.Split to extract all of the parameters from the string into a list. It's using a " " to separate the words in the list, and then filtering to words containing a "&", which in your second example means the list will contain "(ID<&ID1" and "ID>=&ID2);" at this point.
The second part is using Text.AfterDelimiter to extract the text that occurs after the "&" in our list of parameters, and List.Accumulate to "clean" any unwanted characters that would potentially be hanging on to the parameter. The list of characters you would want to clean has to be manually defined (I just put in ";" and ")" based on the sample data). We also manually re-append a "&" to the parameter, because Text.AfterDelimiter would have removed it.
The result of this is a List object of extracted parameters from any of the sample strings you provided. You can setup a query that takes a table of your SQL strings, applies this code in a custom column where [YOUR TEXT HERE] is the field containing your strings, then expand the lists that result and remove duplicates on them to get a unique list of all the parameters in your SQL strings.

How to detect data type in column of table in ORACLE database (probably blob or clob)?

I have a table with a column in the format VARCHAR2(2000 CHAR). This column contained a row containing comma-separated numbers (ex: "3;3;780;1230;1;450.."). Now the situation has changed. Some rows contain data in the old format, but some contain the following data (ex: "BAAAABAAAAAgAAAAHAAAAAAAAAAAAAAAAQOUw6.."). Maybe it's blob or clob. How can I check exactly? And how can I read it now? Sorry for my noob question :)
The bad news is you really can't. Your column is a VARCHAR2 so it's all character data. It seems like what you're really asking is "How do I tell if this value is a comma separated string or a binary value encoded as a string?" So the best you can do is make an educated guess. There's not enough information here to give a very good answer, but you can try things like:
If the value is numeric characters with separators (you say commas but your example has semicolons) then treat it as such.
But what if the column value is "123", is that a single number or a short binary value?
If there are any letters in the value, you know it's not a separated list of numbers, then treat it as binary. But not all encoded binary values will have letters.
Try decoding it as binary, if it fails, maybe it's actually the separated list. This probably isn't a good one.

CSV only returns strings. I need to keep value types

I'm trying to parse through a CSV file and grab every row and upload it to Postgres. The problem is that CSV.foreach returns every value as a string and Postgres won't accept string values in double columns.
Is there an easy way to keep the value types? Or am I going to have to go column by column and convert the strings into doubles and date formats?
require 'csv'
CSV.foreach("C:\\test\\file.csv") do |row|
print row
end
All I need is the values to keep their type and not be returned as a string. I don't know if this is possible with CSV. I have it working just fine when using spreadsheet gem to parse through .xls files.
CSVs do not natively have types; a CSV contains simple comma-separated text. When you view a CSV, you are seeing everything there is to the file. In an Excel file, there is a lot of hidden metadata that tracks the type of each cell.
When you #foreach through a CSV, each row is given as an array of string values. A row might look something like
[ "2.33", "4", "Hello" ]
with each value given as a string. You may think of "2.33" as a float/double, but CSV parsers only know to think of it as a string.
You can convert strings to other types using Ruby's type conversion functions, assuming each column contains only one type (which, since you're using an SQL database, is a pretty safe assumption).
You could write something like this, to convert the values in each row to specific types. This example converts the first row to a float (which should work with Postgres' `double), converts the second row to an integer, and the third row to a string.
require 'csv'
CSV.foreach("C:\\test\\file.csv") do |row|
puts [ row[0].to_f, row[1].to_i, row[2].to_s ]
end
Given the sample row from above, this function would print an array like
>> [ 2.33, 4, "Hello" ]
You should be able to use these converted values in whatever else you're doing with Postgres.
require 'csv'
CSV.foreach("test.txt", converters: :all) do |row|
print row
end
This should convert numerics and datetimes. For integers and floats this works perfectly, but I was not able to get an actual conversion to DateTime going.

Csv with Weka how to add comma as a value not a seperator

I have a dataset. With using this dataset, I must run machine learning algorithms. But my dataset has some elements which also has comma but when I convert CSV to Arff this comma values does not recognized.
Example;
a,b,c
asdasd'%sdas,1,5,4234
My elements are
asdasd'%sdas 1,5 4234
But I could not handle the value has comma inside it.
I tried these
a,b,c
asdasd'%sdas,1\,5,4234
a,b,c
asdasd'%sdas,"1,5",4234
How can I pass comma valued element while using weka? My another wonder is how pass an element as string which has special chars like "sdas&%',+" Is it possible or something similar with this?
The following should work:
"asdasd'%sdas","1,5",4234
You can send strings that contain special characters just like this.

Resources