Calling apoc procedure from Golang driver - go

I want to call apoc procedure from the golang driver. I can fire basic cypher queries from the driver but while calling apoc procedure it throws a syntax error.
panic: An error occurred getting result of exec command: messages.FailureMessage{Metadata:map[string]interface {}{"code":"Neo.ClientError.Statement.SyntaxError", "message":"Invalid input '3': expected whitespace, '.', node labels, '[', \"=~\", IN, STARTS, ENDS, CONTAINS, IS, '^', '*', '/', '%', '+', '-', '=', '~', \"<>\", \"!=\", '<', '>', \"<=\", \">=\", AND, XOR, OR, ',' or ')' (line 1, column 74 (offset: 73))\n\"call apoc.export.json.query(\"MATCH t = (p)-[:has*0..] -> (i:node{name:\"39\"}) return p;\",\"2.json\")\"\n
call apoc.export.json.query("MATCH t = (p)-[:has*0..] -> (i:node{name:"39"}) return p;","1.json")
I want to fire the above query from golang. Basically, golang wants me to pass the query as a string. Here the query itself contains a nested string in it. I think the error is due to that. Below is the syntax I am using to query from golang.
conn.PrepareNeo("call apoc.export.json.query(\"MATCH t = (p)-[:has*0..] -> (i:node{name:\"39\"}) return p;\",\"1.json\"")

You're right about the trouble, I think. I'd start at the other end of this and work backwards. The query you want to issue (and what you'd type in the Neo4j Browser) is:
call apoc.export.json.query("MATCH t = (p)-[:has*0..]->(i:node {name: \"39\"}) return p;", "1.json")
vs what Neo4j claims you actually sent:
call apoc.export.json.query("MATCH t = (p)-[:has*0..] -> (i:node{name:"39"}) return p;","1.json")
Notice that we've not got any backslashes escaping the quotes around "39" in the Go-generated query.
To fix, we need to emit the backslashes explicitly - I'm not a Go developer so there might be a tidier way of doing this, but in other languages it would be something like:
conn.PrepareNeo("call apoc.export.json.query(\"MATCH t = (p)-[:has*0..]->(i:node {name: \\\"39\\\"}) return p;\", \"1.json\")")
Or use a raw string literal (again, not a Go developer so not certain on this one):
conn.PrepareNeo(`call apoc.export.json.query("MATCH t = (p)-[:has*0..]->(i:node {name: \"39\"}) return p;", "1.json")`)

Related

Powerautomate Parsing JSON Array

I've seen the JSON array questions here and I'm still a little lost, so could use some extra help.
Here's the setup:
My Flow calls a sproc on my DB and that sproc returns this JSON:
{
"ResultSets": {
"Table1": [
{
"OrderID": 9518338,
"BasketID": 9518338,
"RefID": 65178176,
"SiteConfigID": 237
}
]
},
"OutputParameters": {}
}
Then I use a PARSE JSON action to get what looks like the same result, but now I'm told it's parsed and I can call variables.
Issue is when I try to call just, say, SiteConfigID, I get "The output you selected is inside a collection and needs to be looped over to be accessed. This action cannot be inside a foreach."
After some research, I know what's going on here. Table1 is an Array, and I need to tell PowerAutomate to just grab the first record of that array so it knows it's working with just a record instead of a full array. Fair enough. So I spin up a "Return Values to Virtual Power Agents" action just to see my output. I know I'm supposed to use a 'first' expression or a 'get [0] from array expression here, but I can't seem to make them work. Below are what I've tried and the errors I get:
Tried:
first(body('Parse-Sproc')?['Table1/SiteConfigID'])
Got: InvalidTemplate. Unable to process template language expressions in action 'Return_value(s)_to_Power_Virtual_Agents' inputs at line '0' and column '0': 'The template language function 'first' expects its parameter be an array or a string. The provided value is of type 'Null'. Please see https://aka.ms/logicexpressions#first for usage details.'.
Also Tried:
body('Parse-Sproc')?['Table1/SiteconfigID']
which just returns a null valued variable
Finally I tried
outputs('Parse-Sproc')?['Table1']?['value'][0]?['SiteConfigID']
Which STILL gives me a null-valued variable. It's the worst.
In that last expression, I also switched the variable type in the return to pva action to a string instead of a number, no dice.
Also, changed 'outputs' in that expression for 'body' .. also no dice
Here is a screenie of the setup:
To be clear: the end result i'm looking for is for the system to just return "SiteConfigID" as a string or an int so that I can pipe that into a virtual agent.
I believe this is what you need as an expression ...
body('Parse-Sproc')?['ResultSets']['Table1'][0]?['SiteConfigID']
You can see I'm just traversing down to the object and through the array to get the value.
Naturally, I don't have your exact flow but if I use your JSON and load it up into Parse JSON step to get the schema, I am able to get the result. I do get a different schema to you though so will be interesting to see if it directly translates.

Need XPath and XQuery query

I'm working on Xpath/Xquery to return values of multiple child nodes based on a sibling node value in a single query. My XML looks like this
<FilterResults>
<FilterResult>
<ID>535</ID>
<Analysis>
<Name>ZZZZ</Name>
<Identifier>asdfg</Identifier>
<Result>High</Result>
<Score>0</Score>
</Analysis>
<Analysis>
<Name>XXXX</Name>
<Identifier>qwerty</Identifier>
<Result>Medium</Result>
<Score>0</Score>
</Analysis>
</FilterResult>
<FilterResult>
<ID>745</ID>
<Analysis>
<Name>XXXX</Name>
<Identifier>xyz</Identifier>
<Result>Critical</Result>
<Score>0</Score>
</Analysis>
<Analysis>
<Name>YYYY</Name>
<Identifier>qwerty</Identifier>
<Result>Medium</Result>
<Score>0</Score>
</Analysis>
</FilterResult>
</FilterResults>
I need to get values of Score and Identifier based on Name value. I'm currently trying with below query but not working as desired
fn:string-join((
for $Identifier in fn:distinct-values(FilterResults/FilterResult/Analysis[Name="XXXX"])
return fn:string-join((//Identifier,//Score),'-')),',')
The output i'm looking for is this
qwerty-0,xyz-0
Your question suggests some fundamental misunderstandings about XQuery, generally. It's hard to explain everything in a single answer, but 1) that is not how distinct-values works (it returns string values, not nodes), and 2) the double slash selections in your return statement are returning everything because they are not constrained by anything. The XPath you use inside the distinct-values call is very close, however.
Instead of calling distinct-values, you can assign the Analysis results of that XPath to a variable, iterate over them, and generate concatenated strings. Then use string-join to comma separate the full sequence. Note that in the return statement, the variable $a is used to concat only one pair of values at a time.
string-join(
let $analyses := FilterResults/FilterResult/Analysis[Name="XXXX"]
for $a in $analyses
return $a/concat(Identifier, '-', Score),
',')
=> qwerty-0,xyz-0

backslash at end of string causes error when inserting into InfluxDB

I have a string:
string = "\\"
puts string
# => \
I am interpolating this into a new string and sending to a database. However the database (InfluxDB) uses backslashes as escape characters so pushing this string can cause an error.
For example, if I pass the following to Influx it will cause an "unterminated string" error:
insert_cmd = <<-TXT
INSERT INTO my_db.default.my_measurement,my_tag=1 my_val="#{string}"
TXT
My question is how can I replace \ in a string with \\ (two actual backslashes).
I have it working with gsub("\\", "\\\\\\") but I don't understand why this works and the following doesn't:
string.gsub("\\", "\\\\")
# SyntaxError: (irb):10: syntax error, unexpected $undefined, expecting end-of-input
Why doesn't this work? Why does gsub("\\", "\\\\\\") work? Is there a better way?
solved
As I mentioned in a comment, actually I am not manually interpolating into a INSERT INTO string. I am using influxdb-ruby:
INFLUXDB_CLIENT.write_point("things", time: Time.now.to_i, values: { foo: "\\" })
It turns out this is a bug with that gem: https://github.com/influxdata/influxdb-ruby/issues/200
It is fixed in v 0.4.2 and i was using 0.4.1
You just use parameterized query strings:
INSERT INTO my_db.default.my_measurement,my_tag=1 my_val=%{1}
Where when you call it you do this:
influxdb.query("...query...", params: [ string ])
What you did was create a classic injection bug by sending unescaped data into a query. The same principle applies in any database with a plain-text string representation, or even other data formats like HTML and JavaScript.

pg: exec_params not replacing parameters?

First time using pg gem to access postgres database. I've connected successfully and can run queries using #exec, but now building a simple query with #exec_params does not seem to be replacing parameters. I.e:
get '/databases/:db/tables/:table' do |db_name, table_name|
conn = connect(db_name)
query_result = conn.exec_params("SELECT * FROM $1;", [table_name])
end
results in #<PG::SyntaxError: ERROR: syntax error at or near "$1" LINE 1: SELECT * FROM $1; ^ >
This seems like such a simple example to get working - am I fundamentally misunderstanding how to use this method?
You can use placeholders for values, not for identifiers (such as table and column names). This is the one place where you're stuck using string interpolation to build your SQL. Of course, if you're using string wrangling for your SQL, you must be sure to properly quote/escape things; for identifiers, that means using quote_ident:
+ (Object) quote_ident(str)
Returns a string that is safe for inclusion in a SQL query as an identifier. Note: this is not a quote function for values, but for identifiers.
So you'd say something like:
table_name = conn.quote_ident(table_name)
query_result = conn.exec("SELECT * FROM #{table_name}")

Ruby send(attribute.to_sym) for Rails Method

I'm using Ruby 1.9.2 and need to go through all of the values for a table to make sure everything is in UTF-8 encoding. There are a lot of columns so I was hoping to be able to use the column_names method to loop through them all and encode the values to UTF-8. I thought this might work:
def self.make_utf
for listing in Listing.all
for column in Listing.column_names
column_value_utf = listing.send(column.to_sym).encode('UTF-8')
listing.send(column.to_sym) = column_value_utf
end
listing.save
end
return "Updated columns to UTF-8"
end
But it returns an error:
syntax error, unexpected '=', expecting keyword_end
listing.send(column.to_sym) = column_value_utf
I can't figure out how to make this work correctly.
You're using send wrong and you're sending the wrong symbol for what you want to do:
listing.send(column + '=', column_value_utf)
You're trying to call the x= method (for some x) with column_value_utf as an argument, that's what o.x = column_value_utf would normally do. So you need to build the right method name (just a string will do) and then send the arguments for that method in as arguments to send.

Resources