Bash: How insert text with special character in sqlite3? - bash

Let's imagine I want to save the text of various bash script in my database with sqlite3.
If my script do this
VARIABLE=$(cat "bashScript.sh")
sqlite3 mydb.db "CREATE TABLE data (myBash BLOB)"
sqlite3 mydb.db "INSERT INTO data VALUES (${VARIABLE})"
Because the bash script will have all the special character, this will not work. How can I do this?

You need to quote the value in the Insert statement, and you'll want to protect quotes in the value itself: untested:
sql_value=$(sed 's/'\''/&&/g' <<< "$VARIABLE")
sqlite3 mydb.db "insert into data values ('$sql_value')"

Related

I want to print a one-liner but I have a problem with printing command varibles, is there like a paramater that ignores them? [duplicate]

Can anyone tell me how I can type a backtick in my shell variable?
I am building a SQL query in a variable.
The column name for that query is also a variable and i need to put it between backticks (it has to be a backtick, not a ' or a ").
example:
SQLQUERY="select `${columnname}` from table"
Thanks!
Use single quotes around the parts containing the back-ticks, or escape the back-ticks with a backslash:
SQLQUERY='select `'"${columnname}"'` from table'
SQLQUERY="select \`${columnname}\` from table"

How to use a variable in sqlite3 csv import via awk

I want to import a txt file (semicolon separated) in a sqlite3 database. Unfortunately there are a lot of spaces in the file, I have to remove first. So I used something like this:
sqlite3 -csv sqliteout.db ".mode list" ".separator ;" ".import '|./normalize.awk infile.csv' importtable"
The normalize.awk removes all the spaces and returns everything again. The pipe is used to read from the input and not from a file. That works fine so far. Since I want to use this in a shell script I would like to replace the "infile.csv" by a variable but I can't find a way to do this because the variable is not evaluated with the single quotes. So, this is not working:
infile="infile.csv"
sqlite3 -csv sqliteout.db ".mode list" ".separator ;" ".import '|./normalize.awk ${infile}' importtable"
Can't I somehow see the solution/problem? Can anybody help?

How can I update column values with the content of a file without interpreting it?

I need to update values in a column when the row matches a certain WHERE clause, using the content of a text file.
The content of the file is javascript code and as such it may contain single quotes, double quotes, slashes and backslashes - out of the top of my mind, it could contain other special characters.
The content of the file cannot be modified.
This has to be done via psql, since the update is automated using bash scripts.
Using the following command - where scriptName is a previously declared bash variable -
psql -U postgres db<<EOF
\set script $(cat $scriptName.js))
UPDATE table SET scriptColumn=:script WHERE nameColumn='$scriptName';
EOF
returns the following error
ERROR: syntax error at or near "{"
LINE 1: ...{//...
^
I would like to treat the content of the file $scriptName.js as plain text, and avoid any interpretation of it.
You should quote the variable:
UPDATE table SET scriptColumn=:'script' WHERE ...
That causes the contents of the variable to be properly escaped as a string literal.
I found a solution to my problem, even though I don't know why it works.
I leave it here in the hope it might be useful to someone else, or that someone more knowledgeable than me will be able to explain why it works now.
In short, setting the variable as a psql parameter did the trick:
psql -U postgres db -v script="$(cat $scriptName.js)"<<EOF
UPDATE table SET scriptColumn=:'script' WHERE nameColumn='$scriptName'
EOF
Not sure how this differs from
psql -U postgres db <<EOF
\set script "$(cat $scriptName.js)"
UPDATE table SET scriptColumn=:'script' WHERE nameColumn='$scriptName'
EOF
which I tried previously and returns the following error:
unterminated quoted string
ERROR: syntax error at or near "//"
LINE 1: // dummy text blahblah
Thanks to everybody who helped!

Hadoop Hive: Generate Table Name and Attribute Name using Bash script

In our environment we do not have access to Hive meta store to directly query.
I have a requirement to generate tablename , columnname pairs for a set of tables dynamically.
I was trying to achieve this by running "describe extended $tablename" to a file for all tables and pick up tablename and column name pairs from the file.
is there any easier way it is done/it can be done other than this way .
The desired output is like
table1|col1
table1|col2
table1|col3
table2|col1
table2|col2
table3|col1
This script will print columns in desired format for single table. AWK parses strings from describe command, takes only column_name, concatenates with "|" and table_name variable, each string printed with \n as a delimiter between them.
#!/bin/bash
#Set table name here
TABLE_NAME=your_schema.your_table
TABLE_COLUMNS=$(hive -S -e "set hive.cli.print.header=false; describe ${TABLE_NAME};" | awk -v table_name="${TABLE_NAME}" -F " " 'f&&!NF{exit}{f=1}f{printf c table_name "|" toupper($1)}{c="\n"}')
You can easily modify it for generating output for all tables using show tables command for example.
The easier way is to access metadata database directly.

How can I escape sqlite3 query parameters in bash?

I have a script that boils down to this right now:
#!/bin/bash
SEARCH_PARAM="$1"
SQLITE3_DB="$2"
# Don't inject me please :(
sqlite3 "$SQLITE3_DB" "SELECT foo FROM Bar WHERE bundleId='$SEARCH_PARAM';"
A glaring problem is that the $SEARCH_PARAM value is very vulnerable to SQL injection. Can I fix that from the bash script or do I need to drop in another scripting language, like Python, to get access to query parameters?
How can I escape characters in SQLite via bash shell? is similar but it has fixed string arguments.
In SQL strings, the only character that needs escaping is the single quote, which must be doubled.
This can be done by using pattern substitution in the parameter expansion:
sqlite3 "..." "... bundleId = '${SEARCH_PARAM//\'/\'\'}';"
(Non-standard SQL implementations like MySQL might have additional characters that need escaping.)

Resources