I need to drop a mysql database directlly from a script.
I create the file ~/.my.cnf and chmod 600.
[client]
user = "**************"
password = "*********"
safe-updates
My script includes this :
CMD4="echo 'DROP DATABASE db_test;' | mysql"
curl -fs -- "$URL" | grep -q -- "$WORD1" && $CMD4
I can´t execute the command I just get this output:
'DROP DATABASE db_test;' | mysql
The database is not dropped.
What´s wrong with it?
Thanks
I think you should enclose the below statement in Grave accent (``) instead of double quotes ("") where you are initializing the value of CMD4 on the both
echo 'DROP DATABASE db_test;' | mysql
You can eliminate the pipe using mysql -Bse 'expression'. So in your case, with the correct ~/.my.cnf setting you can simply do:
mysql -Bse "drop database db_test"
You also have the full range of mysql options as well. i.e. mysql -h somehost -u someuser -p pass -Bse "expression"
Please, don't use variable to store code. Use function instead:
function dropdb {
echo 'DROP DATABASE db_test;' | mysql # aren't you missing some arguments here?
# like `-u username -p db`
}
curl -fs -- "$URL" | grep -q -- "$WORD1" && dropdb
Related
I want to export a random entry of my database into a file with the command
SELECT * FROM my_table ORDER BY RANDOM() LIMIT 1 \g /path/file;
This query works if I enter it in my db terminal, but I want to us this query with a bash script but then I get the error: syntax error at or near "\g"
My bash script looks like this:
PGPASSWORD=*** psql -U user -d db_name -h localhost -p port -t -c "SELECT * FROM my_table ORDER BY RANDOM() LIMIT 1 \g /path/file"
Bash is interpreting the string and trying to interpolate it. Probably, escaping the backslash will solve your problem.
PGPASSWORD=*** psql -U user -d db_name -h localhost -p port -t -c "SELECT * FROM my_table ORDER BY RANDOM() LIMIT 1 \\g /path/file"
A SQL statement terminated by \g is not supported by the -c command switch. Per documentation of -c:
-c command
...
command must be either a command string that is completely parsable by the server (i.e., it contains no psql-specific features), or a single backslash command. Thus you cannot mix SQL
and psql meta-commands with this option
To redirect the results to a file, there are several options:
shell redirection: psql [other options] -Atc 'SELECT...' >/path/to/data.txt
-A is to switch to unaligned mode (no space fillers to align columns).
put the SQL part in a heredoc text instead of the command line:
psql [options] <<EOF
SELECT ... \g /path/to/file
EOF
This form has the advantage that multiline statements or multiple statements are supported directly.
\copy of the query. Be aware that COPY to a FILE is different: it creates the file on the server with the permissions of postgres and requires being a database superuser. COPY TO STDOUT works too but is not better than SELECT concerning the redirection.
I found a solution for my script, and now it works.
#!/bin/bash
RANDOM_NUMBER=0
while true
do
for i in `seq 1`
do
RANDOM_NUMBER=$(($RANDOM % 100000))
echo $RANDOM_NUMBER
PGPASSWORD=*** psql -U user_name -d db_name -h localhost -p PORT -c
"INSERT INTO numbers (number) VALUES ('$RANDOM_NUMBER');"
done
sleep 10
for i in `seq 1`
do
PGPASSWORD=*** psql -U user_name -d db_name -h localhost -p PORT -c
"DELETE FROM numbers WHERE id = (SELECT id FROM numbers ORDER BY RANDOM() LIMIT 1);"
done
done
What should I do for making it work?
#!/bin/bash
TABLENAMES="user_stats"
ssh -t railsapps#xxx.xxx.xxx.xx -p xxx bash -c "'
for TABLENAME in $TABLENAMES
do
psql -d mydb -P format=unaligned -P tuples_only -P fieldsep=\, -c "SELECT * FROM $TABLENAME" > /tmp/$TABLENAME
done
'"
General problem: how to periodically dump the database tables to a local machine from a psql database in a single bash script run on Mac OS X?
Firstly, you should test your SQL and bash scripts remotely (do SSH interactively).
I think your problem is caused by a bad mix of quote / double-quote. I think the star (*) and $TABLENAME are expensed before the SSH call, so too early. Try to put a backslash before the $ sign.
You should use the verbose or the debug option, to help to understand what is really executed:
ssh -t railsapps#xxx.xxx.xxx.xx -p xxx bash -vxc "'
for TABLENAME in \$TABLENAMES; do
psql -d mydb -P format=unaligned -P tuples_only -P fieldsep=\, -c "SELECT \* FROM \$TABLENAME" > /tmp/\$TABLENAME
done
'"
Hello to professionals !
There was a good and simplest script idea to make mysqldump of every database - taken from
dump all mysql tables into separate files automagically?
author - https://stackoverflow.com/users/1274838/elias-torres-arroyo
with script as follows
#!/bin/bash
# Optional variables for a backup script
MYSQL_USER="root"
MYSQL_PASS="PASSWORD"
BACKUP_DIR="/backup/01sql/";
# Get the database list, exclude information_schema
for db in $(mysql -B -s -u $MYSQL_USER --password=$MYSQL_PASS -e 'show databases' | grep -v information_schema)
do
# dump each database in a separate file
mysqldump -u $MYSQL_USER --password=$MYSQL_PASS "$db" | gzip > "$BACKUP_DIR/$db.sql.gz"
done
sh
but the problem is that this script does not "understand" arguments like
--add-drop-database
to perform
mysqldump -u $MYSQL_USER --password=$MYSQL_PASS "$db" --add-drop-database | gzip > "$BACKUP_DIR/$db.sql.gz"
Is there any idea how to force this script to understand the additional arguments listed under
mysqldump --help
because while all my tests shows it doesn't.
Thank you in advance for any hint to try !
--add-drop-database works only with --all-databases or --databases.
See please the reference in docs
So in your case mysqldump utility ignore mentioned parameter because you are going to dump one database.
I'm getting 7: Syntax error: "(" unexpected error while running bellow code on Ubuntu. But It's run on centos without any issues.
#!/bin/sh
#
TODATE=`date '+%Y-%b-%d'`
#
# Backup Creation for Databases
#
databases=(`echo 'show databases;' | mysql -u root -ppaSSword | grep -v ^Database$`)
for DB in "${databases[#]}"; do
mysqldump --force --opt --user=root --password=paSSword $DB | gzip > /mnt/Backup/DB/${DB}_${TODATE}.sql.gz
done
#
Please help me to solve this.
I can't figure out problem. But,
I'm using bellow code for backup. It's working fine with Ubuntu
#!/bin/bash
#
TODATE=`date '+%Y-%b-%d'`
databases="$(mysql -u root -ppaSSword -Bse 'show databases')"
for DB in $databases
do
mysqldump -u root -psqlMYadmin $DB | gzip > /mnt/Backup/DB/${DB}_${TODATE}.sql.gz
done
You can redirect the 'show databases' output to dump.txt file, if done then try.
#!/bin/bash
da=$(date +"%d-%m-%y")
for db in `cat dump.txt` ; do mysqldump --force --opt --user=root --password=paSSword $db | gzip /path/to/backup/$db_"$da".sql.gz ; done
You need to escape the last '$' on the line databases= ...
There's only one ( in the script and you have the shebang line #!/bin/sh. My best guess is that the program /bin/sh does not recognize array assignment, whereas /bin/bash would.
Change your shebang to #!/bin/bash.
You'd probably do better to use $(...) in place of the back ticks.) Also, as Sami Laine points out in his answer, it would be better if you quoted the regex to the grep command (though it is not the cause of your problem):
databases=( $(echo 'show databases;' | mysql -u root -ppaSSword | grep -v '^Database$') )
I'm currently using mySQLdump to backup my dev machine and servers.
There is one project I just started, however, that has a HUUUUUGE database that I don't really need backed up, and i'll be a big problem to add it to the rest of the backup cycle.
I'm currently doing this:
"c:\Program Files\mysql\MySQL Server 5.1\bin\mysqldump" -u root -pxxxxxx --all-databases > g:\backups\MySQL\mysqlbackup.sql
Is it possible to somehow specify "except this database(s)"?
I wouldn't like to have to specify the list of DBs manually, since that would mean that I'd have to remember updating my backup batch file every time I create a new DB, and I know that's not gonna happen.
EDIT: As you probably guessed from my command line above, i'm doing this on Windows, so I can't do any kind of fancy bash stuff, only wimpy .bat things.
Alternatively, if you have other ideas to solve this same issue, they are more than welcome, of course!
mysql ... -N -e "show databases like '%';" |grep-v -F databaseidontwant |xargsmysqldump ... --databases > out.sql
echo 'show databases;' | mysql -uroot -proot | grep -v ^Database$ | grep -v ^information_schema$ | grep -v ^mysql$ | grep -v -F db1 | xargs mysqldump -uroot -proot --databases > all.sql
dumps all databases except: mysql, information_schema, mysql and db1.
Or if you'd like to review the list before dumping:
echo 'show databases;' | mysql -uroot -proot > databases.txt
edit databases.txt and remove any you don't want to dump
cat databases.txt | xargs mysqldump -uroot -proot --databases > all.sql
What about
--ignore-table=db_name.tbl_name
Do not dump the given table, which must be specified using both the database and table names. To ignore multiple tables, use this option multiple times.
Maybe you'll need to specify a few to completely ignore the big database.
I created the following one line solution avoiding multiple grep commands.
mysql -e "show databases;" | grep -Ev "Database|DatabaseToExclude1|DatabaseToExclude2" | xargs mysqldump --databases >mysql_dump_filename.sql
The -E in grep enables extended regex support which allowed to provide different matches separated by the pipe symbol "|". More options can be added to the mysqldump command. But only before the "--databases" parameter.
Little side note, i like to define the filename for the dump like this ...
... >mysql_dump_$(hostname)_$(date +%Y-%m-%d_%H-%M).sql
This will automatically ad the host name, date and time to the filename. :)
Seeing as your using Windows you should have PowerShell available to use.
Here is a short PowerShell script to get a list of all Databases, remove unwanted ones from the list & then use mysqldump to backup the others.
$MySQLPath = "."
$Hostname = "localhost"
$Username = "root"
$Password = ""
# Get list of Databases
$Databases = [System.Collections.Generic.List[String]] (
& $MySQLPath\mysql.exe -h"$Hostname" -u"$Username" -p"$Password" -B -N -e"show databases;"
)
# Remove databases from list we don't want
[void]$Databases.Remove("information_schema")
[void]$Databases.Remove("mysql")
# Dump database to .SQL file
& $MySQLPath\mysqldump.exe -h"$HostName" -u"$Username" -p"$Password" -B $($Databases) | Out-File "DBBackup.sql"
Create a backup user and only grant that user access to the databases that you want to backup.
You still need to remember to explicitly grant the privileges but that can be done in the database and doesn't require a file to be edited.
It took me a lot of finagling to come up with this but I've used it for a few years now and it works well...
mysql -hServerName -uUserName -pPassword -e "SELECT CONCAT('\nmysqldump -hServerName -uUserName -pPassword --set-gtid-purged=OFF --max_allowed_packet=2048M --single-transaction --add-drop-database --opt --routines --databases ',DBList,' | mysql -hServerName2 -uUserName2 -pPAssword2 ' ) AS Cmd FROM (SELECT GROUP_CONCAT(schema_name SEPARATOR ' ') AS DBList FROM information_schema.SCHEMATA WHERE LEFT(schema_name, 8) <> 'cclegacy' AND schema_name NOT IN ('mysql','information_schema','performance_schema','test','external','othertoskip')) a \G" | cmd
Instead of the pipe over to mysql where I'm moving from serverName to Servername2 you could redirect to a file but this allows me to tailor what I move. Sometimes i even OR the list so I can say LIKE 'Prefix%' etc.
You can use this one for production
It excludes 'performance_schema\|information_schema\|mysql\|sys'...modify for your needs
MYSQL_USER=
MYSQL_PASS=
MYSQL_HOST=
MYSQL_CONN="-u${MYSQL_USER} -p${MYSQL_PASS} -h${MYSQL_HOST}"
MYSQLDUMP_OPTIONS="--routines --triggers --single-transaction"
DBLIST=`mysql -s --host=$MYSQL_HOST --user=$MYSQL_USER --password=$MYSQL_PASS \
--execute="SHOW DATABASES;" | grep -v \
'performance_schema\|information_schema\|mysql\|sys' | awk '{printf("\"%s\" ",$0)}'`
mysqldump ${MYSQL_CONN} ${MYSQLDUMP_OPTIONS} --databases ${DBLIST} | gzip >all-dbs.sql.gz