Unable to do mysqldump to AWS RDS - amazon-ec2

I'm trying to export an existing MySQL DB from an AWS EC2 instance to RDS using mysqldump. Here's my syntax:
mysqldump wordpress-db | mysql --host= --port=3306 --user= --password wordpress-db
...where wordpress-db is an existing MySQL DB on my EC2 instance and wordpress-db is also the name of the RDS DB (the endpoint begins with "wordpress-db").
The error message is:
ERROR 1049 (42000): Unknown database 'wordpress-db'
Both the EC2 instance and RDS DB are in the same region...
I'm a bit new to RDS so there's probably something obvious I'm doing wrong. Any ideas?
Thanks,

Try this: mysqldump -h<Hostname> --port=3306 -u<useranme> -p wordpress-db > wordpress-db.sql

The form of invocation you're using requires that the database already exist on the target server.
A better approach is to add --databases immediately before the database name on the mysqldump side of the pipe. Then remove the name of the database from the mysql side of the pipe.

Excellent answers! This saved some of the few hairs I had left on my head. Here's the final syntax I used:
mysqldump -u username -p --databases wordpress-db | mysql --host=RDS-endpoint --port=3306 --user=username --password=password
Thanks!

Related

Mamp: import large database

I'm importing a large drupal database to my mac using mamp and I keep finding errors, the phpmyadmin can't import the database. can anyone help me?
Importing a large database through phpmyadmin is not recommended (it will typically hangup forever). It's much more efficient to use the command line through the Terminal.
First, make sure you can connect to your database from the command line with one of the following commands:
1/ If your root password isn't set:
mysql -u root
2/ or if you have a root password:
mysql -u root -p
3/ or if you have a specific username and password:
mysql -u username -p
If one of those commands execute correctly, you're good to go to the next step.
Notice you can exit the mysql interactive session anytime with entering:
exit
List your databases:
SHOW databases;
If you don't have your database listed here, you will need to create it:
CREATE DATABASE database_name CHARACTER SET utf8 COLLATE utf8_general_ci;
Then select your database:
USE database_name;
Finally, import the data from your sql file:
SOURCE "path/to/your/file.sql"
Second method (it suppose your database is already created)
mysql -u username -p database_name < path/to/your/file.sql

Mongoexport with Query using shell script

I am calling mongoexport using shell script but it keeps failing. My script is :--
mongo localhost:27038/admin -u test -p mongo123 < mongoexport.js
and my mongoexport.js file is :--
use db1
mongoexport --type=csv --fields _id -q '{"_id": "323549991" , "cat" : "ABC"}' --out report.csv
But every time I run it fails with below error :--
switched to db db1
2018-01-10T17:36:15.495+0000 E QUERY [thread1] SyntaxError: missing ; before statement #(shell):1:14
bye
Now I am not sure where exactly I am messing up the syntax .
Regards.
It looks like you are connecting to your mongo. You don't need to do that in order to execute mongoexport.
You just need to connect to your host (not mongo). Take a look at the official documentation
This data resides on the MongoDB instance located on the host
mongodb1.example.net running on port 37017, which requires the
username user and the password pass.
mongoexport --host mongodb1.example.net --port 37017 --username user
--password "pass" --collection contacts --db marketing --out mdb1-examplenet.json
In your case it should look like that (Untested)
mongoexport --host localhost --port 27038 --username test --password "mongo123" --db admin --collection db1 --type=csv --fields _id -q '{"_id": "323549991" , "cat" : "ABC"}' --out report.csv
I assumed your database is called admin and your collection db1, if not replace them accordingly.

How to easily DB dump to heroku's DB

I have a local db full of data that i want to push to the Heroku's DB in order to populae it.
What is the best way/tools o realize this ?
Thank you!
You could write a script that ports your current DB into a seed file, then run the seed file using heroku run rake db:seed
First dump your local database to a dump file:
PGPASSWORD=mypassword pg_dump -Fc --no-acl --no-owner -h localhost -U myuser mydb > mydb.dump
Replace myuser, mypassword, and mydb with your username, password and database name. If there is no password set, leave out the PGPASSWORD=mypassword part.
Next, you must place the mydb.dump file in a publicly accessible location, so upload it to an FTP server or Amazon S3 bucket (for example).
Then on your local machine run:
heroku pg:backups restore 'https://s3.amazonaws.com/me/mydb.dump' HEROKU_POSTGRESQL_COLOR_URL -a appname
Replace HEROKU_POSTGRESQL_COLOR_URL with the URL for your app's database. If you don't know the URL, you can find it with heroku config | grep HEROKU_POSTGRES. Replace https://s3.amazonaws.com/me/mydb.dump with the URL where you uploaded the dump file. Replace appname with name of your app as defined in Heroku.

Forking/Copying Heroku ClearDB to development environment

I'm setting up a development environment on heroku for my app and I'm having an issue copying over the DB. My current DB is ClearDB and I usually connect to it via Workbench. However, if I try to export the DB and iimport into my staging environment I get a credential issue.
I found this post on SO with regards to this issue:
Moving/copying one remote database to another remote database
And the solution is here:
mysqldump --single-transaction -u (old_database_username) -p -h (old_database_host) (database_name) | mysql -h (new_host) -u (new_user) -p -D (new_database)
But even if I run this, I'm still running into an issue with credentials. The execution wants both passwords at the same time, for old DB and new DB so it keeps failing.
I tried to inline the -p but it still asks for password. What am I missing?
Okay, that was a silly mistake. The reason I was having issues is that after option such as -u or -h, there is a space while in the option for password, there is no space. I.E.
mysqldump --single-transaction -u old_database_username -pPasswordOld -h old_database_host database_name | mysql -h new_host -u new_user -pPasswordNew -D new_database
Once corrected, everything was done.

mysqldump cannot connect using socket

This issue has been racking my brain for a few hours. I have been trying to use mysqldump to dump a database, using:
mysqldump --protocol=socket -S /var/run/mysqld/mysqld.sock database`
However, I keep getting:
1045: Access denied for user 'root'#'localhost' (using password: NO) when trying to connect
I am on localhost and running under root (sudo su).
Root#localhost is allowed in the mysql user table.
I can use > mysql to view all of the databases, but mysqldump will not work.
I do not know the root password (system generated).
I have tried adding the socket to the my.conf like so and restarting the mysql server:
[mysqldump]
socket = /var/run/mysqld/mysqld.sock
Any help would be appreciated!
Even though you are connecting via the socket, you must still give the user root
If root#localhost has no password then do this:
mysqldump -uroot --protocol=socket -S /var/run/mysqld/mysqld.sock database
If root#localhost has a password then do this:
mysqldump -uroot -p --protocol=socket -S /var/run/mysqld/mysqld.sock database
If running
mysql
lets you login with specifying -uroot, try not specifying the socket either
mysqldump database
I just noticed that the socket you specified for mysqldump is
[mysqldump]
socket = /var/run/mysqld/mysqld.sock
You need to make sure the socket is defined under the [mysqld] section of my.cnf as well
If this does not exist
[mysqld]
socket = /var/run/mysqld/mysqld.sock
then run this query
SHOW VARIABLES LIKE 'socket';
and make sure of the socket file's name and path.
You could have you System DBA add a custom user for you
GRANT ALL PRIVILEGES ON *.* TO tyler#localhost;
Then, you can run
mysqldump -utyler --protocol=socket -S /var/run/mysqld/mysqld.sock database
This is not secure. tyler should have a password. So, run this:
SET SQL_LOG_BIN=0;
GRANT ALL PRIVILEGES ON *.* TO tyler#localhost IDENTIFIED BY 'tylerspasssword';
then you can do
mysqldump -utyler -p --protocol=socket -S /var/run/mysqld/mysqld.sock database
Give it a Try !!!
I found the solution! The socket does not hold the credentials itself. They are stored in the /root/.my.cnf configuration file instead. Mine only had the username and password for the mysql command. I needed to add [mysqldump] to it as well. Here is what my /root/.my.cnf file looks like now:
[mysql]
user=root
pass=myawesomepass
[mysqldump]
user=root
pass=myawesomepass

Resources