Using heroku pg:backups:restore to Import to Heroku Postgres - heroku

I am trying to copy a local PostgreSQL database to Heroku per this article.
Here is what I have done:
1. Make a dump file
pg_dump -Fc --no-acl --no-owner -h localhost -U postgres mydb > mydb.dump
2.Upload dump file to aws my-bucket-name/db-backup folder.
aws s3 cp mydb.dump s3://my-bucket-name/db-backup/mydb.dump
3. Generate a signed URL:
aws s3 presign s3://my-bucket-name/db-backup/mydb.dump --region us-east-2
4. Verify that the signed URL is accessible.
Navigate to the presigned URL in an incognito tab of a browser. It works.
5. Back up to Heroku using the generated signed URL
I am using double quotes around GENERATED_URL because I'm on Windows:
heroku pg:backups:restore --app my-app-name --confirm my-app-name "GENERATED_URL"
For example:
heroku pg:backups:restore --app my-app-name --confirm my-app-name "https://s3.us-east-2.amazonaws.com/s3.console.aws.amazon.com/s3/buckets/my-bucket-name/db-backup/mydb.dump?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIABCDVKE2GXCY3YXL7V%2F20200934%2Fus-east-2%2Fs3%2Faws4_request&X-Amz-Date=20200924T164718Z&X-Amz-Expires=3600&X-Amz-SignedHeaders=host&X-Amz-Signature=fb2f51c0d7fbe1234e3740cf23c37f003575d968a1e4961684a47ac627fbae2e"
THE RESULT
I get the following errors:
Restoring... !
! An error occurred and the backup did not finish.
!
! Could not initialize transfer
!
! Run heroku pg:backups:info r021 for more details.
'X-Amz-Credential' is not recognized as an internal or external command,
operable program or batch file.
'X-Amz-Date' is not recognized as an internal or external command,
operable program or batch file.
'X-Amz-Expires' is not recognized as an internal or external command,
operable program or batch file.
'X-Amz-SignedHeaders' is not recognized as an internal or external command,
operable program or batch file.
'X-Amz-Signature' is not recognized as an internal or external command,
operable program or batch file.
I've found others with similar problems, but no solutions. Thanks in advance to anyone who can help.

This is resolved. There were two issues.
PowerShell wasn't properly escaping characters. So, I switched to CMD.
The dump file was invalid.
This line of code produced an invalid dump file:
pg_dump -Fc --no-acl --no-owner -h localhost -U postgres mydb > mydb.dump
Instead, I needed to use the following syntax:
pg_dump -Fc --no-acl --no-owner -h localhost -U postgres -d mydb -f mydb.dump
After making that change, all worked smoothly.

For what it's worth, I had the same issue and my solution was to copy the S3 URL which is formatted as https://s3.amazonaws.com/<bucket_name>/<dump_file>.dump. For some reason the pre-signed URL approach did not work but the public URL did.

Related

How to use PGPASS file in Powershell to avoid password prompt?

I had to automate my postgre database backup. As instructed by my software vendor I am trying to use pg_dump.exe (see below) file to take a backup but that prompts me for password.
.\pg_dump.exe -h localhost -p 4432 -U postgres -v -b -F t -f "C:\Backup\Backup.tar" Repo
So googled and found that as per "https://www.postgresql.org/docs/9.6/libpq-pgpass.html" I can create a pgpass.conf file within 'C:\Users\User1\AppData\Roaming\postgresql\pgpass.conf" which I did.
Then I tried to pass data of pgpass.conf file to env variable before executing my pg_dump command. But it is not working. Still I am getting prompt to enter password. This is the content of pgpass.conf file: *:*:*:postgres:password
Below is the code I am trying in PowerShell,
$Env:PGPASSFILE="C:\Users\User1\AppData\Roaming\postgresql\pgpass.conf"
cd "C:\Program Files\Qlik\Sense\Repository\PostgreSQL\9.6\bin"
.\pg_dump.exe -h localhost -p 4432 -U postgres -v -b -F t -f "C:\Backup\Backup.tar" Repo
Why am I still being asked for password?
When I type following code $Env:AppData I get following response "C:\Users\User1\AppData\Roaming"
Everywhere there are guidance on how to use it in UNIX or command prompt but not in powershell. Any help is appreciated. Also if you could direct me how to secure this password file then it will be great.
With password prompt I cannot automate it with windows task scheduler.
I suspect you have a suitable solution, however, as a quick (and not secure) workaround via the command prompt, you can use the variable PGPASSWORD to hold the password then run the backup script.
A sample might be something like:
SET PGPASSWORD=password
cd "C:\Program Files\Qlik\Sense\Repository\PostgreSQL\9.6\bin" pg_dump.exe -h localhost -p 4432 -U postgres -b -F t -f "d:\qs_backup\QSR_backup.tar" QSR
Rod
I have yet to get the damned thing to work yet, but I did find this:
-w
--no-password Never issue a password prompt. If the server requires password authentication and a password is not available by other means
such as a .pgpass file, the connection attempt will fail. This option
can be useful in batch jobs and scripts where no user is present to
enter a password.
I don't see a -w parameter in your call to pg_dump
I used pg_hba file to allow connection "trust" this is riskier method but I had to get things done ASAP. Thank you for your time and effort

"'resource' is not recognized as an internal or external command" while POSTing using Web Service API in SonarQube 5.1.2

I am using SonarQube 5.1.2 on Windows 7 Professional. I am using Web Service API over cURL 7.32.0 (x86_64-pc-win32). I want to upload sonar.exclusions and few more such properties for a specific project using POST.
I use curl -u admin:admin -X POST http://localhost:9512/api/prop
erties/?id=sonar.exclusions -v -T "D:\sonar-exclusions.xml" and I am able to POST it as global sonar.exclusions.
Where as if I use resource to post it to a specific project with the command - curl -u admin:admin -X POST http://localhost:9512/api/prop
erties/?id=sonar.exclusions&resource=org.myProject:myProject -v -T "D:\sonar-exclusions.xml" I get the error {"err_code":200,"err_msg":"property created"}'resource' is not recognized as an
internal or external command, operable program or batch file
What's going wrong with the resource parameter here?
The problem is with the & in the URL, it's interperted by your command line prompt as: Let me run this command:
curl -u admin:admin -X POST http://localhost:9512/api/properties/?id=sonar.exclusions
and then run this command:
resource=org.myProject:myProject -v -T "D:\sonar-exclusions.xml"
The first one returns {"err_code":200,"err_msg":"property created"} while the second one is bound to fail with:
'resource' is not recognized as an internal or external command, operable program or batch file
You should either escape the & or simply put the URL between "quotes".

How to easily DB dump to heroku's DB

I have a local db full of data that i want to push to the Heroku's DB in order to populae it.
What is the best way/tools o realize this ?
Thank you!
You could write a script that ports your current DB into a seed file, then run the seed file using heroku run rake db:seed
First dump your local database to a dump file:
PGPASSWORD=mypassword pg_dump -Fc --no-acl --no-owner -h localhost -U myuser mydb > mydb.dump
Replace myuser, mypassword, and mydb with your username, password and database name. If there is no password set, leave out the PGPASSWORD=mypassword part.
Next, you must place the mydb.dump file in a publicly accessible location, so upload it to an FTP server or Amazon S3 bucket (for example).
Then on your local machine run:
heroku pg:backups restore 'https://s3.amazonaws.com/me/mydb.dump' HEROKU_POSTGRESQL_COLOR_URL -a appname
Replace HEROKU_POSTGRESQL_COLOR_URL with the URL for your app's database. If you don't know the URL, you can find it with heroku config | grep HEROKU_POSTGRES. Replace https://s3.amazonaws.com/me/mydb.dump with the URL where you uploaded the dump file. Replace appname with name of your app as defined in Heroku.

heroku pg: pull not fetching tables from heroku database

I'm trying to pull a heroku database to my local Windows computer by using heroku bash command
heroku pg:pull HEROKU_POSTGRESQL_COLOR mydatabase --app appname,
when I running above command I get the following error:
'env' is not recognized as an internal or external command, operable program or batch file.!
But local database 'mydatabase' is created, but without any tables.
My heroku app's database has a table in it, but it is not getting pulled to my local database.
Help me to solve it.
a couple of things:
1.When there is an error such as "'env' is not recognized as an internal or external command, operable program or batch file" it means that the system is trying to execute a command named env. This has nothing to do at all with setting up your environment variables.
Env is not a command in windows, but in unix. I understand that you have a windows machine though. What you can do is run "git bash". (You could get it by itself but it comes with Heroku's CLI).
This gives you a unix-like environment where the "env" command is supported, and then you can run the actual heroku pg:pull command.
2.If that still doesn't work, there is a workaround which works,without installing anything extra. Actually this is based on a ticket which I submitted to Heroku so I'm just going to quote their response:
"The pg:push command is just a wrapper around pg_dump and pg_restore commands. Due to the bug you encountered, it sounds like we should go ahead and do things manually. Run these using cmd.exe (The Command Prompt application you first reported the bug). First grab the connection string from your heroku application config vars.
heroku config:get DATABASE_URL
Then you want to pick out the username / hostname / databasename parts from the connection string, ie: postgres:// username : password # hostname : port / databasename. Use those variables in the following command and paste in the password when prompted for one. This will dump the contents of your heroku database for a local file.
pg_dump --verbose -F c -Z 0 -U username -h hostname -p port databasename > heroku.dump
Next you will load this file into your local database. One thing that the CLI does before running this command is to check and make sure the target database is empty, because running this against a database with real data is something you want to avoid so be careful with pg_restore. When running this manually you run the risk of mangling your data without the CLI check, so you may want to manually verify that the target database is empty first.
pg_restore --verbose --no-acl --no-owner -h localhost -p 5432 -d mydb2 < heroku.dump
I am sorry this is not a better experience, I hope this will help you make progress. We are in the process of rewriting our pg commands so that they work better on all platforms including windows, but there is no solid timeline for when this will be completed."
For taking backup like dump file in heroku firstly you need the backups addon, installing..
$heroku addons:add pgbackups
Then running below command will give you dump file in the name of latest
$ heroku pgbackups:capture
$ curl -o latest.dump `heroku pgbackups:url`
or
wget "`heroku pgbackups:url --app app-name`" -O backup.dump
Edited:(After chatting with user,)
Problem: 'env' is not recognized as an internal or external command, operable program or batch file.!
I suspected that one of the PATH variable to particular program is messed up. You can double click and check that in WINDOWS\system32 folder.
Ok so How to edit it:
My Computer > Advanced > Environment Variables
Then choose PATH and click edit button

How can I download a file from Heroku bash?

I ran a ruby script from Heroku bash that generates a CSV file on the server that I want to download. I tried moving it to the public folder to download, but that didn't work. I figured out that after every session in the Heroku bash console, the files delete. Is there a command to download directly from the Heroku bash console?
If you manage to create the file from heroku run bash, you could use transfer.sh.
You can even encrypt the file before you transfer it.
cat <file_name> | gpg -ac -o- | curl -X PUT -T "-" https://transfer.sh/<file_name>.gpg
And then download and decrypt it on the target machine
curl https://transfer.sh/<hash>/<file_name>.gpg | gpg -o- > <file_name>
There is heroku ps:copy:
#$ heroku help ps:copy
Copy a file from a dyno to the local filesystem
USAGE
$ heroku ps:copy FILE
OPTIONS
-a, --app=app (required) app to run command against
-d, --dyno=dyno specify the dyno to connect to
-o, --output=output the name of the output file
-r, --remote=remote git remote of app to use
DESCRIPTION
Example:
$ heroku ps:copy FILENAME --app murmuring-headland-14719
Example run:
#$ heroku ps:copy app.json --app=app-example-prod --output=app.json.from-heroku
Copying app.json to app.json.from-heroku
Establishing credentials... done
Connecting to web.1 on ⬢ app-example-prod...
Downloading... ████████████████████████▏ 100% 00:00
Caveat
This seems not to run with dynos that are run via heroku run.
Example
#$ heroku ps:copy tmp/some.log --app app-example-prod --dyno run.6039 --output=tmp/some.heroku.log
Copying tmp/some.log to tmp/some.heroku.log
Establishing credentials... error
▸ Could not connect to dyno!
▸ Check if the dyno is running with `heroku ps'
It is! Prove:
#$ heroku ps --app app-example-prod
=== run: one-off processes (1)
run.6039 (Standard-1X): up 2019/08/29 12:09:13 +0200 (~ 16m ago): bash
=== web (Standard-2X): elixir --sname dyno -S mix phx.server --no-compile (2)
web.1: up 2019/08/29 10:41:35 +0200 (~ 1h ago)
web.2: up 2019/08/29 10:41:39 +0200 (~ 1h ago)
I could connect to web.1 though:
#$ heroku ps:copy tmp/some.log --app app-example-prod --dyno web.1 --output=tmp/some.heroku.log
Copying tmp/some.log to tmp/some.heroku.log
Establishing credentials... done
Connecting to web.1 on ⬢ app-example-prod...
▸ ERROR: Could not transfer the file!
▸ Make sure the filename is correct.
So I fallen back to using SCP scp -P PORT tmp/some.log user#host:/path/some.heroku.log from the run.6039 dyno command line.
Now that https://transfer.sh is defunct, https://file.io is an alternative. To upload myfile.csv:
$ curl -F "file=#myfile.csv" https://file.io
The response will include a link you can access the file at:
{"success":true,"key":"2ojE41","link":"https://file.io/2ojE41","expiry":"14 days"}
I can't vouch for the security of file.io, so using encryption as described in other answers could be a good idea.
Heroku dyno filesystems are ephemeral, non-persistant and not shared between dynos. So when you do heroku run bash, you actually get a new dyno with a fresh deployment of you app without any of the changes made to ephemeral filesystems in other dynos.
If you want to do something like this, you should probably either do it all in a heroku run bash session or all in a request to a web app running on Heroku that responds with the CSV file you want.
I did as the following:
First I entered heroku bash with this command:
heroku run 'sh'
Then made a directory and moved the file to that
Made a git repository and commited the file
Finally I pushed this repository to github
Before commiting, git will ask you for your name and email. Give it something fake!
If you have files bigger than 100 Mg, push to gitlab.
If there is an easier way please let me know!
Sorry for my bad english.
Another way of doing this (that doesn't involve any third server) is to use Patrick's method but first compress the file into a format that only uses visible ASCII charaters. That should make it work for any file, regardless of any whitespace characters or unusual encodings. I'd recommend base64 to do this.
Here's how I've done it:
Log onto your heroku instance using heroku run bash
Use base64 to print the contents of your file: base64 <your-file>
Select the base64 text in your terminal and copy it
On your local machine decompress this text using base64 straight into a new file (on a mac I'd do pbpaste | base64 --decode -o <your-file>)
I agree that most probably your need means a change in your application architecture, something like a worker dyno.
But by executing the following steps you can transfer the file, since heroku one-off dyno can run scp:
create vm in a cloud provider, e.g. digital ocean;
run heroku one-off dyno and create your file;
scp file from heroku one-off dyno to that vm server;
scp file from vm server to your local machine;
delete cloud vm and stop heroku one-off dyno.
I see that these answers are much older, so I'm assuming this is a new feature. For all those like me who are looking for an easier solution than the excellent answers already here, Heroku now has the capability to copy files quite easily with the following command: heroku ps:copy <filename>
Note that this works with relative paths, as you'd expect. (Tested on a heroku-18 stack, downloading files at "path/to/file.ext"
For reference: Heroku docs
Heroku dyno's come with sftp pre-installed. I tried git but was too many steps (had to generate a new ssh cert and add it to github every time), so now I am using sftp and it works great.
You'll need to have another host (like dreamhost, hostgator, godaddy, etc) - but if you do, you can:
sftp username#ftp.yourhostname.com
Accept the server fingerprint/hash, then enter your password.
Once on the server, navigate to the folder you want to upload to (using cd and ls commands).
Then use the command put filename.csv and it will upload it to your web host.
To retrieve your file: Use an ftp client like filezilla or hit the url if you uploaded to a folder in the www or website folder path.
This is great because it also works with multiple files and binaries as well as text files.
For small/quick transfers that fit comfortably in the clipboard:
Open a terminal on your local device
Run heroku run bash
(Inside your remote connection, on the dyno) Run cat filename
Select the lines in your local terminal and copy them to your clipboard.
Check to ensure proper newlines when pasting them.
Now i created shell script to upload some files from to git backup repo (for example, my app.db sqlite file is gitignored and every deploy kills it)
## upload dyno files to git via SSH session
## https://devcenter.heroku.com/changelog-items/1112
# heroku ps:exec
git config --global user.email 'dmitry.cheva#gmail.com'
git config --global user.name 'Dmitry Cheva'
rm -rf ./.gitignore
git init
## add each file separately (-f to add git ignored files)
git add app.db -f
git commit -m "backup on `date +'%Y-%m-%d %H:%M:%S'`"
git remote add origin https://bitbucket.org/cheva/appbackup.git
git push -u origin master -f
The git will reboot after the deploy and does not store the environment, you need to perform the first 3 commands.
Then you need to add files (-f for ignored ones) and push into repo (-f, because the git will require pull)

Resources