CORS ANYWHERE WHITELIST multiple domains using command prompt - heroku

I ma trying to whitelist heroku cors with multiple domains.
How to write command using command prompt? Do I sepearate when using comma or do I repeat CORSANYWHERE_WHITELIST command for each domain?
How to white list localhost correctly?
heroku config:set -a MY_APP_NAME CORSANYWHERE_WHITELIST=https://my-domain.net, https://localhost

Related

Docker login to gcp using json credentials

I want to log into docker on google cloud from the command line in Windows using credentials in json format.
Firstly, I generated the keys of the service accounts in google cloud IAM & Admin. Afterwards, I tried to login as advised using the following commands:
set /p PASS=<keyfile.json
docker login -u _json_key -p "%PASS%" https://[HOSTNAME]
The json that is generated from google, though, has newline characters and the
above set command couldn't read the whole file.
Then, I edited the file to be a single line. But still, the set command is not reading the whole file. Any advice on how to read a json file using the set command and pass it to the docker login command below?
The solution on this is to run the following command:
docker login -u _json_key --password-stdin https://gcr.io < keyfile.json

Log in to Heroku via bash script

I try to write a script in which I call some command from heroku toolbelt. Script works fine till I am login to heroku toolbelt. When I've tried to add heroku's login command during script execution I occured some problems - there is no in heroku toolbelt command such as (command with parameters):
heroku login -u email#mail.com -p 1234qwer
That why I have no idea how to execute heroku login command in bash script. Has anyone got some advice?
For these kinds of things I use expect.
You need to install expect first. If you're on Ubuntu run sudo apt-get install expect
Then in a script, let's call it heroku_login.exp, enter this with the relevant information:
#!/usr/bin/expect
spawn heroku "login"
expect "Email:"
send "YOUREMAIL";
send "\r"
expect "Password (typing will be hidden):"
send "YOURPASSWORD"
send "\r"
interact
Then run expect heroku_login.exp and you should be good to go.
An easier solution to this problem may be to just set the HEROKU_API_KEY environment variable. The Heroku toolbelt will then automatically log you in using that key.

heroku pg: pull not fetching tables from heroku database

I'm trying to pull a heroku database to my local Windows computer by using heroku bash command
heroku pg:pull HEROKU_POSTGRESQL_COLOR mydatabase --app appname,
when I running above command I get the following error:
'env' is not recognized as an internal or external command, operable program or batch file.!
But local database 'mydatabase' is created, but without any tables.
My heroku app's database has a table in it, but it is not getting pulled to my local database.
Help me to solve it.
a couple of things:
1.When there is an error such as "'env' is not recognized as an internal or external command, operable program or batch file" it means that the system is trying to execute a command named env. This has nothing to do at all with setting up your environment variables.
Env is not a command in windows, but in unix. I understand that you have a windows machine though. What you can do is run "git bash". (You could get it by itself but it comes with Heroku's CLI).
This gives you a unix-like environment where the "env" command is supported, and then you can run the actual heroku pg:pull command.
2.If that still doesn't work, there is a workaround which works,without installing anything extra. Actually this is based on a ticket which I submitted to Heroku so I'm just going to quote their response:
"The pg:push command is just a wrapper around pg_dump and pg_restore commands. Due to the bug you encountered, it sounds like we should go ahead and do things manually. Run these using cmd.exe (The Command Prompt application you first reported the bug). First grab the connection string from your heroku application config vars.
heroku config:get DATABASE_URL
Then you want to pick out the username / hostname / databasename parts from the connection string, ie: postgres:// username : password # hostname : port / databasename. Use those variables in the following command and paste in the password when prompted for one. This will dump the contents of your heroku database for a local file.
pg_dump --verbose -F c -Z 0 -U username -h hostname -p port databasename > heroku.dump
Next you will load this file into your local database. One thing that the CLI does before running this command is to check and make sure the target database is empty, because running this against a database with real data is something you want to avoid so be careful with pg_restore. When running this manually you run the risk of mangling your data without the CLI check, so you may want to manually verify that the target database is empty first.
pg_restore --verbose --no-acl --no-owner -h localhost -p 5432 -d mydb2 < heroku.dump
I am sorry this is not a better experience, I hope this will help you make progress. We are in the process of rewriting our pg commands so that they work better on all platforms including windows, but there is no solid timeline for when this will be completed."
For taking backup like dump file in heroku firstly you need the backups addon, installing..
$heroku addons:add pgbackups
Then running below command will give you dump file in the name of latest
$ heroku pgbackups:capture
$ curl -o latest.dump `heroku pgbackups:url`
or
wget "`heroku pgbackups:url --app app-name`" -O backup.dump
Edited:(After chatting with user,)
Problem: 'env' is not recognized as an internal or external command, operable program or batch file.!
I suspected that one of the PATH variable to particular program is messed up. You can double click and check that in WINDOWS\system32 folder.
Ok so How to edit it:
My Computer > Advanced > Environment Variables
Then choose PATH and click edit button

Proxy Settings for Windows 7 Command Prompt

I am trying to use cURL in the Command Prompt, but I dont understand where I have problems. I have been told that I need to configure a proxy tothe Command Prompt so that it can access the sites I am calling on.
This is what I want to run: curl -glob "api.fda.gov/drug/event.json?&search=receivedate:[20040101+TO+20150101]&limit=1"
I have cURL installed, but always face errors because it is not connecting. Is there a simple way to set up a proxy for/through the Command Prompt in Windows 7?
I also do not have admin rights, so I cannot change the system settings.
You can set your proxy using a set command in windows:
set http_proxy=http://<yourproxyaddress>:<port>
Then you can connect your curl requests to external sites.
Some proxies require specific authentication headers to be set, so be aware of those as well. In my case, it's --proxy-ntlm in the example below:
curl -x webproxy.net:8080 -U usernaname:password http://google.com --proxy-ntlm
But there're other options:
--proxy-digest and --proxy-negotiate
Lastly, cURL has a super friendly doc page, so be sure to check it out.

How can I download a file from Heroku bash?

I ran a ruby script from Heroku bash that generates a CSV file on the server that I want to download. I tried moving it to the public folder to download, but that didn't work. I figured out that after every session in the Heroku bash console, the files delete. Is there a command to download directly from the Heroku bash console?
If you manage to create the file from heroku run bash, you could use transfer.sh.
You can even encrypt the file before you transfer it.
cat <file_name> | gpg -ac -o- | curl -X PUT -T "-" https://transfer.sh/<file_name>.gpg
And then download and decrypt it on the target machine
curl https://transfer.sh/<hash>/<file_name>.gpg | gpg -o- > <file_name>
There is heroku ps:copy:
#$ heroku help ps:copy
Copy a file from a dyno to the local filesystem
USAGE
$ heroku ps:copy FILE
OPTIONS
-a, --app=app (required) app to run command against
-d, --dyno=dyno specify the dyno to connect to
-o, --output=output the name of the output file
-r, --remote=remote git remote of app to use
DESCRIPTION
Example:
$ heroku ps:copy FILENAME --app murmuring-headland-14719
Example run:
#$ heroku ps:copy app.json --app=app-example-prod --output=app.json.from-heroku
Copying app.json to app.json.from-heroku
Establishing credentials... done
Connecting to web.1 on ⬢ app-example-prod...
Downloading... ████████████████████████▏ 100% 00:00
Caveat
This seems not to run with dynos that are run via heroku run.
Example
#$ heroku ps:copy tmp/some.log --app app-example-prod --dyno run.6039 --output=tmp/some.heroku.log
Copying tmp/some.log to tmp/some.heroku.log
Establishing credentials... error
▸ Could not connect to dyno!
▸ Check if the dyno is running with `heroku ps'
It is! Prove:
#$ heroku ps --app app-example-prod
=== run: one-off processes (1)
run.6039 (Standard-1X): up 2019/08/29 12:09:13 +0200 (~ 16m ago): bash
=== web (Standard-2X): elixir --sname dyno -S mix phx.server --no-compile (2)
web.1: up 2019/08/29 10:41:35 +0200 (~ 1h ago)
web.2: up 2019/08/29 10:41:39 +0200 (~ 1h ago)
I could connect to web.1 though:
#$ heroku ps:copy tmp/some.log --app app-example-prod --dyno web.1 --output=tmp/some.heroku.log
Copying tmp/some.log to tmp/some.heroku.log
Establishing credentials... done
Connecting to web.1 on ⬢ app-example-prod...
▸ ERROR: Could not transfer the file!
▸ Make sure the filename is correct.
So I fallen back to using SCP scp -P PORT tmp/some.log user#host:/path/some.heroku.log from the run.6039 dyno command line.
Now that https://transfer.sh is defunct, https://file.io is an alternative. To upload myfile.csv:
$ curl -F "file=#myfile.csv" https://file.io
The response will include a link you can access the file at:
{"success":true,"key":"2ojE41","link":"https://file.io/2ojE41","expiry":"14 days"}
I can't vouch for the security of file.io, so using encryption as described in other answers could be a good idea.
Heroku dyno filesystems are ephemeral, non-persistant and not shared between dynos. So when you do heroku run bash, you actually get a new dyno with a fresh deployment of you app without any of the changes made to ephemeral filesystems in other dynos.
If you want to do something like this, you should probably either do it all in a heroku run bash session or all in a request to a web app running on Heroku that responds with the CSV file you want.
I did as the following:
First I entered heroku bash with this command:
heroku run 'sh'
Then made a directory and moved the file to that
Made a git repository and commited the file
Finally I pushed this repository to github
Before commiting, git will ask you for your name and email. Give it something fake!
If you have files bigger than 100 Mg, push to gitlab.
If there is an easier way please let me know!
Sorry for my bad english.
Another way of doing this (that doesn't involve any third server) is to use Patrick's method but first compress the file into a format that only uses visible ASCII charaters. That should make it work for any file, regardless of any whitespace characters or unusual encodings. I'd recommend base64 to do this.
Here's how I've done it:
Log onto your heroku instance using heroku run bash
Use base64 to print the contents of your file: base64 <your-file>
Select the base64 text in your terminal and copy it
On your local machine decompress this text using base64 straight into a new file (on a mac I'd do pbpaste | base64 --decode -o <your-file>)
I agree that most probably your need means a change in your application architecture, something like a worker dyno.
But by executing the following steps you can transfer the file, since heroku one-off dyno can run scp:
create vm in a cloud provider, e.g. digital ocean;
run heroku one-off dyno and create your file;
scp file from heroku one-off dyno to that vm server;
scp file from vm server to your local machine;
delete cloud vm and stop heroku one-off dyno.
I see that these answers are much older, so I'm assuming this is a new feature. For all those like me who are looking for an easier solution than the excellent answers already here, Heroku now has the capability to copy files quite easily with the following command: heroku ps:copy <filename>
Note that this works with relative paths, as you'd expect. (Tested on a heroku-18 stack, downloading files at "path/to/file.ext"
For reference: Heroku docs
Heroku dyno's come with sftp pre-installed. I tried git but was too many steps (had to generate a new ssh cert and add it to github every time), so now I am using sftp and it works great.
You'll need to have another host (like dreamhost, hostgator, godaddy, etc) - but if you do, you can:
sftp username#ftp.yourhostname.com
Accept the server fingerprint/hash, then enter your password.
Once on the server, navigate to the folder you want to upload to (using cd and ls commands).
Then use the command put filename.csv and it will upload it to your web host.
To retrieve your file: Use an ftp client like filezilla or hit the url if you uploaded to a folder in the www or website folder path.
This is great because it also works with multiple files and binaries as well as text files.
For small/quick transfers that fit comfortably in the clipboard:
Open a terminal on your local device
Run heroku run bash
(Inside your remote connection, on the dyno) Run cat filename
Select the lines in your local terminal and copy them to your clipboard.
Check to ensure proper newlines when pasting them.
Now i created shell script to upload some files from to git backup repo (for example, my app.db sqlite file is gitignored and every deploy kills it)
## upload dyno files to git via SSH session
## https://devcenter.heroku.com/changelog-items/1112
# heroku ps:exec
git config --global user.email 'dmitry.cheva#gmail.com'
git config --global user.name 'Dmitry Cheva'
rm -rf ./.gitignore
git init
## add each file separately (-f to add git ignored files)
git add app.db -f
git commit -m "backup on `date +'%Y-%m-%d %H:%M:%S'`"
git remote add origin https://bitbucket.org/cheva/appbackup.git
git push -u origin master -f
The git will reboot after the deploy and does not store the environment, you need to perform the first 3 commands.
Then you need to add files (-f for ignored ones) and push into repo (-f, because the git will require pull)

Resources