I'm trying to run a rocket-rs application while overriding the port configuration using environment variables as described in the documentation.
I've set the variable ROCKET_PORT:
setx ROCKET_PORT 4444
I've checked it was set with an echo. When I run the application (with either cargo run or ./application.exe) it still uses port 8000:
🔧 Configured for development.
=> address: localhost
=> port: 8000
=> log: normal
=> workers: 16
=> secret key: generated
=> limits: forms = 32KiB
=> tls: disabled
🛰 Mounting '/':
=> GET /mine
=> POST /transactions/new application/json
=> GET /chain
=> GET /nodes/resolve
=> POST /nodes/register application/json
🚀 Rocket has launched from http://localhost:8000
I know the port can be configured in Rocket.toml, but the idea is to be able to run in a different port for each console session by setting the environment variable.
Why would this not be working?
Setting the variable like this did the trick:
$Env:ROCKET_PORT=4444
Related
I am new to networking, Docker and Laravel.
I want to get and read a files placed on a remote host into a Laravel App running in Docker Container.
I want to achieve this using port forwarding.
What is the best way to support this?
The general approach would be to use ssh agent to resolve this, but since my team has asked me to use port forwarding to achieve this, I am not sure how to go about it.
png file of the image
I have tried the following
console
>>> Storage::disk('sftp')->readStream('/home/remote_your_username/test.csv');
League\Flysystem\PhpseclibV3\UnableToConnectToSftpHost with message 'Unable to connect to host: 172.21.166.54'
another console
ssh -N -L 4000:127.0.0.1:22 remote_your_username#192.0.0.1
config/filesystems.php
'sftp' => [
'driver' => 'sftp',
'host' => env('SFTP_HOST'),
'username' => env('SFTP_USERNAME'),
'privateKey' => env('SFTP_PRIVATE_KEY'),
'passphrase' => env('SFTP_PASSPHRASE'),
'port' => env('SFTP_PORT', 22),
],
.env
SFTP_HOST=172.0.0.1
SFTP_USERNAME=your_username
SFTP_PRIVATE_KEY=/home/your_username/.ssh/id_rsa
SFTP_PASSPHRASE=your_passphrase
SFTP_PORT=4000
I need to make logging with logstash in Padrino project. I setup logstash on remote server and tried to integrate it with Padrino project, I found only one solution
logger = LogStashLogger.new(type: :udp, host: host, port: 5044) if RACK_ENV = 'staging'
but it can working only when use this code logger.debug message: 'test', foo: 'bar'
Can I make that all logs automatically will send to remote server?
Try this:
Padrino::Logger.logger = LogStashLogger.new(type: :udp, host: host, port: 5044)
I use this:
Padrino::Logger.logger = LogStashLogger.new(type: :udp, host: '172.16.x.x', port: 9999).extend(Padrino::Logger::Extensions)
Alright so, I decided to make sure i can get this ssl stuff working BEFORE building the api.. and I feel 95% of the way there.
So, I have a cert and key from namecheap. All should be good there.
Here is my app.rb
require 'sinatra/base'
require 'webrick'
require 'webrick/https'
require 'openssl'
class MyServer < Sinatra::Base
set :bind, '0.0.0.0'
get '/' do
"Hello, world!\n"
end
end
CERT_PATH = './ssl'
webrick_options = {
:Port => 443,
:Logger => WEBrick::Log::new($stderr, WEBrick::Log::DEBUG),
:DocumentRoot => "/ruby/htdocs",
:SSLEnable => true,
:SSLVerifyClient => OpenSSL::SSL::VERIFY_NONE,
:SSLCertificate => OpenSSL::X509::Certificate.new( File.open(File.join(CERT_PATH, "server.crt")).read),
:SSLPrivateKey => OpenSSL::PKey::RSA.new( File.open(File.join(CERT_PATH, "server.key")).read),
:SSLCertName => [ [ "CN",WEBrick::Utils::getservername ] ],
:app => MyServer
}
Rack::Server.start webrick_options
I run the program with
sudo ruby app.rb
And what's interesting is, on localhost (testing from my macbook pro, running El Capitan) i can access https://localhost and it just says the cert isn't trusted, and asks if I want to go in anyway. Great.
My ec2 instance, however, I can now access via a domain name, one that matches the cert of course. But the site just returns a ERR_CONNECTION_REFUSED (this is what displays in chrome)
But of course, that shows whether or not I run the sinatra server.
Ok, so it sounds easy. Security group, right?
Well, according to ec2, I'm using a security group that has tpc port 443 enabled on inbound. (HTTPS)
So, what gives? What am I not doing right? Why does it do what I expect on localhost but not on the ec2 instance?
Any help would be super appreciated.
Other information:
The server does appear to be running. sudo ruby app.rb gives me valid info about my cert, followed by
[2016-01-22 03:36:52] DEBUG WEBrick::HTTPServlet::FileHandler is mounted on /.
[2016-01-22 03:36:52] DEBUG Rack::Handler::WEBrick is mounted on /.
[2016-01-22 03:36:52] INFO WEBrick::HTTPServer#start: pid=2499 port=443
If I remove webrick and change the port to 80, everything works fine. I can access this app from my site's domain, on http (not https) of course.
From the local machine, I am getting a response.
$ wget https://localhost
--2016-01-22 04:11:48-- https://localhost/
Resolving localhost (localhost)... 127.0.0.1
Connecting to localhost (localhost)|127.0.0.1|:443... connected.
ERROR: cannot verify localhost's certificate, issued by ‘/C=GB/ST=Greater Manchester/L=Salford/O=COMODO CA Limited/CN=COMODO RSA Domain Validation Secure Server CA’:
Unable to locally verify the issuer's authority.
ERROR: no certificate subject alternative name matches requested host name ‘localhost’.
To connect to localhost insecurely, use `--no-check-certificate'.
This seems correct! So, it does seem to be something with the server setup. I can't connect to it. =/ Again. Security group allows 443 and 80.
Things added since I asked the question originally, but still hasn't fixed the issue:
set :bind, '0.0.0.0'
Generally you don't want any ruby webservers actually handling SSL. You make them serve plain HTTP (that is accessible only via localhost). Then you install a reverse proxy that handles all of the SSL communicate.
For example
Install nginx (reverse proxy) and configure it to listen on port 443.
Set your
ruby app server to listen on port 127.0.0.1:80 (accept local
connections only)
All requests hit nginx, which strips the SSL,
and send the plain HTTP request to your ruby webserver.
A very simple nginx config to get you started:
ssl_certificate /path/to/cert.pem;
ssl_certificate_key /path/to/your.key;
ssl on;
add_header Strict-Transport-Security "max-age=31536000; includeSubdomains;";
server {
listen 443 ssl;
server_name you.example.com;
location / {
proxy_pass http://localhost:8080; # your ruby appserver
}
}
I am following the basic Ghost server installation on an ec2 instance, so far I can run ghost server via npm start and I can see that ghost server is up and running:
Ghost is running...
Listening on 127.0.0.1:2368
Url configured as: http://54.187.25.187/
Ctrl+C to shut down
Here is the ghost config config.js:
// ### Development **(default)**
development: {
// The url to use when providing links to the site, E.g. in RSS and email.
url: 'http://54.187.25.187/',
database: {
client: 'sqlite3',
connection: {
filename: path.join(__dirname, '/content/data/ghost-dev.db')
},
debug: false
},
server: {
// Host to be passed to node's `net.Server#listen()`
host: '127.0.0.1',
// Port to be passed to node's `net.Server#listen()`, for iisnode set this to `process.env.PORT`
port: '2368'
}
At the end, I cannot access to anything when I type http://54.187.25.187:2368 on the browser. I really appreciate guidelines on how to setup ghost properly.
EDIT: The problem is solved already, it was a EC2 SG issue that ports remained closed after I have set them to open.
For Amazon EC2 we have found you need to change the port to 0.0.0.0
http://www.howtoinstallghost.com/how-to-setup-an-amazon-ec2-instance-to-host-ghost-for-free-self-install/
I have a situation somewhat similar to "How to create a ssh tunnel in ruby and then connect to mysql server on the remote host."
From my local machine, I want to run a query against a production MySQL database host.
My local machine cannot directly connect to the database host. However, if I SSH to a gateway machine, I can run the MySQL client and connect to the database host.
I'm trying to figure out how to programmatically to run a query from my machine that is forwarded to the gateway machine, which forwards it to the database server, and have results returned. Basically I want to have LOCAL => GATEWAY => DB_HOST forwarding.
I have a hacky solution that runs ssh.exec and MySQL on the command line, shown below, but, I'm trying to find a way that does port forwarding twice.
I tried, but haven't been successful so far.
require 'rubygems'
require 'mysql2'
require 'net/ssh/gateway'
gateway = Net::SSH::Gateway.new(
$gateway_host, $gateway_user,:password => $gateway_pass)
# This works
ssh = gateway.ssh($db_host, $db_login_user)
data = ssh.exec!("mysql -u#{$db_user} -p#{$db_pass} #{$db_name} -e '#{$sql_query}'")
puts "#{data.class} is the class type"
puts data
# Want to do something like this with port forwarding
# client = Mysql2::Client.new(:host => $db_host,
# :username => $db_user,
# :password => $db_pass,
# :database => $db_name,
# :port => port)
# results = client.query($sql_query)
puts "end stuff"
ssh.close
Any suggestions?
Your diagram lays it out fairly well, you would need a tunnel from the Gateway to the Db_Host; and a second tunnel from your local machine to the gateway. The two tunnels would effectively connect your local machine to the db_host by way of the gateway.
Here's a reference specific to tunneling MySQL over SSH