Can't access ruby server deployed with passenger - ruby

I developed a Ruby website and in order to deploy it, I followed this tutorial. Everything went well, but I skipped all user switching operations because I really didn't see the point of it, and to be honest maybe this is the problem.
My problem is that the server is running, as we can see
And I can do a successful curl on it using the local machine, but I can't access it online.
To be honest, it's the first time I'm deploying a website ever so I'm sure I'm just missing an obvious thing (a DNS maybe), but I don't really know what.
The problem might also come from the fact that I'm using the passenger and Nginx binaries given by the passenger gem installed. I didn't install passenger and Nginx on my system so it's using the binaries from the gem.
EDIT:
Thanks all for the current answers, I think the problème, as stated by the first comment under this question is that I'm not using the default server port configured by Nginx but another one, so I'm gonna try to add my port in the Nginx config file.
And to clarify a bit, because I don't have a server name, I'm running my tests using :
curl ipaddress:port
EDIT 2:
I just tried looking at the config file and it appears that the passenger is generating an Nginx config file (because it uses its own Nginx standalone binary to run) that looks like that, so the port must not be not the problem.
Maybe I really have no choice but to use port:80 but now I'm not even sure if I can ping the Nginx standalone from outside my VM, I'm a total beginner with Nginx
EDIT 3:
A netstat gives me this
So the nginx server is really running, but how can I access it? Because curl ipaddress:8080 (I changed the port since the first try with 90 and replaced it with 8080) is not working. But on the local machine a curl on 0.0.0.0:8080 is still working.

Related

How do I go about setting up my Sinatra REST API on a server?

I'm an iOS developer primarily. In building my current app, I needed a server that would have a REST API with a couple of GET requests. I spent a little time learning Ruby, and landed on using Sinatra, a simple web framework. I can run my server script, and access it from a browser at localhost:4567, with a request then being localhost:4567/hello, as an example.
Here's where I feel out of my depth. I setup an Ubuntu droplet at DigitalOcean, and felt my way around to setting up all necessary tools via command line, until I could again run my server, now on this droplet.
The problem then is that I couldn't then access my server via droplet.ip.address:4567, and a bit of research lead me to discovering I need Passenger and an Apache HTTP Server to be setup, and not with simple instructions.
I'm way in over my head here, and I don't feel comfortable. There must be a better way for me to take my small group of ruby files and run this on a server, than me doing this. But I have no idea what I'm doing.
Any help or advice would be greatly appreciated.
bit of research lead me to discovering I need Passenger and an Apache HTTP Server to be setup, and not with simple instructions.
Ignore that for now. Take baby steps first. You should be able to run your Sinatra app from the command line on the DigitalOcean droplet, and then access it via droplet.ip.address:4567. If that doesn't work something very fundamental is wrong.
When you start your app, you will see what address and port the app is listening on. Make sure it's 0.0.0.0 and 4567. If it's 127.0.0.1 or localhost that means it will only service requests originating from the same machine
After you get this working, next step is to make your Sinatra app into a service. Essentially this means the app runs in the background, and auto-starts when the system reboots. Look into Supervisor which is very simple configuration to get this running.
Later you can install Apache or Nginx to put in front of your Sinatra app. These are proxies which simply forward requests from port 80 (default HTTP port) to your sinatra app, but can do additional things such as add SSL support, load balancing, custom error pages etc. - all of which you do not need right now.

Not able to access page data, using anemone with socksify gem and Tor

I ve written a ruby script using anemone gem to crawl a website. The script runs fine when used directly.
But I would like to use socksify gem so that all TCP calls from the script is routed with socks5. I did the following for the same:
Installed and started Tor project and it is running in my machine
Installed socksify gem
ran the following command socksify_ruby localhost 9050 myscript.rb as given here
However anemone does not detect any page in this case. Please let me know what mistake I am doing.
There are a number of problems that could be causing this to happen. First, if ntp is not running on your machine, and the time is off by even a little bit, you will not be able do use the socks server to do anything complicated. This happened to me. You need to install ntp and make sure it has synced before doing anything.
Second, you may find that a lot of this commands like socksify are obsolete. The best way I have found to make sure that everything happens through the socks port without dns leakage is by using curl, which has bindings for many languages. You can carefully watch the traffic with tcpdump to make sure it isn't leaking, and it is watertight in my experience.
I'd also suggest that you look at torsocks, which has recently been updated by dgoulet on github. This replaces tsocks, which the outdated socksify_ruby is based on.
Finally, hidden services have been under great strain lately, because a bot has decided to start up a few million Tor clients. Make sure you can connect with the Tor Browser Bundle, assuming the project you are working on is trying to crawl hidden service.
You didn't actually say that this project involves Tor or hidden services, but you did tag it with Tor.

Can gitlab be installed with Cherokee web server?

I've looked all over and can't figure out if you could use Cherokee instead of Apache or Nginx for gitlab. I'd rather not run multiple webservers (and imagine that they could conflict anyway). I'm giving this a shot on Ubuntu Server 12.10.
For the record, I've already installed gitlab with this guide up to the Nginx section (with all default settings other than passwords, email addresses, and hostname). I'd like to install gitlab at git.mydomain.com and I would prefer for the local server files to be located at /var/www/git.mydomain.com, as I keep all of my domains under /var/www/.
Since you already have all of the Ruby config done, you just need to hook cherokee
up for hosting RoR by following this guide http://cherokee-project.com/doc/cookbook_ror.html
My only problem turned out to be an issue with Ruby. Once that was resolved, I set up gitlab to use a port (though sockets should work too). Everything seems to work pretty well, except for an issue with pushing over HTTPS, but that might have something to do with my local Eclipse/eGit install.
So yes, gitlab will work with Cherokee.

Weird 127.0.0.1 & localhost resolution when running Apache on Mac OS X with POW/Powify installed

I could not figure out why http:// localhost would resolve and http:// 127.0.0.1 would not resolve, when i was running apache, made no sence. While when running POW both would resolve.
I ve set up proper mapping in hosts file, and created VirtualHost entry in httpd-vhosts.conf file with defined ServerName that I have already mapped to 127.0.0.1. VirtualHost entry is for reverse proxy set up.
Every time i would run localhost, VirtualHost entry would work, but anytime i would try to access 127.0.0.1 or mapped domain i would be out of luck. In my case due to project(s) set up http:// localhost is not sufficient to run dev enviroment and i need mapped entry in hosts file to function.
I also use POW for my rails and sinatra apps. Had previously encountered POW issues, and to start and shutdown POW server I have installed gem Powify. Very convinient, I thought, and assumed when i run "powify server stop", that it would take care of things, which it did, for localhost resolution at least.
So how to deal with this? I ended up uninstalling POW. So the simple solution for this issue, is to completely uninstall POW, apparently due to configuration POW sets up I ended up dealing with this problem.
Due to lack of knowledge behind the scenes of what actually happens I would appreciate if some one could point out inner workings. I read up some of the articles on how to set up POW along side Apache, but would appreciate very much if someone could explain why exactly this behavior happens.
Ended up being about powify not removing the firewall exception and just stopped pow instance to free up some memory and processing power.
Still need to uninstall pow for firewall rule to be removed. So that you can use apache properly.

Using apache on mac - NOT the built in one

I've tried downloading apache for my development on Mac OS X (Leopard) from this site:
http://www.techiecorner.com/174/how-to-install-apache-php-mysql-with-macport-in-mac-os-x/
I haven't downloaded php, so I skipped the php checks and right after it finished downloading, and starting the server, I've opended 'localhost' in my browser, and it loaded a page says 'It Works!', so I guess I'm on the right way to using an apache web server.
Now, my questions are:
1. How do I know where is this 'localhost' folder, so I can put there html files and so on?
2. Is it already set that other user can reach my website (once I'll have one under the localhost folder), or do I need to do extra stuff in order to make it so? If this is the matter, what do I need to do more?
Thanks.
from your mentioned site i see that you installed via macports and apache then goes into /opt/local/apache2/.
i think there is an etc directory there, where the configuration is found, in a filed called httpd.conf, there you can find the documentroot directive, which is your 'localhost'folder.
1) it should be /opt/local/apache2/htdocs
2) i guess the default settings of the apache in macports have it listen to incoming requests on all interfaces of your mac, so yes, other users should be able to your website.
ps:
if you're looking for a quick-and-easy method to have a local webserver with php and mysql, i would recommend you take a look at xampp.

Resources