dynamic root host switching - symlink

I have several projects in the gwan server, so I need to switch the vhosts when I test the project from other machine in my local network.
I made a symbolic link of a vhost as the root host.
But gwan reported that "no 'root' #host for listener 0.0.0.0_80".
No matter it is a relative or absolute path link, same error occurred.
Is there any way to achieve the dynamic root host switching?

Can you try v4.25 posted this morning? v4.24 had compiler-optimization issues that prevented some code from executing (resulting in empty files on some platforms).

Related

Having problems running shellinabox on my Ubuntu 16.04

I have been trying to use shellinabox. I cloned the github repo below and followed the steps mentioned.
https://github.com/shellinabox/shellinabox
My /etc/default/shellinabox file is like below:-
# Should shellinaboxd start automatically
SHELLINABOX_DAEMON_START=1
# TCP port that shellinboxd's webserver listens on
SHELLINABOX_PORT=4200
# Parameters that are managed by the system and usually should not need
# changing:
# SHELLINABOX_DATADIR=/var/lib/shellinabox
# SHELLINABOX_USER=shellinabox
# SHELLINABOX_GROUP=shellinabox
# Any optional arguments (e.g. extra service definitions). Make sure
# that that argument is quoted.
SHELLINABOX_ARGS="--o-beep"
All steps were same for me except the deb package installation which I downloaded from a resource which is shellinabox_2.20_armel.deb which i kept in the root folder.
On the other hand I tried all steps in below link too :-
https://askubuntu.com/questions/414930/access-webpage-through-ssh
and have watched the below video from youtube:-
https://www.youtube.com/watch?v=2viORNwxNTY
But I am not able to run the localhost using my IP address to access it. https://myipaddress:4200/
When I try to access the url in web browser I see
This site can’t be reached.
myipaddress refused to connect.
Try:
Checking the connection
Checking the proxy and the firewall
ERR_CONNECTION_REFUSED
What am I missing or doing wrong?

xdebug remote server and SFTP - cannot connect

I am trying to debug PHP files which sit on a remote server (on the same network) without success.
Here is my php.ini config for xdebug on the remote server where PHP and xdebug are installed:
xdebug.remote_enable=1
xdebug.remote_host=192.168.128.56
xdebug.remote_port=9000
xdebug.remote_handler=dbgp
xdebug.remote_autostart=0
192.168.128.56 is the IP address of my PC on which my editor is installed.
I have tried to get this working with both Atom and Sublime Text 3 without success. I think that my path bindings may be incorrect.
I log into the remote linux machine using SFTP. I can then double click on php files in my application and they will open in my editor where I can work on them and save them. How can I set up the path bindings to debug these remote php files? I'm not sure what the second (local) part of the path binding should actually be? Do I need to add the location where the FTP software stores a temporary copy of the file I am working on as the local part of the binding?
I have tried the following:
URL - the address of where the app runs on the remote server:
e.g. http://www.mywebsite.com/testapp/
Path Binding - remote path to the application root on linux : path to the local copy of the files on my machine where the FTP software stores them:
e.g. /web/testApp/ : C:\Users\me\AppData\Local\Temp\scp18929\
I'm a little confused about how the path binding works, and what the values should be. Am I doing this correctly? Can this even be done?
If anyone can help that would be great.
Probably, the first thing to check is whether Xdebug actually tries to connect to your IDE. You can do that, by adding:
xdebug.remote_log=/tmp/xdebug.log
to your php.ini file. When you then initiate debugging, there should be information in the /tmp/xdebug.log file, where it will tell you where it tried to connect too, and whether the connection succeeded or failed.
If you get something like:
I: Remote address found, connecting to 192.168.128.56:9000.
E: Could not connect to client. :-(
That means that either your IDE wasn't listening for something, or that there is a firewall preventing an incoming connection, or that the IP address is incorrect.

windows hosts file seems modified localhosts xampp

upon running Comodo Cleaning Essentials found Modified Hosts Threat Warning, checked file in WINDOWS\system32\drivers\etc and to my surprise three undefined characters were added after localhost:
http://i60.tinypic.com/2d1s6wz.jpg
(see image), are these just unnecessary spaces or what could have generated this change in hosts file?
I should add that i've been running XAMPP on localhost with no problems connecting though, is it safe to remove these characters? and will XAMPP continue to perform as expected? or should this be error be reported to Comodo?
Thanks in advance!
First of all try to use ip address like http://127.0.0.1/ or your lan ip adres

how to access vagrant box "guest machine" from host machine?

I am using puphpet.com tool to set up Vagrant boxes.
Now , I am able to ssh to it and open the IP on the browser but I can not get to access the VHost I set up earlier through puphpet.
I have edited my hosts file (/etc/hosts ) "using OSX" to serve the IP 2.168.56.101 to lab.dev. Now it works fine but I can not access the virtual machine on the guest machine !!!!.
I am using PHP Laravel framework and I need to access the server name which points to /var/www/lab.dev/public/. I would appreciate very detailed answer as I am really new to all of this
Detailed Instructions
Visit PuPHPet.com to build your Vagrantfile.
Configure your Shared Folder Pairs. This is under "Deploy Target > Locally".
Let's assume this directory structure on your OS/X machine:
/Users/unrivaled/Documents/laravel-project (project files go here)
/Users/unrivaled/Documents/laravel-project/public (the web root files)
Folder Source represents the location on your main computer (the "host" operating system), where your source files reside; for example: /Users/unrivaled/Documents/laravel-project The Folder Source must be on your OS/X machine, exactly where your Laravel files are.
Folder Target represents the location on your virtual computer (the "guest" operating system), where you want Vagrant to make them visible to the web server; for example: /var/www/lab The Folder Target can be anywhere that Apache or Nginx, in the virtual machine, can reach it.
Folder Source (local machine) == Folder Target (virtual machine)
/Users/unrivaled/Documents/laravel-project (local machine) == /var/www/lab (virtual machine)
Configure your web server. Your Server Name can be anything you want; in this example, let's use "lab.dev". Configure your Server Alias; in this case, use "www.lab.dev." Your Server Name (or an alias) must match your entry in your /etc/hosts file; see below. Configure your Document Root. This is the folder on your virtual machine where your website files will go and the files that will be served by Nginx or Apache. This value must be at, or below, the Folder Target defined from Step 4; for example, /var/www/lab/public.
Notice in the example how we are giving the web server access to /var/www/lab/public? This actually refers to /Users/unrivaled/Documents/laravel-project/public on your local OS/X system, thanks to the "Shared Folder Pairs" configured in Step 2., above.
Generally, configure everything else in PuPHPet as you see fit.
Run vagrant up to get your virtual machine up and running. If it doesn't work at this stage, you need to resolve any problems before going on.
Determine your virtual machine's IP address. Use vagrant ssh to log into the virtual machine, and then ifconfig should work for this. Do not rely on the IP address defined in PuPHPet. Your virtual machine provider will likely override this value, and you need to know the actual, in-fact IP address.
On your main host computer (not the virtual machine), edit your /etc/hosts file: sudo nano /etc/hosts, adding the server's IP address, followed by the Server Name (or a Server Alias) defined in Step 5, above.
How It Works
Once you have a working web server using the settings in this example, you can view your website by going to lab.dev. Your browser in OS/X will resolve lab.dev to the proper IP address of your server by way of the /etc/hosts file. It then requests your web page from that IP address, where the server matches the requested resource, "lab.dev," to the appropriate Server Name or Server Alias that matches. The files in the Document Root for that server name (/var/www/lab/public) will be processed by the web server.
In summary, your server's IP address in the local /etc/hosts file matches your server's IP address in the virtual machine; your server's name in the local /etc/hosts file matches your Server Name (or Server Alias) in the web server on the virtual machine; the path name to your project source files on your local computer (Folder Source) maps to the Target Directory on the virtual machine; and finally, a subdirectory of that target directory (public) corresponds to the Document Root for the web server.

Setting up RabbitMQ cluster on Windows servers

I am trying to set up a RabbitMQ cluster on Windows servers, and this requires using shared Erlang cookie file. According to the documentation, all I need to do is to ensure that the root directories on different machines contain the same .erlang.cookie file. So what I did is found these files on both machines and overwrote them with the same shared version.
After that all rabbitmqctl commands failed on the machine with new file version with "unable to connect to node..." error message. I tried to restart RabbitMQ Windows service, but still rabbitmqctl complained. I even reinstalled RabbitMQ on that machine, but then .erlang.cookie was reset back to the old version. Whenever I tried to use new version of cookie file, rabbitmqctl failed. When I restored an old version, it worked fine.
Basically I am stuck and can not proceed with cluster setup until I resolve this issue. Any help is appreciated.
UPDATE: Received an answer from RabbitMQ:
"rabbitmqctl will pick up the cookie from the user home directory while the service will pick it up from C:\windows. So you will need to synchronise those with each other, as well as with the other machine."
This basically means that cookie file needs to be repaced in two places: C:\Windows and current_user.
You have the above correct. The service will use the cookie at C:\Windows and when you use rabbitmqctl.bat to query the status it is using the cookie in your user directory (%USERPROFILE%).
When the cookies don't match the error look like
C:\Program Files (x86)\RabbitMQ Server\rabbitmq_server-2.8.2\sbin>rabbitmqctl.bat status
Status of node 'rabbit#PC-FOOBAR' ...
Error: unable to connect to node 'rabbit#PC-FOOBAR': nodedown
DIAGNOSTICS
===========
nodes in question: ['rabbit#PC-FOOBAR']
hosts, their running nodes and ports:
- PC-FOOBAR: [{rabbit,49186},{rabbitmqctl30566,63150}]
current node details:
- node name: 'rabbitmqctl30566#pc-foobar'
- home dir: U:\
- cookie hash: Vp52cEvPP1PukagWi5S/fQ==
There is one more gotcha for RabbitMQ cookies on Windows... If you have a %HOMEDIR% and %HOMEPATH% environment variables (as we do in our current test environment, and sets homedir above to U:\), then RabbitMQ will get the cookie there and if there isn't one it makes one up and writes it there. This left me banging my head on my desk for quite a while when trying to get this working. Once I found this gotcha it was obvious the cookie files were the problem (as documented) they were just at an odd location (not documented AFAIK).
Hope this solves someones pain setting up RabbitMQ Clustering on Windows.

Resources