Install icinga2 web as non root user / Source installation - installation

I am installing icinga2web, I need to copy files to /usr/share location to run icinga web. In order to do that we need root privilege.
Is there any other option available to install in some other location?
I need to install icinga in remote host where I don't have root access.

Yes, I have downloaded all sources
httpd
php
postgresql
icinga2 core
icingaweb
compiled in my local setup, yes its works fine!! Now I can copy my binaries across my VMs everything works fine as any USER!!
Only one restriction is path hard coded!!!

Related

Yarn (Binary) Offline installation on Centos 7

How do i install Yarn (binary) offline on Centos 7. The machine doesn't have internet. Apologise if the question has been asked before.
I couldn't find it anywhere. All Questions point to using Yarn in offline mode but not how to install it offline at the first place.
Finally, i managed to do it via tarball (Note this is Linux installation).
You can install Yarn by downloading a tarball and extracting it anywhere.
cd /opt
wget https://yarnpkg.com/latest.tar.gz
tar zvxf latest.tar.gz
Yarn is now in /opt/yarn-[version]/
the following steps will add Yarn to path variable and run it from anywhere.
Note: your profile may be in your .profile, .bash_profile, .bashrc, .zshrc, etc.
Add this to your profile:
export PATH="$PATH:/opt/yarn-[version]/bin"
(the path may vary depending on where you extracted Yarn to)
In the terminal, log in and log out for the changes to take effect
To have access to Yarn’s executables globally, you will need to set up the PATH environment variable in your terminal. To do this, add
export PATH="$PATH:`yarn global bin`"
to your profile.
Here is the link i found it
Although Yarn can work in offline mode, the packages must be downloaded and stored in the in an offline mirror. Refer to this article.
Your Centos machine will need to be connected to another machine that has access to the Internet. The most common solution is to set up a http/https proxy, then set up yarn to use the proxy
yarn config set proxy http://proxy.server.com:8080
yarn config set https-proxy http://proxy.server.com:8080

Running Composer on guest VM with Vagrant and rsync only keeps

I searched about this topic and could not find anything so here I go with my question: I have Linux running in Vagrant as guest, Windows as host; I shared folders with rsync to speed up development with Rails (using NFS or SMB is extremely slow); some of my PHP dependencies get installed with Composer within my project, so when I run Composer via SSH from the guest it downloads and installs them, however, when I restart my VM I lost the dependencies downloaded and need to start over.
So is there anything I could do to run Composer remote or locally, and not loose my changes? so far I've tried changing the sync type to SMB, run Composer, and then go back to rsync, however, I need to switch on and off and I'd like something more automated...
Thank you for your help!
Carlos.
From the docs:
The rsync synced folder does a one-time one-way sync from the machine running to the machine being started by Vagrant.
So this is intentional: If you don't get the result of ANY changes (not just Composer) out of the vagrant machine onto your copy on Windows, it will always get lost inside the VM.
Use SMB or shared folders and bear the performance penalty. Or try and get a NFS server for Windows and install the files inside the VM via NFS.

Joomla 3.1 setup on Ubuntu 12.04 running nginx

My objective is to have Joomla 3.1 running on an Ubuntu server using nginx.
I am testing the setup locally but I keep encountering problems. I think this is concerned with the permissions in my Joomla source files.
I tried to install a package (T3) manually using the "Install From Directory" option. However, I receive the following message:
Warning JFTP: :store: Bad response
JInstaller: :Install: Failed to copy file
/usr/share/nginx/immigrationinformation.com/components/com_installer/t3-1.4.1/source/plg_system_t3/t3.php
to
/usr/share/nginx/immigrationinformation.com/plugins/system/t3/t3.php
Package Install: There was an error installing an extension:
plg_system_t3
I know that this is the wrong way to set up the server but currently I have all the source files permissions set to 777. When this T3 package tries to install, it creates a folder in plugins/system/ called t3. This has only permissions drwxr-xr-x and thus the reason for the above errors.
My question is: What is the correct method to set up my Joomla 3.1 package such that I can ensure a smooth operation of the site, in a secure manner.
Thanks in advance!
The permissions when t3 installs are correct, it sounds like the problem is with ownership rather than permissions.
I think in ubuntu the system user is www-data so you would need to run the following over ssh
chown -hR www-data:www-data /path/to/joomla/root
Then upload the plugin through the joomla installer (and change folder permissions back to 755 and files to 644).

Drupal install on localhost asks for FTP info

I'm running Drupal 7.4 on localhost, and I've downloaded some themes/modules but I'm unable to install them. I go to administration/modules, for example, select 'Upload a module or theme archive to install', choose the tar.gz from my file system, and before the install I'm asked for a FTP user and password and can not advance.
I'm working locally, so I'm thinking maybe I made some mistake during the install. How can I correct this? I have to do a lot of testing on local before moving the site to a server.
I found the solution here. All I have to do is place the modules/themes inside drupal_folder/sites/default/modules or themes and that's it.
Thanks #nmc
This can happen when sites/default folder is not owned by the user that executes the install script. Make sure the folder sites/default is owned by the apache user (from your drupal root):
Ubuntu:
chown www-data sites/default
Fedora:
chown apache sites/default
If your not able to install the module, because the lack of a ftp connection, it's possible to use the old fashion way.
The other solution has described it, but it's not correct for a 100%.
If you want to do it the drupal way, you need to install the modules/themes to
drupalfolder/sites/all/modules
or
drupalfolder/sites/all/themes
if you are having a multi-installation of drupal, then:
drupalfolder/sites/domain_name/modules
or
drupalfolder/sites/domain_name/themes

Developing on Windows -> Deploying on a Virtual Machine?

Is there an easy way to integrate with VirtualBox such that I could develop under the host, Windows, and deploy and run scripts via a mounted folder in a guest linux system?
I'm looking to develop for Linux under Windows, kind of.
You can use VirtualBox's Shared Folders feature to enable your Ubuntu virtual machine to mount a directory of your Windows host. However, you're likey to be deal with some impedance mismatches like different line endings. I hope that is the least of your worries.
You might want to check out vagrant http://vagrantup.com/
It provides a nice and easy system to create a VM from a template in Virtual Box, and will automatically mount the project folder in the guest VM. The config can also easily be included in your project so others can use it.
I develop in PHP. And I use Debian as guest OS, and Win7 as host OS.
You can done automaticly mount share folder by:
new a file in /etc/init.d/ named mnt_win_sf, than you edit it:
It must has the same info head with /etc/init.d/apache2. And you need just one line of command:
mount -t vboxsf share_folder_name mount_point
We also need to excute this script before apache2, so we edit /etc/init.d/apache2. In the Require Start line, add mnt_win_sf
update them by:
sudo update-rc.d mnt_win_sf defaults
sudo update-rc.d apache2 defaults

Resources