Move Vtiger 7 Storage and Test directories to a new location - symlink

I have recently deployed my instance of Vtiger 7 in a load balanced auto scaled configuration.
I have also created a NFS server and mounted this to my Vtiger server. This NFS will also be auto mounted to any additional servers in the auto scaled scenario.
In order for this to all work properly I need to move the /storage and the /test directories to the NFS utilizing a symbolic link.
I have set this up perfectly and also established the proper symbolic links.
Problem I’m running into is that the vtiger will read the data from the symbolic link folders without issue, however is unable to write to these folders due to permissions issues. I’ve set permissions on the NFS folders 775. I’ve also tried 777 permissions just to check it out but still getting the same errors and vtiger will not write to the directories. Any idea of how I can solve this?

After burning by eyes our for many hours, I have solved my own question.
The issue was regarding folder ownership settings. I essentially needed to change the symlink owner and the NFS directory owner to the same owner as the CRM web root.

Related

Setting up proper permissions for laravel app in ubuntu

I have a DigitalOcean VPS with ubuntu, I'm fairly new to web apps deployng and I'd like some advice on what would be the best way to properly set permissions on my laravel app folder so I don't have to worry about getting permissions errors.
I know about storage and cache folders and how I should give www-data read and write permissions but is there an easy way to set up all this nicely so I don't have to be changing permissions each time a new folder, file is added?
What does your initial setup looks like after say, cloning a git repository into the /var/www folder of your LEMP stack.
I know there are lots of tutorials on this but I'd like to know the more future-proof, solid way to handle this.
Thanks in advance and I'm open to all advice.

pyRevit Profile keeps changing

First I'm not the user using this but am implementing it for a couple of users.
We use VDi machines with all users profiles on the server. I have managed to clone the Git Repo and leave a copy on the server which I use Robo copy to copy to the users.
This has worked great but we are facing an issue when they want to change some settings we get an error. The Setting do work great if in the config file it is pointing to the UNC path (\domian.local\share\users\username) but if it points to the drive lette of the share (t:\users\username) or c drive (c:\users\username) we get an error.
I'll look for the errors and upload it.
Cheers
Isaac

Laravel 5.4 Error 500 on all but Front Page

I have a functioning laravel app that I developed locally. I moved it onto a server via ftp (just to show someone for feedback).
I changed the APP_URL in .env to the subdomain pointing to the /public folder. Also changed the database information. Everything else was left exactly as is.
I can access the front page without any problem. Anything else (e.g. /login or an AJAX to any other controller) results in a Server Error 500 that leaves no trace in the server error logs.
When I assign different routes to the / those are also displayed. I can show pages that pull data from the database, so that is not the issue.
Both local development and server run apache on linux.
Any pointers?
Update: Thank you for the suggestions so far. I currently cannot access the server via ssh (not my server). I'm working on getting that set up and will try your solutions as soon as I can.
Thanks everyone.
With a little help from the hosting company we found the problem. All we had to was to add
RewriteBase /
to the .htaccess automatically created by laravel.
Make sure that your web-server has read and write permissions to the following folders
public
bootstrap/cache
storage
If the web-server does not have these permissions it cannot compile views, store session data, write to log files or store uploaded files.
Set Webserver as owner:
assuming www-data is your webserver user.
sudo chown -R www-data:www-data /path/to/directory
Not always will work if you CHOWN it, in some cases I had to CHGRP to www-data on my Ubuntu VPS as well.
How I'm checking is this:
Domain has to point and if there's an SSL, a padlock in the web browser has to be seen. No matter on your localhost, but Laragon is the quickest to set it up.
Now I know that I see what I should if I can write something in my index.html file inside. If I can't see it, permissions or roles are wrongly set.
Laravel has tons of info online (or google it) on how to set up CHOWN and CHGRP, so, if I'll clone some repo, or unzip it, the first thing now is to set these two up. If these two are rightly done, I can do npm install, composer install - if it's not a shared hosting where I can't do it, but VPS or localhost.
Now you should be able to see Laravel's pages and only public and storage might want different permissions than the rest.
.env file should be created with right permissions as well, if no, you won't key:generate later for instance.

Symfony 3 permissions var/cache

I am setting up Symfony 3 for a web project. I have installed it as per the SymfonyBook. When I test by accessing config.php in the web folder from the browser I get the following messages:
Major problems have been detected and must be fixed before continuing:
- Change the permissions of either "app/cache/" or "var/cache/" directory so that the web server can write into it.
- Change the permissions of either "app/logs/" or "var/logs/" directory so that the web server can write into it.
I have already set up the permissions as described in the Symfony Book using setfacl and checked them www-data is the group owner of var folder and the cache, logs and sessions sub folders and the folder and file permissions are set to 775.
Please note this is Symfony 3 file structure not Symfony 2.
has anyone had a similar experience and managed to find a solution?
olved the problem. I had set the permissions in the project directory but I have set it up using PHPStorm to copy the files to a local server and I had not changed the permissions on the server. Apologies first time I has used this feature in PHPStorm as i usually work in the served directory.

unknown file 663.php in ftp root

For some reasons I have a 663.php file in every folder and subfolders of my httpdoc root in my web server ftp. I don't know where this file came from and my host does not know either
I would very much appreciate any help
Depending on it's content it could seem like a PHP Shell Backdoor.
An attacker would upload this file to gain access to your files, database ect.
They usually exploit a flaw in your application, to upload files.
Be sure to update all the software you are running.
Someone might have gained access to your site, also change all passwords.
It looks like this has occurred before to other users:
Your site has been hacked. The 663.php file is sending out anonymous
spam. If you host with GoDaddy, this is a common theme as thousands
of accounts share one IP and one person with shell access can get in
and place an htaccess file above the root folder on the server and
autoload the files into every folder in your website and onto every
site within that IP address. Year1Media
Quote from AolAnswers.
Thank you all for your help. After a little search i found out that It was a Plesk security vulnerability. The problem was solved by running a patch in parallels plesk. Apart from inserting unknown files it also changed .htaccess to redirect to weird websites.
It Is a Plesk Problem
they are correction patches here:
http://kb.parallels.com/en/113321

Resources