I am setting up Symfony 3 for a web project. I have installed it as per the SymfonyBook. When I test by accessing config.php in the web folder from the browser I get the following messages:
Major problems have been detected and must be fixed before continuing:
- Change the permissions of either "app/cache/" or "var/cache/" directory so that the web server can write into it.
- Change the permissions of either "app/logs/" or "var/logs/" directory so that the web server can write into it.
I have already set up the permissions as described in the Symfony Book using setfacl and checked them www-data is the group owner of var folder and the cache, logs and sessions sub folders and the folder and file permissions are set to 775.
Please note this is Symfony 3 file structure not Symfony 2.
has anyone had a similar experience and managed to find a solution?
olved the problem. I had set the permissions in the project directory but I have set it up using PHPStorm to copy the files to a local server and I had not changed the permissions on the server. Apologies first time I has used this feature in PHPStorm as i usually work in the served directory.
Related
I have recently deployed my instance of Vtiger 7 in a load balanced auto scaled configuration.
I have also created a NFS server and mounted this to my Vtiger server. This NFS will also be auto mounted to any additional servers in the auto scaled scenario.
In order for this to all work properly I need to move the /storage and the /test directories to the NFS utilizing a symbolic link.
I have set this up perfectly and also established the proper symbolic links.
Problem I’m running into is that the vtiger will read the data from the symbolic link folders without issue, however is unable to write to these folders due to permissions issues. I’ve set permissions on the NFS folders 775. I’ve also tried 777 permissions just to check it out but still getting the same errors and vtiger will not write to the directories. Any idea of how I can solve this?
After burning by eyes our for many hours, I have solved my own question.
The issue was regarding folder ownership settings. I essentially needed to change the symlink owner and the NFS directory owner to the same owner as the CRM web root.
I am currently maintaining a Laravel 5.5 project.
I have a copy from the production that runs on my own computer. Both of the session drivers I use are File.
Recently, I found that the production started unable to save/store any file in the storage/framework/sessions folder.
However, no matter how I change the permission of all the folders inside storage folder to 777, session files just don't appear in the storage/framework/sessions folder, while the copy that runs on my own computer just writes files as usual.
I can't figure out how the problems would be, even search every information I could find, the problem still can't be solved.
Also, I'm not sure what information that is helpful for others to inspect. The only one that might be helpful maybe the host I use of the production is Hostgator.
Oh, I found the problem was at the .env file, the one in the production was modified by someone or occasionally to use cookie as its session driver.
I've neglected this part, while config/session.php didn't have any differences between the two environments.
After I set the session driver into file and ran php artisan config:cache, everything started performing correctly.
I have a functioning laravel app that I developed locally. I moved it onto a server via ftp (just to show someone for feedback).
I changed the APP_URL in .env to the subdomain pointing to the /public folder. Also changed the database information. Everything else was left exactly as is.
I can access the front page without any problem. Anything else (e.g. /login or an AJAX to any other controller) results in a Server Error 500 that leaves no trace in the server error logs.
When I assign different routes to the / those are also displayed. I can show pages that pull data from the database, so that is not the issue.
Both local development and server run apache on linux.
Any pointers?
Update: Thank you for the suggestions so far. I currently cannot access the server via ssh (not my server). I'm working on getting that set up and will try your solutions as soon as I can.
Thanks everyone.
With a little help from the hosting company we found the problem. All we had to was to add
RewriteBase /
to the .htaccess automatically created by laravel.
Make sure that your web-server has read and write permissions to the following folders
public
bootstrap/cache
storage
If the web-server does not have these permissions it cannot compile views, store session data, write to log files or store uploaded files.
Set Webserver as owner:
assuming www-data is your webserver user.
sudo chown -R www-data:www-data /path/to/directory
Not always will work if you CHOWN it, in some cases I had to CHGRP to www-data on my Ubuntu VPS as well.
How I'm checking is this:
Domain has to point and if there's an SSL, a padlock in the web browser has to be seen. No matter on your localhost, but Laragon is the quickest to set it up.
Now I know that I see what I should if I can write something in my index.html file inside. If I can't see it, permissions or roles are wrongly set.
Laravel has tons of info online (or google it) on how to set up CHOWN and CHGRP, so, if I'll clone some repo, or unzip it, the first thing now is to set these two up. If these two are rightly done, I can do npm install, composer install - if it's not a shared hosting where I can't do it, but VPS or localhost.
Now you should be able to see Laravel's pages and only public and storage might want different permissions than the rest.
.env file should be created with right permissions as well, if no, you won't key:generate later for instance.
I am currently building a C# WebApi 2 application that I will be uploading to an Amazon Elastic Beanstalk instance to deploy. I am having success so far, and on my local machine, I just finished testing the file upload capability in order for clients to upload images.
The way it goes is I accept the multipart/formdata in the Web Api and save the temp file (with a random name like BodyPart_24e246c7-a92a-4a3d-84ef-c1651416e667) to the App_Data folder. The temporary file is put into an S3 Bucket and I create a reference in my SQL Server database to it.
Testing works fine with single or multiple file uploads locally but when I deploy the application to Elastic Beanstalk and try to upload I get errors like "Could not find a part of the path 'C:\inetpub\wwwroot\sbeAPI_deploy\App_Data\BodyPart_8f552d48-ed9b-4ec2-9986-88cbffd673ee'" or a similar one saying access is denied altogether.
I have been trying to find the solution online for a few hours now, but the AWS documentation is all over the place and tutorials/other questions seem to be outdated. I believe it has something to do with not having permission to write the temporary files on the EC2 server, but I can't figure out how to fix it.
Thanks very much in advance.
This is already possible since April 2013, see also here: Basically the steps you need to perform are the following:
Create a folder called .ebextensions in the top-level of your project through the solution explorer
Add in this folder your configuration file e.g myapp.config (replace myapp with your Elastic Beanstalk's app name)
Add the code displayed underneath to this configuration file you just created. Replace MyApp with your project name (not solution name) displayed in Visual Studio
All set!! Be sure there's a file within App_Data otherwise Visual Studio won't publish it.
{
"containercommands": {
"01-changeperm": {
"command": "icacls \"C:/inetpub/wwwroot/MyApp_deploy/App_Data\" /grant DefaultAppPool:(OI)(CI)"
}
}
}
To give write permission to your DefaultAppPool you can
create an .ebextensions folder
create a config file and place it in your .ebextensions folder
This will change permission to your wwwroot folder
container_commands:
01-changeperm :
command : 'icacls "C:\\inetpub\\wwwroot" /grant "IIS APPPOOL\DefaultAppPool:(OI)(CI)F"'
I had the same problem (unable to write to a file in the App_Data folder of my web application on Elastic Beanstalk).
In my case it was sufficient to create a dummy file in the App_Data folder in my Visual Studio project. When I did this, the App_Data folder was created during deployment with permissions that allow the web application to write to it.
No need for .ebextensions to change folder permissions.
The App_Data folder does not have write permissions by default, and you would have to set appropriate permissions during deployment of your apps.
Check out this post for a detailed explanation of how to do it: http://thedeveloperspace.com/granting-write-access-to-asp-net-apps-hosted-on-aws-beanstalk/
This question is pretty old but for anyone else who ends up having the same issue. I had the same issue with AWS. Connect to your instance and change the properties for the folder you want to upload files to. Select the folder you want to grant read/write access to. Click on properties and set the permissions that way.
My issue was with uploading images to the server. I couldn't put it in the App_Data folder since that is a special offer reserved for the app only and I needed the images to be accessible through the URL. So I created another folder "Uploads". Published my api then connected to the instance through remote desktop. Located the Content folder, and set the properties to read/write for DefaultAppPool. That solved my problem, hope this helps someone out there.
i want a live site to be on local host and without effecting any functionality of live magento site. i have tried many way of doing that but have not get any result from it.
steps i tried are :
1. taken database from magento live site by entering into cpanel(by ftp access) > phpmyadmin > exported all the files to my local machine and i imported all the data to my local phpmyadmin.
2.taken all neccessary files from cpanel > file manager > all files (for example p_html, .htpassword, .trash, access log, etc file and many more) and put it on my local machine and then i put the file in folder and kept it into C:\xampp\htdocs\ all file ( in folder ).
3 Replaced the path of live site with localhost:1234 in the all sql files where applicable taken in step one.
but still not working .
Any help will be appreciated....
Copy your LIVE Magento store to your Local computer:
Download the magento files using any ftp client.
Export the database from live server.
Put the downloaded Magento files in your localhost root folder.
Create a blank database(lets say it 'local-database') in your local computer and import the database backup that you exported from the live one.
Delete/Rename the file app/etc/local.xml
Re-install the Magento using the local-database.
After installation, go to Admin section and then
(i)Flush all cache. (ii)Re-Index all data. (iii)Flush all cache.
That's it. You are done.
N.B. If you have domain specific Modules installed, those modules will not work here.
Seems very simple right. Believe me, it is that simple.
If you face any problem in installing your Magento in localcomputer, here is a post that may come in handy: http://www.insync.co.in/how-to-install-magento-on-wamp-server-localhost-localcomputer/
Steps:
Download the files to the local project folder.
Create a new local DB and import the live database backup/dump.
Update app/etc/local.xml file with local DB parameters(Host name, DB
name, DB username, DB password)
Magento have the project URL saved in 2 places(secure URL and
unsecure URL) in the table core_config_data. We need to update
that in imported local DB to the local URL(9th and 10th record in
the table).
Delete the cache: Delete the contents in var folder(That folder
contains reports and logs too. I assume that you won't need it as
this is a separate installation)
The local copy will most probably work by now but there are possibilities that it would not. Things to do in this case:
If you are getting redirected to the live site, check the .htaccess
file for redirects(For various reasons there may be a redirect
defined in the file)
If you are getting forbidden error, this will come in handy(Usually occurs in linux systems)
There may still be some problems probably theme or module specific. In this case you will need to debug the project and find out what the problem is. Xdebug will come in handy in this situation for boosting the debug process :)
Subrata's solution will conflict with some of the Magento Modules installed and will not allow you to re-install Magento in local. I follow these steps and everything works fine.
Just give a permission (0777) after take a backup to that folder in your local PC
First of all you will have to change the secure and unsecure base_url in your database.
These can be found in the 'core_config_data' table.
Paths:
web/unsecure/base_url
web/secure/base_url
If you want to access your local version of Magento via localhost, you'll have to set localhost as your base_url.
After that you need to clear your cache folder.
EDIT:
To install and run Magento on your local PC using XAMPP, please follow these steps:
http://www.magentocommerce.com/wiki/1_-_installation_and_configuration/installing_on_windows_with_xampp_and_wamp