I’ve got two applications running in my CI deployment. I’m using a separate folder to hold shared resources between the two apps.
Then in autoload.php i have:
$autoload['packages'] = array(SHARED_RESOURCES_PATH);
I’m finding models and helpers load fine from the shared directory, however i’ve added a config folder with database.php and this doesn’t get loaded.
Anyone know why this would be?
Most likely the new config directory you created does not have the same permissions or ownership as your other shared directories; compare permissions/ownership & insure config matches the others (this is usually the culprit when all else is functioning properly).
Related
I have a Laravel project in my local machine that is currently in development, due to reasons, the application is showcase to people using an Nginx server, the problem is this, in the local machine I host the project in a root directory (localhost:8000/), but in the Nginx I host it in a folder (e.g. 10.x.x.x/webapp/), this breaks a lot of stuff and I need to constantly change back and fort the reference to the assets and scripts I have, example:
Font awesome fail to load because is looking for the js & css directories and not the webapp/js & webapp/css directories
In a vue component holding a picture the picture won't load because is looking for /img/picture.jpg instead of /webapp/img/picture.jpg
The only way I manage to solve this is, in the case of Font awesome, is to add the mix.setResourceRoot('/webapp') parameter on the webpack.mix.js file, and in the case of the assets to add /webapp at the beggining, but doing that breaks everything on localhost, since now everything is pointing to a folder that doesn't exist in my machine.
What is a possible solution to have both running without constant reference changes? And what other possible problems could emerge?
I am currently maintaining a Laravel 5.5 project.
I have a copy from the production that runs on my own computer. Both of the session drivers I use are File.
Recently, I found that the production started unable to save/store any file in the storage/framework/sessions folder.
However, no matter how I change the permission of all the folders inside storage folder to 777, session files just don't appear in the storage/framework/sessions folder, while the copy that runs on my own computer just writes files as usual.
I can't figure out how the problems would be, even search every information I could find, the problem still can't be solved.
Also, I'm not sure what information that is helpful for others to inspect. The only one that might be helpful maybe the host I use of the production is Hostgator.
Oh, I found the problem was at the .env file, the one in the production was modified by someone or occasionally to use cookie as its session driver.
I've neglected this part, while config/session.php didn't have any differences between the two environments.
After I set the session driver into file and ran php artisan config:cache, everything started performing correctly.
I just uploaded my laravel project to my shared hosting and was wondering what things and changes to configuration should I make to make the project work?
My typical checklist:
Modify the document root to the /public folder
Make the /storage and bootstrap/cache folders writeable.
Set up the database
Modify the .env file to suit the live environment
Run php artisan migrate
This should get you up and running, or at least bring you to a point where the errors are detailed enough to work things out
I am setting up Symfony 3 for a web project. I have installed it as per the SymfonyBook. When I test by accessing config.php in the web folder from the browser I get the following messages:
Major problems have been detected and must be fixed before continuing:
- Change the permissions of either "app/cache/" or "var/cache/" directory so that the web server can write into it.
- Change the permissions of either "app/logs/" or "var/logs/" directory so that the web server can write into it.
I have already set up the permissions as described in the Symfony Book using setfacl and checked them www-data is the group owner of var folder and the cache, logs and sessions sub folders and the folder and file permissions are set to 775.
Please note this is Symfony 3 file structure not Symfony 2.
has anyone had a similar experience and managed to find a solution?
olved the problem. I had set the permissions in the project directory but I have set it up using PHPStorm to copy the files to a local server and I had not changed the permissions on the server. Apologies first time I has used this feature in PHPStorm as i usually work in the served directory.
I have a few resources (log files, database files, separate configuration files, etc.) that I would like to be able to access from my OSGi bundles. Up until now, I've been using a relative file path to access them. However, now my same bundles are running in different environments (plain old Felix and Glassfish).
Of course, the working directories are different and I would like to be able to use a method where the directory is known and deterministic. From what I can tell, the working directory for Glassfish shouldn't be assumed and isn't spec'ed (glassfish3/glassfish/domains/domain1/config currently).
I could try to embed these files in the bundle themselves, but then they would not be easily accessible. For instance, I want it to be easy to find the log files and not have to explode a cached bundle to access it. Also, I don't know that I can give my H2 JDBC driver a URL to something inside a bundle.
A good method is to store persistent files in a subdirectory of the current working directory (System.getProperty("user.dir") or of the users home directory (System.getProperty("user.home"))
Temporary and bundle specific files should be stored in the bundle's data area (BundleContext.getData()). Uninstalling the bundle will then automatically clean up. If different bundles need access to the same files, use a service to pass this information.
Last option is really long lived critically important files like major databases should be stored in /var or Window's equivalent. In those cases I would point out the location with Config Admin.
In general it is a good idea to deliver the files in a bundle and expand them to their proper place. This makes managing the system easier.
You have some options here. The first is to use the Configuration Admin service to specify a configuration directory, so you can access files if you have to.
For log files I recommend Ops4J Pax Logging. It allows you to simply use a logging API like slf4j and Pax Logging does the log management. It can be configured using a log4j config.
I think you should install the DB as a bundle too. For example I use Derby a lot in smaller projects. Derby can simply be started as a bundle and then manages the database files itself. I'm not sure about h2 but I guess it could work similarly.