I’ve inherited a Laravel 5.3 application that does not appear to be logging web processes or anything else on the server-side in my development environment. Here’s the things I’ve tried/confirmed.
Set APP_DEBUG = true
storage/logs exists and all users have read/write/execute permissions
I’ve created an empty laravel.log file, thinking it needs to exist before it can be written to. I’ve also run the app without that file.
FWIW, this app is running in a vagrant instance and has debugger bar installed.
Any thoughts on what is going on here or something I can try to get logging started?
Thanks.
I found it hiding in the vagrant instance here: /var/log/nginx
That solved, I'd still be grateful for any insight or resource as to how or why that's configured. Knowing this and searching within the project and combing through the Vagrantfile still doesn't shed light on why it's being saved there rather than storage/logs.
Related
I am unable to create a new Common Data Service Database in my Power Apps default environment. Please see the error text below.
It looks like you don't have permission to use the Common Data Service
in this environment. Switch to a different environment, or create your
own.
Which as I understand I should be able to create after the Microsoft Business Application October 2018 update as listed in the article available at following link.
https://community.dynamics.com/365/b/dynamicscitizendeveloper/archive/2018/10/17/demystifying-dynamics-365-and-powerapps-environments-part-1
Also when I try to create a Common Data Service app in my default environment, I encounter following error.
The data did not load correctly. Please try again.
The environment 'Default-57e1485d-1197-4afd-b792-5c423ab508d9' is not
linked to a new CDS 2.0 instance. The operation 'ListInstanceMetadata'
is forbidden for unlinked environments
Moreover I am unable to see the default environment on https://admin.powerapps.com/environments, I can only see the Sandbox environment there.
Any ideas what I am missing here?
Thank you.
Someone else faced a similar issue and I read in one of the threads about deleting the browser cache and trying it again or trying it in a different browser resolved the issue. Could you try these first level steps and check if you still have these issues?
Ref: https://powerusers.microsoft.com/t5/Common-Data-Service-for-Apps/Default-Environment-Error-on-CDS/m-p/233582#M1281
Also, for your permission error ref: https://powerusers.microsoft.com/t5/Common-Data-Service-for-Apps/Common-Data-Service-Business-Flows/td-p/142053
I have not validated these findings. But as these answers are from MS and PowerApps team, hope it helps!
I've spent 3 days beating my head against this before coming here in desperation.
So long story short I thought I'd fire up a simple PHP site to allow moderators of a gaming group I'm in the ability to start GCP servers on demand. I'm no developer so I'm looking at this from a Systems perspective to find the simplest solution to do the job.
I fired up an Ubuntu 18.04 machine on GCP and set it up with the Google SDK, authorised it for access to the project and was able to simply run gcloud commands which worked fine. Had some issues with the PHP file calling the shell script to run the same commands but with some testing I can see it's now calling the shell script no worries (it broadcasts wall "test") to console everytime I click the button on the PHP page.
However what does not happen is the execution of the gcloud command. If I manually run this shell script it starts up the instance no worries and broadcasts wall, if I click the button it broadcasts but that's it. I've set the files to have execution rights and I've even added the user nginx runs as to have sudo rights, putting sudo sh in front of the command in the PHP file also made no difference. Please find the bash script below:
#!/bin/bash
/usr/lib/google-cloud-sdk/bin/gcloud compute instances start arma3s1-prod --zone=australia-southeast1-b
wall "test"
Any help would be greatly appreciated, this coupled with an automated shut down would allow our gaming group to save money by only running the servers people want to play on.
Any more detail you want about the underlying system please let me know.
So I asked a PHP dev at work about this and in two seconds flat she pointed out the issue and now I feel stupid. In /etc/passwd the www-data user had /usr/sbin/nologin and after I fixed that running the script gcloud wanted permissions to write a log file to /var/www. Fixed those and it works fine. I'm not terribly worried about the page or even server being hacked and destroyed, I can recreate them pretty easily.
Thanks for the help though! Sometimes I think I just need to take a step back and get a set fresh of eyes on the problem.
When you launch a command while logged in, you have your account access rights to the Google cloud API but the PHP account doesn't have those.
Even if you add the www-data user to root, that won't fix the problem, maybe create some security issues but nothing more.
If you really want to do this you should create a service account and giving the json to the env variable, GOOGLE_APPLICATION_CREDENTIALS, which only have the rights on the compute instance inside your project this way your PHP should have enough rights to do what you are asking him.
Note that the issue with this method is that if you are hacked there is a change the instance hosting your PHP could be deleted too.
You could also try to make a call to prepared cloud function which will create the instance, this way, even if your instance is deleted the cloud function would still be there.
For the past three to four months, we have our application live and running, we haven't deployed any new fixes / changes on Live. However ever unfortunately, we noticed that application has stopped running.
Following is the issue we observed from our logs :
"Can't create/write to file '/var/tmp/#sql_2f6_0.MYI" .
It would be really appreciable if anyone of you can extend your help.
Check the services and the User for which your Mysql is giving you this error. It is very much possible that any of the services might be down, or the User by which you are using the DB is not getting authenticated.
You or the user that handles your SQL service doesn't have permission to /var/tmp/. You can fix this by using chmod or Security permissions, depending on which platform you're on.
EDIT - HUGE ERROR ON MY PART
I found another site that had the issue that I knew was not on the same server. Then I realized that the original site with the issue was also on a different server and had not been moved over completely yet. The server in question was actually a Plesk Parallels' server and the issue was caused by a patch applied to the server over the weekend due to a security update. This server did have the file path and I just had to chmod it to 777 instead of 77x for it to work. I apologize for the confusion and thank everyone for trying to help. +'s for all. :)
Original Post
I have a website on a shared hosting server (also mine) that since yesterday started giving me this error:
Warning: session_start() [function.session-start]: open(/var/lib/php/session/sess_678cf69f0f17b87c52136ee0280d23cc, O_RDWR) failed: Permission denied (13) in /var/www/vhosts/domain.net/httpdocs/index.php on line 1
I've checked /usr/lib/php.ini and /usr/local/lib/php.ini to see where it is set and both say it is set to the /tmp directory, which is where it should be set and always has been. The /var/lib/php/session directory never even existed. I did create it and give it 777 permissions but that did not help. Though the bigger issue here is why did it change to begin with. There is no .htaccess file for this site and I cannot find this being set anywhere on the site itself either.
This is the ONLY site on this server with this issue, telling me its something local to the website. I just cannot figure out what. So my question is this: what should I look for to check the session save path settings for an individual site on a shared hosting environment to find out why it suddenly changed for this one client?
FYI, I am running a WHM server.
Thanks
session_save_path(realpath(dirname($_SERVER['DOCUMENT_ROOT']) . '/../tmp'));
You need to add the above code before starting the session.
You don't appear to have write permission to the /var directory on your server. This is a bit weird, but you can work around it. Before the call to session_start() put in a call to session_save_path() and give it the name of a directory writable by the server. More details here
I found another site that had the issue that I knew was not on the same server. Then I realized that the original site with the issue was also on a different server and had not been moved over completely yet. The server in question was actually a Plesk Parallels' server and the issue was caused by a patch applied to the server over the weekend due to a security update. This server did have the file path and I just had to chmod it to 777 instead of 77x for it to work. I apologize for the confusion and thank everyone for trying to help. +'s for all. :)
I was wondering if someone could help me.
I have started using version control (git) for my website which is using CodeIgniter.
Everytime i transfer files from my localhost host to my live server, i always have to go through all my files and change the config details.
I came across a post saying i could do all this with the ENVIRONMENT settings in the index.php file automatically based on the SERVER_NAME.
Has anybody done this before? if so, would it be possible to let me know how its done properly?
Cheers,
Try this for a start (index.php):
if ($_SERVER["HTTP_HOST"] == 'devserver1' || $_SERVER["HTTP_HOST"] == 'devserver2')
define('ENVIRONMENT', 'development');
else
define('ENVIRONMENT', 'production');
Then, whenever you need it, you check for the ENVIRONMENT constant (for example, different database settings, etc.). For localhost, simply check if the server is 'localhost' ($_SERVER["HTTP_HOST"] == 'localhost'), or whichever virtual host name you might be using.
You could always use environment variables
http://httpd.apache.org/docs/2.2/env.html
This will allow you to get the environment instead of hard coding the information in your code
This may also help you out
http://docstore.mik.ua/orelly/linux/apache/ch04_06.htm
Not sure if you're still needing help with this, but I had this issue a while ago and released a CodeIgniter module which is designed to automatically handle multiple environments.
I'm shameful to be plugging myself, but it's saved me lots of editing and it might be of use to anyone else that'll read this post in the future.
Here's the link to the Git repo: https://github.com/jedkirby/ci-multi-environments and this is a brief explanation of why and how I made the module: http://jedkirby.com/blog/2012/11/codeigniter-multiple-development-environments