Handling 'Error in exception handler' error in Laravel - laravel

I get the "Error in exception handler" error very often, mainly because of file permission issue, and sometimes because of error in code.
I want to redirect user to a custom error page every time the system encounters the 'error in exception handler' error.
How do I handle this error?

It's because Laravel can't write to the logfile. If you don't want logs, you can disable it in app/start/global.php around line 55:
App::error(function(Exception $exception, $code)
{
Log::error(...); //comment out this line.
});
But honestly, that would be a symptom-treatment instead of a problem-treatment. You should chown the app/storage recursively to the user running the server. Fastest way:
In public/index.php, at the very top, temporarily put in die(`whoami`) just after the opening <?php-tag.
Load any page and copy whatever it prints on the site. Let's say it's www-data.
Fire up a terminal/console, go to your project root and run chown www-data -R app/storage, swapping www-data with whatever you found in step two.

Related

MapReduceIndexerTool output dir error "Cannot write parent of file"

I want to use Cloudera's MapReduceIndexerTool to understand how morphlines work. I created a basic morphline that just reads lines from the input file and I tried to run that tool using that command:
hadoop jar /opt/cloudera/parcels/CDH/lib/solr/contrib/mr/search-mr-*-job.jar org.apache.solr.hadoop.MapReduceIndexerTool \
--morphline-file morphline.conf \
--output-dir hdfs:///hostname/dir/ \
--dry-run true
Hadoop is installed on the same machine where I run this command.
The error I'm getting is the following:
net.sourceforge.argparse4j.inf.ArgumentParserException: Cannot write parent of file: hdfs:/hostname/dir
at org.apache.solr.hadoop.PathArgumentType.verifyCanWriteParent(PathArgumentType.java:200)
The /dir directory has 777 permissions on it, so it is definitely allowed to write into it. I don't know what I should do to allow it to write into that output directory.
I'm new to HDFS and I don't know how I should approach this problem. Logs don't offer me any info about that.
What I tried until now (with no result):
created a hierarchy of 2 directories (/dir/dir2) and put 777 permissions on both of them
changed the output-dir schema from hdfs:///... to hdfs://... because all the examples in the --help menu are built that way, but this leads to an invalid schema error
Thank you.
It states 'cannot write parent of file'. And the parent in your case is /. Take a look into the source:
private void verifyCanWriteParent(ArgumentParser parser, Path file) throws ArgumentParserException, IOException {
Path parent = file.getParent();
if (parent == null || !fs.exists(parent) || !fs.getFileStatus(parent).getPermission().getUserAction().implies(FsAction.WRITE)) {
throw new ArgumentParserException("Cannot write parent of file: " + file, parser);
}
}
In the message printed is file, in your case hdfs:/hostname/dir, so file.getParent() will be /.
Additionally you can try the permissions with hadoop fs command, for example you can try to create a zero length file in the path:
hadoop fs -touchz /test-file
I solved that problem after days of working on it.
The problem is with that line --output-dir hdfs:///hostname/dir/.
First of all, there are not 3 slashes at the beginning as I put in my continuous trying to make this work, there are only 2 (as in any valid HDFS URI). Actually I put 3 slashes because otherwise, the tool throws an invalid schema exception! You can easily see in this code that the schema check is done before the verifyCanWriteParent check.
I tried to get the hostname by simply running the hostname command on the Cent OS machine that I was running the tool on. This was the main issue. I analyzed the /etc/hosts file and I saw that there are 2 hostnames for the same local IP. I took the second one and it worked. (I also attached the port to the hostname, so the final format is the following: --output-dir hdfs://correct_hostname:8020/path/to/file/from/hdfs
This error is very confusing because everywhere you look for the namenode hostname, you will see the same thing that the hostname command returns. Moreover, the errors are not structured in a way that you can diagnose the problem and take a logical path to solve it.
Additional information regarding this tool and debugging it
If you want to see the actual code that runs behind it, check the cloudera version that you are running and select the same branch on the official repository. The master is not up to date.
If you want to just run this tool to play with the morphline (by using the --dry-run option) without connecting to Solr and playing with it, you can't. You have to specify a Zookeeper endpoint and a Solr collection or a solr config directory, which involves additional work to research on. This is something that can be improved to this tool.
You don't need to run the tool with -u hdfs, it works with a regular user.

Joomla 500 Error

I am trying to log into my Joomla administrator on my Localhost but I keep getting a 500 error.
I've tried loads of things:
Changed file permissions of:
chmod 777 error.php,
chmod 775 cache, logs adminstrator etc
I've tried uncommenting # RewriteBase / in my htaccess
I've changed the path in my configuration.php to:
public $log_path = './logs';
public $tmp_path = './tmp';
I have also checked my apached error logs and it is coming back with the following:
[22-May-2014 14:17:49 Europe/Berlin] PHP Warning: fopen(./logs/error.php): failed to open stream: No such file or directory in /Applications/XAMPP/xamppfiles/htdocs/mydir/libraries/joomla/log/loggers/formattedtext.php on line 248
[22-May-2014 14:17:49 Europe/Berlin] PHP Warning: fputs() expects parameter 1 to be resource, boolean given in /Applications/XAMPP/xamppfiles/htdocs/mydir/libraries/joomla/log/loggers/formattedtext.php on line 254
Any help much appreciated
,
Create the directory
/Applications/XAMPP/xamppfiles/htdocs/mydir/libraries/joomla/log/loggers/logs
Then try again

laravel4.1 can't render template when upload to server

it's weird, when developing localhost, everything works fine, the default page shows.
after upload to server, it just show blank page !
it's driving me crazy !
echo 'outside route';
Route::get('/', function()
{
echo 'inside route';
return View::make('hello');
});
both echo works, but View::make('hello') just don't work, views/hello.php is the default file.
You might have to fix your permissions on the remote server, as it might be a cache issue.
1) Run recursive chmod on you storage path (*assuming you already have proper file ownage)
cd /path/to/laravel
chmod -R 755 app/storage
2) Clear cache with Artisan
php artisan cache:clear
3) Refresh page, should work now.
*if you are running the http server as different user (for example you're on Ubuntu and Apache runs as user www-data), you might want to set file ownage for Laravel app files as well
chown -R www-data .
EDIT:
Just a remark about your code example - remember that if you want to use Blade templating engine you have to name your files accordingly. If you want to have a blade template called 'something', you will place your code in app/views/something.blade.php and than reffer to it for example View::make('something').

got error 22 from storage engine mysql

mysqldump: Error: 'got error 22 from storage engine' when trying to dump
tablespaces
mysqldump: Got error: 23: Out of resources when opening file '.\database\table.MYD' (Errcode: 24) when using LOCK TABLES
i got this error when trying to make a dump in any database that I select , looks like that database is corrupted , is possible repair that ?
You seem to have reached the maximum number of open files. This limit is either MySQL's or the system's.
increase the value for the open_files_limit in your MySQL configuration file (this directive does not exist in a default installation, so you might need to create it in the [mysqld] section)
increase the limit at system level (but I am not sure this applies to Windows)
Here are some reasons for this error:
Type “source path-to-SQL-file“. BUT, you must follow these rules:
Use the full source command, not the . shortcut.
Have no spaces in your path. I copied mine to a root of a drive. Note that spaces in the file name is OK, just not the path.
Do not quote the file name, even if it has spaces. This gave error 22.
Use forward slashes in the path, e.g., C:/path/to/filename.sql. Otherwise you’ll get error 2.
Do not end with a semicolon.
Please check your read write access to the drive where you have stored your mySQL database.
error 22 occurred usually when you have no write access to that drive.

PHP session handling errors

I have this at the very top of my send.php file:
ob_start();
#session_start();
//some display stuff
$_SESSION['id'] = $id; //$id has a value
header('location: test.php');
And the following at the very top of my test.php file:
ob_start();
#session_start();
error_reporting(E_ALL);
ini_set('display_errors', '1');
print_r($_SESSION);
When the data sends to test.php, the following is displayed:
Array ( )
Warning: Unknown: open(/var/lib/php/session/sess_isu2r2bqudeosqvpoo8a67oj02, O_RDWR) failed: Permission denied (13) in Unknown on line 0
Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/var/lib/php/session) in Unknown on line 0
I've tried only using session_start(); but the results are the same.
Look at your message
So first thing it relate to permission
open(/var/lib/php/session/sess_isu2r2bqudeosqvpoo8a67oj02, O_RDWR) failed: Permission denied (13) in Unknown on line 0
you have to check file permission
change mode this /var/lib/php/session/
Second thing it relate to session.save_path
Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/var/lib/php/session) in Unknown on line 0
in php.ini
[Session]
; Handler used to store/retrieve data.
session.save_handler = files
; Argument passed to save_handler. In the case of files, this is the path
; where data files are stored. Note: Windows users have to change this
; variable in order to use PHP's session functions.
;
; As of PHP 4.0.1, you can define the path as:
;
; session.save_path = "N;/path"
;
; where N is an integer. Instead of storing all the session files in
; /path, what this will do is use subdirectories N-levels deep, and
; store the session data in those directories. This is useful if you
; or your OS have problems with lots of files in one directory, and is
; a more efficient layout for servers that handle lots of sessions.
;
; NOTE 1: PHP will not create this directory structure automatically.
; You can use the script in the ext/session dir for that purpose.
; NOTE 2: See the section on garbage collection below if you choose to
; use subdirectories for session storage
;
session.save_path = /tmp/ <= HERE YOU HAVE TO MAKE SURE
; Whether to use cookies.
session.use_cookies = 1
you have to change your session.save_path setting to the accessible dir, /tmp/ for example
How to change: http://php.net/session_save_path
Being on the shared host, it is advised to set your session save path inside of your home directory but below document root
also note that
using ob_start is unnecessary here,
and I am sure you put # operator by accident and already going to remove it forever, don't you?
This was a known bug in version(s) of PHP . Depending on your server environment, you can try setting the sessions folder to 777:
/var/lib/php/session (your location may vary)
I ended up using this workaround:
session_save_path('/path/not/accessable_to_world/sessions');
ini_set('session.gc_probability', 1);
You will have to create this folder and make it writeable. I havent messed around with the permissions much, but 777 worked for me (obviously).
Make sure the place where you are storing your sessions isn't accessible to the world.
This solution may not work for everyone, but I hope it helps some people!
You can fix the issue with the following steps:
Verify the folder exists with sudo cd /var/lib/php/session. If it does not exist then sudo mkdir /var/lib/php/session or double check the logs to make sure you have the correct path.
Give the folder full read and write permissions with sudo chmod 666 /var/lib/php/session.
Rerun you script and it should be working fine, however, it's not recommended to leave the folder with full permissions. For security, files and folders should only have the minimum permissions required. The following steps will fix that:
You should already be in the session folder so just run sudo ls -l to find out the owner of the session file.
Set the correct owner of the session folder with sudo chown user /var/lib/php/session.
Give just the owner full read and write permissions with sudo chmod 600 /var/lib/php/session.
NB
You might not need to use the sudo command.
Go to your PHP.ini file or find PHP.ini EZConfig on your Cpanel and set your session.save_path to the full path leading to the tmp file, i.e: /home/cpanelusername/tmp
please make sure the session.save_path is set correctly in the php.ini. php needs read/write access to the directory to which this variable is set.
more information: http://www.php.net/manual/en/session.configuration.php#ini.session.save-path
I had the same error everything was correct like the setting the folder permissions.
It looks like an bug in php in my case because when i delete my PHPSESSID cookie it was working again so aperently something was messed up and the session got removed but the cookie was still active so php had to define the cause differently and checking first if the session file is still they and give another error and not the permission error
When using latest WHM (v66.0.23) you may go to MultiPHP INI Editor choose PHP version and set session.save_path to default i.e. /var/cpanel/php/sessions/ea-php70 instead of previous simple tmp - this helped me to get rid of such errors.
When using the header function, php does not trigger a close on the current session. You must use session_write_close to close the session and remove the file lock from the session file.
ob_start();
#session_start();
//some display stuff
$_SESSION['id'] = $id; //$id has a value
session_write_close();
header('location: test.php');
check your cpanels space.remove unused file or error.log file & then try to login your application(This work for me);
I got these two error messages, along with two others, and fiddled around for a while before discovering that all I needed to do was restart XAMPP! I hope this helps save someone else from the same wasted time!
Warning: session_start(): open(/var/folders/zw/hdfw48qd25xcch5sz9dd3w600000gn/T/sess_f8bgs41qn3fk6d95s0pfps60n4, O_RDWR) failed: Permission denied (13) in /Applications/XAMPP/xamppfiles/htdocs/foo/bar.php on line 3
Warning: session_start(): Cannot send session cache limiter - headers already sent (output started at /Applications/XAMPP/xamppfiles/htdocs/foo/bar.php:3) in /Applications/XAMPP/xamppfiles/htdocs/foo/bar.php on line 3
Warning: Unknown: open(/var/lib/php/session/sess_isu2r2bqudeosqvpoo8a67oj02, O_RDWR) failed: Permission denied (13) in Unknown on line 0
Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/var/lib/php/session) in Unknown on line 0
I'm using php-5.4.45 and I got the same problem.
If you are a php-fpm user, try edit php-fpm.conf and change listen.owner and listen.group to the right one. My nginx user is apache, so here I change these to params to apache, then it works well for me.
For apache user, I guess you should edit your fast-cgi params refer the two params I mention above.
If you use a configured vhost and find the same error then you can override the default setting of php_value session.save_path under your <VirtualHost *:80>
#
# Apache specific PHP configuration options
# those can be override in each configured vhost
#
php_value session.save_handler "files"
php_value session.save_path "/var/lib/php/5.6/session"
php_value soap.wsdl_cache_dir "/var/lib/php/5.6/wsdlcache"
Change the path to your own '/tmp' with chmod 777.

Resources