I just cant figure why is this happening. When I dispatch job which uploads file to my Amazon S3, I want to delete id and update relation (channel). Everything works but I cant delete the file. I cant delete the file even manually. Windows says some process is still using it. To delete it manually I have to end queue worker first. Wierd part is that file is uploaded o my S3, relation is updated in database and queue worker in terminal says job is processed. No failed or running jobs in the jobs table. Anyone know whats going on?
My Job handle below.
public function handle()
{
$path = storage_path() . "/uploads/" . $this->fileId;
$fileName = $this->fileId . ".png";
if(Storage::disk("s3images")->put("profile/" . $fileName, fopen($path, "r+"))){
File::delete($path);
}
$this->channel->image_filename = $fileName;
$this->channel->save();
}
Try
unlink($path)
It worked for me.
Related
I currently confuse why my exec command for downloading the files from s3 to local directory, not working on laravel, however in AWS command line interface, the command that I created is downloading and working well. please see the reference that I attach below.
I have here my sample code that I created on my public function
Function:
public function send_zip_files_to_store() {
$source = 's3://compexp/11-10-2019/01790exp.zip';
$destination = 'C:\xampp\htdocs';
exec('aws cp sync ' . $source . ' ' . $destination);
}
Route:
Route::get('/send_zip_files_to_store','DashController#send_zip_files_to_store')->name('send_zip_files_to_store');
I have a job which I want cron to run at given interval , this removes some old entries from my db and then removes the files that were uploaded with the db entry.
If I run the task from the terminal it works fine, (removing both db entries and the files uploaded). But when I leave the task to cron, it does remove the entries in the db , but it doesn’t remove the uploaded folders in my public directory.
the code that removes the files looks like this
$machtiging = File::files('icec/'.$icecconsult->accesstoken.' /machtiging');
if(count($machtiging) > 0){
File::deleteDirectory(public_path() . '/icec/'.$icecconsult->accesstoken);
}
so it looks if there there, if so , delete them , but this just doesn’t work , ive tried putting the cron job run as root , www-data , and my user , all with same result . files and folder permission I have set them to 777 to be sure, but this doesn’t seem to be the problem.
Ive also tried adding shell=/bin/bash but that didnt do the trick either
Any help on solving this issue would be much appreciated
Update
the crobline looks like this
* * * * * /bin/bash /home/ice/verlopenicec.sh >> /tmp/output 2&>1
also tried
* * * * * /home/ice/verlopenicec.sh >> /tmp/output 2&>1
and
* * * * * /usr/bin/php /var/www/wemedic/artisan verwijder-verlopen-icec-consult 1>> /dev/null 2>&1
All seem to run . its just it doesnt delete or move the directory it needs to
trying to get some debug data , but nothing is showing up
the verlopenicec.sh script itself looks is just a say reference to the original script that laravel should run . thouth might be handy to make a script to test why laravel aint deleting the directory.
script looks like this
#!/bin/bash
SHELL=/bin/bash
USER=ice
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
/usr/bin/php /var/www/wemedic/artisan verwijder-verlopen-icec-consult
wich runs a laravel command that looks like this
$icecconsult = Icecconsult::where('id', '=', $consult_id)->firstOrFail();
$icecconsult->expire = Carbon::now();
$icecconsult->status = 'Gesloten';
$icecconsult->save();
$icec = Icec::where('id', '=', $icecconsult->icec_id)->firstOrFail();
$icec->delete();
$machtiging = File::files('icec/'.$icecconsult->accesstoken.'/machtiging');
if(count($machtiging) > 0){
// File::deleteDirectory(public_path() . '/icec/'.$icecconsult->accesstoken);
$move = 'mv /var/www/wemedic/public/icec/'.$icecconsult->accesstoken.' /tmp' ;
shell_exec('SHELL=/bin/bash');
shell_exec('PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games');
shell_exec($move);
// File::deleteDirectory('/var/www/wemedic/public/icec/'.$icecconsult->accesstoken);
}
return;
(ive commented out the delete function and tried to move it to the temp directory , but having the same result with moving or deleting. both work If I directly run it , but not when cron runs it . Cron does run the task , cause I can see it beeing fired in the /var/log/syslog and the database entru does get changed )
Ive tryied deleting , then moving it to the temp folder, both work if I run them directly , but none work when I leave it to cron/ laravel cron scheduler
If also tried to get a response (tru/false ) from the delete function, but when I try to save that to the db to see it , the function seems to not execute.
dd($machtiging) returns an array like below, showing the files in the folder , after knowing there are files in the folder, it should go ahead and delete the complete folder allong with any files/sub directories located in it
array:1 [▼
0 => "icec/a89ce4c9010e0a745308b29782b5eeae/machtiging/machtiging.pdf"
]
Thanks for you help
Try with this crontab :
*/1 * * * * ice /bin/bash /home/ice/verlopenicec.sh >> /tmp/output.log
Change your bash script to :
#!/bin/bash
moment="`/bin/date +%y_%m_%d`"
echo "--- The script has been executed on $moment ---"
/usr/bin/php /var/www/wemedic/artisan verwijder-verlopen-icec-consult
It should work better, but if not, could you paste here the content of your generated /tmp/output.log ?
When trying to save an image on my server, in my laravel 5 project with intervention image class, with the following code :
$pathFull = public_path('images/original/brand/' . $filename);
$img = Image::make($image->getRealPath());
$img->encode('jpg')->save($pathFull);
I get the error:
NotWritableException in Image.php line 138:
Can't write image data to path
(/var/www/mydomain.com/public/images/original/brand/nanan.jpg)
So ive changed permission on the folder (that already exists) with:
sudo chmod -R 775 /var/www/mydomain.com/public/images
Ive checked the permissions they are 775 so that command works. I tried it localy (xampp) and it worked fine and the driectory paths are fine. I keep getting this error only if i use 777 i don't but thats danerous.
What else can I try to keep the server save and not use 777?
Maybe the paths are not the same. Check it out. See the original directory?
$pathFull = public_path('images/brand/thumb/' . $filename);
Can't write image data to path
(/var/www/mydomain.com/public/images/original/brand/nanan.jpg)
sudo chmod -R 775 /var/www/mydomain.com/public/images/original/brand
I have made recently a method like yours. It works fine for me. Here it is:
$image = Input::file('image');
$filename = Input::file('image')->getClientOriginalName();
$path = public_path('images/' . Auth::user()->email . '/' . $filename);
$img = Image::make($image->getRealPath());
$img->encode('jpg')->save($path);
My website hosting server is hostmonster.com.
My application uses codeigniter framework.
I have a code which sends emails to my users and I want to make it automatic.
I have used the cpanel of the hosting service and I tried to give the command as
php -q www.mysite.com/admin admin sendDailyEmail
my controller is admin and the method is sendDailyEmail and the controller is present inside the application/controllers/admin folder.
I have also set a reminder email to me whenever the cronjob is run.
The email subject reads
Cron php -q /home1/username/public_html/admin admin sendDailyEmail
and the body says
No input file specified
Where do I go wrong.
I have never run cronjobs and this is my first time.
I am no good in giving command line instuctions too.
My admin sendDailyEmail code is as follows
function sendDailyEmail() {
$data = $this->admin_model->getDailyData();
foreach ($data as $u) {
if($u->Daily){
//if(!$u->Amount){
if ($u->Email=='myemail#gmail.com') {
$user['user_data']['FirstName'] = $u->FirstName;
$user['user_data']['LastName'] = $u->LastName;
$user['user_data']['Id']=$u->Id;
$this->email->clear();
$this->email->to($u->Email);
$this->email->from('alerts#mysite.com', 'MySite');
$this->email->subject("My Subject");
$msg = $this->load->view('emails/daily_view', $user, true);
$this->email->message($msg);
if ($this->email->send())
$data['message'] = "Daily Emails has been sent successfully";
else
$data['message'] = "Daily Emails Sending Failed";
}
}
}
$data['main_content']['next_view'] = 'admin_home_view';
$this->load->view('includes/admin_template', $data);
}
You can use wget and set the time for whatever you like:
wget http://www.mysite.com/admin/sendDailyEmail
You can also use curl:
curl --silent http://www.mysite.com/admin/sendDailyEmail
For CodeIgniter 2.2.0
You can try this:
php-cli /home/username/public_html/index.php controller method
or at your case
php-cli /home/username/public_html/index.php admin sendDailyEmail
It works fine with me..
Cheers!
Codeigniter sets up command line differently for running crons, etc.
Read:
http://www.codeigniter.com/user_guide/general/cli.html
So you should run:
php index.php admin admin sendDailyEmail
(that may need adjusted; based on your code above)
Have a look at an article I just wrote that goes a little deeper into it all:
http://codebyjeff.com/blog/2013/10/setting-environment-vars-for-codeigniter-commandline
i have facing same issue while, but following work for me
wget http://www.yoursite.com/controller/function
I am trying to learn how to do my first cron job using CodeIgniter. In the past, it seemed the only way to do this with CI was to use the wget command instead of php.
The CodeIgniter User Guide, however, says that now you can do this from the command line, for example by running:
$ cd /path/to/project;
$ php index.php controller method
This works great using Terminal on my local setup. But when I use a similar command in the cron section of cPanel on my shared hosting, the task just returns the contents of index.php.
I'm not entirely sure what cPanel does with this command, so unsure as to whether it's using the command line at all.
Could someone explain how I might be able to set up a cron job on shared hosting using CodeIgniter please?
Here is the example code from the CodeIgniter user guide:
tools.php
public function message($to = 'World')
{
echo "Hello {$to}!".PHP_EOL;
}
}
?>
It's going to depend on your host. Cron jobs could really screw stuff up if you're not careful, so a lot of shared hosts don't allow it. You probably need to be on some virtual container (like a VPS, virtuozo, etc.) to do this. This isn't a CodeIgniter issue, but a hosting provider issue. Call them first.
We worked around this exact issue as follows:
Set up a normal php file that is scheduled by cron. Nothing to do with codeigniter yet
Inside it, you can make an fsocket or curl request to perform your regular CodeIgniter call as you do from the web.
Here's an example (say, cron.php)
#!/usr/local/bin/php.cli
<?php
DEFINE('CRON_CALL_URL','https://my_server/'); //
DEFINE('CRON_HTTPS_PORT', 443); // port to use during fsocket connetion
DEFINE('CRON_SSL_PREFIX', 'ssl://'); // prefix to be used on the url when using ssl
$current_time = now();
$md5_hash = md5('somevalue'.$current_time);
$url = CRON_CALL_URL.'MYCTRL/MYMETHOD';
$parts=parse_url($url);
//
$parts['query']='md5_hash='.$md5_hash.'&time='.$current_time;
$fp = fsockopen(CRON_SSL_PREFIX.$parts['host'],
isset($parts['port'])?$parts['port']:CRON_HTTPS_PORT,
$errno, $errstr, 30);
if (!$fp) {
} else {
if (!array_key_exists('query', $parts)) $parts['query'] = null;
$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($parts['query'])."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($parts['query'])) $out.= $parts['query'];
fwrite($fp, $out);
fclose($fp);
}
}
?>
NOTE: Make sure that in your MYCTRL/MYMETHOD function you have
ignore_user_abort(true);
that way when you fsocket connection is closed, your script will still run to the end.
We actually have a bunch of these fsockets for various reasons. If you need to make sure that the call to that controller/method came from the cron script, you need to pass some additional hash values so that only cron and the script know it. Once the script is called it has access to any codeigniter functions. Works like a charm.
I've set up 100s of CI cronjob on shared hosting like this: create a short php script which calls the CI controller as if it was a webbrowser.
So, script.php contains this:
script #! /usr/local/bin/php -f /home/example/public_html/script.php
<?php
get_get_contents('http:example.com/cronjob/');
?>
Then set your cronjob in cPanel to call script.php
When it runs Script.php will call the Codeigniter Cronjob controller. There you have the entire CI framework at your disposal.
If you are going to call it like a web browser, why not replace the cronjob
command with:
wget http://example.com/cronjob/
instead of creating something new or simply
curl --suppress http://example.com/cronjob/`