Magento admin throws an exception:
Could not determine temp directory, please specify a cache_dir manually
It is fresh instalation on new hosting package.
Usually it will happen in shared web hosting, but also some times on individual server, if the permission of tmp folder is set wrong.
Many people suggest to modify the file:
/lib/Zend/Cache/Backend/File.php to fix this problem. However, it may be a trap when you upgrade your Magento, as this file resides as core file of Magento. I recommend to use Magento's override feature.
Firstly, copy /lib/Zend/Cache/Backend/File.php to /app/code/local/Zend/Cache/Backend/File.php.
Then on line 91 or near this line, you will find:
'cache_dir' => null,
Change to:
'cache_dir' => "var/tmp/",
You can change the cache folder wherever you want.
Now create a directory named tmp(or whatever name you have given above) under var folder and change the permission to 777 if necessary.
This is only the permission issue. Just set the 777 permission to the cache directory and you are all done. try it.
For more details you can follow the link.
When ever you set the permission be sure it is recurrsively set..
chmod 777 -R /var/cache
this is the function
public function getTmpDir()
{
$tmpdir = array();
foreach (array($_ENV, $_SERVER) as $tab) {
foreach (array('TMPDIR', 'TEMP', 'TMP', 'windir', 'SystemRoot') as $key) {
if (isset($tab[$key])) {
if (($key == 'windir') or ($key == 'SystemRoot')) {
$dir = realpath($tab[$key] . '\\temp');
} else {
$dir = realpath($tab[$key]);
}
if ($this->_isGoodTmpDir($dir)) {
return $dir;
}
}
}
}
$upload = ini_get('upload_tmp_dir');
if ($upload) {
$dir = realpath($upload);
if ($this->_isGoodTmpDir($dir)) {
return $dir;
}
}
if (function_exists('sys_get_temp_dir')) {
$dir = sys_get_temp_dir();
if ($this->_isGoodTmpDir($dir)) {
return $dir;
}
}
// Attemp to detect by creating a temporary file
$tempFile = tempnam(md5(uniqid(rand(), TRUE)), '');
if ($tempFile) {
$dir = realpath(dirname($tempFile));
unlink($tempFile);
if ($this->_isGoodTmpDir($dir)) {
return $dir;
}
}
if ($this->_isGoodTmpDir('/tmp')) {
return '/tmp';
}
if ($this->_isGoodTmpDir('\\temp')) {
return '\\temp';
}
Zend_Cache::throwException('Could not determine temp directory, please specify a cache_dir manually');
}
defined in file lib/Zend/Cache/Backend.php
http://www.webtechnologycodes.com/magento-error-could-not-determine-temp-directory-please-specify-a-cache_dir-manually/
Create tmp folder in root of your magento installation with 777 permissions.
Open lib/Zend/Cache/Backend/File.php
Find $_options property and change line: 'cache_dir' => null, to 'cache_dir' => 'tmp',
Refresh page.
Create an info.php and check for the path beneath upload_tmp_dir to be writable for the webserver.
<?php phpinfo();
Otherwise set the path in your hosting environment.
Beware that this setting can not be placed within .htaccess files but some hosters allow individual php.ini files to be placed in your docroot:
upload_tmp_dir = /path/to/docroot/var/tmp/
Related
i am uploading file in storage like this :
/storage/uploads/contract/19/12199/document.pdf
now i need to to allow only authenticated user to see those document , i use this route :
Route::get('/storage/{pathToFile}', function($pathToFile) {
if (auth()->user()) {
return response()->file($pathToFile);
} else {
return 'Nope, sorry bro, access denied!';
}
});
this didn't work , still all files can be acceded even if user not logged in .
any idea ?
thanks
Did you symlink the public folder to storage folder? If so it would still be accessible because default public entrypoint would be "public" folder so "[host]/storage" would be available in that folder.
What I did in the past was use S3 driver and set file visibility to private then use:
public function get($path, $image)
{
$file = Storage::disk('s3')->get("private/images/". $path. "/" . $image);
return response($file, 200)->header('Content-Type', 'image/png');
}
In your case this would be changed to:
if (auth()->user()) {
$file = Storage::disk('s3')->get($pathToFile);
return response()->file($file);
} else {
return 'Nope, sorry bro, access denied!';
}
Note: S3 driver supports multiple storage solutions: https://flysystem.thephpleague.com/v1/docs/adapter/aws-s3-v2/
Is there a way to zip and download files and folders which are in Amazon S3 bucket, together in Laravel? I Want to zip the three folders and one file in the picture together and download it
Here's a half baked solution in a route file. Hope it helps.
https://flysystem.thephpleague.com/docs/adapter/zip-archive/
composer require league/flysystem-ziparchive
I put this in routes/web.php just to play with.
<?php
use Illuminate\Support\Facades\Storage;
use League\Flysystem\Filesystem;
use League\Flysystem\ZipArchive\ZipArchiveAdapter;
Route::get('zip', function(){
// see laravel's config/filesystem.php for the source disk
$source_disk = 's3';
$source_path = '';
$file_names = Storage::disk($source_disk)->files($source_path);
$zip = new Filesystem(new ZipArchiveAdapter(public_path('archive.zip')));
foreach($file_names as $file_name){
$file_content = Storage::disk($source_disk)->get($file_name);
$zip->put($file_name, $file_content);
}
$zip->getAdapter()->getArchive()->close();
return redirect('archive.zip');
});
You'll definitely want to do something different than just plopping it in the public dir. Maybe stream it out straight out as a download or save it somewhere better. Feel free to post comment/questions and we can discuss.
I did it the following way after looking at some solutions by streaming the zip directly to the client by using https://github.com/maennchen/ZipStream-PHP :
if ($uploads) {
return response()->streamDownload(function() use ($uploads) {
$opt = new ArchiveOptions();
$opt->setContentType('application/octet-stream');
$zip = new ZipStream("uploads.zip", $opt);
foreach ($uploads as $upload) {
try {
$file = Storage::readStream($upload->path);
$zip->addFileFromStream($upload->filename, $file);
}
catch (Exception $e) {
\Log::error("unable to read the file at storage path: $upload->path and output to zip stream. Exception is " . $e->getMessage());
}
}
$zip->finish();
}, 'uploads.zip');
}
i've got problem when i install vqmod in opencart the error say
index.php not writeable
Administrator index.php not writeable
i have no index.php file in my root opencart ,
Admin/index.php
<?php
// Version
define('VERSION', '2.3.0.2');
// Configuration
if (is_file('config.php')) {
require_once('config.php');
}
// Install
if (!defined('DIR_APPLICATION')) {
header('Location: ../install/index.php');
exit;
}
// Startup
require_once(DIR_SYSTEM . 'startup.php');
start('admin');
vqmod/install/index.php
// CHANGE THIS IF YOU EDIT YOUR ADMIN FOLDER NAME
$admin = 'admin';
// Counters
$changes = 0;
$writes = 0;
// Load class required for installation
require('ugrsr.class.php');
// Get directory two above installation directory
$opencart_path = realpath(dirname(__FILE__) . '/../../') . '/';
// Verify path is correct
if(!$opencart_path) die('COULD NOT DETERMINE CORRECT FILE PATH');
$write_errors = array();
if(!is_writeable($opencart_path . 'index.php')) {
$write_errors[] = 'index.php not writeable';
}
if(!is_writeable($opencart_path . $admin . '/index.php')) {
$write_errors[] = 'Administrator index.php not writeable';
}
if(!empty($write_errors)) {
die(implode('<br />', $write_errors));
}
// Create new UGRSR class
$u = new UGRSR($opencart_path);
// remove the # before this to enable debugging info
#$u->debug = true;
// Set file searching to off
$u->file_search = false;
// Attempt upgrade if necessary. Otherwise just continue with normal install
$u->addFile('index.php');
$u->addFile($admin . '/index.php');
$u->addPattern('~\$vqmod->~', 'VQMod::');
$u->addPattern('~\$vqmod = new VQMod\(\);~', 'VQMod::bootup();');
$result = $u->run();
if($result['writes'] > 0) {
if(file_exists('../mods.cache')) {
unlink('../mods.cache');
}
die('UPGRADE COMPLETE');
}
$u->clearPatterns();
$u->resetFileList();
// Add catalog index files to files to include
$u->addFile('index.php');
// Pattern to add vqmod include
$u->addPattern('~// Startup~', '// VirtualQMOD
require_once(\'./vqmod/vqmod.php\');
VQMod::bootup();
// VQMODDED Startup');
$result = $u->run();
$writes += $result['writes'];
$changes += $result['changes'];
$u->clearPatterns();
$u->resetFileList();
// Add Admin index file
$u->addFile($admin . '/index.php');
// Pattern to add vqmod include
$u->addPattern('~// Startup~', '//VirtualQMOD
require_once(\'../vqmod/vqmod.php\');
VQMod::bootup();
// VQMODDED Startup');
$result = $u->run();
$writes += $result['writes'];
$changes += $result['changes'];
$u->addFile('index.php');
// Pattern to run required files through vqmod
$u->addPattern('/require_once\(DIR_SYSTEM \. \'([^\']+)\'\);/', 'require_once(VQMod::modCheck(DIR_SYSTEM . \'$1\'));');
// Get number of changes during run
$result = $u->run();
$writes += $result['writes'];
$changes += $result['changes'];
// output result to user
if(!$changes) die('VQMOD ALREADY INSTALLED!');
if($writes != 4) die('ONE OR MORE FILES COULD NOT BE WRITTEN');
die('VQMOD HAS BEEN INSTALLED ON YOUR SYSTEM!');
you need to do two things
1)open vqmod/install/index.php
find the following code
$admin = 'admin';
change its value to your admin folder name that you have changed to
$admin = 'youradminfoldername';
2)open vqmod/pathReplaces.php
find the following code (if there is no,add it)
$replaces[] = array('~^admin\b~', 'xxxxx');
change xxxxx to your current admin folder name.
after you have these two things,even if it still show such infomation,
you should be ok to install extension.
I have tested this method ,it works for me.
another solution I guess maybe is to delete vqmod folder and reupload vqmod-opencart-2.6.1,and then only need to modify vqmod/pathReplaces.php,it should works
It probably means that you've deleted the file. You might have to reinstall the Opencart or upload the file from default opencart source files.
Open cPanel File Manager Set permission to 777 for the file
I'm getting a little confused here with this Storage VS Public paths in Laravel 5 > ... My understanding is one should use Storage:: instead of File:: for writing files to the Storage folder foresighting the use of cloud services like Amazon etc. So I am trying to put a jpg into a the storage/app folder using Intervention and this code:
public function apply(Request $request)
{
$user = Auth::user();
$path = '/users/'.$user->id;
if (!Storage::exists($path))
{
Storage::makeDirectory ($path, $mode = 0755, $recursive = false, $force = false);;
}
Image::make($request->file('image'))
->orientate()
->resize(600, null, function ($constraint) {
$constraint->aspectRatio();
})
->resizeCanvas(600, 600, 'center', false, 'ffffff')
->save($path.'/foo.jpg');
}
First of all I am not sure the !Storage::exists($path) will do anything as the API for storage tells it won't check for directories so how should I check if a Directory exists??
Second dd(is_writable($path)); return false, and indeed running that code results in
NotWritableException in Image.php line 143:
Can't write image data to path
error.
so how should this be done?
The "trick" that I used was manipulate the image directly in the temp path, and then save it in the storage folder using the Laravel storage method.
$tempFile = $request->file('image')->getRealPath();
Image::make($tempFile)
->resize(100, 100)
->save($tempFile); // Note that we are saving back to temporary path
// Now we can proceed and send the manipulated file to it's final destination
// in '/storage/app/public/uploads'
$path = $request->file('image')->storePublicly('public/uploads');
I just moved my Magento site to Amazon EC2, but keep getting "Connection to Redis failed after 2 failures" error. I've tried to remove the redis configuration from app/etc/local.xml but still get that error.
I also tried to disable all the cache options directly from core_cache_option table. I have no idea how to clean the already cached files. No cache files under var/cache folder as expected and I've tried to flushall from redis-cli command prompt, but still keep getting this error.
Any idea what else should I try?
<cache>
<backend_options>
<server><![CDATA[/var/tmp/_cache.sock]]></server>
<port><![CDATA[0]]></port>
<persistent><![CDATA[]]></persistent>
<database><![CDATA[0]]></database>
<password><![CDATA[]]></password>
<connect_retries><![CDATA[1]]></connect_retries>
<read_timeout><![CDATA[10]]></read_timeout>
<automatic_cleaning_factor><![CDATA[0]]></automatic_cleaning_factor>
<compress_data><![CDATA[1]]></compress_data>
<compress_tags><![CDATA[1]]></compress_tags>
<compress_threshold><![CDATA[20480]]></compress_threshold>
<compression_lib><![CDATA[gzip]]></compression_lib>
<use_lua><![CDATA[0]]></use_lua>
</backend_options>
<backend><![CDATA[Cm_Cache_Backend_Redis]]></backend>
</cache>
Given EC2 instances are ephemeral, you should be able to regenerate the instance, right? If that's not an option —
First, Check app/etc/ for other XML files. Magento will parse any XML files it finds in this folder. I've seen something like the following trip people up
$ ls app/etc/*.xml
local.xml
local.backup.xml
Magento parses both local.xml and local.backup.xml, and the backup values override the new values in local.xml. Also, make sure you're working with the local.xml you think you are. Magento loads the local configuration in the following location. Add some temporary debugging to make sure it's doing what you think it's doing.
#File: app/code/core/Mage/Core/Model/Config.php
public function loadBase()
{
$etcDir = $this->getOptions()->getEtcDir();
$files = glob($etcDir.DS.'*.xml');
$this->loadFile(current($files));
while ($file = next($files)) {
var_dump($file);
$merge = clone $this->_prototype;
$merge->loadFile($file);
$this->extend($merge);
}
if (in_array($etcDir.DS.'local.xml', $files)) {
$this->_isLocalConfigLoaded = true;
}
return $this;
}
Second, after you clear your cache, make sure Magento's reloading the configuration. Add some temporary debugging to
#File: app/code/core/Mage/Core/Model/Config.php
public function init($options=array())
{
$this->setCacheChecksum(null);
$this->_cacheLoadedSections = array();
$this->setOptions($options);
$this->loadBase();
$cacheLoad = $this->loadModulesCache();
if ($cacheLoad) {
var_dump("Loaded Config from Cache");
return $this;
}
else
{
var_dump("Reloading configuration");
}
$this->loadModules();
$this->loadDb();
$this->saveCache();
return $this;
}
Finally, if you suspect the problem is a file based cache not clearing, drop some debugging code in
#File: app/code/core/Mage/Core/Model/Config/Options.php
public function getCacheDir()
{
//$dir = $this->getDataSetDefault('cache_dir', $this->getVarDir().DS.'cache');
$dir = $this->_data['cache_dir'];
$this->createDirIfNotExists($dir);
var_dump($dir);
return $dir;
}
This will let you know the cache directory Magento's reading from — if Magento can't read the local var, it'll pop up to the root level /var/ folder.