KCFinder - cannot write to upload folder. /upload - ckeditor

I am just trying to use KCFinder with CKEditor for uploads and browser. But when I try to upload something, it sends me the following message
KCFinder - cannot write to upload folder. /upload
Path for upload files is domain/upload/
and of KCFinder is domain/admin/kcfinder/
Here is a part from my config.php:
'disabled' => false,
'uploadURL' => "/upload/",
'uploadDir' => "/upload/",
'theme' => "default"

Related

laravel urls missing "en" from links generated from named routes

My test site is http site
my live site is https site.
My test site urls are working fine with
$url = route('admin.taskmanager.taskmanager.edit', [$task->id]);
which is generating
http://example.com/app/taskmanager/taskmanagers/89/edit
but the same url in live site is generating
https://example.com/app/taskmanager/taskmanagers/89/edit
PROBLEM:
Above live site url is generating 502 gateway error.
but when I add /en to the url like
https://example.com/en/app/taskmanager/taskmanagers/89/edit
it is working.
Both are nginx platform with same conf files.
Test site which is http site is automatically adding /en
but live site with https is not adding /en to my generated urls using routes are throwing 502 error.
Any help?
#matiaslauriti my router is as below
$router->group(['prefix' => '/taskmanager'], function(Router $router) {
$router->bind('taskmanager', function($id) {
return app('Modules\Taskmanager\Repositories\TaskmanagerRepository')->find($id);
});
$router->get('taskmanagers/{taskmanager}/edit', [
'as' => 'admin.taskmanager.taskmanager.edit',
'uses' => 'TaskmanagerController#edit',
'middleware' => 'can:taskmanager.taskmanagers.access'
]);

What is the most fast way to show "secure" images with Laravel and Amazon s3?

In my Laravel application I have a gallery for logged-in users with Amazon S3.
Now I download EVERY thumbnail and image in this way:
public function download($file_url){ // is safe, but slow...
$mime = \Storage::disk('s3')->getDriver()->getMimetype($file_url);
$size = \Storage::disk('s3')->getDriver()->getSize($file_url);
$response = [
'Content-Type' => $mime,
'Content-Length' => $size,
'Content-Description' => 'File Transfer',
'Content-Disposition' => "attachment; filename={$file_name}",
'Content-Transfer-Encoding' => 'binary',
];
return \Response::make(\Storage::disk('s3')->get($file_url), 200, $response);
}
This is safe (because I have a router with middleware('auth'), but is very server-intensive and slow.
Is it possible to download a file directly from Amazon:
only for (in my Laravel)-loggedin users (mayby with a temporery download link)?
OR only with a secure unique link?
You can use temporary URLs:
$url = Storage::disk('s3')->temporaryUrl(
'file.jpg', now()->addMinutes(5)
);
First param is the path on S3, second param is how long you want the URL to work for. Set this to a low value if you only want the URL to work for a single page load.
https://laravel.com/docs/5.6/filesystem (under Temporary URLs)

Downloading zip file results in .cpgz file after extraction

I am using the following code do download a zip file. I am sure the file is existing and working on the server. The result is .cpgz file after extraction.
return response()->download('/Applications/XAMPP/xamppfiles/htdocs/stoproject/source/storage/app/Copy of StoTherm Classic®.zip');
The code was working and without any change, it stopped.
I also tried adding headers:
$headers = array(
'Content-Type' => 'application/zip',
);
return response()->download('/Applications/XAMPP/xamppfiles/htdocs/stoproject/source/storage/app/Copy of StoTherm Classic®.zip', basename('/Applications/XAMPP/xamppfiles/htdocs/stoproject/source/storage/app/Copy of StoTherm Classic®.zip'), $headers);
Also tried with:
'Content-Type' => 'application/octet-stream'
Calling ob_end_clean() fixed the issue
$response = response()->download($pathToFile)->deleteFileAfterSend(true);
ob_end_clean();
return $response;
Laravel 4 Response download issue

upload image with soundcloud api

I keep getting an error everytime I try to change the souncloud avatar by uploading, I'm guessing something is wrong with my api wrapper layout? I get a
"`handle_response': HTTP status: 500 Internal Server Error (SoundCloud::ResponseError)"
"HTTP status: 403 Forbidden (SoundCloud::ResponseError)
from /Users/.../.rvm/gems/ruby-2.2.0/gems/soundcloud-0.3.2/lib/soundcloud/client.rb:32:in `post'"
My code:
require 'soundcloud'
user = Soundcloud.new(:client_id => '...',
:client_secret => '...',
:username => '...',
:password => '...')
avatar_data = File.path("/path/to/file/-1.jpg")
user.post('/me').avatar_data
I also tried:
testuser.post('/me/#{avatar_data}')

Resumable YouTube Data API v3 uploads using Ruby

I am currently using the google-api-ruby-client to upload videos to Youtube API V3, but I can't find a way of getting the Youtube ID that is created by a resumable upload. The code I am trying to use is along the lines of:
media = Google::APIClient::UploadIO.new(file_path, 'application/octet-stream')
yt_response = #client.execute!({
:api_method => #youtube.videos.insert,
:parameters => {
:part => 'snippet,status',
'uploadType' => 'resumable'
},
:body_object => file_details,
:media => media
})
return JSON.parse(yt_response.response.body)
But unfortunately for resumable uploads, yt_response.response.body is blank. If I change the 'uploadType' to 'multipart' then body is a JSON blob that contains the Youtube ID. The response for a resumable upload though is only the resumable session URI for the upload with an empty body. How do I go from that URI into the Youtube ID I just created?
Synthesizing the info from How to engage a Resumable upload to Google Drive using google-api-ruby client? and the existing multipart upload sample leads to
videos_insert_response = client.execute!(
:api_method => youtube.videos.insert,
:body_object => body,
:media => Google::APIClient::UploadIO.new(opts[:file], 'video/*'),
:parameters => {
'uploadType' => 'resumable',
:part => body.keys.join(',')
}
)
videos_insert_response.resumable_upload.send_all(client)
puts "'#{videos_insert_response.data.snippet.title}' (video id: #{videos_insert_response.data.id}) was successfully uploaded."
That worked for me.
I am doing resumable uploads in chunks using 0.7.1 version of the API and I had to to this to get the ID...
result = videos_insert_response.resumable_upload.send_all(client)
video = JSON.parse(result.response.body)
puts "Video id '#{video['id']}' was successfully uploaded."

Resources