Trouble downloading csv file laravel (content in response but doesn't download) - laravel

I'm trying to generate and then download a csv file in Laravel 9.
The generation is ok at the correcte location (public/files/OUT/) but i'm getting the content in the response and no download occurs.
here is my code :
$path = storage_path('app\public\files\OUT\\');
$filename='dataTemplate';
$f = fopen($path.$filename.'.csv', 'wb');
// => PUT DATAS IN CSV => no problem
fclose($f);
// file is generated successfully at path location
$headers = [
'Content-Type' => 'text/csv',
];
return response()->download($path.'dataTemplate.csv','dataTemplate.csv',$headers);
Thanks in advance

Related

Unable to open file for reading [ file link ] laravel

I am trying to send a mail to multiple recipients with an attachment of file URL , but while hitting the api in postman it's throwing an error unable to open file for reading [ file link ] , but while I am copying file link and opens in browser it's opening perfectly .
I have checked the file permission also and referred to some of the answers on Stackoverflow but nothing helped me, please help me as soon as possible.
$file_name = 'TimeActivityReport' . "_" . time() . '.pdf';
$storage_path = 'public/TimeActivityReport';
// $storage_path = public_path();
$filePath = $storage_path . '/' . $file_name;
// return $filePath;
$exl = Excel::store(new TimeActivityReportExport($all_total_values,$data,$date_totals), $filePath);
if($exl)
{
$fileurl = asset('storage/TimeActivityReport').'/'.$file_name;
// return $fileurl;
}
// return $fileurl;
return Mail::send([], $emails, function($message)use($fileurl,$emails) {
$message->to($emails,'hello')
->subject('test')
->attach($fileurl,[
'as' => 'checkname.pdf',
'mime' => 'application/pdf'
])
->setBody('check');
});
Try this I tested it on my end and it returned the file
Storage::get('./public/TimeActivityReport/'.$file_name);
You can also test if the file exists using:
Storage::disk('local')->exists('public/TimeActivityReport/'.$file_name);
To attach try:
$fileurl = Storage::path('public/TimeActivityReport/'.$file_name);
resource laravel docs

Laravel - "file does not exist or is not readable", but the file is moved successfully

I get the follow error:
The "C:\xampp\tmp\php49D8.tmp" file does not exist or is not readable.
But the file is copied successfully
My controller code is:
$fileResult=$file->move(self::UPLOAD_DIR, $name_file);
if(!$fileResult){
$result = array("status" => "500",
"error"=> array("error" => "Error in the file move"));
return response(json_encode( $result ), $result["status"])
->header("Content-Type", "application/json");
}
Screenshot: here
Why can be the problem?
Call $validator->fails() can delete uploading file
use
$file = $request->file('file');//get your file
$fileResult=$file->move(self::UPLOAD_DIR, $file->getClientOriginalName());

Laravel write file stream

I am using guzzle to downlaod file from url and save it into my storage.
So I have a code look like this
$response = $this->client->request('GET', $model->url, [
'stream' => true
]);
$body = $response->getBody();
while (!$body->eof()) {
Storage::append($this->filePath, $body->read(1024));;
}
but when I open the folder where my file is located, I see that the file size is changing and sometimes it is zero. So in the end I am getting a invalid file.
How can I solve this problem?

Upload file in S3 using Laravel 5.3

Installation Process
I followed this tutorial to install aws Package in Laravel 5.3
My Code is below
$s3 = \App::make('aws')->createClient('s3');
$s3->putObject(array(
'Bucket' => 'Bucket_Name',
'Key' => 'AWS_ACCESS_KEY_ID',
'SourceFile' => 'http://domainname/sample.txt',
));
I am trying a txt file with around 50 bytes contents and got below error.
A sha256 checksum could not be calculated for the provided upload
body, because it was not seekable. To prevent this error you can
either 1) include the ContentMD5 or ContentSHA256 parameters with your
request, 2) use a seekable stream for the body, or 3) wrap the
non-seekable stream in a GuzzleHttp\Psr7\CachingStream object. You
should be careful though and remember that the CachingStream utilizes
PHP temp streams. This means that the stream will be temporarily
stored on the local disk.
Am I missing something?
SourceFile must be a local file path. The Body parameter allows stream, so you should be able to do a request with guzzle and pass the body to it.
$client = new GuzzleHttp\Client();
$response = $client->get('http://domainname/sample.txt');
$s3->putObject([
'Bucket' => 'Bucket_Name',
'Key' => 'AWS_ACCESS_KEY_ID',
'Body' => $response->getBody(),
]);

Post Multipart file upload with json (text) and files (binary) as parts in Ruby

I'm almost done with a multipart file upload but not quite. The API I am using requires two parts: a meta description (Json) and a file (File).
Here some of the Code:
File.open(file_path) do |image|
request = Net::HTTP::Post::Multipart.new(
url.path,
'metadata' => metadata_as_json_string,
'attachment' => UploadIO.new(image, "image/jpeg", "image.jpg")
)
The trouble I am having is with the 'metadata' part (metadata_as_json_string). Without it everything works fine, but the API requires meta information as json. It works if I save the json content in a file and use it as metadata-part. But my content is not coming from a file.
Any ideas how to provide the metadata without previously saving it in a file?
Thank you
I did find a solution myself by using StringIO:
metadata_file = StringIO.new(metadata_as_json_string)
File.open(file_path) do |image|
request = Net::HTTP::Post::Multipart.new(
url.path,
'metadata' => UploadIO.new(metadata_file, "application/json"),
'attachment' => UploadIO.new(image, "image/jpeg", "image.jpg")
)
Anyway thank you for your time

Resources