How to avoid special chars change on file copy - laravel

A customer put once a day a csv file to an ftp server.
With a laravel application, I copy the file to my server.
$files = Storage::disk('xxxxMasterData')->allFiles();
foreach ($files as $file) {
$fileToCopy = Storage::disk('xxxxMasterData')->get($file);
Storage::disk('localXxxxlMasterData')->put($file, $fileToCopy);
if(env('APP_ENV') === 'production') {
Storage::disk('xxxxMasterData')->delete($file);
}
}
The content of the file on the ftp server ist ok. The special chars are ok:
438186|DM1210DWS|D1=4 L1=14 D2=4 L2=50 Z=2|DWS|Schaftfräser|Fraise en bout|
When I open the copied file, the word Schaftfräser has changed to Schaftfr�ser!
It looks like this:
438186|DM1210DWS|D1=4 L1=14 D2=4 L2=50 Z=2|DWS|Schaftfr�ser|Fraise en bout|
I tried to read only the content from the ftp server, but it return also the Schaftfr�ser!
Is there a setting to do on laravel or on the ftp server to avoid this problem?

I had to change the encoding:
$fileToCopyEncoded = mb_convert_encoding($fileToCopy, 'UTF-8', 'ISO-8859-1');

Related

Trouble downloading csv file laravel (content in response but doesn't download)

I'm trying to generate and then download a csv file in Laravel 9.
The generation is ok at the correcte location (public/files/OUT/) but i'm getting the content in the response and no download occurs.
here is my code :
$path = storage_path('app\public\files\OUT\\');
$filename='dataTemplate';
$f = fopen($path.$filename.'.csv', 'wb');
// => PUT DATAS IN CSV => no problem
fclose($f);
// file is generated successfully at path location
$headers = [
'Content-Type' => 'text/csv',
];
return response()->download($path.'dataTemplate.csv','dataTemplate.csv',$headers);
Thanks in advance

Laravel / Twilio: Twilio\Exceptions\ConfigurationException Credentials are required to create a Client

So I am trying to use the SMS (text message) function in Laravel / Twilio - I have a local machine which I tested it on and the credentials and everything works fine- I use the same code on my remote machine (which worked fine yesterday) and today I am getting an error: "Error: Credentials are required to create a Client"
I have triple confirmed the credentials are correct, I have even hard coded them into the code , I have moved them from the env file to config file and still not working - I have retested my local machine and it works still - I have copied the code from local machine (working) to remote machine and still not working - the only difference between the two is I have changed the SMTP settings in the env file (even if I remove this , the problem still exists). I have cleared cache, restarted services.
my .env file
TWILIO_SID=xxxxxxxxxx
TWILIO_TOKEN=xxxxxxxxxxxxxxx
TWILIO_FROM=+1xxxxxxxxxxxxx
my Twiliocontroller:
public function smsSend()
{
$receiverNumber = "+111111111";
$message = "Sup Dude";
try {
$account_sid = getenv("TWILIO_SID");
$auth_token = getenv("TWILIO_TOKEN");
$twilio_number = getenv("TWILIO_FROM");
$client = new Client($account_sid, $auth_token);
$client->messages->create($receiverNumber, [
"from" => $twilio_number,
"body" => $message,
"statusCallback" => "https://webhook.site/xxxxxxxxxxxxxx"
]);
dd('SMS Sent Successfully.');
} catch (Exception $e) {
dd("Error: " . $e->getMessage());
}
}
My web.php
Route::get('/smssend', [TwilioController::class, 'smsSend'])->name('smsSend');
Any help would be greatly appreciated

Issues in JSON to XML and Upload to FTP in Ballerina Integrator

I am trying samples given in Ballerina integrator tutorials, while running json to xml and then upload into ftp sample facing the issue:
error org.wso2.ei.b7a.ftp.core.util.BallerinaFTPException.
I knew the reason for this issue but don't know where i have to put that command. Please help me to sort out the issue.
Reason for the issue is: ftp credentials are mentioned in conf file, I put conf file under the root directory but it doesn't refer. Need to give
b7a.config.file=src/upload_to_ftp/resources/ballerina.conf
But I don't know where I have to give this?
Thanks in Advance.
You can add -b7a.config.file when running the generated jar file.
Official documentation :
https://ei.docs.wso2.com/en/latest/ballerina-integrator/develop/running-on-jvm/
However, keeping the ballerina.conf file in the root directory should work. Ballerina looks for the conf file automatically when running. Make sure the conf file is outside the src directory.
For the error that you have mentioned, could you add in logs to see if the json has been converted to xml properly? Since the code is structured in a way that checks if the conversion has occurred, it should print an xml
if (employee is xml) {
var ftpResult = ftp->put(remoteLocation, employee);
if (ftpResult is error) {
log:printError("Error", ftpResult);
response.setJsonPayload({Message: "Error occurred uploading file to FTP.", Resason: ftpResult.reason()});
} else {
response.setJsonPayload({Message: "Employee records uploaded successfully."});
}
} else {
response.setJsonPayload({Message: "Error occurred tranforming json to xml.", Resason: employee.reason()});
}
The if( employee is xml ) part will check if the conversion is successful.
The same applies after the file is sent to the server. If the file hasnt been sent, then the ftpResult would be an error. Basically, if you got the message { Message : "Employee records uploaded successfully" } then all the checks should have passed.
I have passed credentials directly to ftpConfig then its working fine. Conversion happened and converted file has been uploaded into ftp location successfully
ftp:ClientEndpointConfig ftpConfig = {
protocol: ftp:SFTP,
host: "corpsftp.dfaDFDA.com",
port: 22,
secureSocket: {
basicAuth: {
username: "DDFDS",
password: "FADFHYFGJ"
}
}
};
Output
{
"Message": "Employee records uploaded successfully."
}

File Upload - Error executing ListObjects | AWS S3 & Laravel

I am getting the error below when i try to upload a file greater than 25mb to amazon s3 using laravel aws sdk, however files below 25mb are uploading successfully. I have everything setup correctly in my .env file. I have no idea why this is happening.
Any help would be appreciated.
Error:
Error executing "ListObjects" on
"bucketdomain/?prefix=b6d767d2f8ed5d21a44b0e5886680cb9%filename%2F&max-keys=1&encoding-type=url";
AWS HTTP error: cURL error 7: Failed to connect to bucketdomain
port 443: Network unreachable (see
http://curl.haxx.se/libcurl/c/libcurl-errors.html)
Save function in laravel:
$v = Storage::disk('s3')->put($path, file_get_contents($file), 'public');
unlink($file->getPathname());
return response()->json(["message" => "File uploaded successfully!"]);
Upload function in laravel:
if ($receiver->isUploaded() === false) {
throw new UploadMissingFileException();
}
$save = $receiver->receive();
if ($save->isFinished()) {
database entries...
return $this->saveChunkFile($file,$userFolderName,$path,$fileName);
}
$handler = $save->handler();
return response()->json([
"Percentage" => $handler->getPercentageDone()
]);
I am using reusable.js in client side to upload files in chunks & the code above is to handle the chunks and merge them when done and pass to the saveChunkFile function.
Picture:
The file is to be stored in the 2nd folder from top but there is not file that is why i think the error is thrown on size function and these files (chunks) are being generated and not stopping still.

Laravel write file stream

I am using guzzle to downlaod file from url and save it into my storage.
So I have a code look like this
$response = $this->client->request('GET', $model->url, [
'stream' => true
]);
$body = $response->getBody();
while (!$body->eof()) {
Storage::append($this->filePath, $body->read(1024));;
}
but when I open the folder where my file is located, I see that the file size is changing and sometimes it is zero. So in the end I am getting a invalid file.
How can I solve this problem?

Resources