laravel XMLHttpRequest cannot load https://website.com/images/1554690945.png. No 'Access-Control-Allow-Origin' header is present - laravel

I'm having problem with my laravel file system CORS, I'm trying to cache the image from the url (which is also my website) in my ionic application but it's failing because of the error. I tried the image from https://reqres.in/api/users/1 and there is no problem caching the image in my ionic application. I guess the problem here is in my laravel website

In one of my current projects I have to save 200+ images in my Ionic App from a request to my server.
The way I handled this problem was converting the image to Base64 using Image Intervention and responding to the request with back to the app to then save the Base64 in the Ionic Storage like so.
Laravel Controller
public function grabImages(Request $request){
$image = (string) Image::make('public/bar.png')->encode('data-url');
$data = {
'base64' : $image,
'file_name' : 'test'
}
return $data;
}
Ionic
After receiving the data you can just store it in the Ionic Storage and access it wherever you would like to, even offline.
To display it all you have to do is set the image source to the Base64.
Using this method also solves a few problems, such as the user cannot see the images in the image gallery, as well as allows you to store them and use them offline for as long as you would like and remove them whenever.
As ImJT said I am using the barryvdh's laravel-cors plugin as well.
Hope this answered your question, good luck!

Related

Does Laravel cache it's .env in anyway? I am having a weird problem

Sorry for this long post I don't have any other way to describe it in short.
I have two Laravel application which are hosted in two subdomains of the same domain. One is
form.example.com another is dashboard.example.com.
The Dashboard app sends a http request to the Form app to get some JSON data. And the code it uses to send the request is like this:
$url = "https://form.example.com/api/v2/get/orders/" . urlencode($log->lastpull);
$client = new \GuzzleHttp\Client();
$request = $client->request('GET', $url);
$json = $request->getBody();
$objects = (json_decode($json));
Now the problem is that Dashboard app sometimes get a blank JSON or some error message in return from the Form app when this request is made.
However the same code works file in the localhost and when I look up the URL (https://form.example.com/api/v2/get/orders/) I get a valid JSON object. Which indicates that Form app is fine.
The error message I talked about I get from the Form app as a response for the HTTP request is this:
SQLSTATE[42S02]: Base table or view not found: 1146 Table 'dbl5qxvxfrl9tl.products' doesn't exist (SQL: select * from `products` order by `index` asc)
The problem with this error message is that the database it mentions 'dbl5qxvxfrl9tl' belongs to the Dashboard app! The app which is making the request.
I have no idea why the Form app is looking for its table on the Dashboard app's database. It only occurs when I host the Dashboard app on my shared hosting's server. In localhost it works fine.
I tried to export the stack trace from the error page but for some reason it was not letting me to export. So I have saved the html page:
drive.google.com/file/d/1elbJhv4BpDNlxC2Ji_463dDLndjzjbm6/view?usp=sharing
(I have replaced the domain name in this html file for privacy reasons)
As user #apokryfos commented on the main post, solution to this problem is this:
If these are two separate Laravel apps in different paths then you can also try running php artisan config:cache on both of them to generate a cache for the config so it doesn't read environment variables anymore (in case there's some cross-over)

How to retrieve files from S3 in Laravel Vapor

I'm having a problem loading images in my html dynamically after storing them successfully with Laravel Vapor.
I have followed this documentation provided by laravel vapor to store files, and it works like a charm. I copy my uploaded files from the tmp directory into the root of my S3 bucket and then store the path of that file in my databases images table so that later I can return the file path to my front end and display the image in my browser.
Unfortunately this is always returning a 403 status code from AWS S3.
I could fix this by making my generated S3 bucket public, but that would raise a security issue. I believe this should work out of the box, not sure where I could have gone wrong... any ideas?
I am returning the uploaded image url using the Storage facade.
use Illuminate\Support\Facades\Storage;
return Storage::url($image->path);
Where $image->path is the file path in my S3 bucket.
I'm sure that the storage facade is working correctly because it is returning the correct url with the file's path.
I got the solution to this problem. I contacted laravel vapor support and I was told to set the visibility property for my file to public when I copy it to the permanent location, as stated in Laravel's official documentation here.
So after you upload your file using the js vapor.store method you should copy it to a permanent directory, then set it's visibility to public.
Storage::copy($request->path, str_replace('tmp/', '', $request->path));
Storage::setVisibility(str_replace('tmp/', '', $request->path), 'public');
I also noticed that your can set the visibility of the file directly in the vapor.store method by passing a visibility attribute with the respective value.
vapor.store(file, { visibility: 'public-read' });
As a side note: just 'public' will return a 400 bad request, it must be set to 'public-read'.

How to control access to files at another server in Laravel

I have a host for my Laravel website and another (non-laravel) for stored files. Direct access to my files are blocked completely by default and I want to control access to them by creating temporary links in my Laravel site. I know how to code, just want to know the idea of how to do it (not details).
From the Laravel docs
Temporary URLs For files stored using the s3 or rackspace driver, you
may create a temporary URL to a given file using the temporaryUrl
method. This methods accepts a path and a DateTime instance specifying
when the URL should expire:
$url = Storage::temporaryUrl(
'file.jpg', now()->addMinutes(5)
);
You could also make your own solution by directing all image request through your own server and making sure the file visibility is set to private.
Here is an example of how a controller could return image from your storage
public function get($path)
{
$file = Storage::disk('s3')->get($path);
// Do your temp link solution here
return response($file, 200)->header('Content-Type', 'image/png');
}
What i am using right now is Flysystem provided in laravel.Laravel Flysystem integration use simple drivers for working with local filesystems, Amazon S3 and other some space provide also. So for this doesn't matter whether is a server is laravel server or not.
Even better, it's very simple in this to switch between server by just changing server configuration in API.
As far as I know we can create temporary Url for s3 and rackspace in this also by calling temporaryUrl method. Caching is already in this.
That's the thing.
If your files are uploaded on an AWS S3 server
then,
use Storage;
$file_path = "4/1563454594.mp4";
if( Storage::disk('s3')->exists($file_path) ) {
// link expiration time
$urlExpires = Carbon::now()->addMinutes(1);
try {
$tempUrl = Storage::disk('s3')->temporaryUrl($file_path, $urlExpires);
} catch ( \Exception $e ) {
// Unable to test temporaryUrl, its giving driver dont support it issue.
return response($e->getMessage());
}
}
Your temporary URL will be generated, After given expiration time (1 minute). It will expire.

Laravel - Can't download file from storage at Ubuntu server with Nginx

So I'm testing that a Laravel app that I just deployed to an Ubuntu server with Nginx works correctly, and I reached a point where I need to download some files that were upload from the front end using Angular.
I can upload files with no problem and I made sure that those are actually in the server and yeah, they are saved as expected.
However when I need to download them I get the error: "Failed to create the file"
It worked on my local machine, so I'm guessing is kind of a configuration problem but I'm not sure what to change yet.
The file is being requested through a GET request with Http with the header: { responseType: ResponseContentType.Blob }, the latter being part of Angular.
And in Laravel this is how I'm returning the file:
public function download($activityId) {
$activity = $activity = Activity::find($activityId, ['student_id', 'file_storage']);
$file = public_path() . '/storage/activityFiles/' . $activity['student_id'] . '/' . $activity['file_storage'];
return response()->download($file);
}
What can I be missing?
I decided to take a look at the laravel logs to see a little further the problem.
The problem was that I forgot to create the symbolic link to the storage.
That fixed the issue.

laravel is saving the wrong image link in db

I am trying to save the image link in the db like this
'photo' => asset('uploads/'.$fileName2)));
and in the db it is being saved as
http://localhost:8000/uploads/8730.jpeg
irrespective of the url in config file
APP_URL=http://example.com
asset() helper function generates
a URL for an asset using the current scheme of the request
https://laravel.com/docs/5.3/helpers#method-asset
Then, When you access to your laravel app via http://localhost:8000/, asset('uploads/'.$fileName2) will returns http://localhost:8000/uploads/....

Resources