I am working on a side project on Laravel and I am a junior intern. Currently I need to generate an URL and send that to a specific email, where the user can view a pdf for a limited time and only once.
For now I am just working on the unique URL generation and would like some advice for generating them securely, following standards and for a limited time only.
Would love just some resources or advice.
I think what you are looking for is Signed Routes in Laravel.
Probably this help
https://laravel.com/docs/9.x/urls#signed-urls
Example:
First of all Before using signed URL you need chek that in your App/Http/Kernel you have next line uncommented
'signed' => \Illuminate\Routing\Middleware\ValidateSignature::class,
after that in web routes you have to add signed middleware
Route::get('pdf/{id}/{user}/{response}', PdfController::class)->name('pdf.generate')->**middleware('signed')**
You finally can use your signed url as
use \Illuminate\Support\Facades\URL;
URL::temporarySignedRoute('pdf.generate', now()->addHour(), [
'id' => 25,
'user' => 100,
'response' => 'yes'
]);
which create a signed url like
https://example.com/pdf/25/100/yes?expires=1521543365
&signature=d32f53ced4a781f287b612d21a3b7d3c38ebc5ae53951115bb9af4bc3f88a87a
Where signature hash include all params in route and avoid manipulation, and the best of all in one hour will expire forever.
Throwing an Illuminate\Routing\Exceptions\InvalidSignatureException.
when visited after expiring
Related
Hello Masters of the Javascript and Laravel.
I'm wondering if it's possible to make two separate inertia.js applications in one Laravel application.
I'm making a SaaS style application, where there will be a front facing part and a backoffice part.
This will be split in two domains (or more), using the same Laravel/Octane application. This works.
However, the Ziggy routes from the backoffice also show up in the front facing application.
There's of course a security risk that these are included, in the Ziggy data. Also transferring the routes from the admin part creates unnecessary traffic, since the routes are transferred with every request with inertia. The backoffice will have a lot more routes than the relatively simple frontend.
Any idea on how this could be structured so the Ziggy data is split in two, while it still being one application?
I dont think this would be possible or if it is, it would be too complex setup since Interia relies with Laravel routing and when you are passing Laravel back to interia you'd have to determine which of that Inertia App you'd push the data.
I think the better approach for this is to just create domain group routes if you need multiple domains support, then create a persistent layout in the front-end which you can use for public/front-end area and admin area. You can just use Laravel Jetstream with Vue+Inertia.
I dont really see any reason why you would need different front-end application, but if you really do, I think you better just set-up Laravel for API routes so you can set-up different application for the front-end.
I ended up doing this in HandleInertiaRequests middleware
/**
* Define the props that are shared by default.
*
* #return array<string, mixed>
*/
public function share(Request $request): array
{
$ziggyObj = (new Ziggy)->toArray();
if($request->host()!=config('app.admin_domain')) {
$ziggyObj['routes'] = array_filter(
$ziggyObj['routes'],
fn ($key) => !str_starts_with($key, 'admin'),
ARRAY_FILTER_USE_KEY
);
}
return array_merge(parent::share($request), [
'auth' => [
'user' => $request->user(),
],
'ziggy' => function () use ($ziggyObj, $request) {
return array_merge($ziggyObj, [
'location' => $request->url(),
]);
},
]);
}
So any routes that starts with "admin" are not sent to the frontend, unless it's the "admin domain".
The front-facing application routes are also sent to the backoffice, but I might filter those out later.
I'm working with Laravel 5.6 as my backend for a personal project and i've been doing something that seems (to me) as a bad practice, either way, I would like to know if it is actually that bad.
First of all, i'm using a Vue.js (CLI 3) project as a client and i'm making requests to my Laravel backend. Now, to deal with the notifications/toasts, i'm using the next format:
return response()->json([
'alert' => [
'title' => 'Server error.',
'text' => 'Error explanation text.',
'type' => 'error'
]
], 200);
It doesn't matter if I everything went right or wrong, i am always responding with this same format and an 200 status. Is it wrong? Should I use other statuses on my responses?
I am doing this because i can't get (i don't know how) the custom 'alert' array on my client side while using a 404 status (for example) and the only way I could find to deal with it was using this 200 status every single time.
HTTP status code are a mechanism to identify easily a response and will help to understand to clients if a requests was okey just checking it, for example a search engine robot that could distinguish between a error page just thanks to the status code.
In Axios for example, a HTTP client for JS, you can read response data even if was error https://stackoverflow.com/a/39153411.
Also have a look into this resource that will help you which status code to choose https://www.restapitutorial.com/httpstatuscodes.html
Having some slight issues with laravel validation rules. I.e. I have setup a form with a field name of 'url'. This url needs to be a URL but is not required.
So I have:
'url' => 'url',
In the validation rules, but it still comes back on submit that the URL is an invalid format. But I didn't fill it out and it isn't required.
Slightly confused here, anything I should look out for here?
https://laravel.com/docs/5.6/validation#a-note-on-optional-fields
By default, Laravel includes the TrimStrings and ConvertEmptyStringsToNull middleware in your application's global middleware stack. These middleware are listed in the stack by the App\Http\Kernel class. Because of this, you will often need to mark your "optional" request fields as nullable if you do not want the validator to consider null values as invalid.
So, this validation rule will do the trick:
'url' => ['nullable', 'url']
For that, I usually use nullable in the very beginning
'url' => 'nullable|url',
I have in a middleware those two routes so they stay on top of all other routes
Route::get('{slug?}', array(
'as' => 'homeIndex',
'uses' => 'Modules\\Pages\\Controllers\\Pages#index'
))->where('slug', '(.*)?');
Route::get('{company?}', array(
'as' => 'companyProfile',
'uses' => 'Modules\\Company\\Controllers\\Profile#index'
))->where('company', '(.*)?');
what I'm trying to achieve is route all pages through homeIndex and all companies profile through companyProfile all on the first segment.
Is working fine for pages, but for companies profile I get 404.
It's same like facebook if you go on facebook.com/about the result is about page if you replace about with your unique name you get your profile.
Any ideas how to make it work?
Facebook works because about is a route. This works because your unique name can never be about. So they look for the about route first, and if the segment isn't about, they know it's probably a unique name.
Yours is different because your app doesn't know if your first segment is a slug or a company so you need some way to tell it the difference. If you redirected everything to a single function, then in that function did some queries or whatever you need to do to figure out if the first segment is a slug or a company name, then redirect that appropriately, it would work.
Laravel reads routes from the top to the bottom. When you hit the route /some-random-company, Laravel has no idea this is a company. All it knows is it happens to match the first route so it hits the slug route with your company. So another solution would be to update the wheres on your routes so Laravel has some idea if the incoming route parameter is a slug or a company and will know where to route that request to.
I'm terrible at regular expressions and I don't know how you are making slugs or if there is any rules you have setup on what a company name can or can not consist of. What you would have to do is figure out if there is anyway where you can accurately determine if the route parameter is a slug or company. For example, if it has one or more of -, it might be a slug.
Then you'd have to write a regular expression pattern to look for multiple - and then put that pattern into the where for the slug route. Then if the route parameter is a company and it does not match that pattern which was looking for one or more of -, Laravel will know to match this request with the company route.
If there is nothing which you can use to determine if a string is a slug or a company, you will have to update your routes so it looks like company/{company?} and slug/{slug?} and then output your links appropriately. This way Laravel will know for sure where to route that traffic.
This looks like an issue to me. Normal way Backbone works is using same URL and GET, POST, PUT and DELETE. But obviously:
1) All but POST methods need an ID either in URL or in request body
2) DELETE request can not contain body or some servers ignore body
So how can you make let's say a Ruby on Rails server app successfully work with Backbone without need to hack Backbone such as model.destroy() needs to have ID in the URL? And, as our RoR developer is telling me, normal way to do routes for PUT is to also have an ID in the URL?
There are 5 routes you need to implement to make use of backbone's default sync behavior. For a resource user, they are:
GET /user/ // Get a list of users
GET /user/:id // Get a single users by id
POST /user/ // Create a new user
PUT /user/:id // Update an existing user by id
DELETE /user/:id // Delete an existing user by id
I'm not very familiar with Ruby on Rails, but glancing at their documentation you could fulfill this specification with something like:
match "/user/" => "users#all", :via => :get
match "/user/:user_id" => "users#one", :via => :get
match "/user/" => "users#create", :via => :post
match "/user/:user_id" => "users#update", :via => :put
match "/user/:user_id" => "users#delete", :via => :delete
You should not have to hack Backbone for it to work with RoR. Backbone is smart enough to know (to a certain extent) what URL and what method it should use.
For example, for the initial fetch of a model, if you set url to '/tasks', it will do a GET request to '/tasks/id'. When you change that model, and call model.save, it will do a PUT request to '/tasks/id'. When you call model.destroy, it will send a DELETE request (with an empty body)
The one thing you do have to consider is the CSRF token. I suggest you include backbone-rails in your Gemfile. It includes some JavaScripts to help with Rails/Backbone integration.