I am starting a new project, Nuxt.js for the frontend and Laravel for the backend.
How can I connect the two?
I have installed a new Nuxt project using create-nuxt-app, and a new laravel project.
As far as I have searched, I figured I need some kind of environment variables.
In my nuxt project, I have added the dotenv package and placed a new .env file in the root of the nuxt project.
And added CORS to my laravel project, as I have been getting an error.
The variables inside are indeed accessible from the project, and im using them
like this:
APP_NAME=TestProjectName
API_URL=http://127.0.0.1:8000
And accessing it like this:
process.env.APP_NAME etc'
To make HTTP calls, I am using the official Axios module of nuxt.js, and to test it i used it in one of the components that came by default.
The backend:
Route::get('/', function () {
return "Hello from Laravel API";
});
and from inside the component:
console.log(process.env.API_URL)//Gives 127.0.0.1:8000
//But this gives undefined
this.$axios.$get(process.env.API_URL).then((response) => {
console.log(response);
});
}
What am I doing wrong here?
I have tried to describe my setup and problem as best as I can. If I overlooked something, please tell me and I will update my question. Thanks.
Taking for granted that visiting https://127.0.0.1:8000/ in your browser you get the expected response, lets see what might be wrong in the front end:
First you should make sure that axios module is initialized correctly. Your nuxt.config.js file should include the following
//inclusion of module
modules: [
'#nuxtjs/axios',
<other modules>,
],
//configuration of module
axios: {
baseURL: process.env.API_URL,
},
Keep in mind that depending on the component's lifecycle, your axios request may be occurring in the client side (after server side rendering), where the address 127.0.0.1 might be invalid. I would suggest that you avoid using 127.0.0.1 or localhost when defining api_uris, and prefer using your local network ip for local testing.
After configuring the axios module as above, you can make requests in your components using just relative api uris:
this.$axios.$get('/').then(response => {
console.log(response)
}).catch(err => {
console.error(err)
})
While testing if this works it is very helpful to open your browser's dev tools > network tab and check the state of the request. If you still don't get the response, the odds are that you'll have more info either from the catch section, or the request status from the dev tools.
Keep us updated!
Nuxt has a routing file stucture to make it easy to set up server side rendering but also to help with maintainability too. This can cause Laravel and Nuxt to fight over the routing, you will need to configure this to get it working correctly.
I'd suggest you use Laravel-Nuxt as a lot of these small problems are solved for you.
https://github.com/cretueusebiu/laravel-nuxt
Related
I'd like to understand how to enable caching in Strapi for a specific (or any) API endpoint. At the moment when the browser hits my endpoint in the response headers I don't see any caching related headers. Is there a way to use etags and have a long cache time to allow the JSON response to be cached?
There is one mention of etags in the docs but I'm not sure how to implement. If anyone can provide more detailed information it would be appreciated.
At least for static files this can be done within Strapi itself. The middlewares documentation suggests, that there is already a middleware called public, which sets a Cache-Control header with its maxAge when it serves the files from the public/ directory.
But if you want to get your uploaded files cached (i.e. files within the public/uploads/ directory), that’s not enough, because the middleware strapi-provider-upload-local (not yet documented) runs first.
A recently published package solves this issue:
npm i strapi-middleware-upload-plugin-cache
Simply activate it by adding or modifying the config/middleware.js file with the following content:
module.exports = ({ env }) => ({
settings: {
'upload-plugin-cache': {
enabled: true,
maxAge: 86400000
}
}
});
I suggest you manage this kind of thing outside of Strapi.
With another service. For example, if you host you app on AWS, you can use CloudFront.
Hi I am using Vuejs in the frontend and Laravel in the backend. The role of Laravel is handling the API only. The frontend and backend are separated, i.e. I am not using Vuejs in Laravel's resource/js folder.
Now I am sending Axios POST request from Vuejs to Laravel. All the form input values are prevalidated using HTML5 required attribute. And when I console.log the request data, it shows all the fields filled.
In Vue file:
const data = {
name: this.name,
gender: this.gender,
mobile_no: this.mobile_no,
image: this.userImage
};
console.log("Request data . . . .", data);
const response = await this.axios
.post(`${this.AppURL}/admin/user/create`, data, {
headers: {
"Content-Type": "multipart/form-data"
}
})
.then(() => {
console.log("Success. . . . ")
alert("Successfully Driver Added");
})
.catch(error => console.log(error));
And in Laravel, the request is passed through some validation. It's a simple validation to check if all the fields are filled.
I am also using JWTAuth package for the authentication, and the token is generated by it.
It's too much code to write them all the way down here. But I am sure you can understand what I mean.
What I am getting as a response is this
POST http://localhost:8000/api/admin/user/create 422 (Unprocessable Entity)
The actual result I am expected to get is either success or some errors that is according to some if conditions in validation or token check.
I tried to figure out where this error might come from. What I think at the moment is this could be due to the absence of csrf_token in the POST request. As I'm sending the request outside Laravel, csrf_token is missing in the form. I am not 100% sure though about this.
So my question is:
How can I include csrf_token in Axios POST request, when I send it from outside Laravel.
If this 422 error is not related with csrf_token, what could be causing this? Any previos experiences like min? and any solutions for this?
Thanks in advance for your help.
Please, modified catch block as #Jack suggested:
.catch(error => {
console.log("ERRRR:: ",error.response.data);
});
Now you can get errors and handle errors in the catch block.
.catch(function (error) {
console.log(error.response.data.errors);
});
please use this code I hope it work's.
I was also facing the same issue, i think it is due to some Headers missing in your Api request from vue.js. here some tips which may helps you to solve this issues.
Make sure that you are protecting your Api Routes or not(by sanctum or something else). If you are protecting , then make make sure that you are sending authentications token in headers.
Second make sure that your request(axios or jwt) should contained valid data, if your are sending images or files etc make sure how can we send them.
First, get request and check in laravel by dd($erquest->all()); if you are geeting data then validate, it is possible that laravel request doesnt contained your sending data..
These errors may be caused due to follow reasons, ensure the following steps are followed.
To connect the local host with the local virtual machine(host).
Here, I'am connecting http://localhost:3001/ to the http://abc.test
Steps to be followed:
We have to allow CORS, placing Access-Control-Allow-Origin:* in header of request may not work. Install a google extension which enables a CORS request.
Make sure the credentials you provide in the request are valid.
Make sure the vagrant has been provisioned. Try vagrant up --provision
this make the localhost connect to db of the homestead.
Just click on the preview tab within network section in the dev tool, you are going to see the actual error message.
I am trying to upgrade my old Polymer application to Polymer v3. Nearly Everything works fine if I use polymer serve.
But I have to also use some php files to connect to the backend and there is the problem.
When I try to run the application using polymer serve, PHP files are not found and returning 404 whenever I try to make a POST request on them.
Non working example. Having the following file structure:
|_ phpFile.php
|
|_ jsFile.js
Inside jsFile.js
fetch("phpFile.php", {
method: 'POST',
headers: new Headers({
Accept: 'application/json',
'Content-Type': 'application/json'
},
body: JSON.stringify({
testData: true
})
}).then(response => {
console.warn(response);
});
When I try to run the application with XAMPP (virtual host), Making a POST request returns exactly what I need, which is great. But import like this:
import {PolymerElement, html} from '#polymer/polymer/polymer-element.js';
stop working, because there is no bundler that would replace # with the current path. And no, I can't simply rewrite it to '/node_modules/#polymer/polymer/polymer-element.js'. All of the polymer elements are using this notation. I would have to rewrite all source codes which is nonsense.
I need to either make successfull POST on php files while serving with polymer serve or replace all # inside imports while serving with localhost (XAMPP or any other service)
Is there anyone who succesfully implemented connection to php file in Polymer 3?
Or anyone who knows some workaround, solution for this?
If You are using PHP for APIs then create separate Application and run it on xampp.
In sort create tow different Application. One for back end (PHP app) and another for front end (Polymer app).
I'm creating a SPA using vanilla JavaScript and currently setting up sw-precache to handle the caching of resources. The service worker is generated as part of a gulp build and installed successfully. When I navigate to the root url (http://127.0.0.1:8080/) whilst offline the app shell displays, illustrating that resources are indeed cached.
I'm now attempting to get the SW to handle internal routing without failing. When navigating to http://127.0.0.1:8080/dashboard_index whilst offline I get the message 'Site can't be reached'.
The app handles this routing on the client side via a series of event listeners on the users actions or, in the case of using the back button, the url. When accessing one of these urls, no calls to the server should be made. As such, the service worker should allow these links to 'fall through' to the client side code.
I've tried a few things and expected this Q/A to solve the problem. I've included the current state of the generate-service-worker gulp task, and with this setup I'd expect to be able to access /dashboard_index offine. Once this is working I can adapt the solution to cover other routes.
Any help much appreciated.
gulp.task('generate-service-worker', function(callback) {
var rootDir = './public';
swPrecache.write(path.join(rootDir, 'sw.js'), {
staticFileGlobs: [rootDir + '/*/*.{js,html,png,jpg,gif,svg}',
rootDir + '/*.{js,html,png,jpg,gif,json}'],
stripPrefix: rootDir,
navigateFallback: '/',
navigateFallbackWhitelist: [/\/dashboard_index/],
runtimeCaching: [{
urlPattern: /^http:\/\/127\.0\.0\.1\:8080/getAllData, // Req returns all data the app needs
handler: 'networkFirst'
}],
verbose: true
}, callback);
});
update
The code to the application can be found here.
Removing the option navigateFallbackWhitelist does not chage the result.
Navigating to /dashboard_index whilst offline prints the following to the console.
GET http://127.0.0.1:8080/dashboard_index net::ERR_CONNECTION_REFUSED
sw.js:1 An unknown error occurred when fetching the script.
http://127.0.0.1:8080/sw.js Failed to load resource: net::ERR_CONNECTION_REFUSED
The same An unknown error occurred when fetching the script. is also duplicated in the 'application > service workers' tab of chrome debug tools.
It's also noted that the runtimeCaching option is not caching the json response returned from that route.
For the record, in case anyone else runs into this, I believe this answer from the comments should address the issue:
Can you switch from navigateFallback: '/' to navigateFallback:
'/index.html'? You don't have an entry for '/' in your list of
precached resources, but you do have an entry for '/index.html'.
There's some logic in place to automatically treat '/' and
'/index.html' as being equivalent, but that doesn't apply to what
navigateFallback is doing...
Cross site ajax request with Vue.js 1.0 and Vue Resource. I get the following error: XMLHttpRequest cannot load http://dev.markitondemand.com/MODApis/Api/v2/Lookup/jsonp?input=NFLX&callback=handleResponse. No 'Access-Control-Allow-Origin' header is present on the requested resource.
I have a basic understanding of the problem but not sure how to add a callback function with the request or if that is the best solution for this example. I put in the full request URL here just to make it easier to follow.
new Vue({
el: '#stockList',
data: function() {
return {
query: '',
stocks: []
};
},
ready: function() {
this.getStocks();
},
methods: {
getStocks: function() {
this.$http.get('http://dev.markitondemand.com/MODApis/Api/v2/Lookup/jsonp?input=NFLX&callback=handleResponse',
function(data) {
this.stocks = data;
}
);
}
}
})
I have almost zero understanding of networking, but I was able to get several remote apis to work using:
this.$http.jsonp
instead of
this.$http.get
"No Access-Control-Allow-Origin" header usually is a problem with the server. It means that the server is configured to only allow a person access to the API if the request comes from the same domain as the server. You either need to run the script from the website that you are requesting data from, or you need to change the server config to allow access to all domains.
If you don't have access to the server, and you don't want to run the script in the browser, then I think what you could do is use a headless browser like PhantomJS to navigate to the page, insert a script element into the dom that contains you script, execute the function and then return the data from the API. I could write the code out for you, but to be honest, it's a bit complex. You would have to know how to use node.js, and phantom.js. I've personally only used phantom.js for the Node 'html-pdf' package, but I'm sure with a little bit of reading you could figure out how to do it.
Set your local environment to http instead of https if you have no control over dev.markitondemand.com.