Nuxt Axios Dynamic url - proxy

I manage to learn nuxt by using following tutorial
https://scotch.io/tutorials/implementing-authentication-in-nuxtjs-app
In the tutorial, it show that
axios: {
baseURL: 'http://127.0.0.1:3000/api'
},
it is point to localhost, it is not a problem for my development,
but when come to deployment, how do I change the URL based on the browser URL,
if the system use in LAN, it will be 192.168.8.1:3000/api
if the system use at outside, it will be example.com:3000/api
On the other hand, Currently i using adonuxt (adonis + nuxt), both listen on same port (3000).
In future, I might separate it to server(3333) and client(3000)
Therefore the api links will be
localhost:3333/api
192.168.8.1:3333/api
example.com:3333/api
How do I achieve dynamic api url based on browser and switch port?

You don't need baseURL in nuxt.config.js.
Create a plugins/axios.js file first (Look here) and write like this.
export default function({ $axios }) {
if (process.client) {
const protocol = window.location.protocol
const hostname = window.location.hostname
const port = 8000
const url = `${protocol}//${hostname}:${port}`
$axios.defaults.baseURL = url
}

A late contribution, but this question and answers were helpful for getting to this more concise approach. I've tested it for localhost and deploying to a branch url at Netlify. Tested only with Windows Chrome.
In client mode, windows.location.origin contains what we need for the baseURL.
# /plugins/axios-host.js
export default function ({$axios}) {
if (process.client) {
$axios.defaults.baseURL = window.location.origin
}
}
Add the plugin to nuxt.config.js.
# /nuxt.config.js
...
plugins: [
...,
"~/plugins/axios-host.js",
],
...

This question is a year and a half old now, but I wanted to answer the second part for anyone that would find it helpful, which is doing it on the server-side.
I stored a reference to the server URL that I wanted to call as a Cookie so that the server can determine which URL to use as well. I use cookie-universal-nuxt and just do something simple like $cookies.set('api-server', 'some-server') and then pull the cookie value with $cookies.get('api-server') .. map that cookie value to a URL then you can do something like this using an Axios interceptor:
// plguins/axios.js
const axiosPlugin = ({ store, app: { $axios, $cookies } }) => {
$axios.onRequest ((config) => {
const server = $cookies.get('api-server')
if (server && server === 'some-server') {
config.baseURL = 'https://some-server.com'
}
return config
})
}
Of course you could also store the URL in the cookie itself, but it's probably best to have a whitelist of allowed URLs.
Don't forget to enable the plugin as well.
// nuxt.config.js
plugins: [
'~/plugins/axios',
This covers both the client-side and server-side since the cookie is "universal"

Related

Dynamically proxying of several pages in NuxtJS

In my NuxtJS application I has a folder with html pages, that can be added/deleted in any time from outside (/static/pages/page1.html, /static/pages/page2.html, ...) and I got a mapping to real uri's for this pages
{ '/foo': 'page1.html', '/bar': 'page2.html', ... }
I know I can use #nuxtjs/proxy, but it requires to rebuild an app every time mapping changes. I also know I can use nginx's rewrites for this, but changing it's config every time is painful too.
I also tried using 'pages/_.vue' file, read .html in component and place it's content to html using v-html, but files contains full html page (w/ scripts), and nuxt throw and error in this case, 'cos v-html don't allow using js (or maybe another reasons, which I can't understand)
How can I make dynamic proxy for this in NuxtJS?
For someone looking for answer for same question
Solve this by creating simple server middleware
in /pages_proxy/index.js:
const path = require('path');
const { Router } = require('express');
const express = require('express')
const app = express()
const router = Router()
router.get('*', async (req, res, next) => {
const pages = { '/foo/': 'page1.html', '/bar/': 'page2.html', ... }
const page = pages[req.path];
if (page) {
res.sendFile(path.join(__dirname, '../static/pages', page));
} else {
next();
}
});
app.use(router)
module.exports = app
in nuxt.config.js
serverMiddleware: {
'/': '~/pages_proxy'
},

Svelte Proxy with rollup?

I'm trying to proxy requests from a Svelte app to a different port, where my backend API runs. I want to use a rollup proxy in the dev environment.
I read the alternative of using a webpack proxy here, but I want to give rollup proxy a try.
This is not an issue in production.
As suggested, I tried configuring rollup-plugin-dev However, whenever I make a request to weatherforecast I still get an CORS error. Below is my configuration and the call:
import dev from 'rollup-plugin-dev'
// other code
export default {
input: 'src/main.js',
output: {
sourcemap: true,
format: 'iife',
name: 'app',
file: 'public/build/bundle.js'
},
plugins: [
dev({
proxy: [{ from: '/weatherforecast', to: 'https://localhost:7262' }]
}),
// other code
];
and App.svelte looks like this:
<script>
import { onMount } from "svelte";
const endpoint = "/weatherforecast";
onMount(async function () {
try {
const response = await fetch(endpoint);
const data = await response.json();
console.log(data);
} catch (error) {
console.log(error);
}
});
</script>
Any help in solving this issue is appreciated.
What's happening is the proxy is passing through the CORS headers un-touched, so you're basically interacting with the API as though the proxy wasn't even there, with respect to CORS.
I'll note that you can get around this in dev, but keep in mind this problem will come up in production too, so you may just need to rethink how you're getting this data.
For development though, you can use something like cors-anywhere. This is a middleware that you can run through your dev server that can rewrite the headers for you.
To configure rollup-proxy on dev environment, you need to remove the call to the serve method, call the dev method and move the proxy calls inside the dev method:
import dev from 'rollup-plugin-dev'
// other code
export default {
input: 'src/main.js',
output: {
// other code
commonjs(),
// enable the rollup-plugin-dev proxy
// call `npm run start` to start the server
// still supports hot reloading
!production && dev({
dirs: ['public'],
spa: 'public/index.html',
port: 5000,
proxy: [
{ from: '/weatherforecast', to: 'https://localhost:7262/weatherforecast' },
],
}),
// line below is no longer required
// !production && serve(),
// other code
];

NextJS - how to configure proxy to log api requests and responses?

I am having an issue with the Cloudinary Node SDK where the Admin Resources endpoint is occasionally throwing a 302 error. Their support suggested that I proxy the request so that I can log the response between my api and their SDK and ultimately get a better idea of what might be causing the problem (in the end they're hoping to see the Location headers that are in the Response).
One of their suggestions was to use Charles Proxy, however I'm very new to how this works and am unable to figure it out. I've read through the Charles docs and spent a full day googling, but I can't find any info related to proxying between the NextJS API and Cloudinary SDK specifically.
How do I go about setting up Charles Proxy so that I can see the request and response in this way? Is there another way that I don't know of that would work instead? Since I'm using the newest version of NextJS v12, could I use the new _middleware option instead? In a later suggestion, their Support made this comment too:
if you can't proxy requests to localhost, you may be able to use a local DNS server or a local override so you can access your local IP using a different hostname (e.g. using /etc/hosts on a unix environment, or C:\Windows\System32\Drivers\etc\hosts on a windows PC) and have that proxied - that said, there's probably an easier way using a Node project or adjusting the settings of the Node server.
but I have no idea where to begin with this either.
Here is the api code I have:
pages/api/auth/images.ts
import type { NextApiRequest, NextApiResponse } from 'next';
import cloudinary from 'cloudinary';
require('dotenv').config();
export default async function handler(_: NextApiRequest, res: NextApiResponse) {
cloudinary.v2.config({
cloud_name: process.env.CLOUDINARY_CLOUD_NAME,
api_key: process.env.CLOUDINARY_API_KEY,
api_secret: process.env.CLOUDINARY_API_SECRET,
secure: true,
});
try {
// fetch cloudinary auth images
const response = await cloudinary.v2.api.resources({
type: 'upload',
prefix: 'my_app/auth_pages',
max_results: 20,
});
// fetch random image
const randImage =
response.resources[~~(response?.resources?.length * Math.random())];
// return image
return res.status(200).json({ image: randImage });
} catch (error) {
console.dir({ error }, { colors: true, depth: null });
return res.status(500).json({ error });
}
}
Note: I'm on a Mac.
Try the following:
export default async function handler(_: NextApiRequest, res: NextApiResponse) {
cloudinary.v2.config({
cloud_name: process.env.CLOUDINARY_CLOUD_NAME,
api_key: process.env.CLOUDINARY_API_KEY,
api_secret: process.env.CLOUDINARY_API_SECRET,
api_proxy: 'http://<local_ip>:<charles_port>', //change accordingly
secure: true,
});
To get the port number, In Charles Proxy go to Proxy > Proxy Settings.

Using Express Router to proxy non-CORS content doubles request path

There must be something obvious I am missing. I have an external record like https/udspace.udel.edu/rest/handle/19716/28599 that is not set up for CORS, but I would like to get access to it in my application. In the past, I have set up a server to simply GET the resource and redeliver it for my application. However, I read about Node http-proxy-middleware and thought I'd try out a more direct proxy.
First, I could not use the target in the createProxyMiddleware() constructor because I wanted to send in the hostname and path of the desired resource like, perhaps, http://example.com/https/udspace.udel.edu/rest/handle/19716/28599 to get the resource above.
Relevant Code (index.js)
const express = require('express')
const { createProxyMiddleware } = require('http-proxy-middleware')
const app = express()
app.get('/info', (req, res, next) => {
res.send('Proxy is up.');
})
// Proxy endpoint
app.use('/https', createProxyMiddleware({
target: "incoming",
router: req => {
const protocol = 'https'
const hostname = req.path.split("/https/")[1].split("/")[0]
const path = req.path.split(hostname)[1]
console.log(`returning: ${protocol}, ${hostname}, ${path}`)
return `${protocol}://${hostname}/${path}`
},
changeOrigin: true
}))
app.listen(PORT, HOST, () => {
console.log(`Starting Proxy at ${HOST}:${PORT}`);
})
I was getting a 404 from the DSpace server without other information, so I knew that the request was going through, but transformed incorrectly. I tried again with an item in an S3 bucket, since AWS gives better errors and saw that my path was being duplicated:
404 Not Found
Code: NoSuchKey
Message: The specified key does not exist.
Key: api/cookbook/recipe/0001-mvm-image/manifest.json/https/iiif.io/api/cookbook/recipe/0001-mvm-image/manifest.json
What dumb thing am I doing wrong? Is this not what this proxy is for and I need to do something else?

Nuxt Apollo with dynamic headers for a session based authentication

Apollo is not storing the header from the query dynamically.
pages/index.vue
methods: {
fetchCars() {
const token = Cookies.get('XSRF-TOKEN')
console.log(token) // 🟢 Token is shown in console
this.$apollo.query({
query: gql`
query {
cars {
uuid
name
}
}
`,
headers: {
'X-XSRF-TOKEN': token, // â­• Fetch without header
},
})
},
},
Is there a way to set the header value new for every Apollo request?
I have a separate Frontend and Backend. For the Frontend I am using Nuxt.js with Apollo. I want to have a session based communication with my server. For this reason I need to send the CSRF-Token with every Request.
Now the problem: On the first load of the page there is no Cookie set on the browser. I do a GET-Request on every initialization of my Nuxt application.
plugins/csrf.js
fetch('http://127.0.0.1:8000/api/csrf-cookie', {
credentials: 'include',
})
Now I have a valid Cookie set on my side and want to communicate with the GraphQL Server but my header is not set dynamically in the query. Does anyone know how I can solve this?
My Laravel Backend is throwing now a 419 Token Mismatch Exception because I did not send a CSRF-Token with my request.
Link to the repository: https://github.com/SuddenlyRust/session-based-auth
[SOLVED] Working solution: https://github.com/SuddenlyRust/session-based-auth/commit/de8fb9c18b00e58655f154f8d0c95a677d9b685b Thanks to the help of kofh in the Nuxt Apollo discord channel 🎉
In order to accomplish this, we need to access the code that gets run every time a fetch happens. This code lives inside your Apollo client's HttpLink. While the #nuxtjs/apollo module gives us many options, we can't quite configure this at such a high level.
Step 1: Creating a client plugin
As noted in the setup section of the Apollo module's docs, we can supply a path to a plugin that will define a clientConfig:
// nuxt.config.js
{
apollo: {
clientConfigs: {
default: '~/plugins/apollo-client.js'
}
}
}
This plugin should export a function which receives the nuxt context. It should return the configuration to be passed to the vue-cli-plugin-apollo's createApolloClient utility. You don't need to worry about that file, but it is how #nuxtjs/apollo creates the client internally.
Step 2: Creating the custom httpLink
In createApolloClient's options, we see we can disable defaultHttpLink and instead supply our own link. link needs to be the output of Apollo's official createHttpLink utility, docs for which can be found here. The option we're most interested in is the fetch option which as the docs state, is
a fetch compatible API for making a request
This boils down to meaning a function that takes uri and options parameters and returns a Promise that represents the network interaction.
Step 3: Creating the custom fetch method
As stated above, we need a function that takes uri and options and returns a promise. This function will be a simple passthrough to the standard fetch method (you may need to add isomorphic-fetch to your dependencies and import it here depending on your setup).
We'll extract your cookie the same as you did in your question, and then set it as a header. The fetch function should look like this:
(uri, options) => {
const token = Cookies.get('XSRF-TOKEN')
options.headers['X-XSRF-TOKEN'] = token
return fetch(uri, options)
}
Putting it all together
Ultimately, your ~/plugins/apollo-client.js file should look something like this:
import { createHttpLink } from 'apollo-link-http'
import fetch from 'isomorphic-fetch'
export default function(context) {
return {
defaultHttpLink: false,
link: createHttpLink({
uri: '/graphql',
credentials: 'include',
fetch: (uri, options) => {
const token = Cookies.get('XSRF-TOKEN')
options.headers['X-XSRF-TOKEN'] = token
return fetch(uri, options)
}
})
}
}

Resources