How to configure server(hosting) for laravel-backend and vue-fontend in same host - laravel

I have laravel-backend which only provide api and vue-fontend which connected with that laravel-backend api . It's works locally fine.
But how to configure them in a single(both in same) live server(also with single domain).
Font-end is SPA. i create the font-end using "vue init webpack-simple my-project" commend.

You could use a different port for your api
By doing this you can run both applciations on the same server and access them by specifing the port in the url.
If you don't want to use ports in the url itself, you can also use nginx (or apache i suppose) with reverse proxy to give a 'path' to that port (would also be cleaner)

Related

How to connect a Laravel Sail instance with an SSH tunnel?

I have a Laravel app which needs to connect to a secure external API with very strict access requirements. There is a handler hosted on AWS which has a bunch of signed certificates etc. The only way to connect to that API is via that specific server due to those requirements.
Now, to test things on my local machine, I do the following:
SSH to the server using the -D flag to set up a SOCKS proxy.
Use this socks to http package to convert the proxy.
Set up Postman's proxy settings to use that http proxy.
That all works fine and I can complete the requests as expected.
However, I'd like to be able to use the proxy in my local Laravel environment too, for which I use Sail.
The problem is that I'm unsure of how to get the container to interact with the proxy. Using the method above in my local machine, I can cURL the required endpoint just fine, but if I try to do it via the container itself, it refuses to connect.
Any help would be appreciated!

Fetch data from gatServerProps of NextJs app when another api server is also running in localhost

According NextJs Documentations:
You should not use fetch() to call an API route in getServerSideProps. Instead, directly import the logic used inside your API route. You may need to slightly refactor your code for this approach.
Fetching from an external API is fine!
So we cannot use NextJs built-in APIs in getStaticProps or getServerSidePropsbut when I'm going to use another API service that is based on Laravel Framework as the back server and fetch it by Axios on the getServerSideProps function, I get Error: connect ECONNREFUSED 127.0.0.1:8080 error.
It should also be noted that everything is fine if the API server is addressed out of our development machine. In Other words, It will face when it's the development environment and both Laravel backend server and NextJs front-end server locate at localhost.
Could you help me out finding a solution for this problem?
When using localhost or 127.0.0.1 inside a docker container, that points to that docker container only, not the host computer.
There are two pretty easy solutions.
Create a docker network, add both containers to that, and use container name instead of ip (https://www.tutorialworks.com/container-networking/)
Use host networking for this container: https://docs.docker.com/network/host/
Edit: Added a link for a tutorial on how to create and use docker networks
So, as #tperamaki's answer already mentions: "When using localhost or 127.0.0.1 inside a docker container, that points to that docker container only, not the host computer."
You can use the ip of your machine in your local network. By example, 196.168.0.10:8080 instead 127.0.0.1:8080.
But you also can connect to the special DNS name host.docker.internal which resolves to the internal IP address used by the host.
In your case just add the port where the othe container is listening:
http://host.docker.internal:8080
In this section of the documentation Networking features in Docker Desktop for Mac, they explain how to connect from a container to a service on the host. Note that a mac is mentioned there, but I tried it on a linux distro and it also works (also in this other answer it is mentioned that it works for windows).

Spring App on GCP - Cloud Run - HTTPS only - This combination of host and port requires TLS

My Spring app uses lets encrypt and is https only. I did not include http to https thing, as it worked for me in postman with https:// format
When I deployed to Cloud Run, and mentioned the custom port (the port specified in spring)
and tested using URL from dashboard
https://..blah..run.app
I am getting error/message
Bad Request
This combination of host and port requires TLS.
What configuration is required on Cloud Run to resolve this?
The url as I see on service details page has htpps://...
EDIT:
If Cloudrun does not need me to take case of SSL, I can remove the application properties entries
server.ssl.key-store-type=PKCS12
server.ssl.key-store=classpath:key/keystore.p12
server.ssl.key-store-password=${lets.secret}
server.ssl.key-alias=someCertAlias
server.ssl.enabled=true
So Can I get an answer on whether to remove SSL from spring?
If cloudrun always uses http, all my calls use redirectConnector, which seems pointless
The Cloud Run Service listens on HTTP and HTTPS. Your application running in the container must listen on a port configured with HTTP only.
FYI: For a public facing web server, you should almost always enable HTTP. Otherwise, when a user enters www.example.com in the browser, the user will receive a connect error. This not always the case, for example .dev gTLDs, but is good practice. When a user connects to Cloud Run with the HTTP protocol, Cloud Run will redirect the user to HTTPS and connect to your application using the HTTP protocol.

Port forward requests from 80 to respective ports

I have many spring boot jars running in different ports. Say 9087-9090. I have a domain say
mydomain.com.
I can access mydomain.com:9087/ and use the application. Also mydomain.com:9088/ and use another application but how can i use them just like mydomain.com and still map them to desired ports. What is the technical term for this.
I use digitalocean hosting and have a Ubuntu 14.04 x64 Box. I'm running Java 7 in it.
You need a reverse proxy (a.k.az front end load balancer) with URL rewriting. I'm not sure what you hosting solution offers or permits, but you could try nginx or Apache httpd if you want something running locally. There are also service providers you might be able to use outside your host.

polygraph for https via proxy server

Can anyone help me setup web polygraph for testing an HTTPS servers via proxy servers in middle
linux machine:192.168.21.7
proxy server :192.168.21.9
https server : 192.168.21.11
This link contains the needed information:
http://www.web-polygraph.org/docs/userman/simple.html
Basically polygraph has couple files which are bundled with it and use for testing.
The manual I gave you give example that uses polysrv but on different distributions you will probably have different names for the tool(on ubuntu it's polygraph-server and polygraph-client)
You need to set the listening service ip+port outgoing "robot" ip and then start it using command line.
For https setup we will configure our pg file on server and client with SslWrap module.
Details of same can be found in http://www.web-polygraph.org/docs/reference/models/ssl.html

Resources