Right way to Run a Vue App with Phoenix application as proxy - heroku

Right now, I am running a Vue app within a phoenix app. I first created a phoenix project and then started a Vue app with the name of assets.. for running it in the development environment. I have added
watchers: [npm: ["run", "build", cd: Path.expand("../assets", __DIR__)]]
which each time creates a build which is being used in app.html.eex from priv/static..
and For deployment, I am using phoenix static build pack.. which in production before deploy creates a build before ahead and then run phoenix app. everything is working fine. but its wrong way due to which..
overall benefits of Vue application is not being availed. e.g code splitting/loading code chunk on-page request only. and many other webpack features which we can avail within a Vue app are all not being availed as we just creating the build and putting it in production.
My issue is that. I have seen in may tutorials that run a Vue app with your API as a proxy. and so that the main app will be Vue and Phoenix API will work behind a proxy.
Right now I have this setup to deploy and work in development mode. My question is how I can achieve the opposite to that?
Starting Vue application which will automatically start the phoenix app as well. Also for deploying on Heroku. API will run simply but Vue app will for more functional than just a JS or CSS file static files?
Update: Is it possible to make an umbrella application in which one is Vue and one is phoenix?

This is more of an extended comment, but it was getting really big so I moved it to an answer.
To start two or more apps you will need something like foreman (in heroku, I think there's a buildpack with it) or systemd, or even start a nodejs server from your elixir app (not sure this is doable in heroku).
You can also split manually your components through the webpack config used by phoenix, this can be a bit more involved than just serving a single js file. The reason to split it into two separate services needs to be thought of, but this is usually achieved (if I'm not mistaken, it's been a while since I used it in this way) with having different entry points. For webpack to work the best with splitting assets (css&others) you'll need to also write your vue components in a way that webpack can then understand the dependencies (at least this was the case - and there might be some complexities with the webpack chunking and phoenix digest when using dynamic components, etc).
Other option is to, for instance, use an independent nuxt app, which bundles up a VueJS app with everything you need, webpack, server, vuex, a sensible config and "structure" etc. Now you have two distinct applications, running each an http server, you use asyncData & fetch to populate/hidrate your front-end with data from the phoenix app, you can use async components and all that stuff. Then you deploy the front-end (nuxt app) to one heroku instance and the phoenix server on other instance or somewhere else.
At that point your phoenix app is basically an api for the front-end. So the vue app has to be built with that in mind, and now you've got 2 applications to deploy&take care of. It does help in a lot of fronts but it's also more complexity (authorization, cookies, etc), hence why it needs to be weighted if it's worth doing it. The major benefit is that since they're now 2 apps, styling and things related to the front-end can be deployed without needing to re-deploy the back-end.
Depending on the type of front-end you can also deploy the nuxt app as a static website to something like s3/cloudfront or any other cloud storage engine. For instance if you have like, X public pages that are all mostly static content and everything that is dynamic data is behind a login wall or something, then that is a solution that works fine too.
All 3 ways are valid depending on what you need/reason to do it.

Related

Deploy a laravel project on firebase

I have a laravel project which uses sqlite. I want to publish this application to firebase but i dont want to use the database of firebase. Because my tables have been populated already and relationship between them. I know how to publish a static content website on there, but i couldn't find any source about this.
I mean, of course there are a lot of tutorial, article about how to deploy laravel on firebase via using Kreait. But as i mention, i dont want to use firebase realtiem database.
Actually i dont have any idea its possible or not with sqlite. Is it possible to publish a laravel application on firebase hosting with sqlite? Or is it a must?
Im just new to firebase. So i may be missing something on to understand the concept of it.
Firebase hosting is a host for static websites
Firebase hosting does not support any server side scripts such as Ruby, PHP, Python or anything else that is processing your files before output. That would require a application engine such as Google App Engine, Heroku, or similar. The hosting service is a static website hosting service.

How to Test Gol App Engine apps locally on Win 10 and use app.yaml

In Google's latest docs, they say to test Go 1.12+ apps locally, one should just go build.
However, this doesn't take into account all the routing etc that would happen in the app engine utilizing the app.yaml config file.
I see that the dev_appserver.py is still included in the sdk. But it doesn't seem to work in Windows 10.
How does one test their Go App Engine App locally with the app.yaml. ie: as an actual emulated app engine app.
Thank you!
On one hand, if your application consists of just the default service I would recommend to follow #cerise-limón comment suggestion. In general, it is recommended for the routing logic of the application to be handled within the code. Although I'm not a Go programmer, for single service applications that use static_files and static_dir there shouldn't be any problems when testing the application locally. You might also deploy the new version without promoting traffic to it in order to test it as explained here.
On the other hand, if your application is distributed across multiple services and the routing is managed through the dispatch.yaml configuration file you might follow two approaches:
Test each service locally one by one. This could be the way to go if each service has a single responsibility/functionality that could be tested in isolation from the other services. In fact, with this kind of architecture the testing procedure would be more or less the same as for single service applications.
Run all services locally at once and build your own routing layer. This option would allow to test applications where services needs to reach one another in order to fulfill the requests made to them.
Another approach that is widely used is to have a separate project for development purposes where you could just deploy the application and observe it's behavior in the App Engine environment. As for applications with highly coupled services it would be the easiest option. But it largely depends on your budget.

OWIN Authentication Failing - Web API

I have some Web API applications that uses OWIN for authentication. Currently they are hooked up to Google and Facebook. I have them installed in multiple environments (local, dev, test, etc). Recently ALL of my applications in my development environment started failing. When trying to authenticate I would get a response back "access_denied". The URL would look like this:
https://{mydevserver}/{mywebapiapp}/#error=access_denied
The same code base works locally as well as in my test environment.
I tried using the same project (just adding redirect uris and orgins) as well as creating a new project.
I also updated my test environment to use the dev project (id and secret).
Nothing seems to have changed on the server recently. But it seems to be environment specific (because multiple applications are affected as well as multiple providers).
Are there any logging techniques I can use to drill down to a more detailed error message? Any tips or hints for what to try next?
The fix was a bit of an odd one. I had to log into my server, open up a browser and connect to a web page (any page). After doing so it started working again.

Share a folder between Heroku App

For my project, I would like to be able to setup multiple websites (App) and according to the requirements from a clients, I will activate and setup modules. Those modules could be a news module, images viewer module and so on.
My engine, which consist of scripts and libraries that manage a few things such as the WebApp routing, the modules, user rights for the CMS. I would like to get this code shared among all my App to avoid unseless duplication.
I would like to know what's the best way to do this on Heroku since I am totally new to this (Heroku) and I am not totally sure if that is feasable.
Also, am I wrong to believe that each websites is an App even if basically the only difference between them are the template and a single setup file?
Thank you
If you have no experience with git get familiar with it.
I would suggest to have one application with different branches in which you keep the configuration specific to the instance.
You create an app instance for each website.
You push the specific branch to the specific app, e.g.
git push git#heroku.com:.git :master

Heroku architecture for running different applications but on the same domain

I have a unique set-up I am trying to determine if Heroku can accommodate. There is so much marketing around polygot applications, but only one example I can actually find!
My application consists of:
A website written in Django
A separate Java application, which takes files uploaded by users, parses them, and stores the data in a database
A shared database accessible by both applications
Because these user-uploaded files can be enormous, I want the uploaded file to go directly to the Java application. My preferred architecture is:
The Django-generated webpage displays the upload form.
The form does an AJAX submit to the Java application
The browser starts polling the database to see if the Java application has inserted the data
Meanwhile the Java application does its thing w/ the user-uploaded file and updates the database when it's done
The Django webpage AJAX-refreshes a div with the results of the user upload once the polling mechanism sees that the upload is complete
The big issue I can't figure out here is if I can get both the Django the Java apps either running on the same set of dynos or on different dynos but under the same domain to avoid AJAX cross-domain issues. Does Heroku support URL-level routing? For ex:
Django application available at http://www.myawesomewebsite.com
Java application available at http://www.myawesomewebsite.com/javaurl/
If this is not possible, does anyone have any ideas for work-arounds? I know I could have the user upload the file to Django and have Django send the request to Java from the server-side instead of the client side, but that's an awful lot of passing around of enormous files.
Thanks so much!
Heroku does not support the ability to route via the URL. Polyglot components should exist as their own subdomains and operate in a cross-domain fashion.
As a side-note: Have you considered directly uploading to S3 instead of uploading to your app on Heroku which will then (presumably) upload to S3. If you're dealing with cross-domain file uploads this is worth considering for its high level of scalability.

Resources