Angular 5 Create an Application Source Bundle for AWS EC2 - amazon-ec2

In AWS Elastic Beanstalk, there is a wizard flow for deploying node.js apps. When I get to the step for "upload your own" application source, it describes in generic terms their 3 requirements: zip file, less that 500MG, no parent folder.
But they stop there. No specifics.
I dropped out to bash and ran...
ng build --prod
...and now have a dist folder. So... what do I include in my zip file and at what folder level? I have tried just /dist, and also /myapp/dist which included all the other loose files in /myapp but no other sub folders such as /src. I have looked all over the web, but don't see what should be a fairly simple tutorial on zipping up an Application Source Bundle for AWS EC2.
What should be included in the zip file for upload?

The cardinal sin in my question above was attempting to run my Angular 5 app in AWS by using their choice for node.js as my server platform. Here is what I learned (with some help from folks like Albert Haff: Angular 5 uses Node (ng serve) to simulate a webserver while you code. However, even though there is a supported flag for --prod, it's not to be used in production! It's really easy (and tempting) to select node.js as the environment when deploying your Angular 5 app via Beanstalk -- but don't do it!
from within your Angular 5 project folder, run ng build --prod ( and consider adding --aot)
if you can, from within the new /dist folder that the build just created (or updated), optionally run compression like for i in dist/*; do brotli $i; done
from within the /dist folder, zip up ALL the contents including subfolders.
go to beanstalk, and as you create a webserver environment, select Tomcat (or any other plain old webserver, but DON'T pick node.js even though it's on the list!).
on Application Code, select to Upload Your Own, and browse to the .zip file you created in step 3 above.
click Create Environment and in a few minutes your Angular 5 app will be serving up on the internet.
Now, from here you will likely need to connect up your domain name. Use Route 53 for that.

Related

How can I deploy just one file to google app engine?

I have already deployed my application to google app engine but I have modified just one file and I want to deploy just this file
Is not possible to upload a single file to app engine, every deploy takes the root folder where app.yaml file is located and upload the directories and files.
But if you made some changes or add a single file, the files that are identical are not re-uploaded.
Only the new files or modified are being uploaded, but the new upload creates a new version of the service.
as stated on the documentation:
"You can update your service at any time by running the gcloud app deploy command. Each time you deploy, a new version is created and traffic is automatically routed to the latest version"

Handling Laravel project on a local machine and an Nginx server

I have a Laravel project in my local machine that is currently in development, due to reasons, the application is showcase to people using an Nginx server, the problem is this, in the local machine I host the project in a root directory (localhost:8000/), but in the Nginx I host it in a folder (e.g. 10.x.x.x/webapp/), this breaks a lot of stuff and I need to constantly change back and fort the reference to the assets and scripts I have, example:
Font awesome fail to load because is looking for the js & css directories and not the webapp/js & webapp/css directories
In a vue component holding a picture the picture won't load because is looking for /img/picture.jpg instead of /webapp/img/picture.jpg
The only way I manage to solve this is, in the case of Font awesome, is to add the mix.setResourceRoot('/webapp') parameter on the webpack.mix.js file, and in the case of the assets to add /webapp at the beggining, but doing that breaks everything on localhost, since now everything is pointing to a folder that doesn't exist in my machine.
What is a possible solution to have both running without constant reference changes? And what other possible problems could emerge?

Best practise/way to deploy Laravel + Vue SPA application to AWS

I have 2 repositories residing in Bitbucket - Backend (Laravel app as the API and entry point) and Frontend (Main application front-end - VueJs app). My goal is to set up continuous deployment so whenever something is pushed in either of the repos in master (or other branch selected by me) branch it triggers something so that the whole app builds and reaches the AWS EC2 server.
I have considered/tried the following:
AWS CodePipeline and/or CodeDeploy. This looked like a great option
since the servers are in AWS as well. However, there is no support
for Bitbucket out of the box, so it would have to go to Bitbucket
Pipeline -> AWS Lambda -> AWS S3 -> AWS CodePipeline/CodeDeploy ->
AWS EC2. This seems like a very lengthy journey and I am not sure if
that's a good practice whatsoever.
Using Laravel Forge to deploy the Laravel app, and add additional steps to build the VueJS app. This seemed like a very basic solution,
however, the build process seems to fail there as it just takes long
time and crashes with no errors (whereas I can run exact same process
on my local machine or a different server hosted elsewhere). I am not
sure if this is issue with the way server is provisioned, the way
Forge runs deployment script or the server is too weak to handle it.
The main question of mine would be what are the best pracises for deploying the app of such components? I have read many tutorials/articles about deploying a NodeJS app, or a Laravel app, but haven't gotten good information about a scenario like this.
Would it be better to build the front-end app locally and version control the built JS file? Or should I create a Pipeline in Bitbucket that would build the app and then deploy it? Or is it the best to just version control and deploy the source files and leave the whole build process as the last step in the deployment process that will be done by the server that is hosting the app itself? There are also some articles suggesting hosting the whole front-end app in S3 bucket - would that be bad practise as well?
Appreciate any help and resources that would help!
From the sounds of things it sounds like you have two types of deployments you might want to run.
Laravel API: If you're using Laravel Forge already then this is a great way to go about deploying your Laravel App, takes care of most of the process and easy server management.
Vue.js App: Few things you can do here, I personally prefer using a provider like Vercel or Netlify who let you deploy your static sites/frontends for free-low costs. You can write custom build steps but they have great presets that should work out the box.
If you really want to keep everything on AWS then look into how to host static sites on AWS

How to get started in automated deployment

I have an app (built with laravel) which i deployed it and working very well, but i have a question, when i deploy it the server i made these process :
1- minify css and js files and comine them in a single file
2- changing some configuration (database,hostname,mail sever ,etc ...)
3- Finally i upload my files to the server.
how can i return back to my local config and unminify my js and css files without doing it manually ?
is there a better way to make it automated ? i know that the first step can be done by gulp or any javascript task runner by a single command and the second one is not a big deal ,but i just want to know if there an automated way?
Why don't you just have a .env config file out of you version control and compress your CSS/JS using Laravel Mix as a part of your deploy process?
To make it clear:
Keep your .env file in .gitignore. Thus you have to setup your environment settings only once (database, hostname, etc).
Use npm run prod to minify your CSS/JS: https://laravel.com/docs/5.4/mix

Phoenix file copying on Heroku

I'm trying to upload images to my Phoenix app on Heroku. I have a simple app that follows the instructions for file uploading from Phoenix's docs.
I have a simple form and the controller uses File.cp_r() to copy the file from the temp directory.
def create(conn, %{"user" => user_params}) do
if upload = user_params["photo"] do
File.cp_r(upload.path, "priv/static/images/foo.jpg") #non-dynamic name, doens't matter
end
...
end
Works just file on my local. However when I upload this to Heroku, test the form and heroku run find on the directory, I don't see anything.
I have noticed that this directory on heroku has a seemingly forbidding privilege:
drwx------ 2 u25619 dyno 4096 Apr 23 05:14 images
I tried slipping in a nice little File.chmod("priv/static/images", 0o777), but to no avail; that directory seems locked away from me, so I think this is a heroku issue.
Any idea how to handle this?
EDIT: Resolved by using the phoenix dep ex_aws to upload to an Amazon S3 bucket.
ex_aws dependency
partial explanation (note: you need to add poison and hackney to make this work, they are not mentioned)
The file system on heroku is ephemeral, and you won't have access to any files that you save on it across deploys or when you start new instances.
Also, when you run heroku run, you're not connecting up to that same instance that's currently running your app, instead what it'll do is to launch a new instance so those uploaded files would not exist there.
A better approach is to save the uploaded files to S3 or similar where you can still access it across deploys.

Resources