how to deploy python3 django2 website on aws ec2 with mysql - amazon-ec2

I never hosted any website before may be thats why this task became so tough for me.I searched various codes for deployment but wasn't able to host my website.
i used python 3.6.4 and django 2.0.2 with mysql database for my website. It would be a great help if i get steps from scratch for deployment with my requirements.
Thanks in advance!

Below are the basic steps to host your django website on any linux based server.
1) Create requirements.txt file which will include all your pip packages.
On your local enviroment just do pip freeze. It will show you something as below. Include those package to your file.
Django==1.11.15
pkg-resources==0.0.0
pytz==2018.5
2) Create virtual env on your ec2 amazon instance. You can follow same step give on below website.
https://docs.python-guide.org/dev/virtualenvs/
3) Install you local packages to this virtual env.
4) If you have mysql as backend you can install mysql with below command
sudo apt-get install mysql*
Or you can use RDS (Amazon Relational Database Service)
5) Check if you django is able to connect to mysql using below command
python manage.py check
6) If above command work without error, You need to install two things.
1) Application server
2) Web server
7) You can use any Application server like uwsgi, gunicorn
https://uwsgi-docs.readthedocs.io/en/latest/
https://gunicorn.org/
8) Web server will be nginx
https://nginx.org/en/
9) For you static file you will need Bucket. You need to create bucket and host you static files their.
You can find help online to achieve above steps.

Related

Deploy with Laravel?

First Laravel Project.
I want to deploy my Laravel project to a Hoster. I have FTP access to it, but I don't want to tinker with filezilla everytime I change something. Plus my Development and 'Finished' projects has some differences (for example Debugbar and different DataBase authentication. What's the best method to deploy a Laravel Project?
It can be done in many ways. So I am going to give a headstart.
Warning: This might not be everything you need to know. I recommend you to learn more about, deployment,ssh, version controlling, apache vhosts, etc , this is just a headstart
Here is how I do it on my Ubuntu server with apache, php and mysql.
I use Git for version controlling and bitbucket and github for managing repositories.
1 - make my project a git repository.
2 - push the repo to bitbucket.
3 - connect to the remote server through ssh and setup the apache vhosts, databases etc.
create a vhost /etc/apache/sites-available/somesite.com.conf file
add the entry to /etc/hosts file
4 - pull the repo from bitbucket to remote server, create and bring the required changes to the production .env file
5 - do a composer install
6 - run php artisan key:generate and php artisan migrate
7 - turn on the site
run sudo a2ensite somesite.com.conf
run sudo service apache2 reload
now the site its up and ready to go.

Deployment in Amazon VPC with custom gems hosted inside company's network

I have a very interesting problem. Following is my current workflow of deployment in Amazon EC2 in classic mode.
Deploy host inside my Company's network.
Deploy Target is EC2 machine in AWS.
Have custom ruby gems inside the company's git account (Hence cannot install gems from outside my companies network).
To overcome the problem mentioned in Point #3. I have used reverse tunnelling between the deploy host and deploy target.
I am using capistrano for deployment.
Now the problem arises when we decided to move from Amazon Classic to Amazon VPC with deploy target having only private ip address. Here is the workflow I thought of for deploying code in VPC instances.
Create a deploy host in Amazon VPC and attach public dns to it so that I can access it from my main deploy host (which is inside my company's network.)
Deploy the code by running the deployment scripts from AWS deploy host.
The problem is that I am not able to find a way to install gems which are hosted inside the git account of my company. Can you guys help me with this problem?
Prior to deployment, you can just setup git mirrors of your production repositories by just pushing to git bare repositories in your AWS deploy host.
Then that AWS deploy host also has access to your VPC so you can do the deployment from there.
Hope it helps.
Download the gems first and then pass it to the ec2 instance in vpc using scp
scp -r -i key ubuntu#ip-address:/ruby-app
Then run gem install gem-name from the folder, it will install gem from within the folder matching with the name.
Run bundle package, this will download all the gems and will be present in vendor/cache folder. Now move this files to the ec2 instance.

Copy ec2 files to local

I would like to create a local copy of a live Magento website, so that I can test and develop on my local version.
I did the following so far:
installed XAMPP for Mac OS X 1.7.3;
created a blank database;
installed MySQL Workbench 6.0 for Mac;
tried to connect to AWS EC2 and RDS instances via SSH following this scheme http://thoughtsandideas.wordpress.com/2012/05/17/monitoring-and-managing-amazon-rds-databases-using-mysql-workbench/;
but I can't connect (it says authentication failed but credentials are correct).
Maybe there's a simpler way to create a copy of my files on EC2 and RDS and run them locally?Or maybe am I just missing something?
Thank you
This is are the steps that you have to fallow to create a development site in your local pc
Zip all the magento files
zip -r magento.zip /var/www/
Make a dump of the RDB
mysqldump -u username -p [database_name] -h rbs-Endpoint > dumpfilename.sql
Download the files to your local pc
Use sftp to download all the files and check the security groups to
make sure the ssh port is open
Import the RDB to the database that you create locally
Before restore the db pls check this http://www.magentocommerce.com/wiki/1_-_installation_and_configuration/restoring_a_backup_of_a_magento_database
mysql -u username –p [database_name] < dumpfilename.sql
Unzip the files in your pc and move to your local webserver
Change the site url http://www.magentocommerce.com/wiki/1_-_installation_and_configuration/update_site_url_in_core_config_data or http://www.magentocommerce.com/wiki/recover/restore_base_url_settings
Update the magento local.xml with your local database access credential
Clean the magento cache
BUT, My recommendation is to create a development site in another EC2 in Amazon AWS

How to setup Pydevd remote debugging with Heroku

According to this answer I am required to copy the pycharm-debug.egg file to my server, how do I accomplish this with a Heroku app so that I can remotely debug it using Pycharm?
Heroku doesn't expose the File system it uses for running web dyno to users. Means you can't copy the file to the server via ssh.
So, you can do this by following 2 ways:
The best possible way to do this, is by adding this egg file into requirements, so that during deployment it gets installed into the environment hence automatically added to python path. But this would require the package to be pip indexed
Or, Commit this file in your code base, hence when you deploy the file reaches the server.
Also, in the settings file of your project if using django , add this file to python path:
import sys
sys.path.append(relative/path/to/file)

Deploying Django to Heroku using a Windows machine (Production server NOT development server)

I use a Windows machine and have a Django project that I have successfully deployed to Heroku, albeit using the development server. To use a production server Heroku seems to require 'Gunicorn' which does not run on Windows.
This is not good for testing locally before deploying. Does anyone know of any way to get around this? Perhaps some way to use a different server on Heroku?
I found a solution that may help when deploying to heroku using a Windows machine. Here is what I do:
Use the development server locally with:
python manage.py runserver
Install and add 'Gunicorn' to your installed apps in settings.py.
Add a process file in the root directory that tells heroku to use the Gunicorn server. This is a file called 'Procfile' with the following code:
web: python kalail/manage.py run_gunicorn --bind=0.0.0.0:$PORT
This way you test using the development server, while heroku uses the Gunicorn server. Make sure you set up serving static files(css/js/imgs) after this, because only the development server automatically serves static files, and the Gunicorn server will need to be configured to do so.
You can run the development server locally quite easily:
> python manage.py runserver
All you need to do is specify path to wsgi script from root directory:
$web: gunicorn hellodjango.wsgi

Resources