Copy ec2 files to local - magento

I would like to create a local copy of a live Magento website, so that I can test and develop on my local version.
I did the following so far:
installed XAMPP for Mac OS X 1.7.3;
created a blank database;
installed MySQL Workbench 6.0 for Mac;
tried to connect to AWS EC2 and RDS instances via SSH following this scheme http://thoughtsandideas.wordpress.com/2012/05/17/monitoring-and-managing-amazon-rds-databases-using-mysql-workbench/;
but I can't connect (it says authentication failed but credentials are correct).
Maybe there's a simpler way to create a copy of my files on EC2 and RDS and run them locally?Or maybe am I just missing something?
Thank you

This is are the steps that you have to fallow to create a development site in your local pc
Zip all the magento files
zip -r magento.zip /var/www/
Make a dump of the RDB
mysqldump -u username -p [database_name] -h rbs-Endpoint > dumpfilename.sql
Download the files to your local pc
Use sftp to download all the files and check the security groups to
make sure the ssh port is open
Import the RDB to the database that you create locally
Before restore the db pls check this http://www.magentocommerce.com/wiki/1_-_installation_and_configuration/restoring_a_backup_of_a_magento_database
mysql -u username –p [database_name] < dumpfilename.sql
Unzip the files in your pc and move to your local webserver
Change the site url http://www.magentocommerce.com/wiki/1_-_installation_and_configuration/update_site_url_in_core_config_data or http://www.magentocommerce.com/wiki/recover/restore_base_url_settings
Update the magento local.xml with your local database access credential
Clean the magento cache
BUT, My recommendation is to create a development site in another EC2 in Amazon AWS

Related

Cannot open phpmyadmin after changing the user's password

I'm doing an inventory management project in Java using XAMPP and JDBC. After downloading XAMPP and changing the user's password on localhost, I cannot log back in to phpmyadmin. I try configuring the config.inc.php file but cannot log back in to phpmyadmin. I need help.
I'm using Mac and XAMPP 8.2.0

how to deploy python3 django2 website on aws ec2 with mysql

I never hosted any website before may be thats why this task became so tough for me.I searched various codes for deployment but wasn't able to host my website.
i used python 3.6.4 and django 2.0.2 with mysql database for my website. It would be a great help if i get steps from scratch for deployment with my requirements.
Thanks in advance!
Below are the basic steps to host your django website on any linux based server.
1) Create requirements.txt file which will include all your pip packages.
On your local enviroment just do pip freeze. It will show you something as below. Include those package to your file.
Django==1.11.15
pkg-resources==0.0.0
pytz==2018.5
2) Create virtual env on your ec2 amazon instance. You can follow same step give on below website.
https://docs.python-guide.org/dev/virtualenvs/
3) Install you local packages to this virtual env.
4) If you have mysql as backend you can install mysql with below command
sudo apt-get install mysql*
Or you can use RDS (Amazon Relational Database Service)
5) Check if you django is able to connect to mysql using below command
python manage.py check
6) If above command work without error, You need to install two things.
1) Application server
2) Web server
7) You can use any Application server like uwsgi, gunicorn
https://uwsgi-docs.readthedocs.io/en/latest/
https://gunicorn.org/
8) Web server will be nginx
https://nginx.org/en/
9) For you static file you will need Bucket. You need to create bucket and host you static files their.
You can find help online to achieve above steps.

How to write a binary file from AWS RDS Oracle database directory to local file system

How to write a binary file from AWS RDS Oracle database directory to local file system on EC2. I tried using Perl script with UTL_FILE, but it can't find read the file. Getting the permissions error.
In AWS RDS Oracle, you do not have access to the file system.
If you need access to the file system then you need to use instance EC2 and install the ORACLE RDBMS.
AWS has an option to integrate with S3: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/oracle-s3-integration.html
You could upload your files there and then download to your local machine. Here are steps to use it with Datapump: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Oracle.Procedural.Importing.html#Oracle.Procedural.Importing.DataPumpS3.Step1

I Can't download phpmyadmin.conf from amazon ec2

Since i have to change some settings inside "etc/httpd/conf.d/phpMyAdmin.conf".
i can't download this file using "FileZilla", I also tried sudo nano command in putty , it returns empty. i don't know how to change permission for this file.
I spent more than an hour. Guide me if someone know how to resolve this.
EC2 is a computer rental service, not a web hosting service, so you won't be able to connect with FTP (filezilla) unless you run an FTP server on your EC2 instance.
As for editing the file while you're connected through SSH (putty), you need to make sure that you're properly referencing the file you want. Try running "sudo nano /etc/httpd/conf.d/phpMyAdmin.conf". Note the leading "/" on the file path; it's important.

Keeping PHPStorm files in sync with the ones generated on the server via php artisan

I am using Laravel with PHPStorm and a custom server where I connect via SFTP. The problem is that being SFTP, it's not in sync. So everytime I generate files via php artisan command, I have to download the file(s) with PHPStorm. I know that I can get around that by using Homestead and Shared folders, but this project requires a custom VPS.
I know that no SFTP "drive" is currently working ok with Windows. Also, the server is remote, not on the same network, so Samba can't do the job.
Thank you!
This is a workflow I use, you may simply need to do the following, assuming you have already setup a default deployment server.
Editing remote files
If you are editing the remote files instead of a local copy, don't; instead:
create a local copy/git clone/etc of you project files.
create a new phpstorm project with the local copy.
Setting up a sync
If you already are working off a local copy but just need sync setup:
ctrl+shift+a
type deployment
select options
change the option: Upload changed files automatically [..] to always
enable upload external changes
As an added bonus, this also automatically syncs assets from say gulp watch too.
If you haven't setup a deployment server
ctrl+shift+a
type deployment
select configuration
create a new server with you method of connection to it.
enable as default server (last icon on the top left column)
Important: if you don't select the server as the default, it will not be able to auto upload changes.
Also don't forget to setup the excludes in the configuration menu, I usually exclude bower_components, and node_modules from deploying to my servers, and only send the build assets. (But it's up to you)
EDIT: Don't run commands remotely, run them locally and let them sync back to the server.
I execute the artisan commands on both sides... i do it on this way on my linux maschine
<?php
unset($argv[0]);
$params = implode(' ', $argv);
$remoteOutput = shell_exec("sshpass -p password ssh -o StrictHostKeyChecking=no user#1.1.1.1 'php /path/to/artisan $params'");
if(!empty($remoteOutput)){
shell_exec("php artisan $params");
}
Save it and add it as commandline tool in phpstorm.... in windows i think you can use the PHP SSH library or somthing else.

Resources