Hi After some issues with Vagrant I'm trying out a switch to Docker and using Sail but struggling with quite a few concepts that I can't seem to find the answers to (or get my head around).
I have a working project that I've been building on a local machine via Homestead and have it in a bitbucket repository. I can make changes to that project and it works on a windows machine and can access in the browser using the URL for the app.
I've now succesfully installed docker, ubuntu on windows and can "sail up" from within the project by navigating within ubuntu to the local files - from here I can "sail up" and browse to the app using http:/localhost. I can access the database via TablePlus and the app works - but very very slowly.
Search results suggest it's slow because I'm accessing the app via the windows directory (currently files sit on drive E) which then leads to some questions:
Should I place my project files within the home directory within ubuntu and run sail from there? If so is there an easy way to copy the directory from the E drive to the home directory or do I simply pull the repository from bitbucket from within ubuntu?
Does this mean each time I shut down the machine I will lose my app files? Meaning I should then ensure I have pushed any changes to the bitbucket repository?
As I understand it ubuntu on windows won't have php or composer installed as default - so do I need to install these first before even thinking about moving my project over?
Are there any recommended guides/videos that would help with a focus on Laravel and Docker (currently working my way through the ones on this site)
It's a strange concept - I got the hang of Homestead and was comfortable with the syncing concept but completely lost on docker - despite getting an app working but the concepts are lost on me at the moment and the Laravel set up guide assumes I have a grasp of them.
Thank you :)
Related
Is it ok to use composer on localhost to upgrade core and modules, and then FTP the files to the server?
I'm on shared hosting, and although it's possible to use SSH with GIT, it's a pain to set up...
Some information about my use case:
No multiple users, no team, no multiple developers, I'm a one man show
Small business sites
I'm the only person adding content
No need for version control, no module development, no coding
I'm a site builder, and the only code I touch are the CSS files of the theme. Will this workflow be ok?
Install Drupal 8 with composer
Import site into aquia dev desktop
Use composer to update modules and core
FTP sites folder to server
Use backup migrate when I alternate working from live to localhost and vice versa
The question is very general and, I think, it has some "conflict" of ideas.
Your question:
Is it ok to use composer on localhost to upgrade core and modules, and then FTP the files to the server
Technically yes. But this method has lots of disadvantages:
DR (Disaster Recovery) - What if you uploaded something that doesn't work? How quickly can you recover? With git its a matter of git checkout
Composer is environment-agnostic - When you run composer install, composer checks some dependencies in you machine and then decides what to install. What if you are missing some required packages in you remote machine? To fix this you should run composer install on the remote machine (via ssh)
FTP might take too long to finish - As apposed to git (or rsync) FTP will upload all files to the server. Other tools will upload just the diff between previous version and the current. So I will always choose rsync over FTP
Security - use SFTP
Your question:
I'm a site builder, and the only code I touch are the CSS files of the theme. Will this workflow be ok
Sounds correct - but remember the composer issue..
Good evening,
This question is has to do with the PHP composer packet management. I have installed in on our test environment (xampp) without any issues and have downloaded the necessary package without any problems (package name: mpdf). After I issue the command to get the package, the vendor package showed up as it should have and my project worked great.
Fast forward, we are not ready to deploy this whole application to a linux(ubuntu 16.04) box using a versioning system (svn) and all the files that were on the test system have been deployed to production. The only problem is that the specific parts of our site on production that needs the mpdf package does not work.
My question is this, even though the vendor folder was also copies to production using SVN, is there anything else for us to do to make this work on the production box?
I am mainly speaking about any necessary steps that might need to performed on the ubuntu box?
Thank for all the help in advance,
George Eivaz
I figured out what the issue was. It had to do with the permissions not being correct on Ubuntu.
I haven't managed to find the solution how to get an access to Magento App code, which is in Docker container, at my host machine to have a chance to develop in my favorite IDE.
In detail, I use this image (https://github.com/alexcheng1982/docker-magento) to get Magento 1.9.
I built containers by the command "docker-compose up -d" and everything is fine. I can see my site that works fine at http://local.magento .
But, as a developer I want to open the app in PHPStorm editor on my host machine. How to do it?
Thanks in advance.
You have to add a volume
volumes:
- ./src/:/var/www/html
Data will be added in local src folder.
Well the big issue I got is how can I run Odoo on my local host (WAMP). I have gotten the files from my company server however I need to test and migrate it so I am trying to run it on my local host however I have had no luck even accessing a single web page. I have copied and pasted the folder into the correct location. However if I try to access it it just takes me to a 404 error or a directory listing page (folder structure)
So just wondering if anyone knows how I can run odoo files from my server to my local host????
Just copying folder structure is not enough for proper migration. You should also have access to the Odoo database: it uses PostgreSQL, and you should either have access to your company database server or duplicate database. It's even more important than files in Odoo directory.
Moreover, for running Odoo on your local machine, you should not have WAMP installed - python with some modules and PostgreSQL is enough.
If you want "clean" install and then using some migration tools, you can install Odoo into your machine (I provided link for 9.0 version, for 8.0 you can use this link or choose corresponding version on the Odoo installation page) - this method will setup all necessary environment automatically, but you should manually upload your data to the fresh install.
Installation from sources is some more tricky, but still not very hard. You can find instructions in official website.
After this you can check directory structure and copy your custom addons, templates, design or content from existing directory structure to the fresh one. Don't forget about database - without it you won't have huge part of your data available on company server.
For running Odoo you should run odoo.py with your python from your Odoo directory (you should have python 2.7.x installed on your machine).
By default Odoo uses port 8069, i. e. for accessing it you should type in browser
localhost:8069
For more detailed instructions please refer to Odoo documentation.
I keep getting a TokenMismatchException when working on a cloned project using Vagrant on Windows.
I have several other laravel projects using similar setups and they work fine.
I tried cloning this project on a VM using VirtualBox and it works, I also cloned this project on a temporary remote staging server and it works but it does not using Vagrant on Windows.
I commented out the
Route::filter('csrf', function()
{
if (Session::getToken() != Input::get('csrf_token') && Session::getToken() != Input::get('_token'))
{
throw new Illuminate\Session\TokenMismatchException;
}
});
but unfortunately I cannot login afterwards as it just returns me to the index page.
I also changed the app/storage permission to 777 recursively as I believe that the token issue is within a session file which is not being overwritten. I also deleted all the content of app/storage/sessions and i can see new files generated when I refresh the application.
I would really like to be able to work using vagrant on this project as it requires regular and speedy changes and also because this is the setup I use on all projects. As I said all other Laravel projects work fine even the ones which I started off using VirtualBox and now on Vagrant.
All help is appreciated. Please let em know if my question isn't clear enough.
Thanks
As my question says, the issue was due to directory permissions. Because the original project was developed on a windows system not using Vagrant nor VirtualBox the permissions could not be changed through vagrant.
I solved this by starting a new laravel application and copying my controllers, models, config files and assets.
Now I can use Vagrant to manage this project.