Every time I make some change in Laravel, I need to do php artisan swoole:http stop and after that php artisan swoole:http start. It doesnt see my changes until I do that. Is there any other way to reloading server, or what..Also, when I do docker-compose up -d it starts running, but in app I am getting Connection is refused.. I am new with Docker and Swoole. Can someone explain to me how to use it? Thank you
I assume you're using octane but in general.
it loads the application into memory and then serves the application from memory. if you're doing local development there is a watcher you can add that will auto-reload when it detects changes.
https://laravel.com/docs/9.x/octane#watching-for-file-changes
As more of a general statement and background: this is how most programming languages work, the code is compiled into an executable and any time you need to make changes, you would have to recompile and re-execute the program.
PHP is an interpreted language so the compilation normally happens as you execute the script which means it can respond to filesystem changes without you having to manually restart or recompile. This may be convenient but it also can be very poor performing.
Various optimizations can be made to change how/when the code is compiled to improve performance (opcache, swoole, etc). When you're running with these optimizations, you will have to follow their guidelines on what needs to be done when you make changes (recompile, restart, or clear caches).
Related
My drupal 7 site takes more.. time to node add (node/add) and update (node/edit). Please anyone help me to reduce the time of node add and edit
Normally I'd do this:
Try disabling modules you installed recently. Or revert/switch off new functionality. I assume when you say takes more.. time you compare it to some time in the past when it was better. Try to recall what were the changes to the website configuration or modules set since then and roll them back one by one.
On your local copy of the website enable Devel module and look at the amount of queries executed while adding or updating a node. look at the most slow ones. It may lead you to some more narrow question.
The last resort I'd say - profiling code with xdebug but I'm not sure how deep you are with PHP development and debugging. This one requires skills in configuring your web-server, and using a PHP IDE or at least an xdebug profile reader (e.g. webgrind)
Try installing fresh Drupal 7 on same server (i.e. create staging sub-domain) and see will on it adding/updating also take long time. If so, problem is with server obviously.
If it's fast with fresh Drupal then problem is on your site - try turning off modules one by on (again on staging domain preferably) until it starts working at normal speed. Must be that some module is causing the problem.
I am developing my first typo3 extension with Extbase and Fluid.
Whenever I change something in any PHP file (a controller for example) I have to deactivate and reactivate my extension to see the changes.
That costs a lot of time and is really annoying..
Flushing the caches didn't solve the problem for me.
How can I speed up my typo3 extension development workflow?
You have to clear the caches from the install tool, that also flushes the caches that contain annotation data.
To ease the process of doing so, you could check out the extension typo3_console, it should contain a command to flush the caches from the command line. Although I have never used it myself, I've heard many good things about it.
I have a Laravel 4.2 application and it in production environment.
Sometime i have bug fixes or updates for it in the local edition, I want to know how do i move these changes from local to production.
Replacing just the file/class that was changed doesn't work. I tried replacing just the controller file but it doesn't work.
Does Laravel compile the code somewhere that i need to upload to production server, What all do i need to change/upload in production to reflect the changes?
By default, PHP is not a compiled language, so changed and uploaded files will work without any special process. Laravel is just PHP, so it follows the same rules.
However, Laravel uses an autoloader that keeps track of all of your classes. When you add a new class, you need to tell the autoloader that it exists by running:
composer dump-autoload
This will scan the available classes and update the autoloader list.
If the problem persists after you run composer dump-autoload, or if you did not add any new classes, there are three potential problems to consider:
Did you upload the files correctly?
Log onto the production server and look at the timestamp of the uploaded files. Do they match your expectation? Consider opening the files in production to see if they contain your latest changes.
Do you have a caching or compiling system in place?
While PHP is not compiled by default, there are tools available that allow you to compile it, and other tools that allow you to cache the output of the scripts. Ask your server administrator if any of these tools are being used.
Do your changes perform as expected?
Finally, check to see if your changes are in production, but not operating in the way that you expect.
Each time I make changes to code I have to restart the server or it won't change the output.
I have tried using thin and webrick.
My development.rb file says "config.cache_classes = false".
I am using RubyMine.
Sometimes my view updates, but the models never update.
Anything else you need to know to help me troubleshoot this problem?
EDIT:
I am away from my coding machine right now, but I started thinking. I have a file called makesandwich.rb in app/models directory and app/models/Lesson.rb calls a function in that file. I have been making changes to the makesandwich.rb file and it hasn't been reloading. Do I need to add that file or should it be included automatically in reload?
I recently had this problem as well. As others have said, it can be caused by setting config. cache_classes or config.threadsafe! to true, which will probably be the problem in most cases.
However, my problem was caused by running rails on a VM. My code was stored on a folder on my host machine, which was then mounted to the VM. Unfortunately, the clock time of my host machine was out of sync with the VM. Here's what would happen when I modified my code:
I'd change some code in a text editor on my host machine
Saving the code would set the modified time of the file to the current time for my host machine
This time was in the future, from the perspective of the VM
When the file update checker looked for modified files, it ignored my file since the modified date was in the future
Again, this is probably a pretty rare case, but it's worth checking this if you're running rails on a VM
If you're working on a Rails 3 project, you might find Zeus helpful. It keeps track of the files in your project, and reloads only the changed code in memory. It makes REPL a lot quicker for Rails 3 development.
https://github.com/burke/zeus
The problem was that I put a function in a separate file and was editing the function there. This will work fine for production, but for development purposes I put the function back in to the Lesson.rb file and the refreshing started working properly.
I'm creating a module that requires a few things to be done (once only) when the module is installed. There may be a number of things that need to be done, but the most basic thing that I need to do is make an API call to a server to let the external server know that the module was installed, and to get a few updated configuration items.
I read this this question on stackoverflow however in my situation I truly am interested in executing code that has nothing to do with the database, fixtures, updating tables, etc. Also, just to be clear this module has no affect (effect?) on the front end. FYI, I've also read this spectacular article by Alan Storm, but this really only drives home the point in my mind that the install/upgrade scripts are not for executing random PHP.
In my mind, I have several possible ways to accomplish this:
I do what I fear is not a best practice and add some PHP to my setup/install script to execute this php
I create some sort of cronjob that will execute the task I need once only (not sure how this would work, but it seems like it might be a "creative" solution - of course if cron is not setup properly then this will fail, which is not good
I create a core_config_data flag ('mynamespace/mymodule/initialized') that I set once my script has run, and I check on every area of the adminhtml that my module touches (CMS/Pages and my own custom adminhtml controller). This seems like a good solution except for all of the extra overhead every time CMS/Pages is hit or my controller is hit, checking this core_config_data setting. The GOOD thing about this solution would be that if something were to fail with my API call, I can set this flag to false and it will run again, display the appropriate message, and continue to run until it succeeds (or have additional logic that will stop the initialization code after XX number of attempts)
Are any of these options the "best" way, and is there any sort of precedent for this somewhere, such as a respected extension or from Magento themselves?
Thanks in advance for your time!
You raise an interesting question.
At the moment, I am not aware of a means to go about executing any arbitrary PHP on module installation, the obvious method (rightly/wrongly) would be to use the installer setup/upgrade script as per 1 of your Q.
2 and 3 seem like a more resource intensive approach, ie. needlessly checking on every page load (cache or not).
There is also the approach of using ./pear to install your module (assuming you packaged it using Magento). I had a very quick look through ./downloader/pearlib/php/pearmage.php but didn't see any lines which execute (vs copying files). I would have imagined this is the best place to execute something on install (other than 1 mentioned above).
But, I've never looked into this, so I'm fairly interested in other possible answers.