preUpdate and postUpdate events not triggered on Doctrine 2 - events

I have followed the instructions from this tutorial: http://symfony.com/doc/current/cookbook/doctrine/event_listeners_subscribers.html, and have created a simple listener, that listens for events dispatched by Doctrine on insert or update of an entity. The preInsert and the postInsert events work fine and are dispatched on the creation of a new entity. However, preUpdate and postUpdate are never called on the update of the entity no matter what. The same goes for onFlush. As a side note, I have a console generated controller that supports the basic CRUD operations, and have left it untouched.
Below are some code snippets to demonstrate the way I am doing this.
config.yml
annotation.listener:
class: City\AnnotatorBundle\Listener\AnnotationListener
tags:
- { name: doctrine.event_listener, event: postUpdate}
Listener implementation (I have omitted the other functions and left only the postUpdate for simplicity purposes)
class AnnotationListener
{
public function postUpdate(LifecycleEventArgs $args)
{
$entity=$args->getEntity();
echo $entity->getId();
die;
}
}
The entity id is never displayed, and the script continues its execution until it is complete, despite the die at the end of the function.

Did you forget to add #HasLifecycleCallbacks annotaion? You could use #PreUpdate annotation and skip service definition altogether.
/**
* #ORM\Entity
* #ORM\HasLifecycleCallbacks
*/
class YouEntity
{
/**
* #ORM\PrePersist()
* #ORM\PreUpdate()
*/
public function preUpdate(){
// .... your pre-update logic here
}
....
}
In my opinion this way of attaching events is much easier as you don't have to define new services and listeners explicitly. Also you have direct access to data being updated as this method is locations within your entity.
Now, drawback is that you mix logic with your model and that's something that should be avoided if possible...
You can read more about Lifecycle callbacks here:
http://symfony.com/doc/master/cookbook/doctrine/file_uploads.html#using-lifecycle-callbacks

Related

Attaching many to many relations while still binding to created event

So I've run into this issue a few times and now I've decided that I want to find a better solution.
For examples sake, I have two models, Order & Product. There is a many to many relation so that an order can have multiple products and a product can of course have multiple orders. Table structure looks like the below -
orders
id
more fields...
products
id
more fields...
product_orders
order_id
product_id
So when an order is created I run the following -
$order = Order::create($request->validated())
$order->products()->attach([1,2,3,4...]);
So this creates an order and attaches the relevant products to it.
However, I want to use an observer, to determine when the order is created and send out and perform related tasks off the back (send an order confirmation email, etc.) The problem being, at the time the order created observer is triggered, the products aren't yet attached.
Is there any way to do the above, establishing all the many to many relationships and creating the order at the same time so I can access linked products within the Order created observer?
Use case 1
AJAX call hits PUT /api/order which in turn calls Order::place() method. Once an order is created, an email is sent to the customer who placed the order. Now I could just put an event dispatch within this method that in turn triggers the email send but this just feels a bit hacky.
public static function place (SubmitOrderRequest $request)
{
$order = Order::create($request->validated());
$order->products()->attach($request->input('products'));
return $order;
}
Use case 2
I'm feature testing to make sure that an email is sent when an order is created. Now, this test passes (and email sends work), but it's unable to output the linked products at this point in execution.
/**
* #test
**/
public function an_email_is_sent_on_order_creation()
{
Mail::fake();
factory(Order::class)->create();
Mail::assertSent(OrderCreatedMailable::class);
}
Thanks,
Chris.
I think the solution to your problem could be transaction events as provided by this package from fntneves.
Personally, I stumbled upon the idea of transactional events for another reason. I had the issue that my business logic required the execution of some queued jobs after a specific entity had been created. Because my entities got created in batches within a transaction, it was possible that an event was fired (and the corresponding event listener was queued), although the transaction was rolled back because of an error shortly after. The result were queued listeners that always failed.
Your scenario seems comparable to me as you don't want to execute your event listeners immediately due to missing data which is only attached after the model was actually created. For this reason, I suggest wrapping your order creation and all other tasks that manipulate the order within a transaction. Combined with the usage of said package, you can then fire the model created event as the actual event listener will only be called after the transaction has been committed. The code for all this basically comes down to what you already described:
DB::transaction(function() {
$order = Order::create($request->validated());
$order->products()->attach($request->input('products'));
});
In your model, you'd simply define an OrderCreated event or use an observer as suggested in the other answer:
class Order
{
protected $dispatchesEvents = [
'created' => OrderCreated::class,
];
}
class OrderCreated implements TransactionalEvent
{
public $order;
/**
* Create a new event instance.
*
* #param \App\Order $order
* #return void
*/
public function __construct(Order $order)
{
$this->order = $order;
}
}
You can redefine boot method in your model, if product ids is static
class Order extends Eloquent {
protected static function boot() {
parent::boot();
static::saving(function ($user) {
$this->products()->attach([1,2,3,4...]);
});
}
}
Or use observers
class OrderObserver
{
public function created($model)
{
//
}
}
And register this
class EventServiceProvider extends ServiceProvider
{
public function boot(DispatcherContract $events)
{
parent::boot($events);
Order::observe(new OrderObserver());
}
}

Drupal 8 - Add custom cache context

I have the following situation: I want to hide or show some local Tasks (Tabs) based on a field on the current user. Therefore I have implemented a hook_menu_local_tasks_alter() in my_module/my_module.module:
function my_module_menu_local_tasks_alter(&$data, $route_name, \Drupal\Core\Cache\RefinableCacheableDependencyInterface &$cacheability) {
... some logic ...
if ($user->get('field_my_field')->getValue() === 'some value')
unset($data['tabs'][0]['unwanted_tab_0']);
unset($data['tabs'][0]['unwanted_tab_1']);
... some logic ...
}
This works fine but I need to clear the caches if the value of field_my_field changes.
So I found that I need to implement a Cache Context like this in my my_module_menu_local_tasks_alter:
$cacheability
->addCacheTags([
'user.available_regions',
]);
I have defined my Cache Context like this:
my_module/my_module.services.yml:
services:
cache_context.user.available_regions:
class: Drupal\my_module\CacheContext\AvailableRegions
arguments: ['#current_user']
tags:
- { name: cache.context }
my_module/src/CacheCotext/AvailableRegions.php:
<?php
namespace Drupal\content_sharing\CacheContext;
use Drupal\Core\Cache\CacheableMetadata;
use Drupal\Core\Cache\Context\CacheContextInterface;
use Drupal\Core\Session\AccountProxyInterface;
/**
* Class AvailableRegions.
*/
class AvailableRegions implements CacheContextInterface {
protected $currentUser;
/**
* Constructs a new DefaultCacheContext object.
*/
public function __construct(AccountProxyInterface $current_user) {
$this->currentUser = $current_user;
}
/**
* {#inheritdoc}
*/
public static function getLabel() {
return t('Available sub pages.');
}
/**
* {#inheritdoc}
*/
public function getContext() {
// Actual logic of context variation will lie here.
$field_published_sites = $this->get('field_published_sites')->getValue();
$sites = [];
foreach ($field_published_sites as $site) {
$sites[] = $site['target_id'];
}
return implode('|', $sites);
}
/**
* {#inheritdoc}
*/
public function getCacheableMetadata() {
return new CacheableMetadata();
}
}
But every time I change the value of my field field_my_field I still need to clear the caches, so the Context is not working. Could anybody point me in the right direction how to get this solved or how to debug such kind of thigs?
Instead of providing a custom cache context, you should be able to use the default cacheability provided by core. I believe the issue is not so much the creation of the cacheable metadata, its seems that your hook_menu_local_tasks_alter is altering content that doesn't know it now relies on the user. So I believe you need 2 things:
General cache contexts that says 'this menu content now relies on the user', eg. user cache context.
Additional use of cache tag for a specific user to say: 'once this is cached, when this user entity changes, go regenerate the local tasks for this user'.
Note that HOOK_menu_local_tasks_alter provides a helper for cacheability, the third param of $cacheability. Drupal core also provides a mechanism here that allows us to say 'this piece of cache data relies on this other piece of cache data'.
Thus you should be able to do something like:
function my_module_menu_local_tasks_alter(&$data, $route_name, RefinableCacheableDependencyInterface &$cacheability) {
... some logic ...
// We are going to alter content by user.
$cacheability->addCacheableDependency($user);
// Note if you still really need your custom context, you could add it.
// Also note that any user.* contexts should already be covered above.
$cacheability->addCacheContexts(['some_custom_contexts']);
if ($user->get('field_my_field')->getValue() === 'some value')
unset($data['tabs'][0]['unwanted_tab_0']);
unset($data['tabs'][0]['unwanted_tab_1']);
... some logic ...
}

Yii2 session event before close/destroy

I want to run some code every time before user session is being destroyed for any reason. I haven't found any events binded to session in official documentation. Has anyone found a workaround about this?
There are no events out of the box for Session component.
You can solve this problem with overriding core yii\web\Session component.
1) Override yii\web\Session component:
<?php
namespace app\components;
use yii\web\Session as BaseSession
class Session extends BaseSession
{
/**
* Event name for close event
*/
const EVENT_CLOSE = 'close';
/**
* #inheritdoc
*/
public function close()
{
$this->trigger(self::EVENT_CLOSE); // Triggering our custom event first;
parent::close(); // Calling parent implementation
}
}
2) Apply your custom component to application config:
'session' => [
'class' => 'app\components\Session' // Passing our custom component instead of core one
],
3) Attach handler with one of available methods:
use app\components\Session;
use yii\base\Event;
Event::on(Session::className(), Session::EVENT_OPEN, function ($event) {
// Insert your event processing code here
});
Alternatively you can specify handler as method of some class, check official docs.
As an alternative to this approach, take a look at this extension. I personally didn't test it. The Yii way to do it I think will be overriding with adding and triggering custom events as I described above.

Best way of passing response information from Model to Controller using Laravel

The Model View Controller architecture tells me that all my business logic should be inside the Model, while the data flow should be handled by the Controller.
Knowing this, while I'm dealing with my logic inside the Model, I need to let the Controller know if he's supposed to redirect to another url, redirect back, what kind of message or variable to pass during the redirection, etc.
What is the best way of doing this?
I can think of some ways, like throwing exceptions on the Modeland catching them on the Controller or returning an array from the Model and treating it on the Controller, but none of them seem very nice. The easiest way would be calling the Redirect->to() (or back()) inside the Model and just returning the Model's return on the Controller, but it seem to break the architecture's separation of rules.
Is there a "right" way of doing this? What would be the pros and cons of each way?
EDIT:
The answer below is old. Laravel now includes a bunch of different ways of handling common problems.
For example, use Laravel's FormRequest's as a way of validating data easily on controller methods, and Jobs to handle business logic for creating / updating models.
OLD POST:
This is a common question, and while the 'MVC' pattern is nice for a basic starting point for a web app, I feel like the majority of developers always need another intermediate service for validation, data handling, and other problems that come up during development.
To answer your question without bias: There is no right way.
To answer your question with my own personal bias, I feel the majority of developers will use the Repositories or Services pattern to handle intermediate data handling between the controller and the model, and also have separate classes for validation as well.
In my opinion, Repositories are better for a framework and data agnostic design (due their interface driven implementation), and Services are better for handling the business logic / rules. Controllers are better used for handling responses and for passing the input data to the repository or the service.
The paths for each of these patterns are the same though:
Request -> Controller (Validation) -> Service -> Model -> Database
Request -> Controller (Validation) -> RepositoryInterface -> Model -> Database
Validation is in brackets since input isn't passed from the validator to the service / repository, the input sent to the validator, gives the 'OK', and let's the controller know it's ok to send the data to the Service / Repository to be processed.
I only use Services when I'm absolutely positive I won't be changing frameworks or data sources. Otherwise I'll use Repositories. Repositories are just a little more work to setup, since you'll need to make Laravel resolve the interface to your repository class through its IoC.
Services Example:
The Service:
namespace App\Services;
use App\Models\Post;
class PostService
{
/**
* #var Post
*/
protected $model;
/**
* Constructor.
*
* #param Post $post
*/
public function __construct(Post $post)
{
$this->model = $post;
}
/**
* Creates a new post.
*
* #param array $input
*/
public function create(array $input)
{
// Perform business rules on data
$post = $this->model->create($input);
if($post) return $post;
return false;
}
}
The Controller:
namespace App\Http\Controllers;
use App\Services\PostService;
use App\Validators\PostValidaor;
class PostController extends Controller
{
/**
* #var PostService
*/
protected $postService;
/**
* #var PostValidator
*/
protected $postValidator;
/**
* Constructor.
*
* #param PostService $postService
* #param PostValidator $postValidator
*/
public function __construct(PostService $postService, PostValidator $post Validator)
{
$this->postService = $postService;
$this->postValidator = $postValidator;
}
/**
* Processes creating a new post.
*/
public function store()
{
$input = Input::all();
if($this->postValidator->passes($input)) {
// Validation passed, lets send off the data to the service
$post = $this->postService->create($input);
if($post) {
return 'A post was successfully created!';
} else {
return 'Uh oh, looks like there was an issue creating a post.';
}
} else {
// Validation failed, return the errors
return $this->postValidator->errors();
}
}
}
Now with this pattern, you have a nice separation of all your processes, and a clear indication of what each of them do.
For a repository example, Google 'Laravel Repository Pattern'. There are tons of articles about this.
Actually - in Laravel 5 that is not the best way to do it. Business logic should not be in models. The only thing that models should do is retrieve and store data from your database.
You are better off using the CommandBus or ServiceProviders to handle application logic and business rules. There are many articles on the web about these, but personally I prefer laracasts.com as the best learning resource.

Doctrine 2 result cache invalidation

I'm using Doctrine 2's result cache on a query retrieving the number of new messages of a user (messaging app):
$query->useResultCache(true, 500, 'messaging.nb_new_messages.'.$userId);
I tried to invalidate this cache like this (in my entity repository):
public function clearNbNewMessagesOfUserCache($userId) {
$cacheDriver = $this->getEntityManager()->getConfiguration()->getResultCacheImpl();
$result = $cacheDriver->delete('skepin_messaging.nbNewMessages.'.$userId);
if (!$result) {
return false;
}
return $cacheDriver->flushAll();
}
So that I don't need to make a useless query on each page of my website.
My questions: is that a recommended practice? Will I eventually run into problems?
I had the idea to build an onFlush hook.
There you have all entities queued for inserts, updates and deletes hence you can invalidate the caches depending on entity name and identifier etc.
Unfortunately, I have not yet build any event listeners but I definitely plan to build such a thing for my project.
Here is a link to the doctrine documentation for the onFlush event
Edit:
There is even an easier way to implement events.
In an entity class you can add #HasLifecycleCallbacks to the annotations and than you can define a function with a #PreUpdate or #PrePersist annotation.
Than every time this model is updated or persisted this function will be called.
/**
* #Entity
* #Table(name="SomeEntity")
* #HasLifecycleCallbacks
*/
class SomeEntity
{
...
/**
* #PreUpdate
* #PrePersist
*/
public function preUpdate()
{
// This function is called every time this model gets updated
// or a new instance of this model gets persisted
// Somethink like this maybe...
// I have not yet completely thought through all this.
$cache->save(get_class($this) . '#' . $this->getId(), $this);
}
}
So maybe this can be used to invalidate every single instance of an entity?
This is an old question I stumbled upon. It's really simple using Doctrine 2.8 nowadays:
/** #var \Psr\Cache\CacheItemPoolInterface|null $cache */
$cache = $em->getConfiguration()->getResultCache();
$cache->deleteItem('skepin_messaging.nbNewMessages.'.$userId);
$cache->clear(); // clear all items
Please be aware that Doctrine internally generates a "real cache key" which won't look like yours. I don't know how to generate that cache key, without re-creating the used query.

Resources