We are happy to announce the release of our php library patchlevel/event-sourcing
in version 3.5.0.
This release contains several exciting new features and improvements.
In this blog post, we will provide you with an overview of the changes.
Event Aliases
You now have the option of defining aliases for the events.
In other words, an event class can now have multiple names and when reading from the event store it is then mapped to the same event class.
When writing, everything stays the same and the aliases are not taken into account.
use Patchlevel\EventSourcing\Attribute\Event;
#[Event(name: 'profile.registered', aliases: ['profile_created', 'profile.created'])]
final class ProfileRegistered
{
}
This is very helpful, for example, if you want to rename the events.
You could do this before with the upcasting, but now it's much easier and clearer.
For larger changes like changing the payload you still have to use the upcasting.
In addition to the simple use, it opens up further possibilities for us to further optimize the subscription engine.
Stay tuned for future versions.
Batching in Subscriber
Another huge feature is the ability to batch within subscribers.
Previously, this was not possible and each projector had to process each event directly and then write it to the respective database.
This meant that the operations on the database were not optimal and thus rebuilding projections took quite a long time.
Here it was clear that these operations were the bottleneck.
Now we have the opportunity to solve this problem by implementing the BatchableSubscriber
on our subscribers.
This interface offers us some new methods that allow us to react to the following things:
Batching starts, is committed or should be rolled back.
We can also tell the subscription engine that we would like to commit because we have reached a threshold.
use Doctrine\DBAL\Connection;
use Patchlevel\EventSourcing\Attribute\Projector;
use Patchlevel\EventSourcing\Subscription\Subscriber\BatchableSubscriber;
#[Projector('profile_1')]
final class ProfileProjector implements BatchableSubscriber
{
public function __construct(
private readonly Connection $connection,
) {
}
private array $nameChanged = [];
#[Subscribe(NameChanged::class)]
public function handleNameChanged(NameChanged $event): void
{
$this->nameChanged[$event->profileId] = $event->name;
}
public function beginBatch(): void
{
$this->nameChanged = [];
$this->connection->beginTransaction();
}
public function commitBatch(): void
{
foreach ($this->nameChanged as $profileId => $name) {
$this->connection->executeStatement(
'UPDATE profile SET name = :name WHERE id = :id',
['name' => $name, 'id' => $profileId],
);
}
$this->connection->commit();
$this->nameChanged = [];
}
public function rollbackBatch(): void
{
$this->connection->rollBack();
}
public function forceCommit(): bool
{
return count($this->nameChanged) > 1000;
}
}
Batching allows us to achieve huge performance optimizations.
In our tests and our projectors we were able to increase performance by 1000%.
Of course, the extent of the performance gain depends on the context.
Whether a lot of data is overwritten, whether the amount of data is kept in memory,
whether the database can perform several operations at the same time, etc.
Another cool detail is that you can now build your read models very effectively with Doctrine ORM,
since you no longer have to flush after every event.
I have transferred the above example to Doctrine ORM.
use Doctrine\Common\Persistence\ManagerRegistry;
use Doctrine\ORM\EntityManagerInterface;
use Patchlevel\EventSourcing\Attribute\Projector;
use Patchlevel\EventSourcing\Subscription\Subscriber\BatchableSubscriber;
#[Projector('profile_1')]
final class ProfileProjector implements BatchableSubscriber
{
private EntityManagerInterface $entityManager;
public function __construct(
private readonly ManagerRegistry $managerRegistry,
) {
}
#[Subscribe(NameChanged::class)]
public function handleNameChanged(NameChanged $event): void
{
$profile = $this->entityManager->find(Profile::class, $event->profileId);
$profile->setName($event->name);
}
public function beginBatch(): void
{
$this->entityManager = $this->managerRegistry->getManager(Profile::class);
}
public function commitBatch(): void
{
$this->entityManager->flush();
$this->entityManager->clear();
}
public function rollbackBatch(): void
{
$this->entityManager->clear();
}
public function forceCommit(): bool
{
return $this->entityManager->getUnitOfWork()->size() > 1000;
}
}
This works because Doctrine ORM tracks the objects for you.
All we have to do is flush the whole thing when the batching is done.
We also check in forceCommit
how many objects doctrine is currently managing for us.
If it exceeds the threshold, we force a commit.