Skip to main content

Posts

Notes on Lazy Loading

Hibernate is offering a configuration called LazyCollectionOption.EXTRA. Extra lazy collections means we can apply trivial operations to them without having to retrieve them from the database and without having to write these trivial queries to avoid doing so. Trivial operations are considered the size(), isEmpty() and contains(). If for example we have an entity User and an extra lazy collection called Address we can call user.getAddress.size() and this will generate a COUNT query on the ADDRESS table. So, no need to write a special query or load in memory the collection. Lazy loading is set by default to @OneToMany relations but you need to set it on @ManyToOne which are by default Eagerly loaded. @OneToOne can be lazy loaded only if the relation is not optional. If the relation is optional the the proxy will have to query the database to find out if it should be null or if it should be initiated. A spring boot property that needs to be overriden in all our applications because
Recent posts

Pessimistic vs Optimistic locking

The pessimistic locking is not really a locking. It is easy to introduce optimistic locking in our code, all we need to do is add the @Version annotation to our entities. Pessimism means that we assume that there will be no conflict and everything will complete normally. But if they don't, then an OptimisticLockException is thrown and a transaction is marked for roll back. An alternative way to activate optimistic locking is annotating an entity with OptimisticLockType.ALL and activate dynamic update. This annotation tells hibernate to check the state of all entity fields before and after persisting it and detect any change. Instead we could use OptimisticLockType.DIRTY and an OptimisticLockException is thrown only if the same field has been changed before persisting by another unit of work. It depends what are the business demands to choose one or the other. Another option is to use OPTIMISTIC_FORCE_INCREASE which means that after a read, we force an increase of the entity versio

Inheritance Types

1. Table per concrete class with implicit polymorphism We create an abstract superclass and we map it as @MappedSuperclass. The concrete classes extend this abstract class and this way we achieve an implicit polymorphism. In the database, the two classes are two different tables with no actual relation between them. 2. Table with concrete class with unions The superclass is mapped as a proper entity and we use @Inheritance(strategy = InheritanceType.TABLE_PER_CLASS). The concrete implementation classes extend this superclass but again they are two separate tables with no relation between them on database table. On Java code though, they can share code. 3. Table per class hierarchy The inheritance strategy is SINGLE_TABLE and both concrete classes are mapped on a single table in the database. The problem with this implementation is that this table may have many nullable columns. Each of them also needs a @DiscriminatorColumn to distinguish between the different classes mapped

Dirty checking

Every time we read a value from the database, a copy is stored in memory which is later used for dirty checking the entity. If we read too many values from the database without needing all of them, then we may run into OutOfMemoryException. This is why a good practice is to mark our session as read only if we plan on reading entities not for modifying them. This will tell Hibernate that once we are done, there is no reason for dirty checking as a result no copy of the entity needs to be done in the persistence context cache. We need to be very careful with collections and dirty checking. Always use property annotation on collection mappings. Otherwise, the dirty checking may consider the collection as modified and the result will be that all associated entities will be deleted and re-inserted.

ID generation strategies

When we choose an ID for our entities we have to decide between natural keys and surrogate keys. In case of surrogate keys we have to define the way this key is generated and the choices are quite a few. We can use a UUID, or a @GeneratedValue. The @GeneratedValue should define either a strategy (IDENTITY, SEQUENCE, TABLE) or a generator. A generator is a custom implementation that we create in our code and assign the ID to our entity before persisting in the database. The 3 strategies are not all supported by all RDBMSs. Also, they are not all as performant. The TABLE should be in general be avoided unless we want special portability features. IDENTITY should also be avoided because of the severe performance penalties but we use it if our database is not supporting it and the one that we favour is the SEQUENCE one. More details are given bellow: IDENTITY - auto generation Th ID is generated outside of the transaction, at database level. This means that in case of a rollback dis

Dynamic Update and Dynamic Insert

Hibernate by default on application start up generates and caches the CRUD statements for all the mapped entities. These statements are not used if we call queries that we have written but when we make changes on persistent context on an entity. The problem is that these auto generated statements update or insert all the properties of our entities, even the unmodifiable ones. This is not very performant and in some cases it may not even be correct. Also, when a new row is persisted in the database, the fields which have not been initialised, are set to NULL by Hibernate. In order to disable this functionality we should set these two configurations to true. Dynamic update and dynamic insert set to true means that every time that we need to update or insert an entity in the database, Hibernate will generate a statement at runtime for only the fields we are updating. Usually, these settings go hand in hand with another hibernate configuration, @SelectBeforeUpdate. It affects performance

Performance of Collection mappings

One of the things that we need to take into account when creating a @OneToMany annotation is the performance. By default this is a lazy loaded collection and only if we need to get the values we will have a second hit to the database to retrieve it. But, we can avoid some hits to the database if we choose wisely the type of the collection. For example, if we map the collection to a Set then every time we need to add a new element to this collection, we need to retrieve the existing elements associated to our persistent object to validate that the newly added element of the collection is unique. If we had mapped it as a List then we could have added the new element and Hibernate would not have retrieved the previously added elements because a List allows duplicates. Sets are still the preferred collection mapping if the above scenario is not that common in our code and the duplicates should not be allowed. Especially for certain Hibernate versions we need to be extra careful in using