I essentially have some objects in this configuration (the real data model is a bit more complex):
A has a many-to-many relationship with B. (B has inverse="true")
B has a many-to-one relationship with C. (I have cascade set to "save-update")
C is a kind of type/category table.
Also, I should probably mention that the primary keys are generated by the database on save.
With my data, I sometimes run into problems where A has a set of different B objects, and these B objects refer to the same C object.
When I call session.saveOrUpdate(myAObject)
, I get a hibernate error saying: "a different object with the same identifier value was already associated with the session: C"
. I know that hibernate can't insert/update/delete the same object twice in the same session, but is there some way around this? This doesn't seem like it would be that uncommon of a situation.
During my research of this problem, I have seen folks suggest the use of session.merge()
, but when I do that, any "conflicting" objects get inserted into the database as blank objects with all values set to null. Clearly that isn't what we want.
[Edit] Another thing I forgot to mention is that (for architectural reasons beyond my control), each read or write needs to be done in a separate session.
Most probably its because the B objects are not referring to the same Java C object instance. They are referring to the same row in the database (i.e. the same primary key) but they're different copies of it.
So what is happening is that the Hibernate session, which is managing the entities would be keeping track of which Java object corresponds to the row with the same primary key.
One option would be to make sure that the Entities of objects B that refer to the same row are actually referring to the same object instance of C. Alternatively turn off cascading for that member variable. This way when B is persisted C is not. You will have to save C manually separately though. If C is a type/category table, then it probably makes sense to be that way.
Just set cascade to MERGE, that should do the trick.
You only need to do one thing. Run session_object.clear()
and then save the new object. This will clear the session (as aptly named) and remove the offending duplicate object from your session.
session_object
that has the clear()
method?
session.clear()
also removes all other (non-offending) objects from the session. So this is not a good option.
I agree with @Hemant Kumar, thank you very much. According his solution, I solved my problem.
For example:
@Test
public void testSavePerson() {
try (Session session = sessionFactory.openSession()) {
Transaction tx = session.beginTransaction();
Person person1 = new Person();
Person person2 = new Person();
person1.setName("222");
person2.setName("111");
session.save(person1);
session.save(person2);
tx.commit();
}
}
Person.java
public class Person {
private int id;
private String name;
@Id
@Column(name = "id")
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
@Basic
@Column(name = "name")
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
This code always make mistake in my application: A different object with the same identifier value was already associated with the session
, later I found out that I forgot to autoincrease my primary key!
My solution is to add this code on your primary key:
@GeneratedValue(strategy = GenerationType.AUTO)
@EmbeddedId
?
This means you are trying to save multiple rows in your table with the reference to the same object.
check your Entity Class' id property.
@Id
private Integer id;
to
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
@Column(unique = true, nullable = false)
private Integer id;
Transfer the task of assigning the object ID from Hibernate to the database by using:
<generator class="native"/>
This solved the problem for me.
One way to solve the above problem will be to override the hashcode()
.
Also flush the hibernate session before and after save.
getHibernateTemplate().flush();
Explicitly setting the detached object to null
also helps.
Add the annotation @GeneratedValue to the bean you are inserting.
Just came across this message but in c# code. Not sure if it's relevant (exactly the same error message though).
I was debugging the code with breakpoints and expanded some collections through private members while debugger was at a breakpoint. Having re-run the code without digging through structures made the error message go away. It seems like the act of looking into private lazy-loaded collections has made NHibernate load things that were not supposed to be loaded at that time (because they were in private members).
The code itself is wrapped in a fairly complicated transaction that can update large number of records and many dependencies as part of that transaction (import process).
Hopefully a clue to anyone else who comes across the issue.
Find the "Cascade" atribute in Hibernate and delete it. When you set "Cascade" available, it will call other operations (save, update and delete) on another entities which has relationship with related classes. So same identities value will be happened. It worked with me.
I had this error few days a go and I sped too many time on fixing this error.
public boolean save(OrderHeader header) {
Session session = sessionFactory.openSession();
Transaction transaction = session.beginTransaction();
try {
session.save(header);
for (OrderDetail detail : header.getDetails()) {
session.save(detail);
}
transaction.commit();
session.close();
return true;
} catch (HibernateException exception) {
exception.printStackTrace();
transaction.rollback();
return false;
}
}
Before i get this error , I didn't have mentioned ID generation type on the OrderDetil Object. when without generating Orderdetails' id it keeps Id as 0 for every OrderDetail objects. this what #jbx explained. Yes it is the best answer. this one example how it happens.
Try to place the code of your query before. That fix my problem. e.g. change this:
query1
query2 - get the error
update
to this:
query2
query1
update
In my case only flush() did not work. I had to use a clear() after flush().
public Object merge(final Object detachedInstance)
{
this.getHibernateTemplate().flush();
this.getHibernateTemplate().clear();
try
{
this.getHibernateTemplate().evict(detachedInstance);
}
}
Another case when same error message can by generated, custom allocationSize
:
@Id
@Column(name = "idpar")
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "paramsSequence")
@SequenceGenerator(name = "paramsSequence", sequenceName = "par_idpar_seq", allocationSize = 20)
private Long id;
without matching
alter sequence par_idpar_seq increment 20;
can cause constraint validation during insert(that one is easy to understand) or ths "a different object with the same identifier value was already associated with the session" - this case was less obvious.
you might not be setting the identifier of the object before calling update query.
I met the problem because of the primary key generation is wrong,when I insert a row like this:
public void addTerminal(String typeOfDevice,Map<Byte,Integer> map) {
// TODO Auto-generated method stub
try {
Set<Byte> keySet = map.keySet();
for (Byte byte1 : keySet) {
Device device=new Device();
device.setNumDevice(DeviceCount.map.get(byte1));
device.setTimestamp(System.currentTimeMillis());
device.setTypeDevice(byte1);
this.getHibernateTemplate().save(device);
}
System.out.println("hah");
}catch (Exception e) {
// TODO: handle exception
logger.warn("wrong");
logger.warn(e.getStackTrace()+e.getMessage());
}
}
I change the id generator class to identity
<id name="id" type="int">
<column name="id" />
<generator class="identity" />
</id>
if you use EntityRepository then use saveAndFlush instead of save
If left an expressions tab in my IDE open which was making a hibernate get call on the object causing this exception. I was trying to delete this same object. Also I had a breakpoint on the delete call which seems to be necessary to get this error to happen. Simply making another expressions tab to be the front tab or changing the setting so that the ide does not stop on breakpoints solved this problem.
Make Sure, your entity have same Generation Type with all Mapped Entitys
Ex : UserRole
public class UserRole extends AbstractDomain {
@Id
@GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
private String longName;
private String shortName;
@Enumerated(EnumType.STRING)
private CommonStatus status;
private String roleCode;
private Long level;
@Column(columnDefinition = "integer default 0")
private Integer subRoleCount;
private String modification;
@ManyToOne(fetch = FetchType.LAZY)
private TypeOfUsers licenseType;
}
Module :
public class Modules implements Serializable {
@Id
@GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
private String longName;
private String shortName;
}
Main Entity with Mapping
public class RoleModules implements Serializable{
@Id
@GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
@ManyToOne(fetch = FetchType.LAZY, cascade = CascadeType.MERGE)
private UserRole role;
@ManyToOne(fetch = FetchType.LAZY, cascade = CascadeType.MERGE)
private Modules modules;
@Type(type = "yes_no")
private boolean isPrimaryModule;
public boolean getIsPrimaryModule() {
return isPrimaryModule;
}
}
In addition to all the previous answers, a possible fix to this problem in a large scale project, if your using a Value Object for your classes don't set the id attribute in the VO Transformer class.
The reason for this issue is you have have different copies of objects referring into same raw in your child table, so spring trying to treat your object as new object but while saving it identifies there is a raw with same primary key. So it gives above error.
Best solution for this issue is to load the whole object (parent entity with child entities) from DB (you already know the primary key of parent object), then update values in the object loaded from DB from your new object(which you were trying to save) and then save the object you loaded from the DB which has new values.
This will update your values in the DB without giving above error.
PS- Do not need to update ids as they already exist in object loaded from DB, update only the values need to be changed
Another way to solve this if you are using spring data:
replace calls to entityManager.persist() with calls to repository.save(), and
replace calls to entityManager.query().getResultList() etc with calls to repository.findBy...
This way, spring data keeps track of the objects. It enables multiple get and persist calls.
instead of just @Id try @Id @GeneratedValue(strategy = GenerationType.AUTO) it worked for me
In my case, I had OneToOne relationship and after saving one row with a foreign key, attempt to save another row with the same foreign key throw the same exception. It means that requirement was not OneToOne relationship, but should be ManyToOne. So I have changed it to ManyToOne and it started working.
This error commonly occurs because you are violating a column of unique type or primary key when trying to insert repeated data.
make sure primary key is different for every entity.
primary key must be unique
just commit current transaction.
currentSession.getTransaction().commit();
now you can begin another Transaction and do anything on entity
Success story sharing