I have a unique index in my postgres DB within a table participations like this
"participation_uniqueness" UNIQUE, btree (event_id, firstname, lastname, email)
That's working great except that if one of the fields is nil, its basically getting ignored:
event_id | firstname | lastname | email
1982 | John | Doe | [email protected]
1982 | Hans | Doe | [email protected]
1982 | null | null | [email protected]
1982 | null | null | [email protected]
I want the uniqueness constraint also to apply if a value is null, so the last record in the example table above should NOT be possible because its against the unique constraint.
In Rails I could simply add this as a validation method to my model
validates_uniqueness_of :email, scope: [:event_id, :firstname, :lastname]
I am wondering why this validation is considering NULL for uniqueness checks but SQL does not?
Researching this topic I understand that it is SQL standard as NULL is intepreted as "missing information" and you cannot compare missing information to each other.
However from a logical point of view this makes no sense to me. I want DB inserts in this case to fail. How can I reach that without implementing dirty workarounds posted here for example https://www.pgrs.net/2008/01/11/postgresql-allows-duplicate-nulls-in-unique-columns/
However from a logical point of view this makes no sense to meWhy should only one person be allowed to have no name? Should a person be allowd to exist without a name? If the answer is "Yes" , it should be "yes" for every person! (ergo: you data definition/model does not match reality)