11

I have a column defined like:

data_url character varying(32768) NOT NULL

and UNIQUE constraint on that column:

CONSTRAINT unique_data_url UNIQUE (data_url)

When a large object is being inserted into the table, the following error message appears:

ERROR:  index row requires 32584 bytes, maximum size is 8191

How can PostgreSQL be set in order to be able to index object larger that 8191 chars? Space and speed are not a problem. It is a rarely changed table of at most hundreds of rows.

Environment: PostgreSQL 9.3.6, Fedora 20 x64

Erwin Brandstetter
  • 185,527
  • 28
  • 463
  • 633
czerny
  • 533
  • 5
  • 12

2 Answers2

17

As suggested by @Josh Kupershmidt and @JoeNahmias the solution is to use UNIQUE on md5 hash of the long value. However PostgreSQL 9.3 doesn't support expressions in UNIQUE constraints so an index, that supports expressions, has to be used:

create unique index unique_data_url_index on mytable (md5(data_url));
czerny
  • 533
  • 5
  • 12
3

PostgreSQL builds an index to support the UNIQUE constraint. You cannot index a field that large. As @josh-kupershmidt suggested, create the constraint on a hash of the field and you should be alright, barring hash collisions.

JoeNahmias
  • 464
  • 6
  • 18