am 25.05.2005, um 13:58:07 -0300 mailte lucas@presserv.org folgendes:
> Hi.
> Thanks for the article...
> But, I have read it and the query works very slow...
> My table have aprox. 180.000 records (correct) and in entire table it has
> aprox.360.000 records(duplicated)...
How often is this necessary?
> Is there a way to delete those duplicated records faster??? Remembering the
> table have aprox 360.000 records...
I dont know, but i think, you should prevent duplicated records in the
future, and make the job (delete duplicates) now.
Btw.: you wrote, there is a primary key on the first row. Real?
,----[ sorry, messages in german language ]
| test_db=# create table blub (id int primary key, name varchar);
| HINWEIS: CREATE TABLE / PRIMARY KEY erstellt implizit einen Index >>blub_pkey<< für Tabelle >>blub<<
| CREATE TABLE
| test_db=# insert into blub values (1, 'x');
| INSERT 970706 1
| test_db=# insert into blub values (1, 'y');
| FEHLER: duplizierter Schlüssel verletzt Unique-Constraint >>blub_pkey<<
`----
In other words: if there a primary key on the first row, you cannot
insert duplicates.
Regards, Andreas
--
Andreas Kretschmer (Kontakt: siehe Header)
Heynitz: 035242/47212, D1: 0160/7141639
GnuPG-ID 0x3FFF606C http://wwwkeys.de.pgp.net=== Schollglas Unternehmensgruppe ===