Thursday, December 25, 2008

How To Delete A Duplicate Records From Table

Query To Find The Duplicate Records

SELECT ColumnName1,ColumnName2,ColumnNameN,count(*) As UserDefinedColName
FROM TableName
GROUP BY ColumnName1,ColumnName2,ColumnNameN
HAVING count(*) > 0

Query To Delete Duplicate Records

select distinct * into #tmpTableName from TableName
truncate table TableName
insert TableName select * from #tmpTableName
drop table #tmpTableName



5 comments:

joonas said...

Would it make _much_ more sense to use a schema that does not allow you to corrupt your database?

Can't see any reason for deleting duplicates if you need them for some reason (as you have a schema that allows them).

windows_mss said...

hi joonas,

thanks for your comments,think in this scenario, during the course of bulk insert by importing the data from the text file, suppose you have to run that job twice in a day for example day time and in evening the same file will be import again in the single day so there is a chance for the duplicate entry of record in the table, by keeping that scenario in mind i post this article, if u have any better way to over come this situation means please share with us.

Michael said...

Perhaps I'm repeating what Joonas said, but again there shouldn't be such duplicates allowed to begin with and should be handled when you do the bulk insert.

Anonymous said...

hi michael,

if you have any idea to avoid duplicate record while working on bulk insert please share with us

Zehra Nasif said...

This will delete all records from the table MyTable which have the same value for the field dupField, leaving that record which has the lowest value in uniqueField.
delete from T1 from MyTable T1, MyTable T2
where T1.dupField = T2.dupField
and
T1.uniqueField > T2.uniqueField

Recent Post

Creative Commons License
Disclaimers:We have tried hard to provide accurate information, as a user, you agree that you bear sole responsibility for your own decisions to use any programs, documents, source code, tips, articles or any other information provided on this Blog.