tag:blogger.com,1999:blog-1334701595587003615.post6378695875364924425..comments2024-03-21T10:04:54.144+05:30Comments on XploreDotNet: How To Delete A Duplicate Records From TableUnknownnoreply@blogger.comBlogger5125tag:blogger.com,1999:blog-1334701595587003615.post-68318642022203864452009-01-02T21:57:00.000+05:302009-01-02T21:57:00.000+05:30This will delete all records from the table MyTabl...This will delete all records from the table MyTable which have the same value for the field dupField, leaving that record which has the lowest value in uniqueField.<BR/>delete from T1 from MyTable T1, MyTable T2 <BR/>where T1.dupField = T2.dupField <BR/>and<BR/> T1.uniqueField > T2.uniqueFieldZehra Nasifhttps://www.blogger.com/profile/05802356913820773213noreply@blogger.comtag:blogger.com,1999:blog-1334701595587003615.post-89293659804022161222009-01-01T12:56:00.000+05:302009-01-01T12:56:00.000+05:30hi michael,if you have any idea to avoid duplicate...hi michael,<BR/><BR/>if you have any idea to avoid duplicate record while working on bulk insert please share with usAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-1334701595587003615.post-37402384456953856882009-01-01T02:37:00.000+05:302009-01-01T02:37:00.000+05:30Perhaps I'm repeating what Joonas said, but again ...Perhaps I'm repeating what Joonas said, but again there shouldn't be such duplicates allowed to begin with and should be handled when you do the bulk insert.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-1334701595587003615.post-54434640527464340452008-12-31T19:42:00.000+05:302008-12-31T19:42:00.000+05:30hi joonas,thanks for your comments,think in this s...hi joonas,<BR/><BR/>thanks for your comments,think in this scenario, during the course of bulk insert by importing the data from the text file, suppose you have to run that job twice in a day for example day time and in evening the same file will be import again in the single day so there is a chance for the duplicate entry of record in the table, by keeping that scenario in mind i post this article, if u have any better way to over come this situation means please share with us.windows_msshttps://www.blogger.com/profile/07219153650185562196noreply@blogger.comtag:blogger.com,1999:blog-1334701595587003615.post-72891072243362516902008-12-31T04:19:00.000+05:302008-12-31T04:19:00.000+05:30Would it make _much_ more sense to use a schema th...Would it make _much_ more sense to use a schema that does not allow you to corrupt your database?<BR/><BR/>Can't see any reason for deleting duplicates if you need them for some reason (as you have a schema that allows them).Anonymousnoreply@blogger.com