in reply to VVP asks Unique field data?

There are a couple of ways to do this. If you want to disallow duplicates of the first field in general, then you should cerate a unique index on that column. This will prevent the following situation:

script invocation 1: insert dog|brown|fuzzy and cat|white|evil
script invocation 2: insert dog|grey|lucky and cat|grey|still evil

If you try that with the unique index, then you will get an error similar to the following "attempt to insert duplicate key row".

Replies are listed 'Best First'.
Re: Re: VVP asks Unique field data?
by vivekvp (Acolyte) on Feb 08, 2002 at 19:36 UTC
    That is what I am trying to do - load a file with duplicates in the first field...i only want to load the first of 6 records that are duplicated, then move on to the next group. The date looks like this: CH CO N 303 CH CO Y 303 CH LA N 303 CH LA Y 303 CH OT N 303 CH OT Y 303 CHAA CO N 303 CHAA CO Y 303 CHAA LA N 303 CHAA LA Y 303 CHAA OT N 303 CHAA OT Y 303 CHAB CO N 303 CHAB CO Y 303 CHAB LA N 303 CHAB LA Y 303 CHAB OT N 303 CHAB OT Y 303 And goes on for 6000 records...any help? Thanks, He who laughs last, doesn't get the joke.
      Hmmm....that is a bit trickier, as there is not an easy way to do that with sql (you could write a trigger, but that is an exercise left to the reader). What I might do in this situation is create a hash table and keep track of the number of times the first field has been seen. Some code (assume | delimited fields)
      #!/bin/perl -w use strict; open(FILE,"myfile") or die "Couldn't open myfile"; my %first_record_hash; while (<FILE>) { my @array = split '|'; if ($first_record_hash{$array[0]} < 6) { #insert record into database $first_record_hash{$array[0]}++; } else { next; } }
      I think that'll do the trick for you
        Thank you very much Thor!