I personally would handle the duplicates when you save the records from the database to the server.This will ensure that if they load the same file a second time that the records won't be inserted into the database. Are you using SQL Server for your back end? (Use a stored procedure to insert the records checking for dupes).
Your quite a ways off here... your code is going to look something like
In the clicked event of a button
//This snippet of code will provide you with a window to
//select the text file that you wish to load..
//It will saved the file name in the variable docname...
string docname, named, ls_one_line
integer value, i_ret, i_ret_one
value = GetFileOpenName("Select File", &
+ docname, named, "DOC", &
+ "Text Files (*.TXT),*.TXT," &
+ "Doc Files (*.DOC),*.DOC"

if value <=0 then return
//Two options now... pull the entire file into the
//datawindow and handle dupes in your insert into the
//database...using the following
i_ret=dw_customer.ImportFile(docname)
If i_ret=<= 0 Then
Messagebox('ERROR','Error importing file'+String(docname),Exclamation!)
return
ELSE
Messagebox('SUCCESS','File import complete!',Exclamation!
END IF
//The second option is to open the file and read
//individuals lines... more difficult in my opinion but does
//have use in other less standard imports
i_ret=FileOpen(docname,LineMode!,Read!,LockRead!)
//See help file for fileread...
DO While FileRead(i_ret,ls_one_line) <> -100
i_ret_one=dw_customer.ImportString(ls_one_line)
//Error check return codes...
//-1 startrow value supplied is greater than the
//number of rows in the string
// -3 Invalid argument
// -4 Invalid input
// -9 PowerBuilder or the user canceled import
//because data failed validation
LOOP