Late breaking news. I found the problem. The TBlobField defaults to Transliterate = True. I set it to False and it allowed the nulls to be written with no problem. Of course, you would think that a Blob field would default to this!!
Thanks to everyone who helped with this issue,
cvgalante...
If anyone is still watching this thread, I think the problem is nulls in the binary memo field (which have to be there - the original application is using the memo field as a very large array of structured records). I have used a TBlobField to SaveToFile the binary data and it is formatted...
Well, that's encouraging - sort of. :-) I really appreciate all your help. I still can't help but wonder if something is broken on the dBase side (Paradox stuff always seems to work better in Delphi), but if I learn anything new I'll send it along.
cvgalante
Well, DUH! I was trying to typecast it by suffixing it with .ASBLOB which, of course, did not work.
It does compile but the process still trashes the memo contents. Here is the procedure at this point:
procedure TForm1.ChangeStatusBufClick(Sender: TObject);
var
buf : array[1..100000] of...
Thanks, but I have been through this and a dozen other variations. This doesn't compile and gives the message:
Incompatible types: 'TBlobField' and 'TField'
Some code I can get to compile but it doesn't write the data back to the memo field properly - at least the original application no...
Good question. I have changed it so many times and I cannot get a clean compile anyway. Here is one example. Assume a TTable named dbf1 pointed to a dBase file with a memo field named TREEDATA.
tsread,
tswrite : tblobstream;
begin
with dbf1 do begin
open...
I've been trying to modify a binary memo field in a dBase file using a variety of methods and can't get any to work. I was trying the BlobStream methods as in the Help examples and they don't run as written. I get errors when I try to create a TBlobStream variable using the Ttable...
Well, I didn't because it's really ugly and I wouldn't recommend it. Plus, I did a lot of things and not all of them were relevant, but here are the things I think actually led to the solution:
When I started working on the problem, someone had managed to remove the database entry from the...
Thanks - actually, the files were not detached properly, so the single file attach still would not work. I ended up doing some fairly bizarre things to get this back up, but was finally able to.
CVG
Is there any way to recover an MSSQL 2000 database from just the .MDF file when the .LDF file has been irretrievably lost? Backups are not available. Don't even ask the questions I asked when confronted with this - you won't like the answers any more than I did! %-)
If not, there is a 3...
Hi Folks,
Am I missing something or is there no straightforward way to import a delimited text file to a table (like any database product does with ease)? The documentation says the Ttable type ASCII is delimited, but I have only found ways to set it up for fixed width formats. I have used...
Check to make sure you don't have compiler directive {$H-} which turns
s : string;
into a 255 character string declaration. Default is {$H+} which makes it an AnsiString as McMerfy shows above.
If you use stringlists, the process is automatic and the array is built for you:
ts := TstringList.Create;
ts.CommaText := '0;400;100,100;D;400,400;D;400';
// or ts.CommaText := YourStringVariable
Now you have:
ts[0] is '0;400;100'
ts[1] is '100;D;400'
.
.
.
etc.
Opp,
Thanks for the link. I will try this out. Actually, I don't need permanent drive mappings, just temporary while my process runs. The ability to supply a network user id and password to log into the remote server to begin with is what is crucial to me.
Again, thanks for the quick...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.