Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Using cursor adaptor to update blob data in MySQL

DDPL

IS-IT--Management
Sep 6, 2022
43
GB
I have a MySQL database that I am uploaded data from a VFP database via a cursor adaptor, everything has been working fine, but now need a BLOB and MEMO field in one of the tables, didn't previously have any so was just do something simple like this;

Code:
SELECT &oSource
SCAN
    SELECT &oSource
    SCATTER MEMVAR
    SELECT &oTarget
    APPEND BLANK
    GATHER MEMVAR
ENDSCAN

Code still works but obviously doesn't upload the BLOB or MEMO, but changing SCATTER MEMVAR and GATHER MEMVAR to include the MEMO clause throws an error when it updates the record on the table with BLOB and MEMO fields, but is still ok on other tables that don't contain these field types.

In MySQL table the VFP BLOB field is matches with a LONGBLOB and the MEMO field with a TEXT. I have "fetch memo fields" ticked on the CursorAdaptor Builder.

The error tells me that I have an error in the SQL syntax, any ideas?
 
Hi DDPL,

Check out FILETOSTR() function. Also STRTOFILE(). You need them for Blobs.
 
Hi Dylim,

I used FILETOSTR() to get the data into the BLOB field in the VFP table, the issue I am getting is copying the table record through the CursorAdaptor into the MySQL data table.
 
Have you added the field to the updatabelfieldlist of the CA? The cursorschema? Adding a field requires several changes in an existing CA.

Sometimes it can be the simplest to redo the CA (with the CA builder, for example).

Edit: There's nothing too special about any field/data type, that you need special treatment. The "fetch memo field" feature is not about fetching memo fields (and blobs, general, all alike) at all or not at all, when you untick it, it enables delayed fetching, but it doesn't suppress it, nor does it explain why you'd get SQL syntax errors. With or without delayed memo(blob/general fetching when you add a record and set the blob or replace the blob value of an existing record originally fetched from mySQL with its old or an empty value (when using the delayed fetch), itll count as new or updated value and be stored back when you do the TABLEUPDATE for sending back all buffered changes.

Well, maybe you're not at all using the CA as intended, you write very little about your general approach, besides having used the builder to create an updatable CA, I think.

One major thing to know, even if you don't want to update existing data and only use the CA cursor to feed in new data into MySQL the first step of using a CA and let it establish the mapping between MySQL and the cursor is to make a CursorFill call - in the dfaultt case automatic with the init of the CA, in other cases by an explicit call to CursorFill. To not fill it with any existing MySQL data you can set or modify the SelectCmd of the CA to include a WHERE 0=1, for example, so no record is fetched at all and then can fill the cursor in any way you want in buffered mode to finally send that data to MySQL with TABLEUPDATE.
 
Last edited:
the first step of using a CA and let it establish the mapping between MySQL and the cursor is to make a CursorFill call
To be more precise it's the simplest way to a) create the cursor as the cursoradapter needs it and b) have it associated with the CA and with updating the MySQL backend. You can also create an empty cursor and use CursorAttach to a CA that then makes the connection and communication with the backend, but then you can easily fail to have the cursor as necessary and that can lead to errors in the SQL generated by TABLEUPDATE. Well, and even the automatic mecahnisms you might use will forward data to MySQL with SQL generated towards it, neither the VFP process is hooked into MySQL nor MySQL is hooked into the VFP process and the only connection you have is via the ODBC layer and that communication goes by SQL, nothing else. VFP generates the SQL Inserts, Updates and Deletes necessary to forward what you did in the cursor, also when not buffering or only row buffering.
 
Last edited:
Hi Chriss,

I am happy with the concept of the CA, have been using it for a while, you kindly offered advice a year or so ago in heling me set this up. Have taken what I believe to be the correct steps in added these two new fields into both tables, the VFP and MySQL.

I use the CA in different ways, I can clear the Target database and re-populate it from the Source database (works both ways round, VFP to MySQL and MySQL to VFP), I can also Synch the databases where it will push and pull data between the two depending on the most recent data, each field is DateTime stamps to allow this.

The CA has been working well, very robust and have learnt a certain amount about it's use over the last year or so.

When I added the new fields to both sets of tables I rebuilt the CA, to ensure that the new fields were included in the schema etc. the process fails when I use GATHER MEMVAR MEMO instead of GATHER MEMVAR.

I did some reading last night and there is some conflicting advice, but there does seem to be a consensus that the CA doesn't like GATHER MEMVAR MEMO and MEMO type fields in general, so will have a play today and see if I can update the data at a field level rather than a record level.

Darren
 
Sorry, I don't see why it would matter how you populate the cursor with data. The aspect of working with the cursor, also when it's a cursor attached to a CA, is merely a metter of VFP workings, not of MySQL. It doesn't change the final data forwarding to MySQL whether a memo field or blob field is populated with INSERT, APPEND BLANK and REPLACE or GATHER or any other mechanism, that solely affects the VFP cursor, you don't gather into a MySQL blob field when you gather into the CA cursor so the MySQL backend won't notice anything from that, it's solely fed from the CA by means of SQL generated when you TABLEUPDATE.

Again, from what you say I still don't know what mechanisms you're using, there are still several ways to use CA and the updatable view characteristic it has. I will need to know the technicalities, not your story. One major technical component to use is buffering. If you don't buffer the CA cursor, you're dealing with not well defined territory. That has about the same implications as an unbuffered DBF in an analogy of native DBF access without anything like a CA or view, merely opening a DBF in a workarea unbufferred. For example data will not be written instantly into a DBF just when you're doing GATHER, you always will need to leave the current record to trigger a DBF file updating. Am I confuding this with row buffering? No, I think not, even without row buffering you're not directly dealing with a DBF record and even less so with a MySQL backend table.

Using buffering is really very basic concept to use in every aspect of working with data, native or not, with free tables or any other backend, especially remote backends like database servers.
 
Last edited:
Hi Chriss,

I get all of that, was trying to give you the technicalities, not a story.

I understand that the cursor has no relation to the MySQL database until you write back any changes to it, the mechanism I am using was demonstrated in the first post.

I establish a connection to the MySQL database, select the table I want to deal with, fetch a remote cursor, delete all the records in it and then upload all the records from the VFP table. I do this using the EXACT code below.

oSource = The Alias of the VFP table
oTarget = The Alias of the MySQL remote cursor

Code:
SELECT &oSource
SCAN
    SELECT &oSource
    SCATTER MEMVAR
    SELECT &oTarget
    APPEND BLANK
    GATHER MEMVAR
ENDSCAN

As I have Auto-update turned on each time it goes round the loop it moves on one record and automatically executes a TABLEUPDATE which generates the SQL script to push the data into the MySQL table via the CA.

I understand that using the GTHER MEMVAR or GATHER MEMVAR MEMO on the cursor is not doing anything to the MYSQL database, but when I use the latter and it gets to the bottom of the loop and the TABLEUPDATE is fired I get an error saying that there is an issue with the SQL. There are no other changes to any of the code, just the including of the MEMO clause before the update is fired.

I have just found a solution, but it doesn't make any sense to me. If I just use the GATHER MEMVAR, but then put a REPLACE fieldname WITH oSource.fieldname in the CA BeforeInsert method it works. The data in the cursor should be the same so the SQL statements being generated should be the same, but for some reason it work one way and not another.
 
What are your fieldnames?

It's better to use SCATTER MEMO NAME loRecord and GATHER MEMO NAME loRecord instead, or even INSERT INTO (oTarget) FROM NAME loRecord
 
fetch a remote cursor, delete all the records in it..
When you fetch records and then delete them, that's also causing a lot of deletes in MySQL. I'm sure you also want that, but the straight forward way if you actually want to repopulate all data is to TRUNCATE the table and start empty on MYSQL already.

The way you do this the queries could cause all kinds of referencing errors when you insert data again. You shouldn't work this way or if so, plan closely in which order you do this to not interfere with integrity rules of relationships.
 
And in regard of technicalities:

As you have code in BeforeInsert and maybe more, that is important to know, don't you think? It would help if you finally post whther and what buffering you use, what code in general and your CA itself including all its code.

One thing that can easily catch you is you don't realize with single row or no buffering you're triggering sending the record you left, not the current record. That's a one off situation that may lead to errors you don't expect, as you assume you handle the current record.

The best control you can get is with table buffering und using an explicit TABLEUPDATE after the scan loop, also it would really be the simplest to insert all records into the cursor with an INSERT INTO target SELECT * from source. That will include memo and blob fields, or APPEND FROM source.DBF
 
Last edited:
I ran into the same issue when trying to update BLOB or MEMO fields in MySQL using a remote cursor with a CursorAdapter. The GATHER MEMVAR MEMO command was causing SQL errors during TABLEUPDATE() — likely due to how VFP handles memo fields vs. how MySQL expects them.


I switched to using just GATHER MEMVAR (without the MEMO clause), then manually assigned the memo/blob fields in the CursorAdapter’s BeforeInsert method using REPLACE. This avoids VFP’s automatic memo assignment quirks and gives you control over how BLOB data is pushed into MySQL.

* In BeforeInsert of the CursorAdapter:
REPLACE blobfield WITH oSource.blobfield

This method made the TABLEUPDATE() generate valid SQL and worked reliably. Also, if you’re repopulating entire tables, consider running a TRUNCATE TABLE directly on the MySQL side instead of deleting all rows via VFP — it's faster and avoids unnecessary DELETE operations.


Hope this helps someone else avoid hours of debugging!
 
As you have code in BeforeInsert and maybe more, that is important to know, don't you think?
I have never had any code in he CA, I entered that line of code in the BeforeInsert method today when trying to find a resolution"

What are your fieldnames?
The fields names that I have added are DContent and Notes, the first being the BLOB and the second being the MEMO field.

It's better to use SCATTER MEMO NAME loRecord and GATHER MEMO NAME loRecord instead, or even INSERT INTO (oTarget) FROM NAME loRecord
I will have a play with this, thank you for the suggestion

When you fetch records and then delete them, that's also causing a lot of deletes in MySQL. I'm sure you also want that, but the straight forward way if you actually want to repopulate all data is to TRUNCATE the table and start empty on MYSQL already.
The database content is created in VFP, when the creation is complete it is uploaded to MySQL (so this would be to an empty MySQL database at that point) then there is an ongoing synching process. There is a rare scenario where you want to re-create the data in the VFP database, at tis point you need to remove all the data from MySQL and re-copy over the VFP data.

I don't use any buffering, am dealing with the data on a record by record bases. It is a single user process, so no locking needed, the VFP database is on a PC and the MySQL on a RPi.

@markwoodwebdev, thanks for your reply. Sounds like exactly the same problem and found that the individual REPLACE for BLOB and MEMO fields seems to be the only way of resolving it that I can find.

Take on board both comments with regards to deleting and TRUNCATE TABLE, will also look at this so thanks both.
 
I guess you can solve this problem using a cursor schema, the mapping not only of field names but data types then can be fitted the best.

What you should also try is these settings for all workareas (that's done by using workarea 0 here, in the last parameter):
Code:
CURSORSETPROP('MapVarchar', .T., 0)
CURSORSETPROP('MapVarbinary', .T., 0)

I assume if a REPLACE in a BeforeInsert works, then you have a NULL value in the field you don't expect to have there and turn that to a value that doesn't cause an error. Anyway, check your data is as you want it and not missing the first or last record or having shifted values.

I can't for the sake of my life see how any mechanism you use to populate the cursor field would effect the transport of the blob from the cursor into mysql, so it can't for 100% certainty matter how it got there.
 
I switched to using just GATHER MEMVAR (without the MEMO clause), then manually assigned the memo/blob fields in the CursorAdapter’s BeforeInsert method using REPLACE. This avoids VFP’s automatic memo assignment quirks and gives you control over how BLOB data is pushed into MySQL.
I can't support that idea. If you don't gather with the MEMO option you're skipping all FPT related fields, Memo, Blob, or General fields. Then it also is no wonder you need a replace.

Also, if you don't have MEMO in the SCATTER, you're not even scattering MEMO (FPT file) related fields into variables, so in that case GATHER with MEMO doesn't save the day, too.
 

Part and Inventory Search

Sponsor

Back
Top