I have a situation where I need to update an archive table several times through out the day. To do this, I need to generate some temp tables to populate aggregate data to the archive.
Basically, the user enters a bunch of new records, or alternatively, closes a bunch of existing records as a batch. At the end of the transaction, the aggregate data is updated to the archive. This obviously causes a noticeable lag from the time the user clicks DONE to the time the tables update, and the form closes.
Currently, I am doing this through stored procedures (Make Table and Update queries). Temp tables are a must, as there are too many "Operation must use updateable query" errors with Union and Summary queries to do this any other way.
SO, the question is this: will replacing the make table and update queries with CREATE TABLE and UPDATE statements improve performance? Can the time lag be 'less noticeable'?
Discussion?![[ponder] [ponder] [ponder]](/data/assets/smilies/ponder.gif)
Thanks,
Melanie
Basically, the user enters a bunch of new records, or alternatively, closes a bunch of existing records as a batch. At the end of the transaction, the aggregate data is updated to the archive. This obviously causes a noticeable lag from the time the user clicks DONE to the time the tables update, and the form closes.
Currently, I am doing this through stored procedures (Make Table and Update queries). Temp tables are a must, as there are too many "Operation must use updateable query" errors with Union and Summary queries to do this any other way.
SO, the question is this: will replacing the make table and update queries with CREATE TABLE and UPDATE statements improve performance? Can the time lag be 'less noticeable'?
Discussion?
![[ponder] [ponder] [ponder]](/data/assets/smilies/ponder.gif)
Thanks,
Melanie