Hi all,
I'm trying to see how to speed up a rather large and lengthy Select Into statement.
The SQL creates a large table out of dozens, possibly hundreds of smaller tables. They are all related on a single key field. Each smaller tables has one to perhaps a dozen fields. The final table can easily bump up against the 1024 field limit.
Every table is indexed on the key field.
The front end is Access. In VBA the long SQL statement is created, then run using the ADO connection execute method.
Each small table could have thousands of rows.
I don't know that there is much to do to speed it up. It can take hours, depending on just how much source data (and source tables) there are. This always changes, since even the small tables are created on the fly from a different part of the processing.
I tinkered with getting this to be a stored procedure but it is not possible -- the sp would need to behave like an SQL Insert Into template, which then takes hundreds of parameters - no good.
Perhaps I could take the long SQL statement and create it as a stored procedure, but does this save any time? Isn't all the processing currently on the server anyway, using the ADO connection. Nothing runs in Access, Access just assembles the SQL string.
Another issue to this, I would like to provide feedback to the user that something is happening. I don't believe there is any way to hook into the long processing to give some feedback on the Access form or the Access status bar. The best I have done is display a message that the processing has started at "now()" and that it could take a long time.
Any ideas on all this!??
Thanks!
I'm trying to see how to speed up a rather large and lengthy Select Into statement.
The SQL creates a large table out of dozens, possibly hundreds of smaller tables. They are all related on a single key field. Each smaller tables has one to perhaps a dozen fields. The final table can easily bump up against the 1024 field limit.
Every table is indexed on the key field.
The front end is Access. In VBA the long SQL statement is created, then run using the ADO connection execute method.
Each small table could have thousands of rows.
I don't know that there is much to do to speed it up. It can take hours, depending on just how much source data (and source tables) there are. This always changes, since even the small tables are created on the fly from a different part of the processing.
I tinkered with getting this to be a stored procedure but it is not possible -- the sp would need to behave like an SQL Insert Into template, which then takes hundreds of parameters - no good.
Perhaps I could take the long SQL statement and create it as a stored procedure, but does this save any time? Isn't all the processing currently on the server anyway, using the ADO connection. Nothing runs in Access, Access just assembles the SQL string.
Another issue to this, I would like to provide feedback to the user that something is happening. I don't believe there is any way to hook into the long processing to give some feedback on the Access form or the Access status bar. The best I have done is display a message that the processing has started at "now()" and that it could take a long time.
Any ideas on all this!??
Thanks!