No this is not normal, but you should know exactly which revision of Pervasive.SQL 2000 you are using?
It sounds like a memory leak, but you can limit the amount of cache that Pervasive.SQL 2000 uses in the configuration and the memory usage should not grow. If it grows outside of that boundary then you either have a memory leak or are building indexes which uses memory outside of cache. I would recommend testing the inserts on a file with no indexes present to see how it behaves and contrast that with this behavior. I'm beginning to recommend at least 512MB of memory on any NT or Win2K server with Pervasive.SQL because every system I have with 256MB of memory, no matter how I limit the cache and other parameters in P2Ki, it seems the servers use up around 175-2225MB of RAM when idle, leaving not a lot for database cache and increasing the liklihood of swapping when doing any significant work. how much memory does your server have and how big are the data files/records you are working with?
You should also apply the latest service pack to both the server and client and that should take care of any potential memory leaks. Lastly, contact Pervasive support in Brussels and they will help you resolve this. We have customers working with files in the 10 to 20GB range and above and database approaching 1/4 to 1/2 a TB so a simple 150,000 records is no big deal, but may require some techniques to be most efficient with Clarion. You can also work with SoftVelocity on making sure your inserts are using the Extended Operations for maximum efficiency over a network.
Pervasivite