The 'limitation' that often gets talked about concerning the PC clock is kind of a legacy of DOS.
On a PC a crytsal provides a base reference frequency of 1.19318MHz to the Programmable Interval Timer (PIT). The PIT divides this base input frequency down by a programmable integer (2 to 65536), which generates an output signal at the divided value (i.e potentially between 18Hz and 596MHz). This is fed to IRQ0 and provides the tick for the system clock.
Under DOS the divisor was fixed at 65536, resulting in a tick of 18Hz (i.e a tick about every 55 to 56 milliseconds).
Under DOS you can't (in principle) change the divisor because it would muck op the whole OS's timing (although there were several games that DID hack the divisor).
Under Windows you can muck about with the divisor and deliver higher resolution timing (to a limit of about 1 millisecond - which is, surprise, surprise, the resolution of the multimedia timers), but several of the time functions are nevertheless limited to the 'old' tick rate (e.g. Timers in VB). Under NT there are kernel level calls that can modify what the PC thinks is the core tick rate, and MS chose to set that rate to 10 milliseconds by default. Assuming that you can make kernel-level calls and have the privilges to do so, you can modify the percieved clock rate down to 1 millisecond and this will beneficially affect all the time fuinctions that work off that core rate (i.e you could get Timers that really try and fire every millisecond).
But, in general, this is overkill. If you want millisecond timing, use the multimedia timers. If you want centiseconds, well most of the basic time function are accurate to within about 5.5/5.6 centiseconds under W95/98, and to within 1 centisecond under NT. Want better than that? Move to the high resolution timers, which are not driven off the system clock.