Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations bkrike on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

time in centiseconds 4

Status
Not open for further replies.

YYYYUU

Programmer
Dec 13, 2002
47
GB
I want to retrieve the current time and display this as


hhmmsscc

where cc is centiseconds.

Any ideas?

Thanks
 
third party hardware. it it has a sufficiently precise clock, it should also include software to retrieve and display the value(s) to the greater resoloution.


MichaelRed
m.red@att.net

Searching for employment in all the wrong places
 
I actually want the time formatted to populate Oracle table, but the format of the time must be

DDMMYYHHMMSSCC

where HH hourse
MM minutes
SS seconds
CC hundredths of seconds.
 
see previous. if it is ONLY to populate a requirement of a table, either use "00"" or (to be truly rediculous) a random value between "00" & "99". Win clocks do not support the resoloution.

read the docs (help).

MichaelRed
m.red@att.net

Searching for employment in all the wrong places
 
Here is an example of getting the local time. You will have to add the date parts, and, combine and format the results as desired.
Drop this into a new module:

Private Declare Sub GetLocalTime Lib "kernel32" (lpSystem As SYSTEMTIME)

Private Type SYSTEMTIME
wYear As Integer
wMonth As Integer
wDayOfWeek As Integer
wDay As Integer
wHour As Integer
wMinute As Integer
wSecond As Integer
wMilliseconds As Integer
End Type
Private sysLocalTime As SYSTEMTIME

Public Function getTIME() As String
GetLocalTime sysLocalTime

With sysLocalTime
getTIME = Format$(.wHour, "00") & ":" & Format$(.wMinute, "00") & ":" & Format$(.wSecond, "00") & ":" & Format$(.wMilliseconds, "000")
End With
End Function [/b][/i][/u]*******************************************************
General remarks:
If this post contains any suggestions for the use or distribution of code, components or files of any sort, it is still your responsibility to assure that you have the proper license and distribution rights to do so!
 
Yeah, they will support resolution to 1/100th of a second, but not to the millisecond (1/1000th).

The limitation is the PC hardware, not windows.

Chip H.
 

It's not just a hardware limitation.

Are we talking timers here, tickers, or getting the system time?




 
AFAIK, the tick is 18 /sec? All the routinely available 'clocks' use this. Obviously (to achieve the execution cycles), there are other clocks. I am not aware of their general availability through 'system' time.

Of course, I am more often wrong than right - it is only the contimuing attempt(s) to provide corrective feedback that provides any possibility of a correct response.

MichaelRed
m.red@att.net

Searching for employment in all the wrong places
 
No, the tick differs on different versions of Windows. 18 per sec is Windows 95/98 (and, I think, Me), giving a resolution of granularity of approx 54ms. This is a DOS legacy (you couldn't change the interrupt frequency because it would upset all sorts of things, although some badly-behaved games programs did).

This is not a problem under NT, and on NT4 the default tick gives a granularity of 10ms. But, if you are brave, you can change the tick frequency - theoretically down to a granularity of 1ms (which is effectively what the multimedia timers do)
 
So, while you 'can' get some 'info' re centiseconds, unless you are a bit off the beaten path, lower part(s) are more (of less?) randomized?


MichaelRed
m.red@att.net

Searching for employment in all the wrong places
 
The 'limitation' that often gets talked about concerning the PC clock is kind of a legacy of DOS.

On a PC a crytsal provides a base reference frequency of 1.19318MHz to the Programmable Interval Timer (PIT). The PIT divides this base input frequency down by a programmable integer (2 to 65536), which generates an output signal at the divided value (i.e potentially between 18Hz and 596MHz). This is fed to IRQ0 and provides the tick for the system clock.

Under DOS the divisor was fixed at 65536, resulting in a tick of 18Hz (i.e a tick about every 55 to 56 milliseconds).

Under DOS you can't (in principle) change the divisor because it would muck op the whole OS's timing (although there were several games that DID hack the divisor).

Under Windows you can muck about with the divisor and deliver higher resolution timing (to a limit of about 1 millisecond - which is, surprise, surprise, the resolution of the multimedia timers), but several of the time functions are nevertheless limited to the 'old' tick rate (e.g. Timers in VB). Under NT there are kernel level calls that can modify what the PC thinks is the core tick rate, and MS chose to set that rate to 10 milliseconds by default. Assuming that you can make kernel-level calls and have the privilges to do so, you can modify the percieved clock rate down to 1 millisecond and this will beneficially affect all the time fuinctions that work off that core rate (i.e you could get Timers that really try and fire every millisecond).

But, in general, this is overkill. If you want millisecond timing, use the multimedia timers. If you want centiseconds, well most of the basic time function are accurate to within about 5.5/5.6 centiseconds under W95/98, and to within 1 centisecond under NT. Want better than that? Move to the high resolution timers, which are not driven off the system clock.


 

The GetTickCounter has a resolution (interrupt dependent) of 10 ms on NT machines.

The QueryPerformanceCounter has a resolution of .8 microseconds.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top