I am going to be using an offscreen screen 13 buffer (one dimension) so I am going to have to use (y * 320 + x)64000 times every frame. I am assembly illiterate, I was wondering if someone out there would help me out with a line of assembly to do that quicker than Qbasic would.
Is there a reason hexidecimal is used more often than decimal, is it faster?
Red Flag Submitted
Thank you for helping keep Tek-Tips Forums free from inappropriate posts. The Tek-Tips staff will check this out and take appropriate action.
Reply To This Thread
Posting in the Tek-Tips forums is a member-only feature.