Jump to content
IGNORED

Stella input response vs. real - measured


dualcam

Recommended Posts

From time to time, the question of emulation response comes up - usually more specifically as the analog (paddle) response time in Stella vs a real console. I thought I would try to come up with a way to measure this. Warning: this is a bit long - in case of TLDR, skip down to 6) :)

 

Here is the test rig I came up with -

http://home.comcast....se test rig.jpg

A photo-cell is placed up against the screen to detect the presence of the paddle/player. I have a 2600-daptor with a resistor wired in place of the paddle controller, and a switch is placed across this resistor. Opening/closing this switch simulates the paddle controller being instaneously moved between two different positions. An oscilloscope monitors the switch and the photo-cell, and can measure the time between switch open/close and photo-cell change.. Here is a sample of such a measurement (yellow is switch and purple photo-cell) -

http://home.comcast....screen shot.JPG

 

I used two diffent systems during testing -

desktop = single P4 2.4GHz, 512MB, basic APG2 video card no G/L support

laptop = Core 2 Duo 2GHz, 1GB, Intel 965 Express video basic G/L

Same LCD monitor was used for all tests (except those that specify TV).

 

Tests follow (range of numbers is measured time in milliseconds for repeated trys) -

 

1) 40-60ms - Stella, Breakout, desktop

The 2600 runs at 60 frame a second, or 16.7ms per frame, so this is a delay of 2.4-3.6 frames. Note that my pressing of the button is not sycronized to Stella's frame generation, so a one frame variation is expected as I could have pressed anywhere between right at frame start to right at frame end. CPU usage ~20%.

 

2) 28-38ms - Stella set to 300 frames/second, rest same as 1)

Turning up the frame rate could give some indication of delay within the emualtion version outside (USB, O/S, video, ...). I would not expect outside delays to be affected by the increased frame rate. Note I said emulation vs emulator, as there could be delays within the game logic (more on this later).

 

3) tried laptop, no change over 1)

3a) enabled G/L, no change over 1)

Hardware speed does not seem to affect the delay.

 

4) 70-80ms - MESS, Breakout, desktop

Curious how a different emulator would compare. Note the difference could come from the use of different OS routines as well as the emulator code itself.

 

5) 35-60ms - Stella, KABOOM!, desktop

Thought I should try KABOOM!, as it's a more popular game and wanted to see how an Activision game would compare. Prehaps slighty more responsive, but could be just some measurement variation. More on KABOOM! in the last test.

 

6) Here is the test we are really interested in - real console vs emulation. Display is same LCD TV for both.

100-120ms - real 7800, SuperBreakout

75-100ms - Stella, SuperBreakout, laptop

So... emulation beats real, apparently. I say apparently, as it is hard to get a direct comparision. Even though I used the same TV for both, the laptop used the TV's VGA input and the real console the coax input - so they used different video signal processing within the TV. A side note: this TV is slow compared to the monitor.

 

6a) again with a different game

120-170ms - real 7800, Warlords

100-150ms - Stella, Warlords, laptop

Warlords is slower that SuperBreakout. Not a suprise to me, as Warlords as a definite preceivable delay. My surprise was this was not higer, but more to follow. Would liked to have ran a real test with KABOOM!, but don't have the cart.

 

7) This is a measurement of "return" time. All prior measurements started with paddle/player under the photo-cell, and timed how long until it disappered. This starts with the paddle/player at the other side of the screen, and times how long until it reappears under the photo-cell.

150-200ms - Stella, Warlords, laptop - TV

Compare to 6a, its about 50ms longer.

 

7a) More average "return" times, 2nd number is difference compared to prior tests (note back to deskop and monitor)

60ms +10ms Stella, SuperBreakout, desktop

90ms +40ms Stella, Breakout, desktop

150ms +100ms! Stella, KABOOM!, desktop

SuperBreakout is only about 10ms slower on the return (1 frame). I have always found it be a bit jittery compared to other games, and this seems to comfirm that. I interperate slower return times to mean the game logic is preforming smoothing on the paddle input and damping the responsiveness. It has been confirmed that this exists in the Warloads game logic. Looks like KABOOM! has an even comparatively larger amout of this too - on the order of 6 frames.

 

Tom

  • Like 3
Link to comment
Share on other sites

6a) again with a different game

120-170ms - real 7800, Warlords

100-150ms - Stella, Warlords, laptop

Warlords is slower that SuperBreakout. Not a suprise to me, as Warlords as a definite preceivable delay. My surprise was this was not higer, but more to follow.

 

Yep - Warlords only reads 1 paddle per frame, taking 4 frames to read all of them. It does this even if there's fewer than 4 players playing. When I wrote Medieval Mayhem I was able to squeeze in 2 paddles per frame, so the controls a little bit nicer.

Link to comment
Share on other sites

I would have expected emulation to be slower - I assume you used a CRT on the real machine?

 

Real hardware, you're reading the controllers in real time, except paddles where best-case is usually multiple scanlines.

Real hardware on old style CRT that doesn't do any fancy buffering or frame-doubling, what you see is when it happens.

 

Emulation is generally frame-based for controller stuff, any faster isn't usually necessary. Plus with USB you have lag waiting for the data to come in which could vary and even be fairly long if a badly written driver is used.

 

Emulation builds up a frame of graphics and will generally display it once built, delay there might be = 1 frame at the monitor's refresh rate.

 

 

One discrepency though - what if the real machine isn't coping with the rapid change in Pot voltage?

For digital sampling at least I found that there's a delay in rise time when going from 0 to logic 1, maybe a similar situation exists if Pot inputs go through sudden increases.

Link to comment
Share on other sites

I would have expected emulation to be slower - I assume you used a CRT on the real machine?

 

No, I used a LCD TV. Haven't had an CRT TV for some time, or I would have tried that.

 

Real hardware, you're reading the controllers in real time, except paddles where best-case is usually multiple scanlines.

Real hardware on old style CRT that doesn't do any fancy buffering or frame-doubling, what you see is when it happens.

 

Agreed, the 2600 is directly outputting to the CRT TV gun as it is scanning in real time. On the LCD TV, there is now probably some added buffering, thus I also compared to emulation that also was using the same LDC TV as the display (but the video signals were different formats).

 

Emulation is generally frame-based for controller stuff, any faster isn't usually necessary. Plus with USB you have lag waiting for the data to come in which could vary and even be fairly long if a badly written driver is used.

 

Emulation builds up a frame of graphics and will generally display it once built, delay there might be = 1 frame at the monitor's refresh rate.

 

If the frame-rate is the same, I don't see a way emulation could be faster. In my tests 6) & 6a) above, I think the emulation tested faster do to the TV processing different video input signals.

 

We can calculate the USB transmission time. The 2600-daptor for example only sends 5 data bytes per packet (which it does every milli-second) and transmits at USB full speed,12Mbps -

(5 byes * 8 bits-per-byte) / * (12,000,000 bits-per-second) = 0.000003 seconds (3 nano seconds).

Even for USB low speed (1.5Mbps), it would only be 24 nano seconds.

The computer will have a USB chip set to take care of the initial receiving/processing of the USB packer. After that it is up to the OS/driver, but ignoring poorly written code, there is very little data for it handle and seems very little that needs to be done with that data.

 

On the display side however, there is much much more data to pass through the OS, video driver, video card, and display.

 

One discrepency though - what if the real machine isn't coping with the rapid change in Pot voltage?

For digital sampling at least I found that there's a delay in rise time when going from 0 to logic 1, maybe a similar situation exists if Pot inputs go through sudden increases.

 

As I recall TTL gate delays on are on the order of nano-seconds, so I really don't see delays there would be significant (at least not compared to the milli-second delays I was measuring).

 

Tom

Link to comment
Share on other sites

By using a photo cell, you are adding another variation you have to consider, because the photo cell will react at a certain brightness level.

 

So if e.g. the emulator has a slightly brighter palette (or just the one color you tested is brighter), the photo cell will react a bit faster.

 

I remember my own experiments with Cosmic Ark almost 30 years ago. I couldn't stand the left and right meteors at a certain level, but I wanted to see all creatures. So I put two photocells left and right which triggered fire automatically. This worked quite nice until a certain level. But then it either fired too late or even missed the meteor completely. I could compensate for some extra levels by using a higher voltage for the photocell and by making the TV picture very bright.

Link to comment
Share on other sites

By using a photo cell, you are adding another variation you have to consider, because the photo cell will react at a certain brightness level.

 

So if e.g. the emulator has a slightly brighter palette (or just the one color you tested is brighter), the photo cell will react a bit faster.

 

I remember my own experiments with Cosmic Ark almost 30 years ago. I couldn't stand the left and right meteors at a certain level, but I wanted to see all creatures. So I put two photocells left and right which triggered fire automatically. This worked quite nice until a certain level. But then it either fired too late or even missed the meteor completely. I could compensate for some extra levels by using a higher voltage for the photocell and by making the TV picture very bright.

 

The photo-cell I'm using is outputting an analog value that is continuous in relation to the brightness. I'm not sure if you are thinking of it's output being ON/OFF switch that changes at a particular fixed level. I look at the scope trace, and can see where the photo-cell output has changed, and the absolute level is not relevant. On KABOOM!, the buckets and background are almost the same brightness, and the change is small, but it is visible on the scope trace.

 

Tom

Link to comment
Share on other sites

Outstanding work. I'm seeing similar numbers using a Pentium-M Dothan eco-clocked to 500MHz.

 

I find the biggest variables to be the LCD monitor as it buffers the frames. And double or triple buffering on the graphics card make a difference too.

 

Cool, so you are trying this same type of test? It has been my guess that any remotely modern processor would not be a bottle neck.

 

I have also been wondering if the main source of the 2-3 frame delay is video buffering. I still have a CRT monitor that I will have to try - though I wonder if it too, will buffer the frame.

 

Tom

Link to comment
Share on other sites

But the change is never immediate, isn't it? It may take some ms. And I suppose the higher the difference is, the steeper the curve becomes, right?

 

So where do you put the bench mark?

 

It does some time to change, but can't say if that is the photo-cell or monitor. If you look at the purple trace which is the photo-cell -

http://home.comcast.net/~tjhafner/analog%20response%20screen%20shot.JPG

I bench mark off the point that it started to drop off.

 

This is certainly not a perfect test and has it unknowns, but I think we can still get some interesting information out of it.

 

Tom

Link to comment
Share on other sites

VERDICT: I need those frames! No STELLA for me!

 

I take by STELLA, you mean emulation in general, as Stella tested faster than MESS.

 

What display will you run a real system on? My LCD TV tested slower than my LCD monitor ;). Obviously, they will vary, but never the less... On a CRT, you will have phosphors delays.

 

I wish I still had an (older) CRT TV to test with - that would be the real comparison.

 

Tom

Link to comment
Share on other sites

Dug up an old analog CRT monitor and did a couple more tests with interesting results. This monitor did create a problem though as the photo-cell is picking up the scanning of the gun, and the trace now appears as a series of spikes (purple trace) -

http://home.comcast....creen shot2.JPG

There is a matter of subjectivity to interpolate along the spike peaks as to when the paddle move started.

 

8 ) 20-40ms - Stella, Breakout, desktop, analog CRT monitor

Compare back to 1), this is about 20ms faster. Now down to a delay time of about 1.2-2.4 frames.

 

8a) 5-15ms - Stella set to 300 frames per second, rest same as 8 )

As in 2), turning up the frame rate should give an indication of how much of the delay is in the emulation (inside the frame loop, and this includes delays created the game logic itself) vs outside from things like the adaptor, USB, OS, video card, display, etc.

 

This puts a cap on how long these outside delays can be - they can't be more than 5ms. I had some tests where it appeared to be down to 2ms, but there is the question of subjectivity in reading the spiked trace.

 

Tom

Link to comment
Share on other sites

No surprise to me. CRTs have a maximum delay of 1 frame (PAL). That's where modern LCDs start, often you will find 2 or 3 frames, sometimes even more.

 

I find it hard to believe that you can measure anything below 1/60Hz (~16ms). That should be impossible for a 60Hz display. Or is your LCD running at a higher frame rate?

Link to comment
Share on other sites

No surprise to me. CRTs have a maximum delay of 1 frame (PAL). That's where modern LCDs start, often you will find 2 or 3 frames, sometimes even more.

 

I find it hard to believe that you can measure anything below 1/60Hz (~16ms). That should be impossible for a 60Hz display. Or is your LCD running at a higher frame rate?

 

My LCD monitor is set for 85 Hz refresh. I didn't get measurements below 16ms until tried the analog CRT, and you can see the scanning in the trace. For an analog CRT (at least NTCS, I don't know about PAL), the input signal drives the gun, so a change in the video signal will start appearing immediately on the current frame/scan-line - no minimum say 16ms delay. It's random were the gun position is at the time we want to change the screen (ie, button pressed) until the gun then reaches that point on the screen. Note the low times were with Stella set to 300 frames/second.

 

I suspect there are CRT's that are digital at heart - the input analog signal converted to digital and stored in a memory buffer by a processor, and then converted back to analog to drive the CRT display, and would have a 1 frame delay.

 

I was not sure what results the analog CRT was give, as I didn't know how the phosphorus would react.

 

Tom

Link to comment
Share on other sites

For an analog CRT (at least NTCS, I don't know about PAL), the input signal drives the gun, so a change in the video signal will start appearing immediately on the current frame/scan-line - no minimum say 16ms delay. It's random were the gun position is at the time we want to change the screen (ie, button pressed) until the gun then reaches that point on the screen. Note the low times were with Stella set to 300 frames/second.

True, but since the spike happens every 60th seconds, how do you know when e.g. the paddle it starts moving and when it stops when all you see are spikes. The movement could have started immediately behind the last unchanged spike or up to almost 60ms later.

 

I suspect there are CRT's that are digital at heart - the input analog signal converted to digital and stored in a memory buffer by a processor, and then converted back to analog to drive the CRT display, and would have a 1 frame delay.

PAL TVs have that delay, since they combine the color signals of two frames (to avoid NTSC's color errors). This is done by an analog delay device.

Link to comment
Share on other sites

As has been noted, I really think CRTs are the gold standard here. I'd be especially curious about seeing Stella on a CRT monitor compared with a real Atari on a CRT TV. I have a strong suspicion that Stella on a CRT monitor would completely outperform a real Atari on a LCD TV, given the horrible lag I've encountered with Atari on flatscreen TVs.

 

More generally, wouldn't the best lag test of all be an audio test? You can easily get resolution down to the millisecond by just inspecting the waveform, as recorded from the TV speakers. All you'd need would be a test ROM that listens (on every scanline?) for a button press, and then emits a square wave as soon as that happens. Put a contact microphone on the joystick button, and one in front of the speakers, and by recording them to the L and R channels of a stereo file, you could see the exact delay in any setup.

 

I'm sure there must be a way to use the same setup to time paddle lag, by triggering sounds with the motion of the pot. Maybe you could add a mechanical device on the paddle that would audibly "click" whenever it passes a certain point?

Link to comment
Share on other sites

I suspect there are CRT's that are digital at heart - the input analog signal converted to digital and stored in a memory buffer by a processor, and then converted back to analog to drive the CRT display, and would have a 1 frame delay.

 

There indeed are such CRTs. Around 2000, 100Hz CRT TVs started to appear in Europe. Such a TV, when in 100Hz mode, buffers last two fields (in interlaced video two fields create one frame) and basically displays them twice, effectively displaying the screen with the 100Hz frequency.

 

PAL TVs have that delay, since they combine the color signals of two frames (to avoid NTSC's color errors).

 

Not true. They combine color signals of two consecutive scanlines, not frames. No full-frame delay here.

Link to comment
Share on other sites

I take by STELLA, you mean emulation in general, as Stella tested faster than MESS.

 

What display will you run a real system on? My LCD TV tested slower than my LCD monitor ;). Obviously, they will vary, but never the less... On a CRT, you will have phosphors delays.

 

I wish I still had an (older) CRT TV to test with - that would be the real comparison.

 

Tom

 

Oh don't mind me, I'm just being an idiot :)

I'm just never satisfied with most emulation unless it's running on the real hardware (which I guess isn't fully emulation, lol)...I find lag and delay just about everywhere I look, which is why I keep about three old consoles in minty shape so I can always get the 'real deal', so to speak! I commend your work, however, and it's cool to see these stats.

Link to comment
Share on other sites

USB times wouldn't be quite straightforward as serial transmission time based on whatever speed is active.

 

All manner of middleman delays in the software/driver/OS side then delay from the device itself before replying, then delay again on the PC side before the received packet becomes available and passed back to end-user application.

 

That said though, an emulated Atari console/computer game is likely to only be polling at 50 or 60 Hz, likely the PC could handle something doing so at 4-5 times that frequency at least.

Link to comment
Share on other sites

True, but since the spike happens every 60th seconds, how do you know when e.g. the paddle it starts moving and when it stops when all you see are spikes. The movement could have started immediately behind the last unchanged spike or up to almost 60ms later.

 

The change in the paddle value occurs with the switch press, the yellow line, and is not snycronized with the screen refresh so could happen anywhere between the spikes. For example -

http://home.comcast....creen shot3.JPG

The green grid lines are 20ms apart, and the next spike is just under the 3rd tick mark which are 4ms apart, so 12ms after the switch was pressed. It's peak is significatly reduced, telling me the paddle has already moved by that time. Stella is set fo 300 frame/second, so it is generating 5 frames between each peak. As the peak is quite low, I think the Stella saw the analog change several frames (at the 300fps rate) before the one that actually got displayed here. It is getting subjective to say much how earlier Stella actually saw the change (I would say it saw at about 8ms), but we can be certain the response did not exceed 12ms.

 

Tom

Link to comment
Share on other sites

More generally, wouldn't the best lag test of all be an audio test? You can easily get resolution down to the millisecond by just inspecting the waveform, as recorded from the TV speakers. All you'd need would be a test ROM that listens (on every scanline?) for a button press, and then emits a square wave as soon as that happens. Put a contact microphone on the joystick button, and one in front of the speakers, and by recording them to the L and R channels of a stereo file, you could see the exact delay in any setup.

 

I'm sure there must be a way to use the same setup to time paddle lag, by triggering sounds with the motion of the pot. Maybe you could add a mechanical device on the paddle that would audibly "click" whenever it passes a certain point?

 

Agreed, I had thought of audio, but I don't have a way to make a test ROM. That would elimate all the variables of the video output.

 

It was my initial intent to more focus on input delays, but I think the delays in the video will also be seen as lagging poor controller response. It is the total hand back to eye-ball time. Looks to me video delays (display particularly) far exceed input delays from USB and software/driver/OS.

 

Tom

 

Edit: I like that these tests have been done with the actual games. I think it has also helped point out that are delays within some of the games created by the game code itself (by like not reading the paddles every frame and smoothing or damping to it).

Edited by dualcam
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...