Jump to content
IGNORED

Why do many scandoubler/TV grabber cards/LCD-TV suck on a8?


Beetle

Recommended Posts

Hi

 

some basics first, i simplified it a bit. If i am wrong in the one or the other detail, let me know.

 

standard TV system (NTSC or PAL, even SECAM) use interlaced video at 60Hz (NTSC) or 50Hz (PAL/SECAM).

That means, your TV set (old CRT style) displays 60 (50) fields per second. The even field displays the even

line numbers of the full image (line 2,4,6,8,etc) the odd field displays line numbers 1,3,5,7,9...)

The human eye/brain is to slow for really seeing two different pictures and will merge the fields to one picture.

Besides that the flicker is of the interlacing lines is visible, but thats another story.

 

VGA monitors and LCD/TFT displays wil only display progressive video. That means, the full picture with all lines

(1,2,3,4,5,6,7,8,9...) is displayed 60 (50) times per second. To be able to do this, the input video source (in our

case our beloved a8 machines) has to be deinterlaced: both fields, even and odd have to be grabbed and weaved

together to one progressive picture. Here lays the problem: The a8 has only one of the fields filled with content.

the other field only contains just blank black lines.

 

This was made for one reason: to reduce flicker. If a single horizontal line was 2 TV lines thick, it would have

one line in the odd and one line in the even field. Since these two fields toggle 25 times per second, the line

would never stand still, but flickering as hell. On "normal" TV sets used for computers back in the days

(12" to 15") the "missing" lines in the image wouldn't not be visible (and still aren't)

Good monochrome monitors will show the scanlines far better.

 

Quite many de-interlacers build in set-top boxes, LCD TV, PC video input cards get confused with this

blank lines (i think they are not completely blank, the sync signal will be there, otherwise no TV would display it.)

 

 

On my home theater projector's screen (an old Sony CRT monster) i can show the lines very good.

It is about 90" diagonal, so you can see it in detail.

 

 

EDIT: (i had trouble uploading the other images, see below for the pictures)

Edited by Beetle
Link to comment
Share on other sites

According to the Atari system hardware manual, the 8 bits do not do interlacing, the odd and even frames are the same. My experience with the VGA converters so far is that the problem is upscaling the image to be viewed at the LCD's native resolution. Plus, a lot of converters will only upscale the 8 bit image to a non native resolution for the LCD.

Link to comment
Share on other sites

puppetmark is right. The A8, as most vintage consoles/computers don't output an interlaced signal.

 

Each field is synced at the same exact position using progressive mode. It is not a "full" progressive mode, because here only half of the TV lines are displayed resulting in half the normal vertical resolution.

 

Sometimes this is described as a single field video. But it is single field in the sense that the scan lines displayed are only those of one of both field.

 

Many scan doublers have troubles with this "progressive" mode.

 

If in addition, you are using an LCD display, there is an issue of scaling to the monitor resolution and adapting to the monitor native frequency. Most LCD monitors can't do 50Hz.

 

Usually the best scan-doublers are those integrated in high quality LCD-TVs. But it is simply not possible to do a perfect scanling/conversion. What would you do if a game is animated at 50 fps? Unless the monitor can sync to the original frequency or to an exact multiple of it (most won't be able), then you will always get delays and/or distortion.

Link to comment
Share on other sites

The difference between the Atari's video output (or an NES, or any other noninterlaced 15KHz video) and a standard (interlaced) TV signal is that the Atari always scans the same lines. An interlaced signal has slightly different timing between the even and odd fields since one field will start scanning a little earlier so that the lines will fall between the lines of the previous field. The Atari is really 320x192x60hz (or whatever) progressive. The reason for scanlines is that the dots on the TV are small enough to accomodate the higher resolution of the interlaced picture, so while the Atari may only show 192 lines they still aren't any "thicker" than they would be with an interlaced picture. Likewise, on my PC monitor I can see scanlines if I look closely at 800x600 because the dot pitch is really small enough for 1280x1024.

 

As for TV cards, the ones I've seen just assume that any input is interlaced, so something like the Atari will simply be scan-doubled and no scanlines will show up at all. The downside is that when things move around the screen at 60hz or blink on and off rapidly you'll see a comb effect.

Link to comment
Share on other sites

What we really nead is an LCD controller that can digitize the 8-bit video, process it, then display it in a way that will "emulate" what a CRT does. In other words, it would display the video in digital scan lines instead of a matrix of pixels. Its a tall order because it would take a lot of processing power but not impossible.

Link to comment
Share on other sites

What we really nead is an LCD controller that can digitize the 8-bit video, process it, then display it in a way that will "emulate" what a CRT does. In other words, it would display the video in digital scan lines instead of a matrix of pixels. Its a tall order because it would take a lot of processing power but not impossible.

 

I think what one would want would be a scan doubler that would simply output black on alternate scan lines--the same set every field. Don't know if anyone makes such a beast.

Link to comment
Share on other sites

Here's an older thread on this.

This should work on an LCD TV as well:

 

 

http://www.atariage.com/forums/index.php?s...p;#entry1241047

 

 

 

 

 

Hi

 

some basics first, i simplified it a bit. If i am wrong in the one or the other detail, let me know.

 

standard TV system (NTSC or PAL, even SECAM) use interlaced video at 60Hz (NTSC) or 50Hz (PAL/SECAM).

That means, your TV set (old CRT style) displays 60 (50) fields per second. The even field displays the even

line numbers of the full image (line 2,4,6,8,etc) the odd field displays line numbers 1,3,5,7,9...)

The human eye/brain is to slow for really seeing two different pictures and will merge the fields to one picture.

Besides that the flicker is of the interlacing lines is visible, but thats another story.

 

VGA monitors and LCD/TFT displays wil only display progressive video. That means, the full picture with all lines

(1,2,3,4,5,6,7,8,9...) is displayed 60 (50) times per second. To be able to do this, the input video source (in our

case our beloved a8 machines) has to be deinterlaced: both fields, even and odd have to be grabbed and weaved

together to one progressive picture. Here lays the problem: The a8 has only one of the fields filled with content.

the other field only contains just blank black lines.

 

This was made for one reason: to reduce flicker. If a single horizontal line was 2 TV lines thick, it would have

one line in the odd and one line in the even field. Since these two fields toggle 25 times per second, the line

would never stand still, but flickering as hell. On "normal" TV sets used for computers back in the days

(12" to 15") the "missing" lines in the image wouldn't not be visible (and still aren't)

Good monochrome monitors will show the scanlines far better.

 

Quite many de-interlacers build in set-top boxes, LCD TV, PC video input cards get confused with this

blank lines (i think they are not completely blank, the sync signal will be there, otherwise no TV would display it.)

 

 

On my home theater projector's screen (an old Sony CRT monster) i can show the lines very good.

It is about 90" diagonal, so you can see it in detail.

 

Here is the output of my s-Video 2.1a upgraded 800XL

 

Notice the very well visible lines on my screen. the lit lines have same thickness as the black ones.

Also notice the screenshot is sharp, you can see my finger.

 

(i have currently trouble uploading the other images, trying again tomorrow)

Link to comment
Share on other sites

The main problem is not "emulating a CRT". High quality LCD-TVs do that. The main problem is not the blank lines or the progressive mode. Some have troubles with that, but many do not.

 

The two main problems are: Using a non-native resolution, and altering the original vertical frequency. There is no perfect solution for this, specially for the latter.

Link to comment
Share on other sites

You don't believe me. Okay, i can live with that. But i believe what i see on my screen.

For some reason uploading images to AtariAge didn't work for me yesterday night.

 

Here are the pics that were supposed to be in my initial posting.

 

At first a screenshot of my projection screen, displaying the DVD players menu with

progressive scan.

progressive-m.jpg

No lines visible, all are displayed every frame.

 

 

This is what the same image looks like in interlaced video. I chose the exposure

time long enough to be able to show both fields.

interlaced-m.jpg

Because the lines got displayed less often, the picture is less bright and the

scanlines become visible.

 

 

Now the Atari picture. You notice how far seperated the lines are right now. You could

literally count the lines now.

atari-m.jpg

 

 

I can see how this lines interlace when i look into my camera's TFT screen. On the

interlaced picture above the characters become hard to read on the screen because

sometimes it will get the even, other times it will capture the odd field.

With the Atari picture on the cam's TFT there is sometimes a visible picture, sometimes not.

This is because my Digicams internal screen works with 30 fps while my PAL equipment

is 50Hz.

Edited by Beetle
Link to comment
Share on other sites

Additional to ijors comments.

 

The sharpness and good pixel representation on LCDs count against it when it comes to displaying output from the Atari, so far as showing the blank scanlines is concerned.

 

On a CRT TV, you get colour bleed which makes the blank lines all but disappear. On an LCD, not the case since pixels tend not to illuminate or alter the colour of adjacent ones.

 

Too bad LCD TVs don't have an option to blend non-standard displays as generated by the Atari and most legacy computers and consoles.

Link to comment
Share on other sites

+1

 

Non-native resolutions on LCDs look crap, no matter what the source.

This has been my quandry with buying a TFT to replace the old CRT. LCD's look the part and take up so much less desk space but the downside is that they suck at anything but the native res due to scaling and most scalers are slow and give an uneven resize... Well, I've took the plunge and just bought an LCD TV this morning with as the manufacturer says "State of the art fast image scaling" I will see how this performs later tonight. I needed to get an LCD because it also has VGA input for my towered Amiga, I just don't have the room for a CRT monitor and a 15" CRT TV ... Keeping my fingers crossed.. Edited by Tezz
Link to comment
Share on other sites

+1

 

Non-native resolutions on LCDs look crap, no matter what the source.

This has been my quandry with buying a TFT to replace the old CRT. LCD's look the part and take up so much less desk space but the downside is that they suck at anything but the native res due to scaling and most scalers are slow and give an uneven resize... Well, I've took the plunge and just bought an LCD TV this morning with as the manufacturer says "State of the art fast image scaling" I will see how this performs later tonight. I needed to get an LCD because it also has VGA input for my towered Amiga, I just don't have the room for a CRT monitor and a 15" CRT TV ... Keeping my fingers crossed..

 

That certainly used to be true, but my recent (US NTSC) experience is quite different. I have 17" Samsung and LG LCD monitors that look great at several different resolutions. My wife likes hers at 1024 X 768 (bigger icons on a 1280 X 1024 native LCD) while I prefer mine at the full 1280 X 1024 to get more on the screen. But even 800 X 600 looks good. But there are probably some resolutions that these LCD's won't display, and it likely has something to do with the scaling factors built-in.

 

Conversely, I bought a nice 17" CRT several years ago, and although it displayed fine in several resolutions, my Atari absolutely sucked on it.

 

As far as the Atari goes, I have had the best luck with ATI's All-In-Wonder cards. Beautiful display on my LCD's, and scalable from very small to full-screen (overkill on a 17" monitor). I also have had very good luck with the Hauppage Win-TV cards. I've had similar luck with strand-alone XGA upscalers, although I've found that *in general*, the cheaper ones don't do well, and the more expensive ones do fine. (I've sent several back to the sellers.)

 

My advice: check several kinds of monitors and ask about display scalability.

 

-Larry

Link to comment
Share on other sites

Additional to ijors comments.

 

The sharpness and good pixel representation on LCDs count against it when it comes to displaying output from the Atari, so far as showing the blank scanlines is concerned.

 

On a CRT TV, you get colour bleed which makes the blank lines all but disappear. On an LCD, not the case since pixels tend not to illuminate or alter the colour of adjacent ones.

 

Too bad LCD TVs don't have an option to blend non-standard displays as generated by the Atari and most legacy computers and consoles.

 

I agree. the bottom line is that a custome controller / converter is needed to display the 8-bit video properly. One that can take these issues into consideration. But what I also now understand is the controller need to be connected to an LCD with the correct native resolution and frequency.

Link to comment
Share on other sites

My ATI card has the option to upscale. In that case, the work is done in the video card and it outputs at native resolution, although you still get the same washed out look.

 

Old computers really shouldn't look that bad aside from the blank lines issue. Scaling 340 or so pixels up to 1280 would look somewhat better than upscaling 1024 to 1280.

Link to comment
Share on other sites

Here lays the problem: The a8 has only one of the fields filled with content.

the other field only contains just blank black lines.

 

This bit can't be right. There are 50/60 fields per second, and if every other field consisted of blank black lines, the Atari screen would flicker like mad because its effective refresh rate would only be 25/30 frames/sec.

 

We'd all be blind or crazy after all these years of staring at that...

 

Also: there are definitely 50/60 VBLANKs per second (RTCLOK counts 50/60 jiffies per second, for one thing). If you write a VBI routine that changes the background color 50 or 60 times per second (red on even frames, blue on odd frames), what you see is a flickering blend of the two colors (purple). If every other field were truly all black, you'd only see one color (only red, or only blue).

Link to comment
Share on other sites

If you use "mplayer" with your TV capture card, you can get rid of the interlace issue: tell it to capture 60 frames/sec, then tell it to discard every other frame. The command line looks something like:

 

mplayer tv://1 -tv driver=v4l2:input=1:norm=NTSC:fps=59.94 -vf field=0

 

Of course, you only get 30 frames/sec doing this, so any games that use flicker won't work right, and games that don't use flicker will still look less smooth (if your character is moving one pixel per frame 60 times/sec, he'll appear to move 2 pixels at a time, 30 times/sec).

 

BTW: mplayer is usually used on Linux or other UNIX-like OSes, but it does work just fine on Windows... see http://www.mplayerhq.hu/

Link to comment
Share on other sites

There are 50/60 fields per second.

 

But, they're the same field over again, not alternating fields as per the interlace standard.

 

There are black lines because of the absence of the other field.

 

More precisely: the video output isn't conforming 100% to the official PAL/NTSC interlaced standards since the ANTIC only outputs 312 (or 262) scanlines per field instead of 312.5 (or 262.5). This is also documented in the ANTIC data sheet. This missing half scanline per field is responsible for that the TV doesn't shift every other field by half a scanline.

 

Strictly speaking, there are no fields here and the Atari outputs 50 (or 60) progressive frames per second each with half the vertical resolution of a PAL/NTSC frame.

 

so long,

 

Hias

Link to comment
Share on other sites

Beetle I won't pretend to understand what goes on with A8 video but perhaps showing you what I see will help someone else explain it.

 

My video card (an original PCI ATI All-in-Wonder) can do still capture single fields of an NTSC frame. Video captures are only done on whole frames (both fields).

 

I believe both types of captures (of Flickerterm 80) demonstrate what the A8s are outputing.

 

[update]

The video was too big and I don't have the tools to re-encode at the moment..

 

With some trickery I did manage to get a screen capture of just what I see on the All-in-Wonder. I can't explain the color variation.

 

Single Field Capture

A8_Still_Capture.bmp

 

Capture of Interlaced Video Snapshot

post-9154-1177447124_thumb.png

 

- Steve Sheppard

 

Link to comment
Share on other sites

Here lays the problem: The a8 has only one of the fields filled with content.

the other field only contains just blank black lines.

 

This bit can't be right. There are 50/60 fields per second, and if every other field consisted of blank black lines, the Atari screen would flicker like mad because its effective refresh rate would only be 25/30 frames/sec.

 

We'd all be blind or crazy after all these years of staring at that...

 

Also: there are definitely 50/60 VBLANKs per second (RTCLOK counts 50/60 jiffies per second, for one thing). If you write a VBI routine that changes the background color 50 or 60 times per second (red on even frames, blue on odd frames), what you see is a flickering blend of the two colors (purple). If every other field were truly all black, you'd only see one color (only red, or only blue).

You are completely right. Nothing i can add to that.

 

 

There are 50/60 fields per second.

But, they're the same field over again, not alternating fields as per the interlace standard.

There are black lines because of the absence of the other field.

Your comment matches what i photographed, too.

 

 

There are 50/60 fields per second.

But, they're the same field over again, not alternating fields as per the interlace standard.

There are black lines because of the absence of the other field.

 

More precisely: the video output isn't conforming 100% to the official PAL/NTSC interlaced standards since the ANTIC only outputs 312 (or 262) scanlines per field instead of 312.5 (or 262.5). This is also documented in the ANTIC data sheet. This missing half scanline per field is responsible for that the TV doesn't shift every other field by half a scanline.

 

Strictly speaking, there are no fields here and the Atari outputs 50 (or 60) progressive frames per second each with half the vertical resolution of a PAL/NTSC frame.

Hias, that is exactly the technical explanation i was looking/asking for.

It matches what my screenshots show and explain why "progressive

scan of 312 scanlines" work on standard TV sets.

 

Thank you all for proving my assumptions wrong. Thats what i call constructive

work, guys!

 

 

Greetings,

Beetle

Link to comment
Share on other sites

 

 

With some trickery I did manage to get a screen capture of just what I see on the All-in-Wonder. I can't explain the color variation.

 

 

 

That is showing 2 frames, blended together.

 

Every adjacent character is offset vertically as the card is assuming it's a normal interlaced display.

 

 

That's using VIVO on a normal AGP card I assume? Capture cards, especially ones integrated into AGP cards, are very prone to interference.

 

I find the best results usually come by using S-Video on a dedicated capture card - try to have it in a PCI slot well away from any other devices.

 

You might want to see if there's a "Spread spectrum" option in your PC's BIOS - it can help, it slightly skews the system clocks and supposedly smooths the waveforms of the clock output, which can cut down the effects of RFI generated in the computer.

Link to comment
Share on other sites

Every adjacent character is offset vertically as the card is assuming it's a normal interlaced display.

 

Looking at that display, I would think that legibility of that particular program on a real screen would be improved if the A8's video output could be jinxed to provide an interlaced signal.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...