Jump to content
IGNORED

Opinion of Atari 5200


ATARI7800fan

Recommended Posts

Do you still say no such thing? :P

post-997-129790980725_thumb.png

 

Yep. It's possible for simpler graphics to be "better". Sometimes I love the fuzzy glow around 2600 graphics on an old CRT. Blocky shapes can stimulate the imagination, too. Did you read my quote above, about the kid who said that he liked radio more than TV "because the pictures were better"? I've never enjoyed a tank game more than 2600 Combat, because the graphics weren't distracting.

Link to comment
Share on other sites

The 7800, with it's primary use of the same low-res sprites (albeit colorful and non-flickering), 2600 sound quality, and choppy gameplay in many games, was just a 2600 with a band aid.

 

7800 a bandaid?

 

Sorry dude - but if you think the 2600 or 5200 can pull off anything like Sirius, or Alien Brigade or Tower Toppler or Scapyard Dog or Midnight Mutants or Commando, I'd like to see what you're smoking.

 

I get you having a preference, but referring to it as "just a 2600 with a bandaid", is ridiculous.

 

The entire graphics architecture is completely different. You might as well call the NES a "2600 with a bandaid", with that OVER-SIMPLIFICATION.

 

2600: 6507, TIA, 1 button controller, 128 bytes

NES: 6502, PPU, 2 button controller, 2K+2K

7800: 6502, 2 button controller, 4K

 

I DO NOT understand for the life of me, why some 5200 owners always feel the need to even bring up and bash the 7800??? Is it jealousy because the system got killed? I swear to god, someone could write a post on how they once played 5200 on a Beach in Hawaii and some other 5200 owner would find a way to bring the 7800 into the topic and start bashing!!! And some 5200 brasher will start commenting on the controller ...

 

Yes, A Bandaid!

I own both systems and almost every game made for them, so I have no reason to be jealous. However:

 

  1. You are in the 5200 thread, so why are you complaining about people boasting about the 5200?
  2. In your anger, you didn't see the big picture in your "NINTENDO is a 2600 with bandaid" diatribe. My comment was deeper than having the same processor. It's HOW the different components work together that make the difference.
  3. The 7800 is a 2600 with a Maria Chip and some more memory (for the most part). However, the main processor has to wait until Maria is done drawing before it can calculate where to place the next sprite. (Hence the choppy movement in games). That is why the 8-bits AND Nintendo have smoother game play in most games (or logic-heavy games).

 

I know the 5200/8-bits, Antic, halts the CPU when drawing to the screen. However, it was not considered a big thing for the 8-bit. However, I have read a few articles criticizing Maria for taking up a more than desirable amount of time. Maybe that can explain why so many 7800 games have such CHOPPY frame rates and sloppy control. The 8-bits had Antic, GTIA, and Pokey working together. I have seen some impressive things on the 7800, but they usually just displaying more colors, and the game play is sloppy or choppy.

 

That was all I was saying. The NES does have the same processor, but it doesn't halt when the PPU draws to the screen, nor do the 8-bits to the same extreme.

 

This is not the ONLY article that makes this claim, but I found it at short notice.

However, managing and displaying a large number of sprites required much more CPU time (both directly and indirectly since the MARIA would halt the CPU when drawing sprites) than consoles with hardware sprites and backgrounds.

 

If you had really read into my points, I think that Maria is wasted on the weak hardware. Additionally, the resolution is the same blocky pixels with more colors. Obviously, each system is going to excel in an area where the other doesn't. My point is that the 7800 should have blown the 5200 away. Instead, it has more colors, choppy gameplay, and "nails on a chalkboard" sound, because they were more worried with cost-cutting and backwards compatibility. This is NOT what I was expecting when I read the hype about my favorite company back in the day!

 

The 8-bits can "pull off" TowerToppler and Commando. I admit they are not as colorful, but the resolution is the same, and the game play is smoother.

post-13491-129791724514_thumb.pngpost-13491-129791725667_thumb.pngpost-13491-129791747704_thumb.pngpost-13491-129791748548_thumb.png

 

I didn't want to take 1 step forward and two steps back. That is what the 7800 felt like to me. It's more of a shift in power than an improveent.

 

Additionally, I wouldn't WANT Scrapyard Dog. I have it, and it's horrible. The main character glides across the screen like 2600 Pac-Man, and the fat pixel graphics are embarrassing compared to the NES

  • Like 1
Link to comment
Share on other sites

Ok, well, thanks for digging all those out. You must have spent quite a bit of time finding all those!

 

Of course there was criticism about the 5200's controllers. Especially since it was an unusual setup, considering the 5200's action/arcade game bias. Maybe it had some minimal impact on sales. I would still doubt it played a large part in the 5200's lack of success though. If that were the case, you could just as easily make the case that ColecoVision's or Intellivision's sales were affected by the controllers (and maybe they were). Though, it's a really good point that at least with the CV, you could just plug in an Atari-compatible stick for use with many games. That was a nice design-feature whether intentional or not. Either way, it's amazing to me that so many people complained (probably without having gotten used to them yet), and I also still wonder how many people in the "real world" complained about them. I sure never heard of it, and I still like them to this day, very much. Far far far more comfortable and usable than CV or Intv controllers for most games!

 

Anyway, thanks for all the effort, and so the magazines did print some criticism back then, ok.

 

I agree... Thanks for the time spent, CV Gus. It really adds to the memories.

 

I DID remember the magazines criticizing the joysticks. However, I agree with Mirage that people weren't used to them. I also think that the general public were like sheep. I think the magazines psyched people out from getting used to something new. My friends and I NEVER had any trouble getting used to them. I never had trouble finding my way around a Pac-Man maze, etc. I think it was just a bunch of unskilled gamers writing articles and leading the public. The BIG problem was that they were unreliable. So, that reinforced the negativity of the media.

  • Like 1
Link to comment
Share on other sites

The 7800, with it's primary use of the same low-res sprites (albeit colorful and non-flickering), 2600 sound quality, and choppy gameplay in many games, was just a 2600 with a band aid.

 

7800 a bandaid?

 

Sorry dude - but if you think the 2600 or 5200 can pull off anything like Sirius, or Alien Brigade or Tower Toppler or Scapyard Dog or Midnight Mutants or Commando, I'd like to see what you're smoking.

I'd contend that many of those games could have been done well on the 5200, but not 2600. (including Scrapyard Dog -with careful use of DLIs, hardware sprites with flicker and multiplexing, and softsprites -granted, still not as good looking as the 7800 overall, and those 7800 games weren't maxing out the system either)

 

 

But yes, MARIA was more advanced than anything else on the US market, still with trade-offs of course, but even more advanced than the NES in some areas. (put it on a full dual bus system -like the NES or CV/SMS/etc- and add competitive sound hardware, and you've got something that could really kick ass -albeit still with resolution trade-offs unless they added more flexibility to the graphics modes, like a higher dot clock version of the 160 wide mode -5.37 MHz would have been a nice compromise too- drop compatibility, and you might keep it in a similar price range on top of the added features -dual bus with more RAM dedicated to video or NES type wide cart slots with external video bus -if GCC made it on their own, the AY8910 would have provided some nice sound capabilities and 16 I/O ports for the controllers and any select switches)

 

MARIA was programmed rather differently from most contemporaries, but that wouldn't have been an issue if 1 they got it out early enough for developers to get used to it (especially developers coming off the 2600 and thus used to *odd* architectures and not the *normal* character/sprite/framebuffer based graphics), and 2 build up enough popularity (with the right advertising, licenses, 1st party software, etc) to get 3rd parties prefer the 7800 over contemporaries. (the PS2 was the least friendly architecture of its generation -vs the PS1 where it was the friendliest to develop for- by a large margin -and at a time when low-level programming was very unpopular, but that didn't stop it from being the most popular to develop for at the time -and even quite a few cases that pushed the hardware close to its limits).

 

Some of the biggest limits of MARIA are due to how it was implemented in the 7800 (though some of that could be expanded -like RAM- but not the bus sharing problem) , though others are tied to the short development cycle or probably GCC being very new to LSI chip design. (which only makes their achievement more impressive though)

 

I get you having a preference, but referring to it as "just a 2600 with a bandaid", is ridiculous.

A preference based on what WAS done, not what the actual potential is, even within the limited implementation in the 7800. ;) (the sound issue is right on for the most part though, it IS 2600 audio though with more CPU time to work with)

That's the one areas GCC should have made more compromises over. (collaborating with Atari Inc engineers to make a Low-cost low-pin count POKEY with the I/O removed, write only registers, shrunk die, single clock input, maybe no irq -since the 7800 doesn't favor interrupt driven effects- maybe even down to a 16 or 18 pin DIP; otherwise they'd need to make other compromises to fit a full POKEY onboard -though maybe make use of the I/O for an expansion port for the computer add-on- and the trade-offs would be dropping one of the 2 2k SRAM chips and/or using an external RF modulator -like various computers of the time, or added production cost of a larger riser board)

Short of any of that, they could push for the smaller/cheaper off the shelf SN76489 (not great, but a lot better than nothing -and with TIA to back it up), though even then there's some added board space needed. (using an external RF modulator might have been enough to make the added space without increasing the size of the riser board used on the first production run)

 

 

The entire graphics architecture is completely different. You might as well call the NES a "2600 with a bandaid", with that OVER-SIMPLIFICATION.

Actually, more like the Master System being a SG-1000 (or Colecovision) with a bandaid since, like the 7800, the sound was non-upgraded and rather weak for the time (the same boring SN76489 -better than TIA for music, but weaker for SFX and weaker than pretty much any other sound chip used in a console/computer save the VIC and C16/Plus4 -much more bare bones than the AY8910, let alone POKEY -granted the AY8910 is more favorable against POKEY if you can't use any interrupts)

 

Except the SMS and SG-1000/CV have the same exact CPU at the same clock speed (but more work RAM) vs the 7800/NES/2600/5200/etc that use the same CPU architecture (some missing stuff from the NES's custom CPU) and the VCS's is at a lower clock speed and has more limited address space and no IRQ/NMI input. (granted there's also the performance differences on the NES/7800/5200 due to DMA conflict on the latter 2 cutting down CPU time -more so on the 7800 from what I understand, but it depends on the complexity of the display lists iirc)

 

That, and I think the SMS VDP actually built onto some of the SMS TMS9918 logic. (rather than the 7800 tacking on MARIA or Genesis wasting die space with the Master System VDP logic on the main VDP ASIC -probably forced them to limit the CRAM of the Genesis to 64 entries rather than 128, possibly the 9-bit RGB over 12-bit as well -granted, they probably wasted a fair amount of space with the hilight/shadow logic as well -especially for a feature that was almost never used and less needed than more palette entries or color depth . . . GTIA like pixel accumulation for 1/2 res 8bpp graphics layers would have been nice though ;))

 

2600: 6507, TIA, 1 button controller, 128 bytes

NES: 6502, PPU, 2 button controller, 2K+2K

7800: 6502, 2 button controller, 4K

NES has 4 buttons technically, and before you say they're select/option key type buttons: some games DO use them as action buttons. (the TG-16 did the same thing, except so extreme that they developed 4 button controllers that remapped those to main buttons as well as the middle buttons -sort of like the Jaguar Pro controller did with the keys)

 

But yes, they're totally different systems, and if it wasn't for the tacked on VCS compatibility, the 7800 would have almost nothing in common with the VCS. (other than a CPU architecture shared by dozens of consoles, arcade machines, and computers)

Link to comment
Share on other sites

The whole point about the above posts was this: the 5200 controllers were not what you'd call overly popular back then, or even in later years.

 

In a way, one has to wonder what Atari was thinking. Didn't they actually test the controllers with the kind of games the 5200 would have?

 

If most arcade games back then used analog controllers like that, then it might have worked. But the fact is, almost none of them did, and those that did use analog usually used paddle or steering wheel (often the same thing) controls.

 

All else used digital controls.

Yes, digital (or pseudo digital -via pull-up resistors) controllers would have been better and more fool proof (probably more reliable among other things). Analog would be a nice accessory though. (the keypads probably could have been accessory controllers too -possibly including ones with built-in keypads)

That and the fire buttons could have been a little better (and everything should have use PCBs rather than flex circuits -and probably metal dome switches)

Having the stick not spin and rubberized (especially like a mini CX-40 derivative) would have been great too.

 

The ergonomics are great though, better than the 7800 proline sticks, better than the CX-40, better than the Colecovision controllers too. (the CV ones would have been great with non recessed buttons and a stick more like the Gemini -but better quality more like a mini CX-40 -as it is, the CV is OK for games where you don't need the fire buttons and can thus use the knob as a thumbstick, but it's not a gamepad and the button placement doesn't favor simultaneous use of the stick with your thumb)

All of the above (even the stock 5200 controller) are better than the Intellivision controllers though. ;)

 

The Vectrex also showed that a good quality, sensitive, short-throw analog stick can be far more competitive for 8-way games. (which also implies the revised 5200 stick with compact spring-loaded pot module and shorter throw movement would be close to that, though it would hardly be as cheap as a simple digital stick or pull-up resistors)

I have no idea why they didn't offer a pseudo-digital controller using pull-up resistors (as standard or an accessory) to address the analog problems after the fact. (the button issues are separate though)

Not only that, but not even a 3rd party controller using such a simple hack to address the issue. (cheaper to build than an analog pot based mechanism on top of that)

 

The 5200 controllers are better in concept and form factor than the 7800, CV, and Intellivision in every respect, but the implementation (analog only joysitck with no centering and long throw, unreliable flex circuitry/carbon dome switches, chiclet buttons, etc). The size, shape, and feel of the 5200 controller is great, but the internals and some details of the design are . . . off. (I also don't like the spinning hard plastic joystick, but that's even worse on the 7800 proline stick while the CV's knob is only good with the thumb and the IV's disc is worse than any of those -and the general form factor is weaker, though I'll argue the tactile membrane keypad is better than the 5200 pad and arguably the CV pad as well)

 

 

 

 

 

 

 

 

Makes sense. Initially, the pack-in killed it. When that was corrected by changing it from Super Breakout to PacMan, and better arcade games were coming out, it stands to reason sales picked up. Both systems had the same strengths and "weaknesses". Both had a 2600 adapter, both had controllers which gave inept gamers frustration, both had what people wanted back then...arcade games at home. Difference being the 5200 boasted better graphics and more popular arcade games. The only advantage the CV had was Donkey Kong. Eventually that one advantage lost out.

The pac in is always important to some degree, but not THAT big a factor. The total launch lineup is an even bigger factor, and that's a compounded problem. Pac Man wasn't ready at launch (the main reason it wasn't pack in iirc), thus people couldn't even buy it after the fact.

Likewise, the VCS had the decent (but simple and aging) Combat as the standard pack-in until Pac Man replaced it after '82: Space Invaders was a killer app, but it was never a pack-in standard other than the Sears Video Arcade 2. (though it probably should have been -a shame that wasn't the game they overproduced rather than Pac Man ;))

Space invaders would be a bad pack-in for the 5200 too since it was one of the few A8/5200 games that was significantly weaker than the 2600 version. (Super Breakout at least had multiplayer support -and 4 player simultaneous at that if I'm not mistaken)

 

What was the total launch lineup for the 5200?

 

 

OTOH, the 7800 pack-in was also an issue. It really should have been Ms. Pac Man (at least early on): it was a couple years old by that point, but was still very hot in the arcades, is a fundamentally fun and addictive game, looks and sounds pretty good on the 7800, plays well on the 7800, and endured as a popular game for years after (to this day) including being the single best selling 3rd party published game on the Sega Genesis. ;) (and was released on almost every game console released since the 7800/NES/SMS -all 3 of those, the Genesis, SNES, etc, etc)

Had it added a 2 player mode (like the Genesis version with added features), that would have pretty much made it perfect. ;)

 

 

 

 

 

 

 

 

 

 

I agree... Thanks for the time spent, CV Gus. It really adds to the memories.

 

I DID remember the magazines criticizing the joysticks. However, I agree with Mirage that people weren't used to them. I also think that the general public were like sheep. I think the magazines psyched people out from getting used to something new. My friends and I NEVER had any trouble getting used to them. I never had trouble finding my way around a Pac-Man maze, etc. I think it was just a bunch of unskilled gamers writing articles and leading the public. The BIG problem was that they were unreliable. So, that reinforced the negativity of the media.

As I said above, all the new controllers had problems, and the analog (especially long throw and to some extent centering) aspect of the 5200 controller was and is a problem.

Pac Man and Ms. Pac Man are NOT the worst cases by far though, they're OK (not ideal, but not unplayable), but some games were much more problematic. Vanguard is a more problematic case by far from the comments and reviews I've seen, and one of the cases where it's barely playable even with a new/refurbished controller -unlike Pac Man or Ms. Pac Man where even a old/worn twitchy controller is at least moderately playable)

 

 

The other thing is that analog control was unnecessary for most joystick based games of the time (driving/paddles or trac balls would address most cases where analog was somewhat useful -and in some cases, you've got analog used in the wrong way like Missile command -should have been "speed sensitive" to simulate a tracball vs position tracking where you move the stick). An analog stick would have been a nice accessory (especially for star wars or star raiders), but digital (or pull-up resistor pseudo digital) would have made a better standard option on top of paddles, trackballs, and analog stick options. (one cool thing about the pull-up resistor option is that you could use the same controls for both that and the full analog stick with no difference in programming -you'd just only have the "fast" speed of movement and the pull-up resistor option wouldn't work for star wars or paddle games where it's position tracking)

 

 

It's also interesting to note that the original VCS/A8 controller ports have all the I/O needed to use joystick with 100% of the features used on the 5200 sticks. (2 POT lines and 5 digital lines -you'd need to multiplex the keyboard more heavily rather than a simple 3x4 matrix and poll it in software rather than using POKEY key scanning, but functionality would be identical)

That, and if you limited the 5200 to 2 ports from the start, you'd have enough total I/O to support the VCS/A8 pinout with direct compatibility. (combination of GTIA I/O lines and POKEY analog ports -possibly key inputs- hacked as plain digital I/O lines with internal pull-up resistors for the POKEY pot lines and perhaps POKEY key inputs wired to act as 6 normal I/O lines, plus the final 4 POT lines would be wired as normal analog POT inputs -except you wouldn't even need the key inputs for 2 ports as such, so you could use those on an expansion port, dedicated keyboard port, or for a built-in keyboard)

Edited by kool kitty89
Link to comment
Share on other sites

I agree with many of the points you've raised, though your sensitivity to resolution is like no one I've ever seen and I don't see the "smoothness" you talk about, after 20 years of owning both.

 

I just don't agree with the 'it's a 2600 with a graphics chip' description for the same reasons that Kitty raised later.

 

 

Additionally, the resolution is the same blocky pixels with more colors.

 

 

That's an over-simplification, in that:

 

- Many A8/5200 games use resolution LOWER than 160 pixels; and some 7800 games use the 320 modes. But yes, 7800 games typically had lower resolution than NES and Colecovision, though not always. Compared to Colecovision, the greater colors, greater number of sprites, hardware scrolling and display tricks kind of out-gunned resolution on its own to me.

 

This is NOT what I was expecting when I read the hype about my favorite company back in the day!

 

You sure the anger over them killing off the 5200 in favour of the 7800 isn't also affecting your personal bias? icon_razz.gif

 

The 8-bits can "pull off" TowerToppler and Commando. I admit they are not as colorful, but the resolution is the same, and the game play is smoother.

 

I said 5200, not A8 icon_razz.gif ... though I suppose with a bigger cartridge and additional RAM memory in the cartridge 5200 could do. Don't agree about smoother at all.

 

Iouldn't WANT Scrapyard Dog. I have it, and my personal subjective opinion was that it was horrible.

 

Fixed that for you.

Edited by DracIsBack
Link to comment
Share on other sites

Right, the adapter - like Coleco's adapter that played 2600 games.

No, the Mk.III/SMS was natively 100% backwards compatible out of the box with the older (colecovision like) SG-1000 Mk.I/II.

 

The consumer had to buy an adapter, just like with Coleco and 2600. The SMS chips are in every Genesis, but that's a technical detail, of interest only to nerds like you and me. None of my friends with Coleco could play 2600 games, and none of my friends with Genesis could play SMS games, either. As far as they were concerned, it was the same deal.

 

The 400 WAS inteded as a game console as such, but it was WAY too expensive back then and not until around '82 did that change. (or could have been in '81 if they got their act together and pushed a new consolidated design with the FCC class B regulations)

 

Yeah, I remember that too. Of course the 400 was priced way out of the mass market for a few years, and the same was true of most consoles since PS1. Whether Atari needed a new console in '79 or not, they sold the 400 as a games machine anyway. All I'm suggesting is that (if they had perfect foresight, which nobody does) Atari could have launched the 400 in '79 with a form factor that would support a gradual decline into the mass market in '82 with backwards compatibility back to '79, rather than launching the 5200 in '82 with '79 hardware and zero backwards compatibility.

 

Many games SHOULD have had keyboard support and wouldn't have been as good without it, if you were going to make a lowest common denominator for cartridge games, at least have it include a numeric keyboard and a few function buttons. (plus the space bar was usually used for pausing, a very important feature)

 

Even if a keyboard were free or printed money, its presence in every 400 caused games to require it, which made it impossible to later release a backwards-compatible game console that didn't also require a keyboard. IIRC there were "computer" games that legitimately required a keyboard, and "arcade" games that gratuitously required the keyboard for one or two keys like the space bar. It would have been nice to force a distinction between the two, for the purpose of my argument, which you don't seem very interested in. :(

 

Again, I don't think making the shift after the fact and retaining a full (but cheap) keyboard would have been much of an issue, and could have been a selling point in general. (you could have 5200-like controllers using the standard A8 pinotu but with no keypads -relying on the system's keyboard and function keys- and thus only the joystick and fire buttons standard)

 

You mean a keyboard would be a selling point for a game console in 1982? Well yes, and so would an internal hard disk. The problem then is to get the consumer to pay for these things, when the competition is offering a system that provides them the basics for a lower price.

 

I put the phrase "Jay Miner architecture" in quotes for this very reason. Of course Jay didn't design all those video systems. But the idea of a simplified GPU that required CPU support was Jay's. This idea made megabucks for 2600, and so it persisted in 5200, 7800 and Jaguar even if it wasn't practical and Jay wasn't personally involved anymore.

No, that's very wrong, the VCS is the ONLY Atari system that ever did that as such.

 

There must be some misunderstanding here. The 5200 couldn't do sprites that moved vertically without the CPU to move the pixels around, and the 7800 and Jaguar* couldn't do sprites OR tiled backgrounds without the CPU to manage scanlines or "zones." None of these platforms let you display a sprite by poking bytes for X, Y, COLOR, and SHAPE. All of them required you to write some kind of "system" that burned CPU cycles and consumed unpredictable amounts of RAM.

 

*I may be a little wrong about Jaguar here, someone correct me if I am.

 

By the time 5200 came out, arcade machines had sprite hardware - and so did ColecoVision and NES. Doing sprites on the 5200, 7800 or Jaguar is a nest of prickly tradeoffs and fun hacks that is still keeping programmers busy in 2011. Doing sprites on ColecoVision and NES means writing bytes for X, Y, and SHAPE. In retrospect Atari's more flexible GPU design was a productivity trap.

The limit of 4 8 pixel wide monochome sprites per scanline was a much greater limitation than the hassle of managing those sprites on-screen. (hence software sprites in character or bitmap graphics modes become an attractive option) The lack of flexible color indexing for character graphics was also a disadvantage against the TMS9918 and VIC-II (at least you had the 5 color mode, but that only gives you 1 optional added color and none for the 1bbp modes). DLIs go a long way towards helping that, but that's still limited on a horizontal line basis (and uses some CPU time -the NES could used raster interrupts for color reloading as well to take advantage of a palette that was considerably larger than even the GTIA/7800 palette, but few if any games used that -the NES has ~56 colors/shades in its default palette but 3 added registers that shift it with 1-1-1 RGB values for a total of some ~448 unique colors/shades but only one bank of ~56 to be indexed on any scanline -and then the limits of CRAM allowing 13 colors for the BG and 12 more for sprites -all as 3 color palettes plus one common BG color)

 

See, I understood everything you just said, because I'm a meganerd like you. But 90% of game programmers I know aren't meganerds. The design says to put the sprite over there, so they go read the hardware manual to see where to poke the X, Y, SHAPE, and COLOR. For better or worse, these people were disqualified from Atari development.

Edited by bmcnett
Link to comment
Share on other sites

It probably would have helped, or hurt less at least you could say. Releasing the CV with comfortable controllers would have helped, or hurt less too.

 

Makes you wonder why. Has anyone ever spoken with the people who made the decision to give the IntelliVision or ColecoVision joysticks and keypads that resembled no arcade game in history? My Dad was in charge of design at Coleco at the time, but he took whatever info he had to his grave in 2009. Can't remember him ever displaying an opinion about games or computers - he seemed to hate them both. I remember once telling Dad that my Amiga had 4,096 colors. He made a frown and said that if sixteen colors was good enough for his pencil set, should be good enough for me! LOL

Link to comment
Share on other sites

I agree with many of the points you've raised, though your sensitivity to resolution is like no one I've ever seen and I don't see the "smoothness" you talk about, after 20 years of owning both.

This came up in the MARIA add-on thread too and it had to be pointed out (especially for NES vs 7800 -or CV for that matter) that lower resolution isn't THAT huge of a handicap with good art design. (there's limits, of course, but still general trade-offs -and the fact the 7800 rarely had its capabilities really pushed, but when it did, it often showed advantages over the NES -Commando looks better on the 7800 IMO, though more so with a modded 7800 for corrected composite video output -rather than the weaker forced by the shared TIA+MARIA line out)

 

I just don't agree with the 'it's a 2600 with a graphics chip' description for the same reasons that Kitty raised later.

Yep, it's more of an all-new system with the VCS hardware tacked on (and dual clock speeds for the CPU to access the old hardware -especially TIA, though RIOT technically could have been bumped to 1.79 MHz too), but also using that tacked-on hardware to a fair extent (I/O and sound) in leu of adding more custom/off the shelf logic to the board. (again, take out TIA and RIOT and drop in an AY-3-8910, and you'll have a system that looks and plays about the same -maybe very slightly better since you'll be acccessing the I/O and sound registers at 1.79 MHz, but with much better sound capabilities -though lacking TIA's periodic noise- and you'd also have a system that was simpler to design and cheaper to manufacture than the 7800 or 5200 for that matter)

 

 

That's an over-simplification, in that:

 

- Many A8/5200 games use resolution LOWER than 160 pixels; and some 7800 games use the 320 modes. But yes, 7800 games typically had lower resolution than NES and Colecovision, though not always. Compared to Colecovision, the greater colors, greater number of sprites, hardware scrolling and display tricks kind of out-gunned resolution on its own to me.

I don't think many A8/5200 games use the 80 pixel GTIA modes, especially in-game, very few do AFIK, and almost none use the lower horizontal res 4 color graphics modes. OTOH, I think a fair bit more use 160x96 graphics. (still not super common, but a few do) And plenty of the character mode stuff, of course.

There are a few games (at least on the A8) that use the 320x192 modes as well, and more that use DLIs for a higher res status screen or such.

 

You sure the anger over them killing off the 5200 in favour of the 7800 isn't also affecting your personal bias? icon_razz.gif

Whether they SHOULD have canceled the 5200 in favor of the 7800 like that is another matter too: the 5200 had a lot of areas of missed opportunities in the design from the start, including some you couldn't really go back on (like integrated VCS compatibility, or a careful design to allow a cheaper/simpler add-on module for compatibility -and have it out from day 1), but there's other flaws that could be corrected and should indeed have taken advantage of the hardware design in general. (the 5200 should have been significantly cheaper than the 400 -or 600XL for that matter- and it had a ton of potential for consolidation: single-chip DRAM interface/refresh IC, CGIA, removing the expansion port, using a much smaller motherboard based on all of that -switching to 2 16kx4-bit DRAMs later on when those became available, etc, etc)

The controllers could have been fixed much earlier (the pseudo-digital pull-up resistor option should have been cheaper and more reliable on top of controlling better for most games -the button/key issue would be a separate fix though, except using simple switches with pull-up resistors could have meant a single PCB for the key matrix and joystick switches -side buttons would still be separate though)

 

Technically, they could even have made a deluxe 5200 model with 2 cart slots and direct VCS compatibility. (and have it set up to use SALLY as the 6507 to save some cost, but that would also mean having 2 5200 joyports and 2 2600 ports, so it might just end up being a mess -especially as time went on, and compatibility was less and less of an issue -let alone how the JAN chip could have made the VCS adapter a fair bit cheaper)

 

But the main thing about the 5200 vs 7800 is that the 5200 was already on the marker for better or worse and had sold a few million units already (at least 2 million going by articles from the time), and while the 7800 would have been great in place of the 5200, there's many more trade-offs with releasing it after the fact rather than pushing on the with 5200 and fixing its hardware issues (aside from native compatibility) and press on.

It also facilitated rather direct ports from the 8-bit, more so once they had more digital-like controllers that didn't conflict with some of the games. (rarely such an extreme case, but there were some -and the pseudo digital option should have been there from the start, and even useful after they released the nice revised controllers with the high precision spring loaded pot modules)

 

The 3200 could have been better (potentially more cost effective than the 7800 even -especially if they merged STIA and ANTIC like CGIA), a directly compatible low-end model of the A8 (like the 1982 600 with a cheap keyboard -sort of like how the 400 was originally aimed, but finally cheap enough to be at game console prices) could have been better too (especially in light of the computer wars cutting deep into the consoles and Atari Inc's own need to push more of the computers -which was rather overdue, especially on the marketing side), and also waiting for the 7800 (or if the 3200 had been delayed until '83) probably would have been better than what happened with the 5200, but none of that mattered (except pushing the computers) after the fact with the 5200 and there were lots of trade-offs in the decision to drop it rather than press on and fix many of the hardware problems that could be -more reliable and more cost effective. (hell, with a DRAM ASIC, CGIA, and no expansion port, they might have been able to crap it down close to the 7800's motherboard size -actually it might have even been smaller than the 7800 PCB since the chipset would take up less real estate than RIOT+TIA+MARIA+SALLY though the 8 DRAM chips and DRAM IC would take up a bit more space than the SRAM chips in the 7800 -let alone if they continued with discrete logic-)

 

Continuing with the 5200 also would have avoided some of the issues with the split and transition to Atari Corp (no delay with the 7800, just pressing on with the 5200), and you'd have common production of all the custom chips in the 5200 and A8. (just without all the chips used in the 5200 -no PIA and no MMU/FREDDIE later on)

CGIA would have been on both platforms once stockpiles of ANTIC+GTIA ran out, similar DRAM interface logic could be used on both, additional consolidation of POKY and SALLY could be applied to both, etc.

 

I said 5200, not A8 icon_razz.gif ... though I suppose with a bigger cartridge and additional RAM memory in the cartridge 5200 could do. Don't agree about smoother at all.

No, it's not smoother, the A8 version is reasonably competitive with the C64 with some trade-offs (and it has the right music), but the 7800 version is better looking and better sounding. (the latter thanks to POKEY in conjunction with TIA -I think the A8 POKEY composition was weaker in general not just due to the lack of TIA though -the music could have been the same on the A8 -save for cutting out some music sounds for SFX, or potentially better with some software modulation of POKEY, but that wasn't the case)

 

Iouldn't WANT Scrapyard Dog. I have it, and my personal subjective opinion was that it was horrible.

 

Fixed that for you.

It's not an amazing game, though it's a decent sidescroller (in the grand scheme of things with all the competition, it was average to mediocre -especially if you take the sound into account regardless of technical limitations), it wasn't a stellar game on the Lynx either for that matter. (Ninja Golf would probably be a better example of a sidescroller on the 7800 -pretty decent art design for the color and resolution too, not an exceptional game, but decent and with a neat and very original concept -makes you wonder how they could have pushed that concept with a higher budget)

 

The only reason scrapyard Dog is notable on the 7800 is because the library of games is limited on the 7800 as such. (and most games that are there are relatively low budget -vs still having a small library but having most games exceptionally high quality and fairly large ROM sizes for the time) That's nothign against the machine itself, but the reality of the situation Atari Corp was in at the time. (especially with a general lack of 3rd party support and relatively tight in-house budget -especially compared to the competition-)

 

 

 

 

Edit: and all of the above issues are much less than the fundamental management problems and flawed distribution system that largely caused the destabilization of the market (due to oversaturation, bloating, and the "glut") that led to the crash (with some help from the computer wars), and THOSE were the real problems Atari/Warner needed to deal with. (and Morgan was doing handily by early 1984, albeit over a year later than they needed to avert disaster entirely)

Edited by kool kitty89
Link to comment
Share on other sites

Right, the adapter - like Coleco's adapter that played 2600 games.

No, the Mk.III/SMS was natively 100% backwards compatible out of the box with the older (colecovision like) SG-1000 Mk.I/II.

 

The consumer had to buy an adapter, just like with Coleco and 2600. The SMS chips are in every Genesis, but that's a technical detail, of interest only to nerds like you and me. None of my friends with Coleco could play 2600 games, and none of my friends with Genesis could play SMS games, either. As far as they were concerned, it was the same deal.

No, you're thinking of the Mega Drive/Genesis, the Mk.III/Master System in Japan was 100% compatible with the preceding SG-1000 and Mk.II.

 

The Genesis is still a different context though since it has fully embedded SMS compatibility, like the Game Gear (except the GG isn't dragged down by it like the Genesis hardware was -ie they had to sacrifice board space and VDP die space for compatibility on the MD). Requiring the adapter must have been a marketing decision and it shouldn't have been a big issue to make the systems directly cart compatible. (and probably just require an adapter for the card games)

 

But from a consumer PoV, it's closer to the CV somewhat: except the native controllers and controller ports can be used, the adapter was very inexpensive, and again is more comparable to adapters required to play famicom games on the NES, or Japanese SMS/Mk.III/SG-1000 games on the westenrn Master system. (or the PC Engine to TG-16 for that matter)

 

Yeah, I remember that too. Of course the 400 was priced way out of the mass market for a few years, and the same was true of most consoles since PS1. Whether Atari needed a new console in '79 or not, they sold the 400 as a games machine anyway. All I'm suggesting is that (if they had perfect foresight, which nobody does) Atari could have launched the 400 in '79 with a form factor that would support a gradual decline into the mass market in '82 with backwards compatibility back to '79, rather than launching the 5200 in '82 with '79 hardware and zero backwards compatibility.

Yes, but there's no reason they COULDN'T have doen what you suggest after the fact, but retain a cheap built-in keyboard and STILL have a cheaper system than the 5200 (mainly due to the 5200 being unevenly cost cut over the 600).

 

And no, the 400's price point is VERY different from the PSX and such, the 3DO's launch price is probably the only thing to come close by comparison on the game console front (PS3's 500-600 launch price is a fraction of the 400 with inflation taken into account), that and it may have been marketed as a game machine (to some extent), but it was also marketed as a lower-end computer, and most bought it as such. (marketing in general wasn't as strogn as it should have been for the A8 line though, among other issues)

 

Even if a keyboard were free or printed money, its presence in every 400 caused games to require it, which made it impossible to later release a backwards-compatible game console that didn't also require a keyboard. IIRC there were "computer" games that legitimately required a keyboard, and "arcade" games that gratuitously required the keyboard for one or two keys like the space bar. It would have been nice to force a distinction between the two, for the purpose of my argument, which you don't seem very interested in. :(

And? they could have released the game console with the keyboard in any case, and it would have been a plus on the market in the earl/mid 80s anyway. (even a cheap keyboard like the 400's would have placed it into the low-end home computer category ;))

 

There's a LOT of other areas you's want to cut features for a dedicated console than the keyboard: you'd want to remove the fundamental keyboard I/O from POKEY, and SIO logic from POKEY, and remove PIA, drop to 2 controller ports from the start, drop 4 of POKEY's POT lines in favor of 4 plain I/O lines, etc. That's what you'd do if you wanted a truly cost-cut console only derivative of the system with no regard to expandability to a full computer.

If you wanted it truly compatible with the computer, taking the 600 design from 1982 and swapping in a cheap keyboard wouldn't have been substantially different in cost from making the keyboard an accessory. (the the cleaner, more consolidated board design compared to the 5200 should have saved cost -in spite of PIA and the keyboard- as would the smaller/lighter casing in general and the smaller amount of shelf space it would consume)

 

You mean a keyboard would be a selling point for a game console in 1982? Well yes, and so would an internal hard disk. The problem then is to get the consumer to pay for these things, when the competition is offering a system that provides them the basics for a lower price.

Yes, except that falls apart with the system already being price competitive WITH the keyboard included and a low-cost keyboard at that, and the 5200 being less cost effective in spite of the cut-down design. (which, again, seems to be an odd mix of corner cutting countered by various other cost inefficiency)

 

It not only would be a selling point, but would better combat the home computer war and continue selling when game consoles came under attack (including in the heat of the crash).

Then again, you could argue the plain Atari 600 without the lower cost keyboard would have managed close to the same thing. (that DEFINITELY should have been released either way though)

The Atari 400, as it was, was price competitive with the 5200 if not the Colecovision. (not that a cut-down consolized design couldn't be MORE cost effective still though, but the 5200 of 1982 was not a good example of that)

 

 

There must be some misunderstanding here. The 5200 couldn't do sprites that moved vertically without the CPU to move the pixels around, and the 7800 and Jaguar* couldn't do sprites OR tiled backgrounds without the CPU to manage scanlines or "zones." None of these platforms let you display a sprite by poking bytes for X, Y, COLOR, and SHAPE. All of them required you to write some kind of "system" that burned CPU cycles and consumed unpredictable amounts of RAM.

 

*I may be a little wrong about Jaguar here, someone correct me if I am.

1. you're only talking about sprites and

2. you're making a lot of generalizations

 

The A8 an VCS do indeed have very similar sprite architectures, but the bitmap/character mode playfield capabilities of the A8 are what sets it apart and makes it much more like later contemporaries.

You could have A8 games (and do) that do everything with the character/bitmap display with no use of sparites at all (but software "sprites" or blitter objects in the more modern context)

Dozens of other platforms lacked sprite position registers or any sprite hardware at all, and either software rendered to a framebuffer or character display (VIC-20 only had characters and no framebuffer), or you had a blitter to accelerate such. (and blitter logic is what took over with the 3DO/PSX/Saturn/etc -the "sprite" engines in all of those cases were blitter driven using 2D textures)

The Jaguar has both a blitter and the object processor, which is a list processor somewhat in the vein of MARIA or such with several generations past and the fact that it works with a full framebuffer rather than having to carefully arrange things on a scanline basis. (it doesn't use hardware sprites and it's not a blitter, but it does have a framebuffer to build up and read -the Panther didn't have enough RAM to use a framebuffer and thus was stuck with the 7800's method more so)

 

And from what I understand, the 7800 (let alone Jaguar) doesn't burn though CPU cycles to build the diplay as such, the biggest hit to CPU resource is the shared system bus forcing the CPU to be halted for MARIA's burst DMA. The early concept design of MARIA with no DLLs DID eat up CPU time more like the VCS (the CPU had to manually build-up the display for every scanline based on the display list), but the use of DLLs allowed MARIA to offload much of the overhead as such.

If MARIA was used in a dual bus design, or with fast enough memory to allow interleaving, you'd have loads more CPU time to work with.

 

In the Jaguar, it's sort of the opposite: the CPU is one of the least efficient (if not the least efficient) bus masters in the system, it hogs the bus when used and it's thus necessary to use as little CPU time as possible (and keep it off the bus as much as possible) to allow anywhere near peak bandwidth on the main but. TOM has line buffers for many of its operations (every object processor operation, some blitter operations -not texture mapping- and I think some RISC GPU operations as well), so it can manage near 100% fast page bandwidth (~106 MB/s) on its own, but the 68k drags that done hugely as does JERRY (since it was sort of hacked in with a somewhat buggy and slow DMA interface -not that big of an issue if purely used for sound synthesis though -with low bus usage). The Jaguar would have hugely benefited from either a slow bus for the 68k to be on (especially 68k and JERRY), or using a CPU with a cache (even a modest one like a 68020), but that was one of the design trade-offs to push low cost on top of other limitations in development time and funding. (in hindsight, one good trade-off would have been to drop the huge DSP -same RISC core as the GPU- in favor of a small, simple ASIC that just has I/O and DMA sound channels -maybe hardware ADPCM decoding logic or other features like per channel low pass filtering, supporting PCM of different bit depths, etc- and add a 2nd bus dedicated to the CPU and audio OR shell out for a more powerful CPU with a cache and keep the single bus design -lots of options, and many other platforms made the mistake of an overbuilt sound system that would never really be worth it -the SNES was probably the first to do that -OTOH, Atair Corp had many, many, other problems that were bigger than any hardware issues with the Jaguar by orders of magnitude: mainly the downward spiral the company fell into around 1989 which Sam Tramiel's weaker management compared to his father was at least in part responsible for -it not largely responsible for)

 

I'm not a programmer for any of these systems, just more of a tech/history geek (at least I'm not programming for any YET ;)), and I don't have a deeply intimate understanding of the systems, but a reasonable high-level understanding of the unique architectural aspects and limitations, so I can't offer a whole lot more than I've summarized above. (and there's a point where I hit a wall in understanding the dev manuals as well -though maybe you could glean more from that short of poking some of the Jag programmers for more details ;) -the doccuments should be reasonably easy to find online, I think Atarimuseum has them and I know they've been uploaded to AA before -there's also doccuments for the Jaguar II, but it was still a work in progress when those were printed)

 

The limit of 4 8 pixel wide monochome sprites per scanline was a much greater limitation than the hassle of managing those sprites on-screen. (hence software sprites in character or bitmap graphics modes become an attractive option) The lack of flexible color indexing for character graphics was also a disadvantage against the TMS9918 and VIC-II (at least you had the 5 color mode, but that only gives you 1 optional added color and none for the 1bbp modes). DLIs go a long way towards helping that, but that's still limited on a horizontal line basis (and uses some CPU time -the NES could used raster interrupts for color reloading as well to take advantage of a palette that was considerably larger than even the GTIA/7800 palette, but few if any games used that -the NES has ~56 colors/shades in its default palette but 3 added registers that shift it with 1-1-1 RGB values for a total of some ~448 unique colors/shades but only one bank of ~56 to be indexed on any scanline -and then the limits of CRAM allowing 13 colors for the BG and 12 more for sprites -all as 3 color palettes plus one common BG color)

 

See, I understood everything you just said, because I'm a meganerd like you. But 90% of game programmers I know aren't meganerds. The design says to put the sprite over there, so they go read the hardware manual to see where to poke the X, Y, SHAPE, and COLOR. For better or worse, these people were disqualified from Atari development.

Yes, but only a small part of my above statement was on the hardware sprite issue. TONs of developers were working with systems (namely computers) with absolutely no sprite hardware at all (namely framebuffer or character graphics manipulated by the CPU -maybe with hardware scrolling if you were lucky), not even the tricky A8/VCS type sprites. Plus you had arcade board doing the same with CPU driven framebuffer graphics or a blitter in some cases.

 

OTOH: the pure X/Y register stuff was relatively new as well (though the TI99 had it in '79), and was NOT the defacto standard for console/arcade games until the mid/late 80s, and even then you had tons of computer platforms that required software and/or hardware blitting (and tricky stuff like dealing with planar bitmap graphics or attribute clash on some older systems -like MSX or Spectrum). The dominant Japanese home computer was the PC8801 with 640x200 3bpp planar graphics as the standard/lowest resolution and was used for games many (early models used 3-bit RGB, later ones allowed indexing from 9-bit -many games opted for 2 bitplanes and lots of dithering), and only a 4 MHz Z80 to drive graphics on top of that.

 

There was a MASSIVE range of different limitations on different systems, you had the likes of the Apple II and Spectrum getting new games into the early 90s (more so the Spectrum), and the A8 was better off in pretty much every way than the Apple II for ease of programming. (if you didn't like the sprites, you could focus on CPU driven character/bitmap graphics and learn the best advantages to push more color with DLIs and such -and make use of the hardware scrolling- but making some use of the sprites would be something any good A8 developer would eventually push -even if just for decals of added color/detail to bitmap/character graphics -the trade-offs of the A8 are unique, but that's the same for many platforms and any developer working on it would learn to push it accordingly)

 

The 7800 if a bit more distinct from that though since MARIA is a bit further from the conventional character/bitmap modes from what I understand (aside from sprite manipulation), but maybe its character modes do work rather like contemporaries (my understanding of MARIA is very general).

However, when it was being developed and planned for the original release, there was still a lack of the defacto sprite+tilemap standards in the arcade or home console industry:

on top of that, the VCS was still dominating the market by a large margin, and thus the programmers experienced with that would be used to dealing with "odd" or "unconventional" architectures as such. (even though the VCS isn't really like the 7800 in how its programmed -and obviously doesn't involve the tight CPU intervention and cycle timed code necessary doe the VCS to display even a static screen)

The A8's sprites likewise would be favorable for those used to dealing with the VCS, but with MUCH greater flexibility of CPU time, use of interrupts, and all the modern graphics features offered for the playfield. (be it in bitmap or character modes)

 

If the 7800 had been released as planned in '84, and Morgan's other plans (Natco, etc) had followed through, the market would have been much more favorable for the 7800 and developer support would have pushed beyond the unique architectural limitations by the time Nintendo (or Sega) were even remote issues on the market. (plus, the lack of delay could have meant negotiations for licensing Japanese arcade games before Nintendo had secured exclusive rights in mid 1985 -which is when Michael Katz was attempting to license said games and was forced to resort to licensing computer games instead for a large part of the 7800's library)

 

Granted, you could argue for the 5200 to have pressed on in spite of the problems (since it was already there, had a notable userbase, and many issues that could be corrected with new hardware revisions), but I already commented on that in my previous post. (including other things like common prodcution of ICs for the A8 and 5200, ease of cross-platform development, etc -plus an architecture a good chunk of 3rd parties were already developing for -for A8 and 5200- and thus already past the stages of "odd" architecture being a major boundary in such cases -plus hype, market share, and userbase would solidify such developer interest regardless of difficulty of development: a la PS2 ;))

Edited by kool kitty89
Link to comment
Share on other sites

No, you're thinking of the Mega Drive/Genesis, the Mk.III/Master System in Japan was 100% compatible with the preceding SG-1000 and Mk.II.

 

Ah, yeah you're right about that one! Stupid mistake on my part.

 

There's a LOT of other areas you's want to cut features for a dedicated console than the keyboard: you'd want to remove the fundamental keyboard I/O from POKEY, and SIO logic from POKEY, and remove PIA, drop to 2 controller ports from the start, drop 4 of POKEY's POT lines in favor of 4 plain I/O lines, etc. That's what you'd do if you wanted a truly cost-cut console only derivative of the system with no regard to expandability to a full computer.

 

It's not clear to me how the removal of features from just the 400's chips would have saved money in 1979. If anything, two sets of chips would have cost more money? You'd want to eventually merge all those chips, and a side effect of that would be the removal of useless features, but it's the merging of the chips itself that saves you money eventually if volume is high enough.

 

But removing a big physical hunk of electronics - like a keyboard? That reduces costs up and down the supply chain. Your box gets smaller, so you can fit more on a truck! I don't understand your vision of cost management.

 

Yes, except that falls apart with the system already being price competitive WITH the keyboard included...

 

We're just going to disagree about whether a keyboard belongs in a game console. History seems to have shown that demand for keyboards and keypads in game consoles is low, but everyone's entitled to an opinion.

 

bitmap/character mode playfield capabilities of the A8 are what sets it apart and makes it much more like later contemporaries.

 

I'm not talking about home computer video hardware. This may not be obvious because I'm talking about Atari 400. It's on the table only because we're talking about 5200 and Atari's design process.

 

With the exception of 2600, 5200, 7800 and Jaguar, I don't know of a successful game console from 2600<X<PS1 that didn't have hardware sprites and hardware tiles. Pretty much all the arcade hits from Pac Man onward had them, too. Atari's game console designs didn't have them and were more flexible, but in retrospect this was a liability after 2600.

 

A8 had great hardware tiles for 1979, I'll give you that. For 1982, not so good. But enough to get a lot of great games onto 5200.

 

And from what I understand, the 7800 (let alone Jaguar) doesn't burn though CPU cycles to build the diplay as such, the biggest hit to CPU resource is the shared system bus forcing the CPU to be halted for MARIA's burst DMA.

 

Yes, you've found the other problem with Atari's game console designs: a shared bus for GPU/CPU. 5200 and 7800 had it, I don't know about Jaguar. That was a good idea for a home computer, and a bad idea for a game console.

 

Yes, but only a small part of my above statement was on the hardware sprite issue. TONs of developers were working with systems (namely computers) with absolutely no sprite hardware at all (namely framebuffer or character graphics manipulated by the CPU -maybe with hardware scrolling if you were lucky), not even the tricky A8/VCS type sprites. Plus you had arcade board doing the same with CPU driven framebuffer graphics or a blitter in some cases.

 

You're right when you're talking about home computers, but I'm simply not talking about them. Can you list arcade boards that did frame buffer graphics? I guess the very late sprite games with lots of really huge sprites did that. But by that time, the world had largely switched to 3D which always has a frame buffer.

Edited by bmcnett
Link to comment
Share on other sites

I agree with many of the points you've raised, though your sensitivity to resolution is like no one I've ever seen and I don't see the "smoothness" you talk about, after 20 years of owning both.

 

I just don't agree with the 'it's a 2600 with a graphics chip' description for the same reasons that Kitty raised later.

 

Thanks for taking the time to see WHERE I am coming from. I am not sure if you mean the smoothness in game play or my response here. I know that the 2600 games always seemed very loose in control, and objects felt like they just glided across the screen. I know Pac-Man isn't the best example, because it was a poor game, but he just glides along. Galaxians seem jerky when they decend. I personally get the same feeling from many 7800 games. I am willing to accept the fact that some issues could be programming. I know that there are 8-bit games that suffer too. Even though 8-bit Galaxians was criticized in many ways, the ships descended in a smooth motion. (I am aware of some slow-down and weird pauses programmed in, but I am talking about a feeling that everything is working together.) The 7800 control in Mario Bros and Donkey Kong, for example, feel as if I want to break the joystick to try to get smooth motion. Meanwhile the enemy objects (barrels, shellcreepers, etc) feel as if they are dragging in mud and not sync'd to the animation.

 

I have played around with jump type games, and I've noticed that a jump like DK or MarioBros is exponential. The character starts with a great leap of space, and the jump slows down toward the top. In the 7800, I can "feel" every exponential jump, where the 8-bits almost give a natural sine-wave kind of curve. The other thing that disturbs me (with the previously mentioned games as examples again) is that it appears the characters move vertical and THEN move horizontal -- as if it's not diagonal. That is what bothers me about Galaga. The ships appear to make a staircase kind of motion at times (as with the jumps)... Playing 7800 games often remind me of playing an 8-bit or NES on an emulator with the frame skip set to a ridiculous number.

 

 

Additionally, the resolution is the same blocky pixels with more colors.

 

 

That's an over-simplification, in that:

 

- Many A8/5200 games use resolution LOWER than 160 pixels; and some 7800 games use the 320 modes. But yes, 7800 games typically had lower resolution than NES and Colecovision, though not always. Compared to Colecovision, the greater colors, greater number of sprites, hardware scrolling and display tricks kind of out-gunned resolution on its own to me.

 

Actually, most 8-bit games did use the 160 mode. However, ALL of the "Sprites" (Players/Missiles) were 160 resolution. The difference was in the vertical resolution. The lower resolution Player/Missiles would be 128 pixels high, instead of 256 (full-screen length, including above and below the screen area)

 

This is NOT what I was expecting when I read the hype about my favorite company back in the day!

 

You sure the anger over them killing off the 5200 in favour of the 7800 isn't also affecting your personal bias? icon_razz.gif

 

Honestly. It is PURE disappointment. I was SO EXCITED. FINALLY, it wouldn't be necessary to overlap an already limited number of Players just to get multi-colored characters, Finally, 4-missiles wouldn't have to be used for a 5th player (or stretched out behind a monster to become the whites of their eyes)! I was really excited. I thought Atari was going to bring the arcade HOME, and I thought they were going to finally prove they were number one again. If you recall, magazines didn't make very good screen shots back in the day, so the 7800 mock-ups left a little to be desired, but I KNEW they'd be better. I would re-read the magazines, like the little nerd I was (am). I would dream about having the arcade at home. My biggest let-down was that the cancellation of the 7800. I thought it would never see the light of day.

 

When they finally released it, I went to buy it as an early graduation gift. The store was sold out, so I got the NES. I couldn't believe how accurate SMB was to the arcade. When I got it home, I was excited by the sound quality. I still wanted a 7800. I would check out the boxes at the store. From the back of the Ms.Pac-Man box, I noticed that the Monster's eyes were STILL SQUARE. This really bummed me out. That was my first let down. When I finally got a 7800, I was terribly disappointed by the sound. I kind of excused it, because I knew the 7800 was designed about 3-4 years earlier. That was BEFORE I was educated that the NES was just as old. Two things I really appreciate are sharp graphics and quality sound. I was disappointed by both, and then the choppy game play sealed the nail for me.

 

I figure Pac-Man is a game from 1980. If a system can't faithfully product graphics from a 4-year old game (Monsters with round eyes and the right bottom fringes), than there is a problem. The 2600 can get the same 7800-SHAPED monsters (without as many colors) and is older than the game itself. Of course they could make wider monsters, but that would create other distortions and multiple scrolling fields. Look how much detail is lost in Dig Dug, as he moves vertically.

The 8-bits can "pull off" TowerToppler and Commando. I admit they are not as colorful, but the resolution is the same, and the game play is smoother.

 

I said 5200, not A8 icon_razz.gif ... though I suppose with a bigger cartridge and additional RAM memory in the cartridge 5200 could do. Don't agree about smoother at all.

 

Not really important. I can agree to disagree. I think anything on the 8-bits could be done to some degree. I don't really care for Tower Toppler myself, but I feel like the 7800 version is the only version (out of 8-bit, C-64, and gp2x) that makes the character feel as if he s being pulled backward. And the controls feel less responsive than the other versions to me. However, its not a game I enjoy, so I can be fair and say that I haven't played ANY version to any great extent. Some of that could be me.

 

Iouldn't WANT Scrapyard Dog. I have it, and my personal subjective opinion was that it was horrible.

 

Fixed that for you.

And I totally agree. I didn't realize that I put it out there like that. To me, it was the Kasumi Ninja of the Super Mario Genre.. lol. Speaking of which, just thinking about that how politically incorrect that announcer voice is made me laugh.. (Kasumi Ninja)... Take care.. :cool:

Link to comment
Share on other sites

There's a LOT of other areas you's want to cut features for a dedicated console than the keyboard: you'd want to remove the fundamental keyboard I/O from POKEY, and SIO logic from POKEY, and remove PIA, drop to 2 controller ports from the start, drop 4 of POKEY's POT lines in favor of 4 plain I/O lines, etc. That's what you'd do if you wanted a truly cost-cut console only derivative of the system with no regard to expandability to a full computer.

It's not clear to me how the removal of features from just the 400's chips would have saved money in 1979. If anything, two sets of chips would have cost more money? You'd want to eventually merge all those chips, and a side effect of that would be the removal of useless features, but it's the merging of the chips itself that saves you money eventually if volume is high enough.

Less silicon=cheaper, lower pin count=cheaper (ie 28 pin POKEY), less board space=cheaper. (due to lower pin count and no PIA) Unless you want a comupter add-on for the system, there's no reason to have such features at all.

 

Actually, that's also what makes using SRAM attractive, especially once 2k densities got affordable (compared to 512 byte chips): consoles (that avoid use of framebuffers and have direct access to ROM) don't need the RAM, and unless you invest in embedded DRAM interface logic, SRAM makes a much more attractive option for such embedded systems. (even then, there's the issue of more board space due to the number of DRAM chips used -at least until 4-bit wide DRAM chips became available -you could certainly argue that by the time of the Genesis, they should have been pushing for DRAM, but the earlier you go, the more trade-offs there are -if you want the A8 framebuffer capabilities, you'd definitely want DRAM though -so not like the 3200's planned 2k SRAM as of 1981, but you'd STILL have more work RAM than the Colecovision -even if almost 1/2 was eaten up by the character index/map/table -or whatever term Atari used-) There's a reason the Colecovision, SG-1000, Master System, NES, and even PC-Engine (but not CD) used all SRAM (aside from video -PCE was also SRAM for that due to speed iirc). The limited work RAM also limited framebuffer hacks to few systems (or requiring on-cart RAM) up to the 4th gen. (even the PCE has too little work RAM to really allow hacked framebuffer graphics -unlike the SNES and Genesis)

 

The CV has DRAM and embedded interface logic in the VDP (thanks to TI), and while Atari could have pushed that for the 5200 to allow the bitmap modes (and potentially decompression of ROM data into RAM), it could have been a more competitive move to take advantage of the shared system bus that didn't need the VRAM and make do with less work RAM in general. (restricting mainly to character graphics as such with 2-4k SRAM to work in -if they'd gone with embedded DRAM logic, that might have paid off more in the long run -especially when 16kx4-bit DRAMs became available so you'd only need 2 16-pin DIP DRAMs for 16k)

 

As for common production: yes, that can be significant, but there's also a wall between cheaper/simpler components and using more expensive components across the board.

Otherwise, the 6507 (which also uses a modified die compared to the 6502) wouldn't have existed, or a least would have been replaced by the 6502/6502c once Atari ramped up A8 production. ;)

 

Removing all those features certainly would have made it tighter to require/force developers to make games that stuck to those limitations for such forward compatibility with a nonexistent console in general. (unless they launched the console derivative back in '79, but that probably still would have been in the $400, maybe closer to $300 like the Intellivision if they chucked the DRAM+logic in favor of just 2k SRAM, etc)

But that would really be unattractive and rather pointless to limit the computers like that. (albeit removing PIA, dropping 4 of POKEY's ADC inputs for plain IO lines, and going with 2 ports from the start would have benefited the A8 line's cost effectiveness by a good margin as well)

 

And that's why it makes more sense to A. stick to only full computers (with low to high end), and the closest thing to a console would be a low end gaming computer still includign a keyboard.

B. design a fully standalone console that avoids the restrictions of compatibility, or focuses on VCS compatibility alone (regardless of using some of the A8 hardware or not).

The 5200 opted for incompatibility with both and had potential for being more cost effective than either option, but fell far short of that due to other areas of missed cost optimization. (such an incompatible system could still have been optimized with provisions for making an adapter module as low-cost as possible -one obvious thing would be using the 6502 as the 6507 as well as compatible controller pinouts with the lines wired to an expansion port -or the cart slot- for passthrough to RIOT+TIA I/O)

The 5200's POKEY could have been cut to 28 pins from the start since SIO unused and the key inputs would be unnecessary if they were aiming at a 2 port design. (just use POKEY's POT lines for POT and I/O input plus the GTIA I/O lines)

And lockout would be important to add as well.

 

The low-cost computer idea is a good foolproof option for Atari's position at the time: one less unique product line to support (from the consumer PoV), just the VCS and the A8 line with a bottom end computer positioned in direct competition with the upper end of the console market. (but making a far better game system than the VIC-20 ;))

Hell, in the extreme, for compatibility alone (and not really intending the system to be used as a proper computer), they could have pushed for a super cheap ZX80/81/Timex 1000 type keyboard. (maybe with a port for an optional "real" keyboard as with the XEGS's keyboard port) Or go beyond that and only include the keys most used by games. (space bar, start, probably the number keys, etc) The inclusion of the keyboard expansion would still make it attractive as a computer too. (and much cheaper than any contemporary console computer upgrades since it would literally be just the keyboard -or maybe they could include the SIO lines on the same general expansion connector and have the SIO port on the keyboard unit as well)

 

But removing a big physical hunk of electronics - like a keyboard? That reduces costs up and down the supply chain. Your box gets smaller, so you can fit more on a truck! I don't understand your vision of cost management.

Umm, the box wouldn't get any smaller at all if you used a super compact membrane keyboard (or negligibly so). In the extreme case, using a really compact/minimalistic membrane keyboard would be almost no different from using a solid slab of plastic and only a couple buttons/switches for power and reset. (maybe option/select/pause if you didn't do that in software with controller inputs)

Let alone the massive 5200 where even a full Atari 600 would have been smaller and cheaper to distribute (and possibly cheaper to manufacture), even the 1200XL has a smaller footprint (not sure if it's lighter, but the motherboard is even slightly smaller than the 5200's).

 

If we were talking about the 1200XL, removing the keyboard would have been obvious (though switching to a cheap membrane keyboard could be close to the same thing as such), or even the 600, but that would have the same possibilities for a cheap keyboard. (and a slimmer form factor that didn't comply with the full throw keyboard -again, possibly removing SIO and adding that and the key lines to an expansion port for a keyboard upgrade that included SIO)

 

Yes, except that falls apart with the system already being price competitive WITH the keyboard included...

We're just going to disagree about whether a keyboard belongs in a game console. History seems to have shown that demand for keyboards and keypads in game consoles is low, but everyone's entitled to an opinion.

I agree, more buttons and option keys (and accessory keyboards) are more useful in general. (with careful use of menus and button combos once you've gone beyond the normal limits -the Jag's keypad could really have paid off if a good amount of PC ports had been made, but in either case it should have had the Pro Controller layout from the start -and honestly, the pro controller has enough to push most complex PC games with a few tweaks and avoid the need for the keypad as such -aside from instant access to certain functions like FPS weapons, etc)

 

As above, I think a dedicated console is preferable in general, and as such, including computer compatibility is generally impractical and would tend to be sloppy. (aiming at VCS compatibility out of the box -or optimized for a cheap adapter- would be more significant, though makign the hardware similar to the A8 to facilitate ports -and sharing some production components- would also be useful)

 

However, after the fact, or for a low-end gaming-oriented computer in general, a cheap keyboard is still a reasonable option. (especially given the various conflicts over the 5200/3200/etc)

 

 

bitmap/character mode playfield capabilities of the A8 are what sets it apart and makes it much more like later contemporaries.

I'm not talking about home computer video hardware. This may not be obvious because I'm talking about Atari 400. It's on the table only because we're talking about 5200 and Atari's design process.

 

With the exception of 2600, 5200, 7800 and Jaguar, I don't know of a successful game console from 2600<X<PS1 that didn't have hardware sprites and hardware tiles. Pretty much all the arcade hits from Pac Man onward had them, too. Atari's game console designs didn't have them and were more flexible, but in retrospect this was a liability after 2600.

That's sheer coincidence: there were simply no blitter (etc) based systems that were pushed on the mass market by major players. (the Lynx was all blitter based and quite powerful, but not managed well under Atari -and at a fundamental disadvantage in cost, size, and battery life, like the Game Gear -it did outsell in the GG in parts of Europe, but not the GB anywhere AFIK -hardly surprising though)

 

A lot of arcade games pushed blitter driven graphics as well, though less often software driven.

 

And, like the A8 and Colecovision (MSX, etc), several later consoles pushed for software driven graphics as well where the sprites were unsuitable or undesirable. (be it pseudo bitmap or character-by-character movement) The SMS did that in several games including Space Harrier (very few hardware sprites used), and After Burner (similar), which is also why you see a lot of attribute clash in those games (especially Space Harrier), but not much flicker. (on the Genesis, such options were mainly used for 3D games, though some used software scaling/blitting in 2D games as well -sometimes blitting onto sprite objects as well for realtime scaling animation)

 

The Sega CD was the first home console to definitively include a blitter (a few games that use no sprites at all -or render animation to them), of course that went beyond a simple 2D blitter and added affine texture rendering for scaling and rotation. (and full 3D texture mapping if you draw on a line by line basis with the CPU recalculating perspective -the same thing the SNES needs to do for mode 7 warped for 3D perspective)

 

But really, the main reason you didn't see any really successful consoles pushing such hardware is simply because none of them tried to do so. (had the likes of the Amiga or Flare 1 chipset been pushed by a well-known and capably managed company, they could have been exceptionally popular and competitive platforms -obviously you'd strip out the unnecessary I/O hardware and cut RAM down in the Amiga -maybe even push for use of fastRAM to allow more total bandwdith depending on the cost you were willing to push -cutting out the fastRAM bus would be a notable cost savings, especially if you're talking a late 80s release)

 

And in any case, you wouldn't have to worry about "odd" architectures as long as you stuck with framebuffer+blitter/CPU or character+Sprite (or some combination) based graphics as you'd have tons of developers experienced with any of those. (the 7800 was problematic as it didn't do any of that in a conventional manner as such and didn't get out into the market when developers were more open to "odd" hardware as such -especially if Atari had the funding to push it enough to make developers really interested, as that tends to overcome even the toughest architectural issues) That's also why the Panther was a bad idea: very much in the 7800's spirit in terms of list processing with no framebuffer or "normal" sprite or character support.

 

The Neo Geo used ALL Sprites with no character support, so it was also an oddball to port to or from, but it goot a good amount of support. (both developing for and porting from)

Since most games of the time were totally remade for the home conversions, similarities allowing source ports was a moot issue. (as long as the platform was something they reasonably understood in general)

 

 

A8 had great hardware tiles for 1979, I'll give you that. For 1982, not so good. But enough to get a lot of great games onto 5200.

It's a shame they couldn't/didn't push more comprehensive indexing (with CLUTs in dedicated CRAM or main DRAM -like the TMS9918 did- and using RIOT instead of PIA would give you 128 bytes for potential CRAM entries ;)).

The C64 had some nice on-screen color flexibility, but the master palette really limited it. (Atari is the opposite ;) -though good use of DLIs helps a lot)

 

And from what I understand, the 7800 (let alone Jaguar) doesn't burn though CPU cycles to build the diplay as such, the biggest hit to CPU resource is the shared system bus forcing the CPU to be halted for MARIA's burst DMA.

Yes, you've found the other problem with Atari's game console designs: a shared bus for GPU/CPU. 5200 and 7800 had it, I don't know about Jaguar. That was a good idea for a home computer, and a bad idea for a game console.

Not really, it has the very same cost benefits for a computer and a console: like how lower cost computers have shared video memory. There's other trade-offs, but an efficient and well-designed system on a unified bus (or at least as few buses as possible) will tend to be far more cost effective (higher cost to performance ratio) than a multi-bus system. (a multi-bus design is also an easier method that requires less R&D to implement, but will also be more expensive in the long run as such -unless you go really extreme with heavy buffering to simulate multiple buses -the final model Genesis 3 used a single SDRAM bank with heavy buffering to use for both main and VRAM)

 

A unified bus design also means memory and be allocated more efficiently. (be it ROM or RAM)

That's why the VCS did it, A8, C64 (with interleaved DMA, though still less CPU time than the A8), 7800 did it, Amiga, Lynx did it, Jaguar did it, N64, Xbox did it, 360 did it, etc.

There's obviously many other areas of cost trade-offs, and with similar buffering AND multiple buses, you end up with something like the PSX. (which has a chipset that could still perform capably if using a unified bus -or could even perform better than it did if they pushed the system bus to 66 MHz with SDRAM or buffered for 64-bit DMA)

As such, using multiple buses on the jaguar would have been unattractive: more investment in caching and buffering was far more attractive. (and investing in a CPU with a cache -or added logic to allow external caching for the 68000 even)

Many, many trade-offs though. (like consolidation and die space: another area the jaguar shines with an extremely ambitious .5 micron process targeted for a design laid out in 1990! and in production before even high-end CPUs were using .5 micron -they cam in '94 with the likes of the PPC603, granted the N64 was pushing .35 micron in 1996)

That and some other factors (like total RAM or use of TONS of high-speed ROM) make the difference between cost effective game console (or lower-cost home computer) and an arcade machine or high-end workstation.

 

There's a reason no (no arcade) cart based home consoles opted for the NES's dual external bus design again (the PCE had the video bus on the expansion port, but not on the card slot). It required multiple ROMs (minimum) and a larger connector and PCB with more traces. (and 2 separate bankswitching schemes)

The onboard RAM option was less flexible, but much more cost effective. (the single bus option was a compromise that was as flexible -or more- than the NES, and cheaper than either of the other options, but also with more performance trade-offs -and the faster memory needed to counter those trade-offs would again counter the cost effectiveness, though in the 7800's case I wonder if the SRAM was fast enough to make full speed interleaving feasible -maybe with optional interleaving on the cart slot for later games with faster ROM, I'm not sure what speed MARIA accesses the bus at, but if it's just 1.79 MHz, that wouldn't have been tough at all to interleave at the time, even with DRAM like the Amiga -if it's 3.58 MHz, that's a lot tougher and you'd need at least 140 ns memory -though the PC Engine was pushing ROM at that speeds from the very start in 1987, probably facilitated by in-house production whereas the Lynx pushed for cheap ROM in the 400ns range and the Jaguar at only 375 ns standard)

 

Plus there's the development time constraints to design for multiple buses and/or interleaving on top of cost issues. (with a tight design and no backwards compatibility dragging it down, the 7800 probably could have included a TMS9981 like DRAM arrangement for video updated via I/O ports and just a single 2k SDRAM chip for the CPU -actually it might have been interesting if they could have pushed a 3.58 MHz 6502 given the use of SRAM -and knock it down to 1.79 MHz for ROM or allow either)

Again, lots of trade-offs, not just in cost/performance, but also in terms of R&D time and budget available.

 

The Saturn is a prime example of both the issues of multiple buses, a general lack of buffering (for wider bus width OR higher bus speeds), and lack of consolidation with cutting-edge chip fab processes (the 3DO is an even more extreme example using all 1 micron parts in 1993 -which also physically prevented use of heavy buffering without extreme expenses -though it also meant a massive potential for consolidation had it ever gotten that far ;) -it was also licensed and sold for profit and in a higher end form factor, so even worse), plus a lack of tempered design for attractive cost (a lot of RAM, dual CPUs, etc -and using SDRAM when EDO would have had the same performance -but taken more work to interface).

Hell, if the Saturn had pushed the likes of the N64 or Jaguar (or PSX for that matter), they could have merged both VDPs and added line buffering for 64-bit DMA with almost no page breaks (maybe even caching -aside from the CPU) and/or taken advantage of the 66 MHz rated SDRAM and clocked the system bus at 2x that of the internal speed of the custom chips (ie ~57.4 MHz rather than ~28.7). Or various other compromises with multiple buses and slower RAM and/or narrower bus widths like knocking it down to 2 buses and both using 32-bit EDO DRAM with one shared sound+CPU bus and one video bus. (the N64 used 9-bit RDRAM at 500 MHz with an MMU to connect that to the 32-bit system bus -albeit only practical due to being partnered with SGI, and there's trade-offs for fast/expensive RAM on narrow/cheaper buses vs slow/cheap RAM on wide buses -or in between)

As it was, the Saturn could still have been OK if they trimed things more modestily for the inefficent/wasteful areas (an SH1 with 512k of SRAM just to manage CD-ROM data transfers, a highly programmable synth chip with DSP that was used almost 100% of the time just for the DMA sound channels, dual CPUs but without good support for the flexible DSP coprocessor also included -that would have made the slave CPU far less important, and the overuse of RAM -probably should have knocked it down to 2.5-3 MB and put more emphasis on expansion if needed -merging the sound and CD-ROM subsystems would have really helped with the 68k managing CD-ROM data transfers with a small ~32k cache/buffer like contemporaries and the sound RAM to work in and store samples; they could have even re-used the Sega CD interface and save cost and R&D time -just a clock doubled CD-ROM chipset for 2x speed mode and an otherwise nearly identical 68000 interface -the 12.5 MHz 68k in the SCD was used for CD-ROM data management as well as a general purpose CPU)

 

The Jaguar 2 stuck to a single bus design (actually with a slow sound/IO bus iirc), but with even heavier buffering, an actual GPU and texture cache, and a CPU with a cache as well (derived from the in-house GPU RISC core), all with 64-bit DMA.

 

New technology shifts options for bus sharing: there was a time when interleaving was attractive, but that fell apart once fast page mode was in regular use, but then you had potential for multi-bank interleaving to avoid page breaks and enough chip space to allow for buffering (and eventually on-chip caching), and it's a combination of interleaving and caching/buffering (and ever faster memory speeds) that makes bus sharing realistic for modern consoles like the 360. (albeit you could argue that using cheaper DDR2 with dual buses could have been more cost effective in a well optimized design for that -and would have allowed more RAM without excessive cost, though single bus with DDR2 would have been cheaper than either, but be slower though also allow more flexibility due to more RAM capacity and still at lower/competitive cost)

Short of that, the jag does support 2-bank interleaving to reduce page breaks for unbuffered operations (texture mapping, JERRY accesses, and 68k accesses), so adding another 512k bank could have helped a lot (both doubling texture mapping speed and cutting out a lot of 68k overhead -technically it should allow a 50/50 split from 100% FPM bandwidth and the 68k running full bore)

That, and a 26.6 MHz 68000 would have been great if they were available as such without custom grading; technically you should have been able to have a 40 MHz 68000 with zero wait states with 75 ns FPM accesses. (a 26.6 MHz 386SX would probably have been the next step up, or a 13.3 MHz 386SX short of that -probably cheaper than a 68EC020 due to the higher volume production too)

The CoJag didn't use dual buses, but did add a bank of dual-port VRAM in the 2nd bank and used a 25 MHz 68EC020 or R3000.

 

The limitations of the Jaguar meant that they either needed a more expensive CPU with caching, a GPU cache (which may have been impractical with the resources and time flare was working with), or dropping the pure single-bus design for an added (slow) CPU/IO/sound bus (probably 16 bits as well). The lack of texture mapping buffering is hindsight since Flare could have focused on that rather than some other areas. (high speed shading, Z-buffering, the object processor, etc -hell, they could have dropped the object processor and focused purely on making the blitter as fast as efficient as posssible: I wonder if Atari management influenced them to build off the Panther's OP rather than pushing an all new design -albeit still a spiritual successor to the Flare 1 and Slipstream)

Investing in JERRY (vs a simple sound and I/O ASIC) was a bad move in hindsight as well.

 

There's probably some things they could have had the foresight to change at the time, but the management issues at Atari Corp (and general financial problems) obviously hurt things a lot. (the Jag was lucky to get as far as it did, especially with how ambitious the hardware was ;))

 

 

But on the A8/5200 specifically, the CPU still has a lot of time in active display as video doesn't eat it up like MARIA can/does and also leaves consistent enough intervals for DMA to allow fairly even interrupts. (hence POKEY modulations are possible via interrupts whereas it's not so simple on the 7800 AND you tend to have a lot less CPU time -or at least much less consistency)

IIRC, you've got close to 1.2 MHz performance in the A8 (vs 1.6 MHz with V-DMA disabled -due to DRAM refresh), so still better than the C64 or Apple II with interleaving, but weaker than the BBC Micro's 2 MHz AND interleaving. ;)

 

It's not a problem that went away with dual bus designs either, but the bottlenecks change. (except the NES where you have dual cart buses)

With the CV/etc and SMS, you have to use CPU time to update VRAM (the SMS also eats through that 16k very fast since it uses 4bpp graphics) and you can only update in vblank.

The PC-Engine allows interleaved access to video RAM (fast SRAM), but still requires CPU driven updates.

The SNES and Genesis have DMA to main ROM in vblank that halts the CPU (burst DMA) for fast VDP updates, but still relatively limited bandwidth and NO option for slower interleaved DMA. (it would have been really useful to have a slower interleaved mode since you'd have 1/2 bandwidth with full CPU resource -and clipping the screen for more DMA wouldn't mean losing more CPU time)

The SNES is even tigher than the Genesis in terms of DMA, and the CPU cycles eaten in vlblank for animation heavy games roughly equates the resource used for the PCE to copy graphics as needed on the fly. (even better would be interleaved DMA to ROM with VDP driven copying in active display ;) -or a dual bank arrangement like many framebuffers use for hardware page flipping -the Sega CD uses that for the word RAM and a nearly identical set-up for the framebuffers in the 32x -same 80 ns DRAM chips even, though clocked at 7.6 MHz vs 12.5 in the CD)

 

 

And sorry about the multi-topic rant, I tend to draw a lot of parallels with these discussions.

 

 

Yes, but only a small part of my above statement was on the hardware sprite issue. TONs of developers were working with systems (namely computers) with absolutely no sprite hardware at all (namely framebuffer or character graphics manipulated by the CPU -maybe with hardware scrolling if you were lucky), not even the tricky A8/VCS type sprites. Plus you had arcade board doing the same with CPU driven framebuffer graphics or a blitter in some cases.

You're right when you're talking about home computers, but I'm simply not talking about them. Can you list arcade boards that did frame buffer graphics? I guess the very late sprite games with lots of really huge sprites did that. But by that time, the world had largely switched to 3D which always has a frame buffer.

I already addressed most of this above, there was no necessity to stick with pure character and sprite graphics as such, and computers are directly comparable to consoles (and in semi-direct competition in many cases as well -especially in Europe in the 80s and early 90s, and then worldwide once 3D hit and hardware acceleration on PCs -which had been getting ever more competitively priced and user friendly- hit in the mid/late 90s). Had the Amiga been bigger in the US, it may have ended up giving more direct competition with game consoles as well.

 

Since none of the big home console companies (prior to the 5th generation) pushed a purely blitter based system, it's really a self defeating prophecy. ;)

 

And again, what arcade hardware used didn't matter at all since you wouldn't be doing direct ports anyway (and often extremely heavily optimized/modified ports for weaker consoles). So all that mattered was that you had an architecture that a lot of programmers would reasonably understand: and framebuffer based blitter (or even CPU) driven graphics fall in that category as do character based graphics and "normal" hardware sprites. (the jag's object processor wasn't really "weird" either so long as you had a framebuffer to work with -otherwise you'd have to deal with building up the entire display with objects sort of like the 7800)

 

Even with "odd" or "difficult" architectures, you can/will get strong support if the system is marketed/hyped well enough. (the PS2 is one of the best examples of that)

Edited by kool kitty89
Link to comment
Share on other sites

And sorry about the multi-topic rant, I tend to draw a lot of parallels with these discussions.

 

Not at all, there's a lot of useful factoids in your rants. They just aren't conducive to conversation. I hope you understand that I can't reply to all your points.

 

I'm going to disagree with your claim that the framebuffer architecture never succeeded in 2D game consoles by coincidence alone. A frame buffer requires not just a lot of RAM to make all the pixels addressable, but requires all the backgrounds and sprites to end up in a unified palette space. NES graphics would therefore typically require a 5-bit frame buffer, or an extra 37.5k RAM, but wait that's just for the front buffer. You also need a back buffer to render to, so now an extra 75k RAM are required just for frame buffers on NES. Also, the background must be written freshly to the backbuffer each frame, which commits a lot of time and bandwidth to pixels that aren't even moving.

 

SNES would need an 8-bit frame buffer or an extra 120k RAM just for frame buffers, and Genesis would need a 6-bit frame buffer or an extra 90k RAM just for frame buffers. It's just ludicrous.

 

The RAM isn't just expensive to buy and put in the console, but that's also a lot of bus bandwidth to read the frame buffer, read the graphic, combine them, then write it back out.

 

The frame buffer was a liability for game consoles until 3D became obligatory. 3D doesn't absolutely require a frame buffer (the DS' algorithm for 3D would work with a scanline buffer) but as a practical matter, after a DS-like level of geometric complexity is exceeded, a frame buffer is cheaper than alternatives.

 

p.s. oh yeah Intellivision would have required 15k extra RAM for its frame buffers! That's on top of the 1456 bytes of RAM it already had.

Edited by bmcnett
Link to comment
Share on other sites

I'm going to disagree with your claim that the framebuffer architecture never succeeded in 2D game consoles by coincidence alone. A frame buffer requires not just a lot of RAM to make all the pixels addressable, but requires all the backgrounds and sprites to end up in a unified palette space. NES graphics would therefore typically require a 5-bit frame buffer, or an extra 37.5k RAM, but wait that's just for the front buffer. You also need a back buffer to render to, so now an extra 75k RAM are required just for frame buffers on NES. Also, the background must be written freshly to the backbuffer each frame, which commits a lot of time and bandwidth to pixels that aren't even moving.

Yes, except that's not a direct comparison either. 5bpp framebuffer is SUPERIOR in most ways to what the Genesis has. ;) (and even some advantages over the massive number of subpalettes the A framebuffer means you can have ANY indexed color on any area of the screen, so even if you have a lower per scanline (or per screen) color count, you still have many other advantages. (hence why ST graphics usually look MUCH better than NES graphics)

OTOH, with the SMS, you have 2 15 color palettes to use, though only 1 can be applied to sprites, yet ST games STILL tend to look significantly more detailed color-wise. (but in that case, it's due to 9-bit vs 6-bit RGB for the most part -and in some cases, more restricted ROM space than ST RAM/disk space)

I'd say that even an 3bpp framebuffer would have substantial advantages over the NES (or C64) at the same resolution, though less so if it had to be limited tot he C64's color set. (with the NES's color set -let alone the A8's or NES's with RGB control used- it would be much more flexible, plus you could do additional raster effects for palette reloading)

 

And character and framebuffer graphics modes aren't mutually exclusive either. (you could have multiple planes with ability to anable character or framebuffer graphics in general, and the chip space used for sprite logic is generally MUCH more than that used for character generator logic -let alone a simple framebuffer manager) That and having flexible resolutions would also be important, especially if you stuck to packed pixel graphics. (where framebuffer size would be less variable via color depth)

 

RAM for the framebuffer is an issue, but it can be cheap DRAM (and relatively slow) vs the fast SRAM/VRAM used in contemporary systems, let alone investment in bus sharing. (interleaved random access DMA is slow and relatively unattractive -OK in the mid 80s, but the Jaguar and Lynx did much better like other modern design- and packed pixel graphics are more necessary for fast/efficient use of fast page mode accesses, let alone added line buffers -aside from line buffers, separate source and destination will help keep in fast page mode, and dual bank interleaving is a more comprehensive implementation of that principal -the lynx doesn't have line buffers or dual banks/buses but does rely on fast page mode -so careful management of the game to avoid page breaks is critical to peak performance) In the late 80s and very early 90s, the Amiga's memory sharing scheme was still OK, but not really great compared to alternatives. (the overall performance of the Amiga was still fairly impressive though, and the slower memory interface would also mean less -or even no- performance drop for using ROM as the source and RAM as the destination, or various other options -it's a case where using the FASTRAM bus would also be attractive so the blitter could saturate chipram -short of a much more comprehensive redesign)

 

By the end of the 80s, DRAM was getting cheap enough to quite feasibly have over 256 kB in a game console, and by 1991, it was easily feasible to push over 512k of DRAM (especially on a unified bus) at console prices.

http://phe.rockefeller.edu/LogletLab/DRAM/dram.htm

Actually, it was more feasible back in '86/87, but the DRAM shortage/crisis of '88 pushed things back up. (not until 1990 did it drop below the lowest 1987 prices)

 

Prior to that, low resolution framebuffer based systems were practical, but limited. The Astrocade was pure framebuffer, though single buffered and only 160x88-102 2bpp. (up to 320x204 with RAM expansion) OTOH, it would have made a better low-end computer than a console. (it also had a 256 color palette -and technically it had split-screen indexing for 2 4 color palettes, but not proper cell based attributes like many character systems -or the ZX Spectrum with 32x24 attributes over a 1bpp framebuffer)

The Lynx did so with only 64k of shared memory (ROM was used basically as a disk to load data from -and it was too slow to do much else with anyway). The double buffered framebuffer took up some 16k of that for 160x102x4bpp.

 

And in general, a 320x204x4-bit (from 12 bit with raster interrupts for added color effects) display (taking 64k double buffered) would be realtively competitive with 4th gen consoles in general (assuming the blitter is relatively fast, let alone has resources for scaling or affine rendering -or 3D acceleration). 128k would allow that with 8bpp, plus added RAM to work in or decompress into. (even in '89, 256k DRAM wouldn't be unreasonable, 512k might be pushing it, but it would depend on what other trade-offs were made)

And 8bpp would mean it would have FAR greater color flexibility than any console out until the Jaguar/3DO. (SNES technically could do 8bpp tiles, but rarely used the feature -and conventional 4-bit -sometimes 2-bit- tiles ended up usually with under 100 colors on-screen, and other trade-offs due to the cell-based nature -hence why the PCE STILL had some advantages in color in spite of 9-bit vs 15-bit RGB -albeit with significant trade-offs)

 

The Flare 1/Multisystem design with a decent chunk of RAM would probably have made a highly competitive 4th gen console. (it could do 4bpp at 512x200 or 256x200, but it was faster with 8bpp 256x200, and as long as you had a decent amount of work RAM, that would be preferable by far -you'd need 100kB for double buffering, and with a 256kB DRAM system, that would leave a full 156 kB for added work RAM -again, very feasible for 1989, let alone later as RAM prices dropped steeply up into the early 90s before stagnating for several years at around $3 per Mbit)

 

 

Actually, a bigger issue than the framebuffer RAM (destination) would be the source: uncompressed 8-bit graphics cells/chunks would take up a good amount of space (or even 4bpp for that matter, depending on the timeframe), so you'd want to avoid using ROM as the source as much as possible unless you had hardware decompression or hardware texture indexing. (so you could have 4 -or even 2- bit sources and 8-bit destination -or more flexible if you did planar graphics- but otherwise you'd want to unpack graphics to raw 8bpp when loading into RAM -if you could do it well ahead of time, also probably use lossless compression to conserve more ROM)

 

A system with 2 planes (or maybe more) with either possible of a few different resolutions and maybe different color depths as well as character AND framebuffer modes (or maybe framebuffer only on 1 layer), would have made a very flexible system even with no hardware sprites. (you'd use the framebuffer layer for that, and/or use character objects)

Still, tons of advantages (and some trade-offs) for pure framebuffer as well.

 

 

SNES would need an 8-bit frame buffer or an extra 120k RAM just for frame buffers, and Genesis would need a 6-bit frame buffer or an extra 90k RAM just for frame buffers. It's just ludicrous.

Se above, you're heavily oversimplifying things. Genesis equivalent to a 64 color framebuffer. :lol: (you'd be lucky to be competitive with a 5bpp buffer overall -let alone with raster effects used, and you'd better hope that framebuffer isn't using a higher output color depth like the amiga or you'll have more trade-offs)

You end up with a LOT of redundant colors if you want reasonably smooth graphics, or more colors if you can sacrifice for more clash or high contrast tiles.

Hence why digitized photos or FMV (the good codecs) still only end up with around 30 colors (or less) if clash is to be avoided. (and it's not that dramatically different from an optimized 16 color version of the same image -and in some ways worse than an optimzied 16 color version using a higher depth palette to index from)

 

The RAM isn't just expensive to buy and put in the console, but that's also a lot of bus bandwidth to read the frame buffer, read the graphic, combine them, then write it back out.

Which is why various bus sharing schemes is important. (if you don't push a multi-bus design)

And no, the RAM isn't really expensive compared to the options most consoles took. (128 kB SRAM+128k DRAM in the SNES, 64k+8k SRAM -and pretty high speed too- in the PCE back in '87, etc -compared to commodity DRAM, that's a lot of overhead for the RAM components and at a time when embedding the DRAM interface logic shouldn't be an issue at all)

Let alone the advantages added RAM gives for decompressing data into (having compressed into ROM) and using slow ROM without performance loss. (so cheaper games, and still a cost competitive console design as well)

Especially compared to the likes of the PC Engien using 140 ns ROMs. (somethign NEC's vertical integration allowed -apparently more attractive than investing in more RAM onboard the console itself... rather skinpy on the RAM in the CD units too for that matter)

 

The Sega CD, less the CD-drive and interface, and a bit of simplification, would probably have been a highly competitive piece of console hardware in 1991, even cost wise with the 768 kB. (and even if you had the separate framebuffer bus -using the word RAM to render into-) All it needed was logic for a framebuffer and video DAC, and it would have been pretty awesome. (swapping generic DMA sound for the Ricoh chip +64k SRAM/PSRAM would have been a significant cost savings too)

Link to comment
Share on other sites

don't forget that to exercise your "superiority" of a 5 bit frame buffer on NES you need 5 bit sprites and tiles, which requires 2.5x the ROM. what a disaster! :) a conventional Genesis title needed only 2x the ROM of NES for twice as many colors overall as 5 bit!

Edited by bmcnett
Link to comment
Share on other sites

don't forget that to exercise your "superiority" of a 5 bit frame buffer on NES you need 5 bit sprites and tiles, which requires 2.5x the ROM. what a disaster! :) a conventional Genesis title needed only 2x the ROM of NES for twice as many colors overall as 5 bit!

Nope, I already addressed all of that (unpacking indexed tiles into RAM -let alone proper lossless compression, or using fewer bitplanes alone for some things -if not using packed pixel graphics), and 5-bits is WAY superior to the NES (better than the Genesis in many, many cases, even without added raster tricks). 4bpp would be ahead of the NES is most cases and even 3bpp would have some advantages. (the master palette depth is also important -ie the ST's 16 colors from 9 bit will often have advantages over the SMS's 16+15 color palettes from 6-bit RGB -and you only get the 15 colors for sprites)

 

The SMS has to deal with everything being 4bpp as well and only 16k to load things into AND hold all the tilemap information, etc. (so there's a rather limited amount you can decompress/unpack into V-RAM -and maybe a tiny amount you could add to main RAM -the rest would need to be updated on the fly and uncompressed or simple enough packing/compression that the CPU can handle it on the fly -the same limitations the PCE/MD/SNES have to deal with as well, except the PCE supports 2-bit data unpacking in hardware and the SNES has 2-bpp tile modes -rarely used for more than an overlay or sparse far BG layer- and the Genesis and especially SNES have more CPU work RAM to buffer into -the PCE's pure planar graphics also make the fewer bitplanes option more realistic on the fly while the SNES

s composite planar and especially MD packed pixel formats make it tougher -the packed pixels make higher performance lossless compression and CPU based pseudo-framebuffer based rendering a lot easier though -SNES's Mode 7 is also useeful for software rendering due to such and more so being 8bpp with a separate 256 color palette but limited to 16k pixels max)

 

 

Limitations of how indexed palettes (subpalettes) are used on a tilemap of any given depth can make for massive differences from bitmaps of only slightly higher depth or even equal depth depending on master palette. (in the latter case see the ST vs SMS or more so STe vs SMS -or Game Gear had 12-bit RGB though, so would have more flexible use of color than the Lynx)

 

 

 

 

Granted, several of my points about DRAM costs and shared bus efficiency would also apply to making more powerful/cost effective sprite/tile based systems too. (the shared bus issue would be less practical without caching, though having as few buses as possible would be significant too -ie eliminating dedicated sound/coprocessor buses and have shared DMA to the CPU bus for sound and such)

Edited by kool kitty89
Link to comment
Share on other sites

Forgot to mention a blitter (or alternate graphics driving logic) with variable source and destination depth and indexed color look-up. (let alone hardware decompression logic -or a simple/low cost DSP coprocessor aimed at such -or CPU if you were doing something simple like RLE)

Edited by kool kitty89
Link to comment
Share on other sites

Having owned a 7800 since 1988, and a 5200 since 1998, I just can't shake the feeling, playing the games, that the 7800 just isn't really a big step away from the 5200 or CV.

 

It seems at best that the one thing the 7800 could do better than most systems, at least from its time, is move large numbers of objects around in Asteroids-style games, although Xevious implies otherwise.

 

But overall, the colors seemed duller than those for the 5200.

 

The problem with comparing the two directly, stat for stat, is that that doesn't mean too much. Who would've thought the CV capable of Newcoleco's Ghostblaster? Or the 2600 of Robot Tank? That was one problem both the CV and 5200 had: unlike the NES or 2600, neither system really had enough time for programmers to become skilled enough to push their limitations.

 

I guess the only way would be for two or more pros who KNOW both systems to explain what each could do based on genre- and how much better, if at all.

 

 

Can anyone here, in plain English (my skill was with Commodore computers), give a 7800 vs. 5200 comparison? Leave out sound; we KNOW the 5200 wins there. It should be based on scrolling games, Asteroid/Space Duel games, one-on-one fighters, etc.

Link to comment
Share on other sites

Ok, well, thanks for digging all those out. You must have spent quite a bit of time finding all those!

 

Of course there was criticism about the 5200's controllers. Especially since it was an unusual setup, considering the 5200's action/arcade game bias. Maybe it had some minimal impact on sales. I would still doubt it played a large part in the 5200's lack of success though. If that were the case, you could just as easily make the case that ColecoVision's or Intellivision's sales were affected by the controllers (and maybe they were). Though, it's a really good point that at least with the CV, you could just plug in an Atari-compatible stick for use with many games. That was a nice design-feature whether intentional or not. Either way, it's amazing to me that so many people complained (probably without having gotten used to them yet), and I also still wonder how many people in the "real world" complained about them. I sure never heard of it, and I still like them to this day, very much. Far far far more comfortable and usable than CV or Intv controllers for most games!

 

Anyway, thanks for all the effort, and so the magazines did print some criticism back then, ok.

 

I agree... Thanks for the time spent, CV Gus. It really adds to the memories.

 

I DID remember the magazines criticizing the joysticks. However, I agree with Mirage that people weren't used to them. I also think that the general public were like sheep. I think the magazines psyched people out from getting used to something new. My friends and I NEVER had any trouble getting used to them. I never had trouble finding my way around a Pac-Man maze, etc. I think it was just a bunch of unskilled gamers writing articles and leading the public. The BIG problem was that they were unreliable. So, that reinforced the negativity of the media.

 

 

Oh, no, even back then, most people I knew couldn't stand them. Trust me- I've had a 5200 for over 12 years now, and I still can't make sharp moves with regular 5200 controllers.

Link to comment
Share on other sites

And most people I knew liked them. So, where does that leave us? Oh yeah, to the obvious original point that its all about personal preference ultimately. I have absolutely no issues whatsoever controlling nearly anything just fine with the 5200 controllers. I love them. To each their own is what it comes down to.

Link to comment
Share on other sites

It seems at best that the one thing the 7800 could do better than most systems, at least from its time, is move large numbers of objects around in Asteroids-style games, although Xevious implies otherwise.

 

There are lots of games on the 7800 that show this.

 

That was one problem both the CV and 5200 had: unlike the NES or 2600, neither system really had enough time for programmers to become skilled enough to push their limitations.

 

The problem was the same for the 7800.

 

give a 7800 vs. 5200 comparison? L

 

How many times are you going to retread this same "give me a comparison thread"???? Surely it must be at least 3 dozen times by now ...?

Edited by DracIsBack
  • Like 1
Link to comment
Share on other sites

I thought the 7800 had a better realistic colors per scanline setup than the 5200. Maybe that's because of the higher number of objects doable per scanline? I remember the A8 being pretty limited on a scanline basis and isn't the 5200 pretty much the same thing video wise (what, some port is different locations or such)?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...