Archive for August, 1994

3D Misconceived

Tuesday, August 2nd, 1994

(This column first appeared in the August 2, 1994 issue of PC Graphics Report,
and was subsequently reprinted and quoted in a number of magazines)

Many of you are aware that I’m a big proponent of getting 3D onto the PC desk top, in terms of both hardware and software. I want to see 3D on every system that ships, which is one of the reasons I helped start a VESA Committee (specifically the VESA Advanced Graphics Interface, or VAGI, Committee) to standardize a free 3D API which would foster acceptance of 3D technology.

My thinking was that if a wide range of applications are 3D enabled, that helps create the demand for broad, affordable 3D hardware support. Widely available, low cost 3D hardware justifies the development of 3D enabled software. There’s an obvious problem with this approach, however, namely it’s a chicken and egg situation – which one will happen first? It’s been an issue for any 3D API on the PC, as well as for all the myriad high-end 3D graphics boards and applications various vendors have tried to be successful with in the PC market (and frequently failed with as well).

Ironically, it looks as if the thing to break the chicken/egg cycle will be pure software. Enter 3D rendering APIs. Specifically, 3D rendering APIs which provide decent performance, but without the use of hardware acceleration. These include Intel’s 3DR, Criterion’s RenderWare, RenderMorphics, Microsoft’s Win/G and Argonaut, as well as the proprietary mechanism used by the gruesome but insanely popular 3D game, DOOM by id Software.

All of these mechanisms focus their performance around the system CPU, and not on graphics hardware.

Recently, while at SIGGRAPH in Orlando, I had a chance to speak with Mike King of Criterion, who explained to me that RenderWare’s main task is to render to a local PC memory resident buffer, which then gets copied to the display, on a frame by frame basis. If the graphics hardware has extra goodies, RenderWare apparently doesn’t really take any advantage of them. The exception is if a low-level 3D library, such as that offered by Matrox for its MGA boards, is available, in which case RenderWare can use that functionality, but potentially sacrificing some of the performance it gains from cutting corners around mathematical accuracy in favor of visual presentation and performance.

DOOM works in fundamentally the same way (but no hardware acceleration support at all), in particular the rendering to a PC based memory buffer. Following in this line of reasoning, Microsoft has designed a new interface, Win/G, to provide such fundamental memory to screen blitting capabilities to Windows applications, in an attempt to move game developers from DOS to the Windows environment. At Spring COMDEX, DOOM was seen running under Win/G, and Argonaut and RenderWare libraries support Win/G as well.

With the visual performance that these software-only renderers seem to be provide, it almost leaves one wondering why 3D hardware is even necessary. Then you see Jurassic Park, T2, or some other such 3D graphics animation feature, or play some of the newest VR games, and realize software may be fine for some things, but powerful graphics hardware is required nonetheless for real improvements in display performance. Now if only the various APIs would take advantage of it… It will happen, and I would argue, must happen, as part of the normal evolutionary process in PC technology. However, the low-cost 3D software, using software rendering needs to happen first, in order to provide for future extensions which deal with hardware acceleration.

3D Performance
All this talk about visual 3D performance brings something else to mind, namely how does one measure such performance, and once measured, what does it actually mean?

Over the last year, I’ve had over a half dozen aspiring vendors of 3D graphics hardware quote me some great numbers, such as “this chip will do 250,000 gouraud shaded triangles per second” (frequently the number is repeated twice, apparently to help me understand that this is a really good thing). I’m sure that the sophisticated graphics  workstation user, who probably also deals with multi-tasking, multi-threaded operating systems, and has a $50,000 per seat purchasing budget for computer equipment, has a reasonable sense of what this all really means.

But, up until recently, such numbers had no real meaning for me, and I’m ashamed to say that even now my pulse still doesn’t quicken when being quoted big triangle numbers, although I have a better time visualizing what they represent. And, this is only because I’ve been developing some 3D driver extensions for AutoCAD Release 12 which offer near-real-time manipulation of complex rendered scenes. Considering that I’ve been involved in personal computer (including pre-IBM PC systems) graphics for about 15 years now and still basically can’t comfortably correlate the triangles/second rating to reality, how do you think that the average PC user is going to feel when he gets these large unrelatable numbers whipped at him or her? Especially next year when the 3D PC hardware wars really start heating up.

You may want to consider this a strong suggestion to create a 3D performance measurement that relates to the experience of the average PC user who is being targeted as the purchaser of new generation 3D PC hardware. Might I suggest the DOOMMark, which could measure how many frames per second of DOOM play you can get at a given resolution, on a given system?

Another aspect of the whole triangles/second measurement currently being bandied about is that while the graphics hardware can support such amazing drawing rates, frequently the systems these graphics devices are located in can issue drawing requests for only a small number of triangles per second. I’ve heard (and can’t substantiate myself, although many of you can) that a 60 MHz Pentium system can only generate between 30-50K 2D triangles per second (based on the assumption that the CPU has to perform the transformations on the original 3D triangles to generate a 2D viewed triangle). The exact numbers aside, bandwidth issues are a very real limiting factor, even with today’s Pentium systems and local bus adapters.

One other 3D performance issue I think needs to be addressed is that of expectations. Those of us in the graphics industry know (or at least should know) the limitations inherent in PCs and graphics hardware, and therefore can state with confidence (at least for the time being) that a sub-$5000 system combination will not be able to produce the same real-time graphics performance a high end SGI system can. However, the typical PC user has no such experience to draw from, and instead might use clips from CNN, scenes from Jurassic Park or Lawnmower Man, and the capabilities of his or her Super Nintendo as a reference point, all of which make PC 3D look rather disappointing. Heck, DOOM, a software-only 3D program, looks better and responds faster than much of what I’ve seen so far on expensive PC 3D graphics boards. PC users need to be educated as to what “3D” really is in terms of PCs, and be shown the benefits of PC 3D, not as a comparison to unrealistic portrayals in the media and elsewhere, but as solutions/environments that stand on their own. This is going to require a lot of work, particularly in educating the PC press so they can explain it all to the end users. Of course, your company’s advertising and marketing efforts will need to explain it all as well so you end up with customers who get more than they expected, and not less than they believe is possible.

The Workstation Mentality
Much of what I’ve covered so far in this week’s column I could attribute to what I call “The Workstation Mentality”. This terms stems from the fact that many of the new entries in the 3D hardware market are from companies that have no experience in the PC market, but lots in the more expensive (and hence lower volume) workstation market. These companies try to apply the experience they’ve gained in marketing workstations to the PC market, often with disastrous results, and rarely realizing their mistake until too late, even when outsiders try to point the problem out. Typical symptoms of The Workstation Mentality include:

  • Thinking that because their hardware is 3D, PC people will automatically buy it, and therefore forecast sales volumes that even a fledgling mass-market graphics chip/board company would be proud to see in their first year.
  • Pricing their product(s) such that board level implementations often range in the $1500 to $4000 price range because that’s “competitive” in the workstation market. That won’t work in the PC market, where a high-end graphics board is one in the $600-1000 price range.
  • Assuming that the PC market is full of 3D applications that can immediately take advantage of their hardware. For the record, there are very few packages shipping today that can do that, with lots of effort. This includes MicroStation PC v5 and AutoCAD Release 12 for DOS (and not much else). The name aside, 3D Studio does NOT support any external 3D acceleration of the core rendering process.
  • Figuring that Windows NT will be the platform of choice for anyone wanting to use 3D technology (since Daytona will ship with OpenGL, I guess). In reality, the largest volumes will probably be under a combination of Windows 3.1, DOS, and Chicago, assuming 3D APIs which really do take advantage of accelerated 3D hardware start shipping on these platforms. Mind you, Windows NT (Daytona actually) is probably the best 3D platform right now on the PC, but only for the next few months. And Istill haven’t seen more than one OpenGL application announced for NT (it’s MicroStation).
  • Assuming that all monitors in the PC market are created equal since that’s the case in the workstation market for a given workstation – the monitor’s sold with the workstation. This is another fallacy, as most any PC graphics board company will tell you (and proves by virtue of all the special monitor tweaking utilities provided with their boards).
  • Believing that on-board (if not on-chip) VGA compatibility is not required. This is a major fallacy. All PC graphics boards should have VGA compatibility, but it should be disableable in devices which a user might want to use in a dual screen configuration. The reason it should be on-board or on-chip is that many power users (the most likely early adopters of 3D technology) are likely to have fully loaded systems, and will not want to devote two slots to graphics boards unless they have a specific application which requires dual display usage (and even that can sometimes be done with two video connectors on a single graphics board).
  • Assuming that PC users understand what 3D means in a computer context (see my diatribe earlier in this column to see why that’s not the case).

As an aside, if any of you reading this think I’m specifically singling you out with the comments about The Workstation Mentality, trust me, I’m not. The 7 symptoms I described above are endemic at at least a half dozen different companies I’ve spoken with over the last year. It’s quite uncanny how similar the manifestions of The Workstation Mentality actually are.

Human Interfacing with 3D
In the last couple of months, I’ve gotten intimately familiar with the 3D side of AutoCAD, and I must say that AutoCAD does a downright lousy job of letting people manipulate 3D objects. Now, that not entirely Autodesk’s fault either. Granted, other applications deal with 3D manipulation much better, including Autodesk’s own 3D Studio, but every application seems to deal with interfacing to 3D in a different way. Several different things contribute to this confusion:

  • Displays. They are all 2D. Makes it tough to see the back of the object without doing some contortionistic manipulations on your input device.
  • Input Devices. Most PC users only have mice or just keyboards. Some might have 3D/6D input devices, but with the exception of the Logitech CyberMan ($80 or so), all the rest of them are bloody expensive. Of course, workstations seem to come equipped such devices, but PCs don’t, further contributing to a lack of 3D understanding among PC users.
  • Our childhoods. While many of us got to play with blocks, tinker toys, and other “3D” toys in our early years, society has forced us to think and deal more in 2D, even to the point of taking the 3D world around us and applying it in 2D – we even applaud such efforts and apply labels like “Realism” to them. Sigh.

All in all, we need to make people 3D aware by making them deal with the world as a 3D entity. People need to be able to think in 3D, and manipulate in 3D in order for 3D to become completely natural on the desktop. Obviously, that’s not going to happen tomorrow, especially to people who are beyond their childhood years and thus have a more difficult time (re)learning basics. What we can all do, however, is try to come up with a portable, standard, 3D interface paradigm that will make using 3D applications, and  therefore 3D hardware acceleration, much more intuitive.

Conclusion
In order for 3D hardware to truly become the premiere PC product category of 1995 (or 1996 if you’re a pessimist), there must be 3D graphics software shipping which can be dynamically adapted to support 3D hardware acceleration just by replacing DLLs, runtime libraries, etc. The new breed of smart, software-only renderers must evolve to support such hardware. Alternately, something like VAGI needs to get to market (and VAGI needs resources in the form of human bodies to get that accomplished), and/or Intel and Microsoft need to be encouraged to further expand their 3DR and 3D DDI interfaces across multiple OSes, as well as include greater support of more advanced graphics functionality. Finally, some sort of standardization needs to come about in the area of input devices for 3D applications and software, in order to help further promote the growth of 3D software acceptance.

If you are an active 3D hardware vendor now, or plan on being one soon, the only way you’ll be successful is by grooming the PC graphics market into one that readily accepts and requires 3D technology. That takes time, money, ingenuity, and cooperation. Maybe what this requires is a marketing coalition of 3D hardware (and software?) vendors. Anyone want to start such a consortium?