Archive for July, 1996

Conspiracy Theory, a la Microsoft

Tuesday, July 30th, 1996

(This column first appeared in the July 30, 1996 issue of PC Graphics Report)

The last few years have shown us paranoia is alive and well in our culture. The greater public presence of numerous militia groups such as the Freemen, movies like JFK, The Roswell Incident, and The Rock, the popularity of TV series like X-Files, the recent bombing in Atlanta (and possibly of TWA flight 800), and the confidential FBI file flap at the White House all contribute to our mass paranoia. If we look at the cause of that paranoia, much of it boils down to the belief the government (ours or someone else’s) is conspiring to enslave us by the strategic control of information. In particular, withholding or altering facts to protect and enhance the position of government (and government officials) to the detriment of free people everywhere.

While there will always be paranoia among some part of the population, the number of people concerned about the potential loss of their personal liberty and freedom is rising at an alarming rate. Perhaps even more perturbing are people who are oblivious to the whole situation.

So, what does this all have to do with the PC industry? Well, we have our own government-like entity, trying to convince us it knows what’s best for us in the long run. This entity tells us if we support it and its products, our future will be a glorious and harmonious one. In case you hadn’t guessed, this entity is none other than Microsoft.

Conspiracy Symptoms

A company wanting to monopolize a market or industry is nothing new. In the rare case where this has happened, sooner or later something has destroyed, or at least disabled, the monopoly. However, when compared to Microsoft’s rapid rise, the only other company that has really come close to the monopolistic success Microsoft is enjoying was AT&T, and the government had to intercede there.

What differentiates Microsoft and AT&T is a combination of time, technology, and ethics.

Time

Microsoft has come from nowhere in less than 20 years, while AT&T (a.k.a. Ma Bell) took its time to build a monopoly – over 75 years.

Technology

In terms of technology, Microsoft’s is far more complex and invasive than AT&T’s ever was. While you might think control of your phone lines was the ultimate in invasive technology, it’s nothing compared to the access to all your most personal digital data Microsoft has, if it chooses to access it. See more on this below.

Ethics

Finally, in terms of ethics, Microsoft frequently appears quite lacking, much more so than AT&T apparently ever was. Actions speak louder than words, and all of Microsoft’s protestations to the contrary, the company won’t let any obstacle stand in its way when driven by the megalomaniac force of Bill Gates.

Microsoft’s weapons of war include an excellent PR engine, a proven ability and desire to undermine industry standards if Microsoft doesn’t control them, more money than many small nations, the ability to run roughshod over anyone or anything perceived as a threat, and last but not least, a vision.

A vision can be a great thing if you happen to share it. However, putting them in perspective, visions of how life should be are what’s caused most (if not all) of the wars in our species’ history on this planet.

Nashville

Why the gloom and doom about Microsoft’s growing virtual monopoly? In a word, “Nashville.” While Nashville is a familiar place to fans of country music, in the future of the PC industry it’s the code name for Microsoft’s vision of what will drives the PC. Attendees of WinHEC ‘96 got a glimpse of what Nashville will become, namely a common interface to both your PC’s contents, and content on the Internet.

Nashville is supposed to blend Internet Explorer 4.0 (scheduled for an end of year beta) with Windows 95 and a new Explorer, and will most probably result in something called “Windows 97.” Throw ActiveX into the mix, and you end up with the biggest computer virus to have ever been released. And Microsoft will be laughing all the way to the bank because people will stand in line to pay real money for it.

Why is Nashville such a scary concept? Let me itemize the key concerns:

  • According to Microsoft’s WinHEC presentation, users will see their computers’ contents presented to them in the same way they see the Internet and the World Wide Web. The average PC user won’t be able to tell the difference between when they are accessing live or archived Internet information or local files.
  • Greater networking capability and support, all as transparent as possible, will be a key feature of the new Windows technology. Combined with the item above, you have security issues of unprecedented magnitude. In simplifying and unifying the Windows interface, normal users could very easily end up making their entire system’s contents public for all the world to see.
  • Even worse, if there’s no real differentiation between local and remote data, it becomes incredibly easy to accidentally save or store data in an unwanted (and possibly quite public) place.
  • Microsoft’s push for widespread use of ActiveX controls and applets is already bearing fruit, rotten fruit. ActiveX controls which require other controls are now appearing. Some of these controls (and some Web pages) appear to indiscriminately load other controls from somewhere else on the Internet. In addition to overloading your system with infrequently used code, what better way to introduce a virus?
  • With Nashville networked at all levels, and ActiveX controls being prevalent, not much prevents Microsoft (and other ruthless companies) from installing unauthorized software on any user’s system to scan its hard drives for interesting files; record user behaviors, Internet use, and even passwords; and transmit them anywhere. Yet another form of potential security breach.
  • The “theoretical” behavior recording applet I just mentioned could easily be used to create demographic information about how often a machine is used, what applications and/or games are being run, and what Web sites and news groups you are accessing. Perhaps even a list of everyone the user sends e-mail to… This demographic information could then be resold to anyone willing to pay for it, including promoters of products who arrange to have customized advertising posted on your system the next time you boot, all courtesy of yet another specialized ActiveX control downloaded transparently by your new Windows operating system, and all without your being able to do anything about it if you want to continue accessing the Internet.

Of course, the above list isn’t necessarily complete. However, ever ready to keep application competitors on their toes, Microsoft has already announced the user interface for IE 4.0 should be the design target for all developers, since IE 4.0’s GUI will become the universal look and feel for all Microsoft applications.

Ralph Grabowski, editor of the CAD++VRML newsletter (http://users.uniserve.com/~ralphg/) predicts this will irritate application developers to no end, since Microsoft has changed their UI requirements many times in the last several years, starting with urging support for OS/2’s UI, then Windows 3.x, then adherence to the Microsoft Office UI and Windows 95’s new look, and now something completely new. Each time they’ve expressed confidence that their requirements were pretty much stable and final. Yeah, right.

Parting Paranoid Ponderings

Someone recently bet I couldn’t come up with a theory combining government conspiracy with Microsoft’s capitalistic excesses. Here’s a hypothetical scenario and I’ll leave it to you to decide whether or not I’m being unnecessarily paranoid:

Scenario Building Blocks:

  • The U.S. government is extremely concerned about how quickly the Internet has gained popularity because the Internet offers the means to disseminate any information nearly instantly, including matters pertaining to illegal activities as well as potentially damaging and top secret documents.
  • While the U.S. government has an action plan in place to be able to shut down the Internet in case of a breach of “national security” resulting from the transmission of restricted data, they currently have no way of deleting the critical information from the systems of users around the world. (Note: The Internet could be quickly rendered inoperative by severing a small number of national “data pipes” which form the Internet’s backbone, as well as shutting down proprietary on-line services).
  • Microsoft has Internet OS technology which may rapidly become the de facto installed standard on many, if not all, computers.

Scenario Evolution:

  • The government, while publicly chastising Microsoft for monopolistic behavior, privately encourages Microsoft to continue such behavior, with the requirement Microsoft give the government access to a “backdoor” in their future operating systems, much along the lines of the Clipper encryption chip proposal that failed in Congress.
  • Microsoft, taking advantage of any opportunity to increase market share, agrees to the government deal, and proceeds to eliminate all competition.

Scenario Outcome:

  • As a result of the release of some information the government does not want widely distributed or discussed, it shuts down the Internet, uses the Microsoft OS back door to wipe all record of such information as well as anything else determined to be capable of undermining the stability of the government (Bill Clinton jokes, for example).
  • As an added measure, some disaster is fabricated and martial law is declared, and our lives are irreversibly changed. Maybe Bill Gates even joins the cabinet with a position of Secretary of Information.

Shark Diving in San Diego

Monday, July 1st, 1996

(his article first appeared in the July/August 1996 issue of Dive Log)

Man feeding shark
When you tell people that you’re going on a shark dive, you usually get one of three reactions:

  1. Disbelief from people who think you’re making the whole thing up and ignore you or worse, start watching you warily, thinking you’re going to do something foolish and upset their routine.
  2. Incredulousness, somewhat similar to the disbelief of the above item, but thinking you’re serious, and crazy. These people tend to be big fans of the Jaws series of movies and books. Then they get nonchalant when you tell them there’s a shark cage involved.
  3. Excitement, usually only if the other people you’re sharing your plans with are divers, or don’t believe in the hysteria Jaws caused.

In my case, I was excited when I discovered that a recent trip to San Diego, and more particularly, a free Saturday at the end of my conference there, would coincide with the first shark dive of the season with San Diego Shark Diving Expeditions, and its proprietor, Paul Anes.

Several friends of mine had recommended checking out his operation, and at a recent dive conference I helped out at, Marty Snyderman (a well known underwater photographer and videographer) echoed that sentiment.

I called Paul, reserved a spot on the dive (at a cost of $250), and anxiously waited six weeks until I actually made it to San Diego.

The author’s conference I attended in San Diego was interesting, but nothing truly exciting. I was hoping the shark dive would turn out better.

April 27th
The night before the dive, I checked out my gear, lubed the O-rings on my camera housing (I wasn’t about to do the dive without proof I was there), and packed my now heavy  gear case. I had brought my only cold water dive protection with me, namely my Viking dry suit, and had already faced derision from folks who said that was overkill for California waters.

On the morning of April 27th, I put my gear case on a pair of luggage wheels and made my way to the hotel lobby. As it turned out, my hotel was less than a half mile from the  harbor where the boat we’d be on was docked, and since there was no taxi in sight, I decided to hoof it. Upon making it out onto the path to the harbor, I was faced with an ill omen – thousands of people walking right towards me. There I was, a sole person, facing an on-rushing tide of humanity. I struggled my way through the human cattle herds only to discover the harbor I had found was the wrong one, and that I needed to go another half mile with wheeled cart and case dragging behind me. I finally made it to the  boat, the HydroDiver, and finally discovered that the mindless lemmings I had to fight my way through were sacrificing themselves for a Walk for Life charity event.


Picture of the HydroDiver, with shark cage mounted on the rear of the boat

The HydroDiver
Upon arriving at the scene of our imminent departure, I found a large cage (with no discernable shark tooth marks) attached to the back end of the HydroDiver, and everyone waiting for me, as usual.

After being introduced to my “bait-mates”, and loading my gear on board, we departed under the care of Capt. Jim Stickler.

The place where we would start chumming the waters with fish blood and guts was about 12 miles out from shore. It didn’t take long to get there, and our first course of business was to get suited up for a trial run at getting in and out of the cage and making sure we were properly weighted, before any sharks actually showed up.

The Cage
The shark cage is designed to float about 10 to 15 feet below the water, about 20 to 30 feet from the boat. Unlike many east coast shark cages, where you have to hop in the top, the opening on this cage is in the “back”. The cage has a maximum capacity of three divers.

Now, you may be wondering why anyone in their right mind would swim 25+ feet in shark infested waters to get to a cage underwater, when sharp teeth lurk everywhere. Well, San Diego Shark Divers has added a new twist to the whole experience – they offer trained handlers who wear an extra piece of insurance, namely a chain mail suit, impervious to the penetration of shark teeth, but not necessarily a way to prevent bruises and small scrapes. The handler escorts each diver to and from the cage, keeping sharks away until the diver is safely secured.


Paul Anes preparing fish for bait and hand feeding of sharks

Getting Chummy
Anyhow, after we made sure all we knew the proper cage commuting procedure, and were properly weighted (a little heavy to avoid spending much time on the surface), it was time to start chumming the waters. While Paul prepared some whole fish for hand feeding the sharks, all the chum used to create blood slicks to attract the sharks was actually preprocessed and frozen. I was surprised to learn that there are three companies in the San Diego area that specialize in making buckets of chum just for shark diving and fishing. Being told that the first sharks usually appear an hour or two after chumming commences, we entered a pool, selecting 15 minute chunks of time in which we thought the first shark would come up and play. We were all wrong. The first shark showed up barely 15 minutes after the first ladle full of blood and gore was showered upon the surface of the water. Mind you, it wasn’t a big one, just a two and a half foot blue shark. A three footer joined him minutes later.


A small Blue Shark takes the bait.

Paul putting on his chain mail suit

Soon thereafter it was time to get suited up. Watching Paul don his chainmail was quite fascinating. His company owns two of the $7,000 chainmail suits, and while they’re custom made, they still don’t fit all that well. Paul, and his fellow handler Dennis Alba, had quite a time getting him into the 18 pound suit. Ultimately, the tried and true remedy of duct tape was used to force a snug fit. By the time he and the rest of us were dive-ready, a half dozen more blue sharks had appeared.

Yours truly (in center) in the cage, with sharks all around.

Sharks Ahoy!
After Paul got in the water, I was quick to follow, escorted by him, of course. The water was a balmy 66 degrees, and visibility was well over 60 feet. I felt overdressed in my dry suit, but was later thankful for not being wet or cold. A couple more divers followed, and we were in the midst of shark soup.

First Paul, and later Dennis, started hand feeding sharks right in front of us. We had been warned that if a shark tried to get into the cage with us, that we should bonk them on the nose, and they’d back right out. They were right.

I was amazed by the voracity of the sharks. One moment, they’d slowly be circling around the cage and the handler, the next they’d lunge in at the fish in his hand, attempting to swallow the fish whole, including the handler’s hand. My first time in the cage passed quickly, taking a mere 50 minutes and one roll of film. An hour later I popped down for another 40 minutes, and another roll of film. Time seemed to stop while I watched the sharks lazily swim by, circling the cage in the apparent hope that the yummy treats inside might come out and feed them. They tested the cage with their noses. The metal of the cage disrupts their sensory equipment, located in their noses.


Dennis Alba hand feeding a hungry shark

Toward the end of the second dive, I counted 16 sharks circling the cage, with the smallest at just over two feet, and the largest a sizable 7 footer. When I left the cage under Dennis’s capable guidance, I had to bonk several sharks on the nose to keep them away from me. What fun!

My biggest regret during the shark dives was that I couldn’t go outside the cage to get better pictures – the opening in front of me wasn’t large enough to comfortably fit my housing and strobe through and still be able to look in the viewfinder.

The Fish Story
Perhaps the greatest irony of the day was that as soon as we had gotten out of the water, a pod of dolphins showed up to play with the sharks, and not long after, a bunch of seals joined the fray. If we had only stayed in the water a few minutes more, and I had not run out of film… In any event, the appearance of these mammals certainly gave us a great topic for conversation over Mexican food and margaritas that night.

I’ve been told that later in the summer, the more aggressive Mako sharks turn up, and occasionally they’ll also see Mola-Molas (also known as ocean sunfish).

Would I do this again? Absolutely – it was worth every penny.

I think the dives I did were a spectacular way to interact with and observe these natural predators. And San Diego Shark Diving Expeditions really runs a safe and eminently enjoyable operation. I give San Diego Shark Diving Expeditions, and the whole shark diving experience an earth shattering 9.5 out of 10 on my Richter Scale! If you find yourself in the San Diego area, give Paul Anes a call at 619-299-8560, FAX: 619-299-1088. (His address is: San Diego Shark Diving Expeditions, P.O. Box 881037, San Diego, CA 92168-1037.)

The Revolution in 3-D Graphics Hardware

Monday, July 1st, 1996

(This article first appeared in the July/August 1996 issue of Desktop Engineering Magazine, and is reprinted here with their permission)

Recent and upcoming developments in 3-D graphics should ensure that most engineers will soon be looking into personal workstations – and liking what they see.

The personal workstation phenomenon we’ve been hearing about recently is fueled by a revolution occurring in graphics display acceleration. Those of us who have followed trends in PC graphics have watched the transition over the last decade from simple, unaccelerated graphics devices (e.g., EGA or VGA) to blazingly fast 2-D accelerators. New superchips are showing up from such companies as ATI, Matrox, Number Nine, and S3, among others.

Windows NT, combined with high-speed Pentium and Pentium Pro CPUs, has brought serious graphics to the PC. And while they still fall short of the power of full-blown UNIX number-crunchers, PC workstations are quickly growing in strength. And, they are, in a word, cheaper.

The latest trend in graphics chips is to add 3-D functionality. According to some analysts, more than 30 different new 3-D chips will hit the market in the next 12 months, many from start-up companies, the rest from established 2D chip makers. Intel predicts that by the end of 1997 every new computer sold will include some form of 3-D graphics hardware acceleration.

Software for 3-D
While the technology for PC 3-D graphics hardware has been around for many years (Matrox had a serious PC 3-D graphics board in the late 1980s), such products didn’t take off  because they were expensive – in the $6,000+ range – and had little software support. The software support issue was the more serious factor in hindering the acceptance of  3-D PC hardware. Early 3-D boards came with specialized utilities for viewing 3-D vector files, as well as 3-D display-enabled drivers or add-one for DOS versions of AutoCAD,  but little else. Since AutoCAD was not really 3-D display-enabled, the result was frequently a kludgey, difficult-to-use interface.

Then, about two years ago (interestingly, more or less about the time Doom was released) several companies started heavily promoting 3-D graphics APIs (application programming interfaces) and software rendering libraries. Doom, it seems, was a significant catalyst in a new craze in 3-D PC graphics, even though it wasn’t even a true 3-D  program.

At the same time, the PC 2-D graphics hardware market had more or less topped out in user-perceivable performance gains and resolution, and graphics hardware vendors were looking for new growth areas. They found them in multimedia, particularly multimedia’s digital video aspect, and in 3-D graphics.

OpenGL Meets Windows NT
About a year ago, Microsoft started shipping OpenGL with Windows NT, and it became the final catalyst needed to bring 3-D graphics acceleration to the engineering PC desktop. OpenGL is a 3-D graphics API developed by Silicon Graphics and a consortium of other players in the graphics market, and it is intended to be an open, portable version of the IrisGL API that SGI developed for its Iris workstations years ago.

In the three or four years it has been out, OpenGL has become the de facto 3D graphics API standard on all popular high-end computing platforms. (This is in spite of the fact that, according to some, it falls short in portability because many of its features depend on items unique to SGI hardware.) What this means in terms of software is that virtually all workstation graphics applications are based on OpenGL these days, regardless of the platform or operating system.

Microsoft’s release of Windows NT with OpenGL, combined with high-speed single- and multiprocessor Pentium systems, has given workstation application developers a way to port their applications to lower-cost machines. This is attractive to software developers because if workstation costs drop, software sales volumes should go up and in turn  increase demand. Similarly, developers of high-end PC graphics applications, such as CAD and modeling products, can now discard custom 3-D driver interfaces and run their products on top of OpenGL under Windows NT.

Of course, the number of machines running Windows NT is a drop in the bucket compared to those running Windows 95. But Microsoft now offers OpenGL for that popular  platform as well. This means that supporters of PC-based OpenGL applications have a potential market opening up that dwarfs the entire workstation market. No surprise then that a number of developers of both workstation applications and high-end PC applications have ported their software to Windows NT and 95 on top of OpenGL. And the  numbers are growing weekly.

OpenGL and 3-D Hardware
The Windows NT implementation of OpenGL provided the software support that PC 3-D graphics hardware needed. Microsoft provided a low-level driver mechanism, 3D-DDI (3D Device Driver Interface), in Windows NT 3.51 for those 3-D hardware companies wanting to provide end users with a noticeable boost in 3-D rendering performance. For companies wanting to show even faster 3-D graphics, there is a type of driver called an OpenGL Client, which abstracts much of the OpenGL functionality in such a way that hardware with similar characteristics to SGI’s workstation graphics hardware can significantly accelerate OpenGL rendering.

Only a handful of hardware companies have implemented OpenGL Clients for their graphics devices, most notably AccelGraphics. The rest have implemented 3D-DDI support. The main reason is that 3D-DDI drivers are easier to develop than OpenGL Clients. Doing an OpenGL Client also requires the developer to pay Silicon Graphics a significant licensing fee, while 3D-DDI has no such requirement. Some companies also report not seeing a significant performance improvement with an OpenGL Client compared to  3D-DDI.

The companies that have developed one or both of these drivers include 3Dlabs, AccelGraphics, Artist Graphics, Evans & Sutherland, Intergraph, Matrox, and Oki. Of these, only 3Dlabs sells its chips to others; the board- level OEMs of 3Dlabs’ SX-300 include Diamond/SPEA, ELSA, Fujitsu, and Omnicomp.

3-D Hardware: The Next Generation
Perhaps because many engineering application packages cost thousands of dollars, many companies targeting Windows NT and OpenGL with their graphics hardware appear to be charging quite a bit for their products; average prices run around $2,000 for a loaded OpenGL accelerator board.

Still, that’s just one end of the PC 3-D hardware spectrum, the other end being the consumer and commodity 3-D hardware markets, which today barely exist.

Again, the reason for this is a lack of software support. However, by the time you read this, more than a dozen different new sub-$300 PC 3-D graphics boards will ship, all just for making 3-D games look better and run faster. The games that take advantage of these boards do not use OpenGL; they don’t need the visual accuracy OpenGL provides nor the additional performance overhead OpenGL incurs as a result. Instead, they use a blend of hardware-specific libraries and game-oriented 3-D graphics APls such as Argonaut’s BRender, Criterion’s RenderWare, Apple’s Quickdraw 3-D RAVE, or the up-and-coming Microsoft Direct3D API. (Yes, Microsoft is not limiting itself to OpenGL; it has evolved 3D-DDI and combined it with RealityLab, the 3-D graphics API and library it owns as a result of purchasing RenderMorphics last year.)

Of all 3-D APls, the one that stands to truly open the PC market to 3-D acceleration at all levels is Direct3D. While initial comments from game developers about Direct3D’s performance have not been flattering, every 3-D graphics hardware company is rushing to complete Direct3D drivers, because Direct3D is something that Microsoft has decreed to be the future of PC 3-D. Where this helps engineering applications is that Microsoft has publicly stated that a new version of Windows-based OpenGL is forthcoming. The new version will sit on top of Direct3D instead of 3D-DDI, thereby enabling any 3-D graphics board that has a Direct3D driver to become an OpenGL accelerator as well. The OpenGL Client acceleration model will still be available to companies wanting to further improve OpenGL performance.

In short, if you need OpenGL acceleration on Windows platforms today, it’ll probably cost you a lot more than if you can wait 6 to 12 months.

Support and Performance
Now that you’ve learned that you might be able to run MicroStation, 3D Studio Max, Pro/Engineer (and the latest killer 3-D games) all on the same piece of graphics hardware, you should know what to look for in such devices.

With software support of 3-D acceleration ultimately acting both as an enabling mechanism and a stabilizing factor, what’s left to differentiate various PC 3-D hardware devices  are price, support, and performance. But because this market is still in its infancy, and because there are more than 30 companies vying for the 3-D hardware dollar, prices will fluctuate in a fit of competitive frenzy.

Support is perhaps the most critical issue, since 3-D hardware is a relatively new technology, at least on the scale on which such devices are going to be sold. Support considerations include:

  • Will the company still exist a year from now?
  • What drivers does the company currently support? (Drivers can be extreme vaporware; promises about drivers should be taken with a grain of salt)
  • Is there another source for drivers if the company stops supporting the product?
  • What type of phone, fax, or E-mail support does the company provide, and does it cost?

Performance is a central issue also, although, unfortunately, it’s not something easily measured, both because of the diversity of 3-D graphics applications (e.g., engineering vs. entertainment vs. modeling) and because the PC 3-D market is still so young. Rumor has it that Ziff- Davis Benchmark Operations is working on a 3-D benchmark for release at the end of the year, joining its well-known WinBench, WinStones, and PCBench benchmark series.

Therefore, when a board vendor quotes some esoteric 3-D performance number, ignore it until a unified benchmark comes along. Until then, rely on magazine reviews and peer experience.

As a rule of thumb, however, the performance of a given 3-D graphics board is affected by (1) the bus type of the board, (2) the quality of the driver support, and (3) the feature set of the 3-D graphics chip being used. If you buy a 3D graphics board in 1996, it should be a PCI bus-based board. In 1997, Intel’s Accelerated Graphics Port (AGP) will start becoming popular and will entail a new bus technology that overcomes the graphics bottleneck the PCI bus creates.

The quality of the 3-D drivers that come with a given board is difficult to discern. In the past, the best places to get information on driver quality have been public forums like CompuServe, AOL, and the Internet, as well as magazine reviews.

The final performance issue is: What does the hardware accelerate? The simplest form of acceleration is a “span engine” A span engine draws horizontal strips, leaving the calculation of where to draw the strips up to system software. This really sucks up system bandwidth.

Next up the scale is a triangle drawing engine. Nearly all 3-D triangle engines support smooth shading algorithms, an important requirement, and also generally off-load quite a bit of the calculation process from the system. Triangles are a key 3-D drawing component, since all other complex shapes can be reduced to triangles and triangles are relatively easy to draw.

Usually used in conjunction with 3-D triangles and lines is something called a Zbuffer: a special memory area used for the depth sorting of pixels so that objects that are  supposed to be in front of other objects actually visually appear that way on your display. Further up the 3-D feature chain are devices that do hardware texture mapping; that is, they apply 2-D graphical data. such as digital photos, onto 3-D surfaces, even wrapping them around 3-D objects.

Virtually all the devices coming out by year’s end will support a Z-buffer, 3-D line and triangle drawing, texture mapping with bilinear or bilinear interpolation and perspective correction, and lighting. If possible, for future compatibility, you should make sure any device you purchase has most if not all of these features.

A 3-D Future
There’s a lot more to the exploding PC 3-D mark et than can be covered in a single article. The market is growing exponentially, in terms of both hardware and software. If you want to keep track of all the latest PC 3-D graphics hardware developments, an excellent place I can recommend is the Dimension 3D Page on the Web.

And, speaking of the Web, 3-D software is a major driving force in cyberspace, where revolutionary technologies, such as Virtual Reality Modeling Language (VRML) and 3-D browsers, are opening up new 3-D realms for exploration. These changes, combined with the rapid deployment of high-end 3-D applications for Windows 95 and NT, and the incredible boom in 3-D game technology, should make for an truly amazing time ahead.


Side Bar: Heidi Ho

Not all high-end applications are using OpenGL as their native 3-D graphics rendering API. Autodesk has developed its own 3-D rendering API called Heidi. Heidi is the low-level layer of HOOPS, a high-level 3-D API Autodesk owns as a result of acquiring Ithaca Software several years ago. HOOPS was once used by most of the CAD packages that didn’t use OpenGL. Many of these packages have since been ported to OpenGL, as CAD companies have wanted to avoid having a competitor control their display capabilities.

Autodesk has debuted Heidi as part of its 3D Studio Max for Windows NT procluct, and has also indicated that Heidi will be the 3-D display and rendering engine used in a future release of its flagship product, AutoCAD. While some 3-D hardware companies are providing Heidi drivers, Autodesk has promised that Heidi will be able to sit on top of OpenGL.


List of companies mentioned in this article and their respective contact information:

AccelGraphics Inc.
San Jose, CA
408-441-1556 – Fax 408- 441-1599

Artist Graphics
St. Paul, MN
612-631-7855 – Fax 612-631-7802

ATI Technologies
Thornhill, Ontario
905-882-2600 – Fax 905- 882-9339

Autodesk/Kinetix Division
San Rafael, CA
415-507-5000 – Fax 415-507-5100

Diamond Multimedia/SPEA
San Jose, CA
408-325-7000 – Fax 408-325-7070

ELSA Inc.
San Jose, CA
408-935-0350 – Fax 408-935-0370

Evans & Sutherland
Salt Lake City, UT
801-588-1000 – Fax: 801-588-4540

Fujitsu Microelectronics
San Jose, CA
800-558-2494 – Fax 408-922-9179

Intergraph Corp.
Huntsville, AL
205-730-5441 – Fax 205-730-6188

Matrox Graphics
Dorval, Quebec, Canada
514-969-6320 – Fax 514-969-6363

Microsoft
Redmond,WA
206-880-8080 – Fax 206-936-7329

Number Nine Visual Technologies
Lexington, MA
617-674-0009 – Fax: 617-674-2919::

Oki America/Advanced Products Division
Marlborough, MA
508-624-7000 – Fax: 508-480 9635

Omnicomp Graphics Corp.
Houston,TX
713-464-2990 – Fax 713- 827-7540

S3 Inc.
Santa Clara, CA
408-980-5400 – Fax 408-980-5444

3Dlabs Inc.
San Jose,CA
408-436 3455 – Fax 408-436 3458