Archive for the ‘PC Graphics Report’ Category

Tropical Computing

Monday, July 7th, 1997

(This column first appeared in the July 7, 1997 issue of The Peddie Report
(The Peddie Report used to be The PC Graphics Report))

I was driving around my new neighborhood a week or so ago, looking at the sand and cactus around me, and for some reason I started thinking about panoramic viewing software. That led me to wonder what had ever happened to Eric Gullichsen, who a couple of years ago had demo’d the first such software I had seen on a PC, and whom I’d known when he was a VR guy, first at Autodesk and then when he co-founded Sense8.

That night when I read my latest issue of The Peddie Report (TPR), much to my surprise I found an item about dear old Eric. It appears he is now living on the island of Tonga in the South Pacific, doling out domain names to folks far and near. I wonder if he still lives on a houseboat like he used to in Sausalito.

Anyhow, what really caught my eye in that article was the following editorial comment from my colleagues at TPR:

Obviously, the reality of living in a tropical paradise outweighs the notion of trying to access one via a desktop computer. We truly hope that this is one technology trend that catches on. Our bags are packed, and we’re just waiting for Microsoft to buy us out. We have an Internet strategy for goodness sake. Cut us a check, and we’re out of here. JPA Tahiti. Hmmm, has a nice ring to it.

Well, my friends, it is a trend and I will be bold enough to try and take at least some credit for starting it. Why? Because as I write this, I am looking out my glass doors at the palm trees in front of my house, swaying in the ocean breeze. You see, two weeks ago, my family and I moved from the hinterlands of
tax-free New Hampshire to the Caribbean island of Bonaire (, and we plan on living here indefinitely. And best of all, it didn’t require being bought out by Microsoft to make this move.

Technology made me do it
I (with the permission of my wife) had decided to move somewhere … anywhere at all (other than New England), as long as it ended the stagnation I had started to feel. The option of where to move was limited to places with which we were reasonably familiar and comfortable, the local scuba diving had to be good, and Microsoft couldn’t have an office there (spoils the neighborhood, especially if you’re a Microsoft critic like myself). Most important, whatever location we chose, it had to have viable Internet service readily available, so I could continue my writing career and communicate with editors and others.

As luck would have it, Bonaire gained an ISP in February; my wife and I liked the people and the island; and Bonaire had world-class diving. Poof. Here we are.

The vision of virtuality
Okay, enough background information. One of the oft-admired qualities of the Net is that it enables telecommuting and remote communication. What the visionaries who promote these Net capabilities fail to take into account is the reality of connections and culture other than those of the U.S.

While a Caribbean island might be a somewhat extreme case compared with the business Meccas of Europe, the U.S. and the Pacific Rim, it can be used as an example to underscore the fact that not all places are as compatible with the future as one might otherwise think. This is especially true if your benchmark is the warped perspective that Net computing in the U.S. offers.

Let me provide some basic Bonaire statistics to help put my subsequent comments in perspective:

Government Democratic, part of the Netherlands Antilles, which are part of the Netherlands
Size 112 square miles (one-third of which is a national park and uninhabited)
Location About 50 miles north of Venezuela, 30 miles east of Curacao, and about 80 miles
east of Aruba
Population Approximately 15,000
Recreational tourism (diving, watersports), rice processing, salt (from the ocean), oil transshipment port
127 volts (on a good day), 50 Hz
Slow – 2-3 weeks from the U.S., longer to the U.S.
Stores open 9-10 am to noon and 2 pm to 4-5 pm weekdays; most are closed weekends
$250-700/month, depending on type of unskilled or semi-skilled job
of living
A gallon of (almost) fresh milk is $6. Housing is expensive (compared to New Hampshire), as are utilities (compared to everywhere but New Hampshire)
Languages Dutch, Papiamentu (a Creole-like language), Spanish, and English

Bonaire statistics
With current first-hand experience of a not so unusual Caribbean island in mind, let’s take a look at some Net and other computing perceptions:

General computing perception Island reality
Internet access is
basically free
You won’t find any $19.95 unlimited access here. Costs run around $5/hour for a 28.8 Kbps modem connection, and faster connections simply are not available.
Everyone has at least 28.8
Kbps access to the Internet
Well … being able to connect at 28.8 Kbps with a modem at the ISP’s, and being able to get 28.8 Kbps data transfers via the Net are two very different things, as I’ve discovered. My effective throughput to/from the Internet is closer to 9,600 to 14.4 Kbps. Makes it pretty painful (and expensive) downloading those “small” 5 Mbyte sample and software files from websites.
Internet backbone
bandwidth will double every (pick a unit of time)
So? The laws on Bonaire forbid anyone but the local telephone monopoly to have satellite transmission capabilities, and the size of the data pipe to Bonaire is limited due to technical and cost reasons.
Sure, we can do real-time
audio/video/whatever over a standard Internet connection
Not at an effective rate of 9,600 or 14,400 bps
Efficiency is critical Let me introduce you to the concept of “island time.” Shops are closed between noon and 2 pm. ASAP means whenever someone gets around to it (anywhere from next day to next month, or so). A common joke here is to say you want something immediately — it shows you have a sense of humor because of the absurdity of the request. So, having a slower Internet connection is not a problem culturally (although the cost of prolonged access is).
Anyone can start an
Internet business
Down here, starting a business requires a lot of patience. We applied for all the permits we needed to do Internet consulting, since that’s a wide open market on Bonaire. We need four sets of permits: a business permit (to form the corporation), managing director’s permits (to run the corporation), work permits (to be able to work at our own business), and residence permits (so we can live here while running our business). We applied for the permits in April, and can still expect the government to take another 2-3 months to approve everything, assuming the process goes smoothly (it rarely does). We’re currently on long-term tourist visas, waiting for the wheels of local bureaucracy to turn (not that we mind the wait too much, being where we are).
Computers are cheap and
widely available
Widely available, perhaps, if you’re willing to wait until a properly equipped model is ordered and delivered from the U.S. or the Netherlands.Cheap? No. As a rule of thumb, with import duties, mark-ups, taxes and everything else, buying computers and other office fixtures usually runs about 2-3 times the U.S. retail price. Occasionally you can get a better deal.

Perception vs. reality

Addressing the global community
In a nutshell, what my experience on Bonaire has shown me so far is that Americans (myself included) tend to be rather ignorant of what a “global community” really is. Up until a couple of weeks ago, I could be heard uttering most, if not all, of the same invalid generalizations listed above. But I’ve learned that the rest of the world (Bonaire being only one locale among many) doesn’t always follow America’s pace or direction, and expecting the rest of the world to change to meet U.S. standards is folly. American companies and Internet potentates need to spend more time understanding and addressing global issues in terms of the Internet.

While slow Internet connections and a wildly different culture from the “I want it now, and fast” are the norm in Bonaire (and elsewhere in the tropics), these items are a reasonable price to pay for having an office in a diving paradise — the sun shines every day, there are no stoplights on the entire island and,  because it’s a desert climate, I don’t have a lawn to mow. I can’t think of many better places to put the Internet’s promise of remote connectivity to the test.

I should end with a caution to not bother trying to register domain names here. That process also works on island time. Eric Gullichsen’s Tonga-based automated domain name server may be a better bet.

COMDEX ‘96 and Parting Advice

Tuesday, November 26th, 1996

(This column first appeared in the November 26, 1996 issue of PC Graphics Report)

Well, this appears to be my last PCGR column, at least for the next couple months while I go on sabbatical, so I figure I had better make it count…

This was my 10th COMDEX/Fall, and while I usually leave the show with some bad cold or other easily contracted illness associated with shaking the hands of hundreds, if not thousands, of people who have done the same with other countless people, any of whom could be ill, this time was different. I was counting my healthy blessings after returning home Thursday… Until I got a paper cut, on my left cornea. That’s right, a paper cut on my eye ball. My wife Linda’s first reaction was to laugh hysterically and then blame it on the COMDEX curse. She’s probably right. Anyhow, it only took two doctor’s visits, a patch and a day to heal. I guess I’ll avoid COMDEX for a couple of years to see if my health improves.

However, stories of stupid injuries aside, I went to COMDEX to bring you back some insights, so here, in no particular order are the things I learned at COMDEX this year.

Jake’s Best of COMDEX
Best New Hardware Product: Trinity Video Production System by Play Inc. (
Best New Hardware Runner-Up: Web TV from Sony and Philips/Magnavox
Best New Software: Tomb Raider by Core, published by Eidos
Best Product Roll-Out Entertainment: Mystere by Cirque de Soleil, courtesy of Microsoft Windows/CE
Best Bowling Score at Graphics Bowling Night: 168 (mine)
Best Adult Spin on a PC Game Title: Duke Screw’em 38D (vs. Duke Nukem 3D)
Most Bizarre Booth Show Concept: Samsung’s Fashion and Technology fashion show for high tech women
Most Annoying Pre-COMDEX Occurrence: Press releases sent as attachments to e-mail
Best Way to Get Your Car Quickly From a Valet: Tip valet $5 when you drop the car off

The DVD/DOS Dilemma
While talking to the folks at ProCD (they make those frequently updated CDs with everyone’s address and phone number) at the ShowStoppers media-only event at COMDEX, I discovered ProCD has already mastered a DVD-ROM packing the contents of their 5 CD set onto a single DVD disc. However, in the process of doing this, they discovered an interesting flaw, at least under older operating systems. It turns out, that unless you have an operating system with 32-bit FAT support, like the latest releases of Windows 95 and Windows NT, you can only access the first 2 gigabytes of a DVD-ROM. That’s because DOS, Windows 3.1, and the initial versions of Windows 95 have a 2GB drive address limitation. That’ll put a real damper on some DOS game titles that currently require more than 4 CD-ROM discs for a complete game. It also means that Microsoft’s goal to force the world to its newest and latest operating systems will be aided, because DVD-ROM will be next to useless on anything else. I’m sure Bill Gates will be laughing all the way to the bank.

The Direct3D/DOS Dilemma
Direct3D (D3D) was the promised panacea to all of our 3D ills and woes on the PC, especially if we were game developers.

DirectDraw was the intended 2D panacea. Well, it appears game developers still aren’t biting. I’ve had the “pleasure” of being a judge for the Software Publisher Association’s annual CODIE awards in the category of Adventure and Role-Playing Games. Of the 27 new titles I’m judging, guess how many are Windows 95 only? Two. The rest are predominantly DOS based, and several are pretty heavy duty on the 3D side. The only use of Windows 95 that’s evident is that some of the DOS games use DirectDraw to get the frame buffer address for the graphics board, and then they continue running exactly the same code they use for the DOS-only games. Not an accomplishment Microsoft should be proud of.

So far, other than the two games Microsoft has published (HellBender and Monster Truck Madness), I’ve only seen one other Christmas ’96 title (HyperBlade) shipping with D3D support.

Sure, there are many companies which have promised to ship D3D games, but where are they? Running on DOS, with hardware specific 3D support. Long live DOS.

However, it is heartening to learn, for Microsoft’s sake, that VRML browser and tool developers seem to be hopping on the D3D bandwagon with total abandon. What this tells me is that D3D is good enough for applications which run at a few frames per second (fps), but not ones that require 15+ fps. I’m sure this isn’t Microsoft’s intention, but it is what their attempt to dominate PC 3D graphics APIs has wrought.

Windows/CE – Hands-On

As a member of the press, I had access to a loaner Casio Cassiopeia Windows/CE handheld computer during COMDEX. It was a great way to get hands-on experience with Windows/CE.

Let me provide a quick review in terms of pros and cons:


  • Windows 95 GUI. While I’m still not a big fan of Windows 95, I must admit that its user interface has grown on me. I really like the task bar (when none of the applications have hung or crashed). Windows/CE offers the same look and feel, and made it very easy to start using the Cassiopeia.
  • Touch screen. All Windows/CE devices have a touch screen and a stylus you should use to select stuff via the touch screen. I found the touch screen easy to use, even with my large, meaty fingers.
  • Convenience. You can turn off a Windows/CE machine at any point, and turn it back on again and resume where you left off, pretty much instantly. While some notebook computers do this, you can do it one-handed with a Windows/CE machine, which just happens to fit in a jacket, or, if you’re daring, in your back pocket.
  • Backlighting. Don’t even bother looking at any personal organizer, PDA, PC companion, or handheld computer if it doesn’t have backlighting, which the Casio Cassiopeia does.
  • Keyboard. For such a small device, the keyboard is surprisingly usable. That’s especially important for those of us whose handwriting is beyond the recognition of devices like the Pilot or Newton.
  • Applications. It’s nice to be able to use a reasonable word processor, spreadsheet, task manager, and calendar program on a handheld PC, and even better when you can easily move data back and forth between your desk top PC and the handheld. Windows/CE provides all this (although not without flaws, see Cons)
  • Size. While still a little larger and heavier than I would ideally like, the Casio Cassiopeia and other Windows/CE devices I checked out seemed to pack a lot into a little package. Under a pound, and it fits into a pocket.


  • Performance. I was disappointed by having to see an hour glass appear whenever I pulled up or switched to a new application on the Casio. It reminded me of the way Windows 95 brings my powerful desktop system to a crawl. Everything is in RAM and ROM, with no disk access required. So why is it so slow? Because everything’s been written in horribly inefficient, but portable, C++ code. A faster processor probably wouldn’t hurt either.
  • RAM/ROM. Bill Gates made a big point about how technology has only recently come to the point where Windows/CE was affordable. I beg to differ. Windows/CE and its related applications could easily run on half the RAM and ROM a base system ships with if only Microsoft’s engineers had been willing to write space efficient code in a lower level language than C++. But it’s not Microsoft’s place in life to compromise…
  • Battery Life. I would have been willing to overlook all the other flaws of a Windows/CE handheld device (and buy one at the special press price I was offered) if it had decent battery life, but after “on” time of only 5 hours (on AA batteries), and inserting a standard PCMCIA modem card, I got a warning that I should switch to AC power (an additional costly option on the Casio Cassiopeia) immediately if I wanted to use the modem card. For a device that promotes communications freedom, tying yourself to a AC outlet (along with the adapter you need) is plain ridiculous. My goal had been to plug the modem into my cell phone so I could check e-mail from wherever I happened to be – in a car, airport, hotel lobby, trade show floor, etc., but it wasn’t going to happen withthis device. By the way, average “on” time for current Windows/CE devices using AA batteries appears to be less than 10 hours.
  • Price. I think $499 for a basic model is still a bit steep. But, compared to a Sharp Zaurus, it isn’t all that bad, I guess. However, compared to the immensely popular Pilot from U.S. Robotics, it’s a lot of money. Still, it’s more than I’m willing to pay.
  • Synchronization. When it works, it works great. When it doesn’t work, watch out. I tried to transfer my schedule from my PC notebook to the Cassiopeia right after I got it, and got nailed by a “feature” in Microsoft Schedule+ that readjusts all the times when you change your machine’s time zones. This resulted in all my COMDEX appointments being listed three hours off on the handheld. Windows/CE provides no way to brute force an update from the PC – everything always has to be synchronized and play nice. The only way to start over is to remove the batteries from the Windows/CE device and loose everything in memory, and then reload it. That’s about a 20-30 minute, annoying effort. Argh.
  • IrDA. For those of you who don’t know IrDA, it’s a fancy name for the infrared port most notebooks ship with these days. Well, Windows/CE devices have IrDA as well, but it appears you can’t use it to transfer data between a notebook and a handheld. It’s use seems to be limited to communicating solely between Windows/CE handhelds. That’s very limiting.

Net result? I’d buy a Windows/CE device if I could get it with a much better battery life, better performance, and a lower price. But I wouldn’t buy any of the Windows/CE devices now shipping.

Microsoft Logos
In closing this column, I’d like to remind you all about the threat that Microsoft’s hubris is to our industry, and most others as well, with a current example of the Redmond behemoth’s excesses.

This warning all started with a phone call I got from a concerned graphics hardware company a couple of months ago. They called to complain about what they considered to be unfair practices on Microsoft’s part. In particular, they were complaining about WHQL – the Windows Hardware Qualification Lab. That’s the place  where hardware vendors get their Microsoft conformance logos.

These conformance logos are of great interest to large systems OEMs like Gateway 2000, Micron, Compaq, IBM, and countless others, because if all the hardware components of a given PC have the logo, then the PC gets the logo as well, and the systems manufacturers gets a nice fat discount from Microsoft on a Windows 95 or NT bundle. Microsoft justifies this discount by stating that having a conforming PC lowers Microsoft’s support costs for the PC, which then get passed on to the PC manufacturer. However, the OEMs are responsible for supporting the PC users and not Microsoft. In fact, Microsoft will send people back to the OEM for free support of OEM versions of Windows.

That doesn’t change the fact that PC manufacturers are requiring graphics hardware companies that want to do business with them to pass WHQL’s certifications tests, putting the fate of much graphics hardware in the hands of Microsoft’s WHQL.

Perhaps this wouldn’t be too bad if WHQL used published compliance standards for its testing, and stuck to them. But that is unfortunately not the case. It appears from accounts from a number of graphics hardware companies I’ve spoken to that WHQL tends to be quite arbitrary in their certification process. For example, if whoever is testing a given board/driver combination at WHQL doesn’t like some aspect of the driver’s user interface, the board and driver are automatically rejected, with a vague explanation. However, nowhere are these types of user interface requirements clearly documented, leaving hardware companies guessing as to what might pass and what might not.

Even more serious are sudden changes in direction WHQL has recently made, most notably publicly (at a vendor forum) refusing to certify any driver which uses GDI by-pass, whether that feature can be disabled by the user or not. This approach has been used by several leading hardware companies to squeeze better performance  out of Windows by avoiding sluggish GDI core code for years, but now all of a sudden WHQL considers it illegal. And because a graphics hardware company needs WHQL approval to sell to its biggest customers (OEMs), such companies have no choice but to follow where Microsoft leads.

What Microsoft is doing here is nothing short of forcing conformance to ever tightening (and frequently undocumented) standards, leaving graphics hardware manufacturers, especially those using similar silicon with little room for product differentiation. A homogeneous sea of sameness in the graphics hardware market will not benefit consumers.

It’ll get even worse when hardware companies are required to adhere to the PC 97 specification to get their conformance logo. Microsoft bringing some order into the universe of driver-of-the-week benchmark performance tweaked, non-regression-tested code is probably a good thing, but in an intensely competitive market area, where software differentiation is the name of the game, the WHQL process is too heavy-handed.

What are your options? Band together and refuse to get any of your products WHQL certified. Will this happen? No, because someone will always be trying to cozy up with Microsoft to gain the upper hand against his/her competitors, regardless of the long term outcome of such behavior.

In Closing
Now that I’ve gotten my final anti-Microsoft tirade out of my system (and hopefully into yours), let me say adieu.

It’s been fun. Keep in touch.

The Computerization of Television

Tuesday, November 26th, 1996

(This column first appeared in the November 12, 1996 issue of PC Graphics Report)

As I write this column, I’m sitting in front of my 35-inch television, watching typical Sunday afternoon mindless, but nevertheless entertaining, TV drivel.

That brings to mind something that’s been bugging me for a while. Earlier this year, I had a chance to play with Gateway 2000’s Destination system – the first PC to integrate a Wintel machine with a television. It was interesting. Interesting enough to others, apparently, that several major PC manufacturers are coming out with similar TV/PC combo systems — several of which will be on display at COMDEX next week.

These new products are all part of an on-going trend of convergence of computer and television technology. Back in 1993, in my first Richter Scale column for PCGR (see PCGR, 16 November 1993, p.750), I reported on the state of interactive television (IT). IT was the original concept that the term “convergence” was applied to. However, thanks mostly to the Internet, IT in its original form has pretty much gone by the wayside, to be replaced with PCs and NCs connected to the Internet and TVs in one way or another. Which brings us back to the Destination and its newer brethren.

Convergence is the new holy grail, as companies ranging from Intel, Toshiba, and Compaq, to Sony, Zenith, and Sega will all attest to — if not by words, then at least by actions. While convergence happy companies are churning out products, technologies, and standards at a feverish pace, there appears to be great disparity in their offerings and implementations. Everyone knows that convergence offers enormous market potential, but no one knows exactly how or where. The only common factors they all seem to agree on are that convergence involves the Internet, a CPU, and a display.

Just as implementations vary, so do the end results. Some companies feel that the TV is going to evolve as an extension of a PC. Others see convergence as a means to guarantee on-going use of specific computer components. Still others see computers evolving into a new type of A/V component, in the same vein as a VCR. Many of these companies have done market research on what consumers are looking for in a combined computer and TV, and it appears from the results that they are getting a lot of conflicting answers.

Let’s take a look at a few of the different implementations of this so called convergence:

The PC as TV Tuner
The Gateway 2000 Destination is an excellent example of the “PC as TV Tuner” concept. The Destination features a 31-inch monitor, which acts as the system’s PC display. The Destination is a PC first and foremost, and a TV second. This results in a powerful PC limited by a measly 640×480 display (I’d expect at least 1280 x 1024 on a 31-inch display), and a TV with artifacts resulting from a perpetually digitized image.

As best as I can figure, the Destination is targeted at the same people who buy A/V components based on which one has the largest remote control. In this case, the remote is an entire PC, along with the heat and power consumption of a PC, and all the pitfalls of a PC. Imagine not being able to watch TV because it won’t boot. (I wrote a humor piece on this very topic a couple of years ago –you can find it on my new Richter Scale web site (see

Another limitation of a system like the Destination is that not everyone wants a 31-inch TV. Some folks want 20-inch TVs, while others want 35-inch or larger sets. Gateway would have been better off selling the PC component of the Destination as a standalone item, letting consumers buy their own TV. However, that points out another flaw of the current state of convergence – today’s TVs can’t even clearly display the paltry 640×480 resolution the Destination’s PC requires.

Perhaps the biggest drawback I see of Destination-like systems is that the focus is on the PC and not the TV. I know that when I sit down on my living room couch to watch TV, I’m there to vegetate. I do not want to have to boot my TV. I do not want to have to enable TV mode. I want to just turn it on, surf some channels, and relax. However, if I want to pull up a Web page, switch over to a video game, or something my TV can’t currently do by itself, I’d like the ability to enable the required device, change my TV’s input source (or better yet, using my Picture in Picture (PIP) feature) and get the results I’m looking for. This still means that my TV is the center of my living room universe, and not a PC.

The PC as TV Companion
A more plausible approach towards convergence is a PC acting as companion to a TV, which is the pitch Intel is making with its Intercast technology. Intel is trying to get users to plug their computers into their TV transmission stream (cable, DSS [Digital Satellite System], antenna, etc.) so that at the same time they are watching TV, their computers can be pulling up all sorts of viewing program specific information from the Internet. Examples Intel provides include real time sport statistics during a football game, up to the minute news coverage, and most importantly, automatic downloads of information from advertisers.

As with everything Intel does, Intercast stresses the requirement for a PC, preferably one with an Intel Pentium or better processor, and lots of hard disk space to handle all the downloads of excess information from various networks and advertisers. The need for the hard disk for caching such information, as well as the implied requirement of a Pentium grade CPU is what Intel claims will prevent Network Computers from being able to support Intercast properly.  In my mind, that’s a great argument in favor of an NC.

The Intercast approach is compatible with both the Destination model of convergence, as well as that of the PC as an A/V component (see below). The PC can also be used completely independently from the TV. This approach allows for greater flexibility in terms of PC usage and integration with the TV.

However, I’m still not sold on Intercast itself — I have absolutely no interest in having my hard disk filled with advertisements. It’s bad enough I get a bunch  of unsolicited spams via e-mail each week.

A/V Component
The direction I’m most comfortable with convergence taking is in making the computing device just another A/V component of the entire entertainment experience. The TV acts as the main output device, allowing you to switch between the regular TV transmission stream, and specialized A/V devices like VCRs, video disc players, video game consoles, and now, Web enabled devices.

The latter category includes things like the Web TV-branded units from Sony and Philips, rumored DVD/Web machines, the Sega Saturn with NetLink, and a number of other products in the development pipe. In terms of volumes, International Data Corp. (IDC) predicts that these types of devices are going to out  sell all other types of Internet-enabled non-PC devices. Sales in 1997 should hit 1.2 million devices, and 4.6 million devices in 1998. Note that IDC separates set-tops into a separate category, so those units are not included in these figures.

As I pointed out above, PCs can be A/V components, as long as you’re not having the PC act as the device controlling what you see on your TV. There’s nothing quite like playing a first person game like Duke Nukem 3D or Hexen on a large screen TV, with full surround sound blaring around you.

Set-top Box
I’m personally not sure where my designation of A/V component ends, and the designation of a set-top box starts, in terms of convergence, but let’s assume for the sake of discussion that the difference is that the set-top box combines the management of the TV signal feed with Internet access. Simply said, you take your existing cable or DSS box, and add Internet access capabilities to it. Outside of some cable trials, I haven’t come across any such devices.

These devices are the progeny of yesteryear’s concept of Interactive Television. Since set-tops are generally provided by cable companies, the access to these devices by consumers is dictated less by consumer demand than by individual cable operators. Nevertheless, IDC expects that 160,000 Internet enabled set-tops to be sold by the end of this year.

I think the set-top box convergence solution is also quite reasonable, as it leaves consumers with an existing paradigm (the cable box) imbued with new features (Internet surfing). Its success depends heavily on how soon cable companies get these devices into the market – and there’s some question as to how willing they are to do so, since they cost a significant chunk of change relative to normal cable boxes, and cable companies normally subsidize such boxes with monthly fees for programming.

TV Integrated
The ultimate type of convergence, one that several TV companies have been working on, is the integration of TV with an Internet compatible computing device — all in a single box. While the aesthetics of an integrated Internet TV are appealing, I would have some concerns about being able to upgrade the Internet access component of the TV to keep up with the on-going revolutionary changes occurring on the Internet. The benefit of external devices is the ability to upgrade them, either piece by piece or by complete replacement.

It is interesting to note that in this area the U.S. is woefully behind the times. Our TV has only recently started offering closed captioning (one form of an additional data feed on a TV signals) as a standard feature, and then only because of an act of Congress.

In Europe, in the meanwhile, most TVs have been offering Teletext for many years. Teletext is a perpetual data feed during the Vertical Blanking Interval (VBI) between image frames on TV. The data feed sends a new page of text information each frame. The TV doesn’t need any serious data storage because each page is repeated every few seconds. For example, when visiting my brother in Germany, I was able to check out the ski conditions at Val D’Isere in France by enabling the Teletext menu on his TV for a popular German TV channel, entering the page number for the menu of ski conditions, and after a couple seconds once the right page had been displayed, selecting the page number for Val D’Isere. The wait was no longer than waiting for today’s Web pages. Teletext even offers simple graphics.

But, Teletext offers no bidirectional data traffic – it’s purely a polling system. However, each channel of TV programming offers its own Teletext feed, which greatly expands the content options. Even better, on channels where fixed programming was offered (such as a specific movie or series), you could use the Teletext interface (also built into German VCRs) to automatically record a program selected off a Teletext screen, and even self-correct the time in case the program was moved. This was three years ago. Heck, we still can’t do that here.

So why has Teletext worked so well in Germany and elsewhere? I’m not sure it’s government mandated, but because the technology has been built into the TVs and VCRs en masse, at basically no extra charge, it gets used and widely adopted. Can the same thing work here with the Internet? Maybe, if the price is right or TV sets are required to adhere to some basic Internet standard. Sure would be a lot better than that foolish V-chip requirement for content censoring.

But I digress…

The Meaning of TV
The reason I wrote this column is that I still sense that what many computer companies are missing in terms of convergence is that TV is part of a culture built around cheap, usually mindless entertainment. Expecting anything more of consumers (myself included) is a matter of self-delusion on the parts of  these companies. We want to be entertained in front of a TV – we don’t want to have to work or think much, other than during Jeopardy or a good mystery,  perhaps. And some of the proposed convergence solutions aren’t taking that into account. So, if you’re in a company trying to target the TV viewing public, follow the lead of a consumer marketing and entertainment expert like Sony, and not the lead of a computer, chip, or software company.

So Long, For Now…
Next week’s COMDEX may be my last one for some time. Following COMDEX, I’m going to be taking a couple months hiatus from the PC graphics industry while I try to figure out what I’m going to do for the next decade or two of my life.

I’ve been directly involved with the personal computer industry since 1979 – almost 18 years – and to be honest, I’m feeling a little burnt out. And thanks to the examples Microsoft and others have set for the rest of the computer industry in terms of never-ending product announcements, “standards”, and strategic redirection, I have recently started feeling rather overwhelmed by the sheer volume of knowledge I need to digest to stay ahead of the pack.

At this point in my life, I think I want to concentrate on raising my family (a 17-month-old daughter, and a new baby due in April), becoming a more rounded human being (mentally and spiritually – I’m already too round physically), and giving something back to our environment and society (stay tuned).

I’m not sure where my upcoming introspection will lead, and I’m not sure if I’ll be back in these pages as a regular contributor after the two month sabbatical I’m taking. However, many of you have made a profound difference in my life, and I hope many of you will continue to stay in touch, because I certainly intend to do so.

Regardless of what path(s) I decide to take, you’ll be able to track my on-going rants, musings, and activities through my two Web sites: Stroke of Color and The Richter Scale.

Your friend,


Streaming Data and Consciousness

Tuesday, October 29th, 1996

(This column first appeared in the October 29, 1996 issue of PC Graphics Report)

Over the last several weeks, I’ve been inundated with a variety of press releases dealing with YASDT – Yet Another Streaming Data Type. I’ll state right off the bat that new Internet data types don’t thrill me. In fact, I despise them. I believe that I already have to deal with more data types than I really need, and don’t like the idea of having to add more to that list. That said, some of the streaming data demonstrations I’ve seen have been impressive.

Streaming Primer

For those of you not familiar with what streaming data is all about, let me give you a small primer.

As you are probably aware, things like graphics, video, and sound take up a lot of data if you want them to convey a reasonable amount of information When the Web became a popular thing to support and use, people discovered rather quickly that it takes a while to download multi-hundred kilobyte media files, just to hear someone talk, or see a video clip. What made these download experiences extra painful was that they had to complete in order for the media to be accessible.

“Streaming” changed all this, by changing the way data was sent from a Web server, and interpreted on the client side of things. The server, with the aid of specialized handshaking, keeps a steady stream of media data streaming to the client, while a special “player” on the client side sucks up the data stream, and plays it back on the fly.

Progressive Networks, of Seattle, WA, with its RealAudio technology, was among the first to offer streaming media, initially for sound. Currently, on a 28.8 Kbps link, RealAudio can play full stereo, near-CD quality sound in real-time.

Streaming Business Model

The way Progressive makes money is by selling add-on server software to handle the streaming data. The average cost per RealAudio server stream is $85. The client-side player is free, although the company does have an upgraded client-side player that offers a few extra bells and whistles for sale. I should also mention that Progressive sells tools for encoding sound and multimedia into the necessary format for streaming.

The server stream pricing model works, but only for the time being. The disadvantage Progressive has (and already knows of) is that their revenue stream depends on an add-on to existing server software. Companies making server software have realized that in order to differentiate themselves from each other, they need to come to market with streaming server technology themselves. And once it’s in the server software, it’s basically free. No more add-ons required.

To Progressive’s credit, they have managed to sign up an impressive list of customers, including ABC, NPR, and dozens of other news and music sites. They also realize that their model, as it currently stands, cannot support them for long, so they have launched a number of initiatives to try to keep a step ahead of the pack.

Streaming Evolution

Apparently, Progressive realized early on that the same technology that allowed audio to be streamed would also allow other data types to be streamed. Thus was born the recently announced RealMedia Architecture (RMA). RMA builds on the streaming technology in RealAudio, but extends the server add-on so that multiple data tracks can be sent in parallel in a stream by the server, while an updated client can take the data tracks and pass them to the appropriate player on a users’ system. The players are part of a new client-side plug-in architecture which allows third parties to developed RMA-compatible players. So far, RMA sports support from companies offering video, MIDI, graphics, and more. According to Pat Boyle, Progressive’s product manager for RealMedia Tools, RMA will ship in December.


On the other front, Progressive, with the help of 40 other companies, including Apple, Autodesk/Kinetix, HP, IBM SGI, Sun, Macromedia, and Netscape, has announced a standards initiative called the Real Time Streaming Protocol (RTSP) for delivery of real-time media over the Internet. The first draft of the protocol specification, RTSP 1.0, was submitted to the Internet Engineering Task Force (IETF) a few weeks ago. RTSP is basically the handshaking mechanism for streaming.

Progressive will provide a reference implementation of RTSP. Why would Progressive do something that might kill its business?

First, Progressive is attempting to ensure that when a standard is passed (which would ultimately happen with or without their involvement), Progressive’s technology is not outdated and does not have to be revised. In other words, if you propose the standard, you control it, to a certain point. This is part of the typical Microsoft standardization procedure, not surprising when you consider that Progressive’s founders came primarily from Microsoft. Where Progressive differs from Microsoft is that they aren’t dictating the standard to others – instead they are collaborating with others to bring it to fruition.

The second reason Progressive can afford to bring RTSP to completion is because it’s only part of the equation. Anyone wanting to implement RTSP still needs to develop or buy tools to encode data for streaming, and also would need to develop multi-stream players. With the RMA client plug-in player model, and the partnerships announced, Progressive has a serious head start there.

However, the aforementioned server software companies, most notably Microsoft and Netscape, are already on the move.

Microsoft’s NetShow

Microsoft has a new technology in development called NetShow, Beta 2 of which should be out by the time you read this. Microsoft, for the time being, is looking at RTSP, but not supporting it. According to Jim Durkin, Microsoft’s Manager of the Network Multimedia Product Unit, his company will support RTSP if it becomes widely used and adopted by the IETF, but until then is holding the specification under review and will continue working on its own technology, NetShow.

NetShow is a multiple data type streaming and CODEC mechanism, but it’s not being pushed as being as general purpose (in terms of breadth of data types) as RMA. NetShow’s focus is audio, video, URLs, GIFs, and JPEGs, and the server component of the software, once released, will become a standard item in Windows NT Server Edition (it won’t run on the Workstation version of NT) and Microsoft’s Internet Information Server (IIS). The NetShow client will be standard with Microsoft’s Internet Explorer 4.0. Jim Durkin also told me that in current tests, on a 200MHz Pentium Pro system, NetShow has been able to support 1,000 simultaneous 28.8 Kbps streams. Pretty impressive, especially if it’s basically free…

Netscape’s MediaServer

Netscape, a public supporter of RTSP, has announced Media Server 1.0, which is supposed to be part of the new version of the company’s SuiteSpot server software when it ships at the end of the year. Media Server, which is supposed to adhere to RTSP, will initially only deal with audio information, but that still provides something that’s almost as good as RealAudio for free (RealAudio offers better quality, based on Netscape’s specifications). Netscape’s big pitch is the cost (or the lack thereof) as well as the ability to add synchronized sound to Java and JavaScript applications resident on a Web site. I would bet that streaming video isn’t far behind, once the company is happy with the performance and stability of the initial release of Media Server.

Macromedia’s Shockwave and Audioactive

I would be remiss if I didn’t include Macromedia’s streaming technology in this overview. Macromedia’s Shockwave technology was perhaps the first major  multi-content streaming technology on the market, and with a price of zero for delivery, certainly attractive. Note that Shockwave doesn’t require special servers to stream data – the Shockwave player happens to be able to take data in small chunks and do something with it, which is in effect, similar to streaming, but without the handshaking and VCR like controls for server data that Progressive and NetShow offer. I should also point out that Macromedia is not doing Shockwave solely for altruistic motives – Macromedia makes its money from the tools it sells to create Shockwave content.

Now, in just the last week, while at the same time supporting RTSP, Macromedia has teamed up with Telos to produce Audioactive, a direct competitor to RealAudio. Macromedia officials claim that Audioactive is a better deal than RealAudio because Audioactive’s server pricing is per server and not per stream. Macromedia claims that its customers will only pay around $7,000 for between 100 and 200 simultaneous streams vs. $8,500 to $17,000 for a RealAudio server solution (using Pat Boyle’s $85/stream cost figure).

I personally think it’s a bit late for Macromedia to be getting into the audio stream server market at a time when Progressive is moving past that type of technology, and Netscape and Microsoft are offering the same technology pretty much for free.

Down Stream

I think decent, wide-spread, and affordable streaming technology and tools are second in importance only to micro-payments on the Internet. Without good streaming technology, the Internet can’t be used as the multimedia backbone it is destined to become.

While it may not be obvious, the success of the Internet (and therefore streaming technology) is important to those of us in the PC graphics industry – just think of all the bundling, support, and acceleration opportunities available for various streaming data types like audio, video, MIDI, and 3D.