(This column first appeared in the March 28, 1995 issue of PC Graphics Report)
Two years ago, I left WinHEC ’93 (Windows Hardware Engineering Conference), vowing not to return for as long as it was us (as attendees) paying to hear the advermercials of presenters. I skipped WinHEC ’94, and felt remorse after numerous people asked me where I had been during the week, as they hadn’t seen me. So, this year I decided to give the conference another shot, and I must admit I actually learned something at WinHEC ’95, although it certainly wasn’t what I expected.
The first thing I learned is that the presenters had nothing new to present, and while the commercialism had certainly been toned down from a couple of years ago, it was still present – heavily in some sessions. As soon as any presenter started talking about anything remotely interesting, one of three things got inserted into the presentation, followed by a subject change:
1) We were referred to the Computer Game Developer’s Conference in Santa Clara as being the place when all would be revealed. Oh joy! Another conference we can spend more money at (and probably be referred to SIGGRAPH for the unveiling of the real information). Never mind that CGDC overlaps completely with Spring COMDEX… That, combined with the fact that VESA is apparently not having its Spring COMDEX meeting, has caused me to cancel my COMDEX plans and come out to Santa Clara for CGDC instead, and I’d recommend anyone looking to sell their graphics hardware to users of the next killer app (it’ll be a game) bag COMDEX and go to CGDC as well. CGDC can be reached at 415-948-2432, FAX: 415-948-2744. Conference fee is $650, and make sure to stay through Wednesday, since that’s the day Microsoft is presenting its Game SDK. Call now, as space is really limited.
2) We were referred to a number to call to register to receive a copy of the specification, or to obtain more information on the topic. I guess that what we paid for the conference wasn’t enough to allow us to receive the information while we were actually at WinHEC.
3) We were referred to an e-mail address for the same purpose as listed in #2 above.
Additionally, while Microsoft spoke frequently about the new Windows 95 M8-Beta release that had gone out the prior week, that was not a CD-ROM we found in our albeit very chic and sturdy WinHEC bags. One good thing that those bags did contain were the complete slides of all the presentations, in both hardcopy and on CD-ROM. That works out to about $55/pound (rough estimate) of material (not including the food we were fed) for the WinHEC conference fee. One real bonus, I’ve been informed, is that the WinHEC CD contains a DOC file with the 3D-DDI specification – a real boon for all of us who have been attempting to print spurious 3D-DDI related Help pages out of the new DDK.HLP file located on the WinSpeed NT v3.51 Beta CD.
Do It Their Way
Another thing I observed during WinHEC was how hard certain companies were trying to convince attendees that their world view was a do or die proposition. It seems that what used to be called “evangelism” has now become the issuance of commandments, with the fervor of technological religion behind it. Being that this was a Microsoft conference, I guess it’s understandable that Microsoft was one of the companies suffering from hubris, dictating to the hardware industry what form, shape, and capabilities their new products must take in order to be worthy (of a logo). The other company pushing a new religion appeared to be Intel, with its “new” NSP technology.
NSP, or Native Signal Processing (or Numerous Sales of Pentiums), is an amazing marketing ploy, exceeding even the peddling the DX4 label for a clock-TRIPLED CPU. For those of you who’ve not been reading your weekly industry journals, NSP is designed to replace all your custom function silicon with software running on a Pentium driving a generic device. For example, the NSP audio solution is to generate sound by using the Pentium’s processing cycles on a simple audio DAC, instead of having dedicated sound silicon or DSPs or silicon. The same applies to codecs, with Intel stating that a hardware decompression engine is no longer necessary, and I even overheard someone seriously suggest that graphics hardware doesn’t need acceleration hardware anymore – a dumb frame buffer along with Pentium processing power will give users all the performance they need.
Now, with a little introspection, it seems obvious to me that what Intel is doing here is trying to justify to users everywhere that they absolutely need a Pentium processor, and basically nothing else in the form of advanced peripherals. If NSP is successful, it will certainly increase Pentium sales, as well as decrease sales of new audio and video hardware. But what happens when all those other applications and latest generation operating systems use the extra bandwidth a Pentium provides? The answer is simple – it’s called a P6. Now, Intel is claiming that the P6 is not meant for desk top use. Yeah, right. How many times have we heard this before? I recall that the IBM PC/AT (based on an 8Mhz 286) was supposed to be solely a multi-user system, and similar rhetoric preceded the 386, 486, and Pentium. Overall, NSP is a real clever strategy. It increases Pentium sales, decreases the value of competitors’ 486 technologies, diminishes sales of intelligent peripherals, and will lead to increased desire for P6-based desktops because the Pentium will not be able to support the bandwidth necessary to support bulkier, slower 32-bit applications, and provide smooth NSP support in parallel much beyond next year. The thing that saddens me most is that NSP will probably be successful, even though people will be paying more to get less. As several fellow WinHEC attendees put it: “Gosh, I can spend $1000 more for a Pentium system, or just a few hundred dollars for good peripherals for my 486 PC. I guess the larger expense must be better, right?”
Thinking back to the early days of PCs, one of the main reasons a PC was slower than a mini computer was because it used the then-equivalent of NSP, but not by choice. Mini-computers offered much better overall performance because they used intelligent peripherals to perform distributed processing operations. PCs have finally gotten to the point where they have this distributed processing ability, in the form of caching controllers, great display processors, new caching DSP audio technology, and more, and Intel says it’s not important. Go figure.
Where Intel should be focusing NSP technology is on areas where hardware is currently insufficient, as they appear to be doing with 3DR, their 3D interface. 3DR, contrary to rumor, is not dead, but doing surprisingly well, by the way.
While at WinHEC, I had the opportunity to tag along on a meeting to discuss the development of yet another benchmark, this one having to deal with Multimedia performance. While a small firm from Washington was marketing such a benchmark, the initiator of this meeting, Ron Wilson, Senior Technology Editor of Electronic Engineering Times, appeared to be looking for something more industry wide, and not sponsored by a single, potentially biased, entity. The official goal of the new group formed by this initial meeting is to establish a mechanism for comparing multimedia systems, and Windows 95 appears to be the first target platform. Participation is open to all industry members, and is strongly encouraged. In the words of one attendee, “if we don’t do something and do it soon, we’ll be doomed to have to live with whatever Ziff-Davis whips up…”. To both get more information and provide input on what you’d like to see in a Multimedia Performance Analysis Tool, contact Ron Wilson at EE Times – email@example.com, Voice: 415-525-4498, FAX: 415-525-4406.
Perhaps the most exciting things at WinHEC could be found in the exhibit area, both in exhibitor’s booths as well as by talking to other attendees.
Chances are that you’ve already read about Martin Marietta’s Real 3D unveiling at WinHEC in these pages, as well as a few other goodies, but here are a few things I bet that you probably did not read about:
SafePlay by Vired, Inc.
Vired, a new company based in Waco, Texas, was offering a unique new anti-virus tool for Windows 95, called SafePlay. What makes this tool extra special is that it’s the first one being offered to counter AutoPlay-based virii. For those of you not up on the latest Windows 95 offerings, AutoPlay is a feature that allows CD-ROMs to be automatically run upon being inserted into the system’s CD-ROM drive, without any user action (other than inserting the disc). The Windows 95 AutoPlay feature, combined with the ever-plummeting prices of writable CD-ROM technology, opens the door to the introduction of a whole new range of CD-ROM based viruses, and prior to SafePlay, there’s been no way to protect against them. SafePlay will ship around the same time that Windows 95 finally ships, with a suggested retail price of $69.
RemoteOSBoot by NTTS, Inc.
Apparently, an increasing number of OS/2 and Windows 95 Beta users have been griping about the length of the boot process for these advanced operating systems (commonly running a minimum of several minutes on normal systems (i.e. 486/DX2 systems). NTTS (which apparently stands for “No Time To Spare”) previewed their new RemoteOSBoot hardware product to eliminate the boot process. RemoteOSBoot comes with a device that plugs into a phone jack and power outlet, and has a regular three-prong electrical outlet on the top. This is where you plug in your system’s power strip. Once installed, you can just call the RemoteOSBoot device, from your car phone perhaps, while on your way to the office, and when you get there, your computer will be booted and waiting to serve you. An advanced model of the device also incorporates the same sort of remote sensor that the stationary part of a garage door opener, and you can use a remote control with a range of 500′ (through concrete, no less) to remotely boot your system. End user pricing was unavailable, but they indicated that OEM quantities would place the phone-only device at under $100. HTTS implied that they believed that RemoteOSBoot would be really well accepted by large PC companies.
The Bill Action Figure
Perhaps the most bizarre item being promoted (albeit without much fanfare) could be found at the Microsoft Store (up on the upper level where they were selling Microsoft t-shirts and other stuff). It was the Bill Gates Action Figure, allegedly anatomically correct (but no one would verify that, and I certainly wasn’t going to check). The Bill figure has posable arms and legs, several different sweaters and dockers, and also comes with a pair of Bill glasses to finish off that Bill Gates looks. The woman selling the doll told me that WinHEC was the first time these action figures have been offered to non-Microsoft staff, but that all senior Microsoft executives have them in their offices to remind them of who their higher authority really is. I couldn’t bring myself to spend the $49 for the Bill doll, but if you call Microsoft, they should be able to sell you one. Someone I was with when we saw the Bill doll suggested that the figure should come with a set of voodoo pins. I found the whole thing very strange, to say the least.
Unplugger, by Vired
Vired also had another interesting Windows 95 utility on hand at WinHEC, namely the unplugger. Apparently, current Plug and Play (PnP) support has some interesting side effects on certain peripherals. With some PnP modem boards, Windows 95 likes to reconfigure them for different COM port addresses, even when the user tries to override the settings to a permanent location. The net result is that non-PnP aware software (which is a label that applies to virtual all software in the world) doesn’t know that the modem it’s been configured to use can’t be found where it expects it to be. Similar to how MacOS ties individual files to certain applications, Unplugger ties applications to certain devices. Then when the application is run, it locates the specific devices the application has been associated with and fakes out the application, making it see the device where it thinks it should be, instead of where PnP code has placed it. A bonus feature of Unplugger also aids in the support of devices that have their BIOSes in non-volatile RAM. Under Windows 95 some devices are reconfigured to not run initialization code from ROM, since Windows 95 provides a superset of that initialization. This works great for as long as you continue using only Windows 95. However, if you boot into DOS, the device no longer initializes, since its ROM code has been disabled. Big time disappointment. Unplugger has a DOS utility which fixes this problem too. (I asked someone at Microsoft about this problem, and his reply was “Why would you want to boot anything other than Windows 95?”. Sigh.)
NU-RAM, by Speicher GmBH
Up until I ran across Speicher (pronounched Shpy-kher), I was not aware that there were any silicon manufacturers in Germany. Started by a pair of former East German scientists with West German VC funding, Speicher staff hinted at a revolutionary breakthrough in RAM technology. Now, I’m still regularily confused by all the various new RAM acronyms, so I’ll try to relate as much information as I was able to glean from this Gerhardt Kraftig, a technical marketing manager at Speicher.
A few key features before I describe how this patent pending technology works. First, NU-RAM, based loosely around a VRAM core, promises to cost only one-third the price of traditional VRAM. This makes NU-RAM just about the least expensive graphics RAM technology on the market. Bandwidth is comparable with that of normal VRAM. The chip interface is close enough to that of VRAM that using NU-RAM should not require silicon changes for graphics chip companies. The only real downside is that NU-RAM is targeted only at video and game applications, and can’t be easily integrated into more general purpose applications, such as system RAM.
NU-RAM stands for Neural (as in Network) RAM. The really neat thing about NU-RAM is that it uses a new form of fuzzy logic, blended with something akin to the branch prediction techology of the Pentium and P6 processors to predict graphical information as it’s being shifted out to the display, in order to come up with a visually accurate display. The prediction technology works by using internal pattern storage to build probability tables of how surrounding pixels relate to an interior pixel value, based on the complete data passed in by the controller. Such data is perpetually being sampled in order to refine the probability tables. Where the cost savings comes in is that NU-RAM really only has one-quarter the amount of RAM a traditional RAM architecture needs, with the missing RAM emulated via the fuzzy logic generated probability tables.
The RAM is laid out in a matrix format, with the fuzzy logic looking at an array of 8×8 bytes at a time. This byte granularity means that 8 bit per pixel modes must operate in a 3:3:2 RGB or YUV format instead of being palettized. Direct color and true color modes of operation are also viable for use with NU-RAM. I went to a private suite and saw a digital video playback demo based on NU-RAM technology, and it was comparable with the best I had seen on the WinHEC show floor. The prediction logic is ideally suited for video and game graphics, since these output forms require decent image representation, but without individual pixel accuracy. On the other hand, NU-RAM is not suited to DTP, CAD, and “non-regional” graphics applications. It’ll be interesting to see what the market does with NU-RAM.
At WinHEC, as before, politics and private agendas ran rampant, and the real information to be had was not in the sessions. And even though next year will be more of the same, I’ll probably still feel like I need to attend, just like everyone else, in the hopes that we might actually learn something. As a final note, I should point out that the five new products/technologies I mentioned above will all, by some odd coincidence, be officially announced to the market on April 1, 1995. Hmmmm.