Saturday, September 29, 2012

The End of Broadcast TV

Broadcasters as an Impediment
Color TV was introduced to the public 61 years ago and was immediately withdrawn from the market. Though, nominally, the withdrawal was to conserve resources during the Korean war, in truth, it was a technology dispute between the broadcasters. The original CBS color scheme involved a monochrome CRT and a color wheel generating field sequential color. RCA, the owner of NBC, was developing a different color technology, the familiar shadow mask and spatial integrating color. The dispute delayed the implementation of color TV for over a decade and it was not until the mid-1960’s that the networks were all broadcasting in color in prime time.

Twenty-five years later, analog HDTV had been implemented in Japan and the US was planning to convert to HDTV as well. The conversion meant that all of the broadcast stations would have to upgrade their equipment for a questionable return on investment. The upgraded signal and consequent investment did not allow them to charge more for their air time. At the same time, some of the remaining US consumer electronics companies, not wanting to cede the rest of the market to Japan, proposed a digital HDTV format. The digital HDTV format was decided upon in large part because it delayed implementation of HDTV. To sweeten the deal for broadcasters, the US government decided on an implementation scheme that gave the broadcasters additional, valuable, spectrum.

Long before the implementation of HDTV, with the growth of cable, over the air broadcast ceased being important to consumers as a content source. However, the TV market being what it was, the “rabbit ears” connection on the back of the TV set remained until the transition to HDTV. In truth, the vast majority of consumers needed neither the rabbit ears connection nor the tuner; however it was kept on the platform to preserve incremental marketshare.

Value of Spectrum
In the early growth period for cellular phones, it was common for cell phone companies to buy local taxi-cab operations, not to get around town but for the spectrum they had for communicating with their cabs. Once the cellular networks were in place, the cab drivers could uses cell phones as easily as their old radios; but the purchase of cab companies showed just to buy their allocated spectrum proved a windfall for the previous owners.

As wireless communications have grown, radio spectrum has become more and more valuable. Now, it seems that the government has decided that the TV spectrum is too valuable to leave it in service of little used broadcast TV reception and has begun the process of buying out the broadcast stations. The stations do not actually own the spectrum, but as with all government transfers payments the status quo is something of a contract with those that benefit from the current policy and the government will not change the policy without buying out the incumbents.

Implications As with the transition to HDTV, when NTSC to HDTV converters were provided at a subsidized rate or free, no doubt consumers that are dependent on broadcast will get some help from the government. Possibly, in addition to subsidizing cable or satellite access, it might also subsidize internet access thereby preserving access to free news and other content. The major networks and broadcaster might suffer some decline in viewership as their captive audience goes away. However, their profit from the sale of their spectrum should be more than their loss.

For the CE market, a major encumbrance is removed for TV set innovation. The current HDTV format was designed in large part to accommodate broadcast TV. The removal of broadcasters from the picture will enable some experimentation in formats. Actually, Vizio is already doing this with its Cinema-Wide product. As with 3D, even further control of TV formatting might fall to the movie industry once the broadcasters are gone. Better sound might also rise from an option to standard. And… as there is no longer an access advantage to the broadcasters, more diverse sources of local content will develop. As cable made possible new networks without the investment in a broadcast and local infrastructure, the departure of broadcast TV might enable a blossoming of the blogosphere into video content. Finally, the additional spectrum that is made available might be further enabling to new services and to ubiquitous digital signage.

Broadcast TV, with its large distributed infrastructure, has been a traditional impediment to TV innovation. As broadcast goes away, a round of innovation both in hardware and content is certainly in the offing. Advancements in mobile electronics, and the additional spectrum freed up will advance this as well. As with any change of this type, it can’t come soon enough but will likely take much longer than it should as it has to handle the objections of those dependent on the current infrastructure. It would be nice if it was decided that this is inevitable so let’s just do it. However, as with HDTV, the change will likely take 10 years.

Monday, September 24, 2012

CE 2012

Thousand, tens of thousands of people line up a day in advance to pay up to $400 for a device most plan to dispose of within 3 years. The device is sold with pretty healthy margins and is rapidly sold out. Meanwhile TV sales, a device most will be using for the next 14 years and is sold with virtually no margins, TV sales are at best flat as well as TV pricing. In spite of the weak economy, there is plenty of excitement in consumer electronics but it seems to be concentrated on a couple of products and one company. Indeed there is some speculation that Apple might become a monopoly. I see little danger of that. This posting contains some of my views on what is happening and what is likely to happen in Consumer Electronics (CE). I first give an assessment of where the industry stands then likely (or in some cases already implemented) reactions

To be sure, Apple has a great deal of consumer mindshare. This exist because of great marketing and a wide product line where success on one consumer platform reinforces success on others. However wide participation across the industry is not necessary for success. An example is Google's recent decision to sell its Set Top Box (STB) which came with its purchase of Motorola Mobility; the group was not necessary for Google's plans. Another part of Apple's success has been masterful management of the CE ecosystem, including the parts where they are a buyer rather than a seller.

For this analysis, I break the CE ecosystem into 5 parts: Components, Composite Devices, Operating Systems (OS), Content & Services, and Consumer outlet. Although Apple, as a device maker seems to have the upper hand currently, others in the ecosystem are multi-billion dollar companies as well and have ways to respond.

Components As I have noted in other postings, information operations can be classed in 5 functions: Display (which also includes other human interfaces such as printers or speakers), Memory, Communications, Processors, Sensors. Many of these functions had been common stand alone devices such as Cameras (sensor), simple cell phones (communications) and tape recorders (storage). However, although these devices are still available, mostly people buy the function as part of a composite device such as a smartphone. I will comment on 4 of the 5.

Display: In the diagram, TV is not listed as a separate platform. Although there are current efforts to change this, it has largely become a peripheral, a display, for the Set Top Box or other content outlet. There has been wide speculation about Apple launching an Apple branded TV set. I have previously posted that I think that this would actually be good for the industry as Apple will certainly spark a round of innovation and would certainly attempt to lead prices up. But absent content agreements an Apple TV is unlikely unless it can be sold as a requisite attachment to some other device.

Storage & Communications: Providing free cloud store with their devices puts device makers in the role of selling storage. Unlike most smartphones, the iPhone does not have removable flash memory. By selling iPhones with differing on-board flash capacities, Apple effectively garners for itself the flash attachment sale. Removable storage, such as flash, can be thought of as a form of communications as well; the flash chip can be moved from one device to another, transporting the information on it as well. The absence of removable storage limits the user's communications options. The change in connector to a proprietary design further puts Apple in the business of managing and selling communications for its devices.

Sensor: Cameras still exist as a separate CE category, but there has been steady improvement in cell phone cameras both in terms of resolution and functionality. Cell phone cameras will never be able to take massive lens attachments, but short of that, the ubiquity of cell phone cameras is putting an end to the "point and shoot" segment of the camera market.

Composite Devices
The incorporation of various component functions into composite devices such as the smart phone means that the smartphone is effectively gobbling up the markets of those components as independent devices... except for the TV/Display, where the all-important content selection function was taken over years ago. TVs still have tuners, but few actually use them. Another basic trend in composite devices is compactness. The smaller something is the more difficult the engineering and the fewer companies can compete. CE leaders therefore; therefore promote thinner and thinner devices.

Operating Systems
Operating Systems (OS) are way to manage the ecosystem. Among Google's first moves in CE were development of its own OS. HP developed its own OS; but it seems to have gone by the way with their absence from the tablet and smartphone market. With the time and difficulty of enticing a developer community, of all of the parts of the CE ecosystem, a vibrant OS is the most difficult to replicate.

Content & Services As I noted above, removable storage is as much a form of communications as it is storage. That being the case, with the proliferation of high speed internet connections and the decline in hard drive and flash storage costs, the optical disc market is going away as it the content industry's distribution chain for physical media. However, having seen what happened to music industry pricing and revenue with their internet distribution agreements, the video industry is in no rush to sign up for the same thing.

Consumer Outlet
A man walks into a Walmart and asks to buy a Kindle. The sales associate behind the counter says, "But we don't sell Kindles." The man asks, "You are one of the nation's largest CE retailers, the Kindle is an extremely popular CE product?" The sales associate answers, "Yes...." There is no punch-line. The fact is Walmart is discontinuing its sales of Kindles the device isn’t just a tablet, it is a competing retail outlet that can solicit and take orders. Although, even in the most tumultuous relationships, big companies rarely refuse to do business with each other; Apple still buys from Samsung, the differing interests among the major players is causing some rifts in what is normally a short-term maximize revenue industry. The conflict is not just between Walmart and Amazon, but Google as well. Apple has its well known tiff with Samsung but also looks to avoid doing business with Google, hence Apple trying to replicate Google Maps. Finally, all of the hardware makers have some issue with those that are primarily not hardware makers, as the non-hardware makers would like the hardware to be free in order to more easily sell content.

Much of what has been going on with composite devices have been facilitated by what is going on with display technology, packing ever-more pixels into the same size screen. That is coming to an end, not because the technology has reached its limits but because the human eye cannot see ever more concentrated pixel densities. As with the other components, displays will continue to improve, but not to the point where they are driving the platforms, not without a fundamental technology change. Flexible displays may be that change and I discuss more on that below.

Fundamentally, the component makers have two principal interests. The first is to reassert themselves either as independent products or as composite devices. The TV industry formed the Smart TV Alliance in order to facilitate an app developer community writing apps for smart TVs. Nikon has developed a camera with on-board wireless communications, turning the tables on the existing composite devices by adopting some of their function. Coupled with its Droid operating system, a Skype app and a Blue Tooth headset and contrary to what I said above, you start to approach a cell phone with interchangeable lenses. Add advanced voice or esture command features and it could be computing device as well.

The component makers have a secondary interest in balancing out the power of the OS at least as far as that OS exists as a sales channel. That means supporting the Brick and Mortar retailers.

Composite Devices
As discussed above, the TV makers would like to make the TV a composite device. The number of participants in other composite devices market will expand as well. Given the extent to which tablet sales are eating into sales of the incumbent notebook, existing notebook and notebook processor companies desire to add these devices to their lineup. HP has a desire to re-enter both the smartphone and tablet markets. Barnes and Noble has the Nook, Amazon has the Kindle to increase their consumer reach and facilitate content sales, not to make money on tablets. A tablet from Walmart would not be a surprise. Indeed, making money on tablets may be quite difficult with more proprietary tablets from those wanting only to expand consumer access.

In addition to new participants in the existing platforms, there might be new platforms again driven by advances in display technology: a new form factor for the tablet enabled by a roll-up display or a smart watch. Again, advanced voice command or gesture recognition can facilitate these new platforms as well as wireless recharging. For human factors reason, I do not expect a near-to-eye device (glasses with built in video screens) as such devices have been launched in the past and gotten a ho-hum reception from the market.

Although information can be shared across platforms, the composite devices really are not that tightly linked. Nintendo recently introduced the TVii which changes that. In some ways it is the game console reasserting its central position and the TV as a peripheral. However, it is also bringing new functions to the TV and fundamentally expanding the usage model. The name TVii bares some resemblance to the much rumored itv and it may be Nintendo beating Apple to the punch.

Operating Systems
Just one comment about operating systems, Microsoft has been sitting in the anti-trust penalty box and has not been much of a factor as the smartphone and tablet markets have developed. Their time in the box is up and given the current valuation of Apple, they won't be seen as the behemouth; so I expect them to be much ore aggressive.

Content & Services
Since the advent of the VCR and cable, the direction has mainly been in the direction of user purchased content rather than ad supported. In addition to Amazon's rumored ad supported tablet, ad supported content is in resurgence. Possibly this is due to the weak economy but this would tend to favor Google and the Droid world. This is especially true given the video content owners reticence about following the path of the music content owners.

Consumer Outlet
As stated above, I expect the Brick and Mortar retailers to get added support from the hardware makers that actually intend to make money on hardware. I also expect that both the B&Ms and the online retailers to expand their offerings of proprietary tablets. In addition both the B&Ms and online retailers might expand their services, buying used product in order to facilitate the sale of newer versions. Increasing supply constraints on rare earths used in electronics may drive this as well.

Digital Signage
Finally, a mention of digital signage and digital out of home.... As digital signage becomes more ubiquitous, how these mobile devices interact with the signage will become a factor in their utility. Apple chose not to include Near Field Communications (NFC) on the iPhone 5. This was, perhaps, compelled by the Apple business model. However, for the Droid world, NFC may be a critical part of their future allowing consumers to opt-in for advertising regardless of device maker.

The above, though an incomplete assessment of the market, highlights that there are numerous opportunities for innovation in the CE world. Although Apple rides high, their position is not as secure as it seems. If their competition or other adversaries quit focusing on copying Apple and instead focus on the consumer and giving them something that Apple doesn't... that is stylish and easy to use, then there is a world of possibilities. Indeed, with the advancement of flexible displays, there is an inevitability of change.

Wednesday, September 19, 2012

What determines TV Sizes

LCDs are made, several at a time on larger sheets of glass (called a mother-glass) and “cookie cut” into smaller displays. Determining how many LCDs to cut from a mother-glass is a balance of pricing and geometry. In general, larger displays sell for more per square inch than smaller displays, so there is an incentive to make as few cuts as possible. However with current quality requirements, the bigger the display, the lower the yield, as it is more likely to contain a visual defect. Further, for whatever size is chosen, the manufacturer must have orders for that size. In general, sizes are chosen so that at least 6 individual displays (cuts) come from each mother-glass but usually less than 25. An LCD maker with a yield issue may choose to do more.

There are a number of geometric considerations as well. Of course the size of the manufacturer’s equipment is a starting point, the length and width of the mother-glass it was designed to take. LCD fabrication equipment (fabs) comes in generations or gens. Early on, the gen of a fab did describe when it was made. However as the size of LCDs diversified with the types of products using flat panels, gen has taken on a meaning more related to the size of the equipment than when it was produced. An LCD maker may install a Gen 6 fab to make monitors and the fab may be newer than their Gen 8 fab making TVs. Within a Gen, sizes may vary somewhat; a slightly large Gen 6 fab may be termed a Gen 6.5. As with all manufacturing equipment, products change, screen sizes change, and a fab will eventually be making a different product in a different size from what it was originally designed for.

Aspect ratio is another consideration. TVs are generally 16:9. Notebooks may be 16:9 or 16:10. Cell phones and monitors can have substantial variances from square to 21:9. In addition to the display itself, a small border area is needed to cut the mother-glass into individual displays. The size of this border depends on the manufacturer. Finally, LCDs are usually made with all the displays on one mother-glass being the same type. The arrangement of these displays on the mother-glass is termed the lay-up. The layup does not necessarily use every square inch of glass, in fact usually not. However, makers like to maximize glass utilization from the standpoint of getting the most out of their equipment and maximizing the number of cuts.

Sometimes very small differences in equipment or display size can mean the difference between getting an extra cut or two. For instance, when the Standard Panels Working Group (SPWG) settled on a business notebook size of 16:10, 14.1” diagonal vs. the competing 15:9 14.0”, two sizes that cannot be differentiated except by actually measuring them, there were substantial manufacturing implications. While all of the fabs that could make a 14.1” could make the 14.0” as well, two of the fabs that could make a 14.0” could not make the 14.1” without making it with fewer cuts. This meant that those fabs were uneconomic at that size. The 14.1” size was chosen to be consistent with previous aspect ratios and with the long term planning for Windows regarding screen content.

So, very small differences in equipment size may cause an LCD maker to choose one particular screen size vs. another. This is why TVs are often termed 42” (or some other size) class vs having every maker make a screen that is precisely 42”. Sometimes when it seems that the industry has settled on a standard size such as 40”, an individual LCD maker will determine that he can get the same number of cuts and consequently virtually identical costs, on a 42” or larger screen size and start offering that size. Even when the industry seems to standardize on sizes (32”, 40”, 55”) other sizes will proliferate. For a particular sized fab originally designed to make 12 cuts from a mother-glass, only certain sizes of larger screens will fit well on that equipment. In this example, trying to make a slightly larger screen may mean going from 6 cuts to 3.

Recently there was a story of some TV brands selling smaller than advertised TVs. The article tended to blame the retailers for the problem. However, large retailers sell multiple brands and multiple models; they cannot inspect every TV model to ensure that it is in compliance. Most Retailers actually do keep tape measures behind their TV counters; however this is mostly to ensure that the TV physically fits where the consumer is intending to place it. Short of actually measuring the screen diagonal yourself, the best guarantee that you are getting the screen size advertised is to buy a trusted name brand.

Monday, September 17, 2012

The New IBM

When DisplayWriter, by IBM, was still the dominant word processing software, my company was faced with a decision on upgrading. I don't recall what decision was made, but I do recall that it wasn't DisplayWriter. DisplayWriter was ruled out because it could not write to an Ascii file which was the universal word processing file type at the time. It could read Ascii, it just couldn't output to Ascii, meaning that once you modified a report in DisplayWriter, future modifications had to be done with Displaywriter. Rather than cementing the IBM software as the only choice, the Ascii situation cemented Displaywriter as a definitely not. Personal tech decisions can be less pragmatic. However, repeated instances of not putting the consumer first can wear on a company's brand and, eventually, marketshare.

In Europe, cell phones are required to have a micro-USB connector. In addition to convenience for the customer, this minimizes waste from having to continually buy multiple new chargers and discard the old. Preserving connections such as preserving connections to other types of word processing software minimizes the headaches and expenses of upgrading system components without having to upgrade everything. In some sense, this is more important to the consumer as no one wants to toss all of their old peripherals in order to upgrade particularly if their peripheral is an automobile with a dock made for the old device with the old connector. I imagine the automakers are not pleased with the idea of a new connector either as they had been considering closer integration of car functions with the owners mobile device.

In Rugged Displays I mention how, although consumer cared about every bit of weight, they held little value for increasingly thinner notebooks once they got under an inch. We also now know that 80-90% of cell phone users jacket their phones, sometimes increasing the thickness by triple. As with many factors in consumer design, thinness can be pushed to the point of diminishing returns or even no value beyond bragging rights. To be sure, bragging rights are important for branding but they are easily trumped by features that carry real benefit.

Saturday, September 15, 2012

The Film Rolls No More

When the IBM PC was introduced in the early 1980's the component cost of the 5 1/4" floppy drive was about $550. After some initial rapid cost reductions and a format change (with a capacity increase) around 1987, the device settled into a very regular 18% per year cost reduction. Although there were some attempts to expand the device capacity at the end of its life, the floppy basically relied on predictable annual cost reductions to remain on the PC platform. The image to the side is from an old technical paper of mine on integrated optics. Being partly a mechanical device and flash being made by photolithograpy, it was a matter of time before flash replaced the floppy; but floppies did not live that long and flash thumb drives are now used like floppies were although I am still waiting for AOL to start sending me dozens of free thumb drives in the mail. (Before net connections were standard, AOL used to send out its net connection software on floppy disk to anyone that they remotely thought might have a computer.)

Floppies were cheap enough to be considered free but lacked the capacity of a hard drive or tape. However, like tape, it was removable from the device and made for easy transfer of data from one machine to another. But therein lies the rub. Removable media are more a form of communications than they are of storage. Though the floppy was, in some ways competing with tape or a hard drive, the real competition was the internet. As files grew bigger and the internet grew more capable there was no longer any point to having a floppy disk. About the same time floppies were going away, optical discs (with a c) were on the rise. Optical discs were cheaper and much higher capacity. However, advancement of the internet and of wireless capacity has been relentless. Even the cheapest DVDs and Blu Rays today come with a WiFi connection standard and increasingly rarely get used to actually play discs.

Tape has disappeared as well. Hard drives were on a steeper cost curve and tapes inability to do random access did it in. AS on the PC platform, Tape's optical cousin, film, survived for much longer having inherently much higher data capacity. However,Fuji is now discontinuing its motion picture film business, shortly after Kodak announced its exit from the business as well. Although the decision may be credited to a general trend to have everything digital, end to end, it is really the low cost of digital transport rather than any cost advantage of digital storage. It was the combination of cheap hard drive storage and vanishingly small data delivery (as opposed to carting around 100 pound rolls of film) that spelled the end for film at the movie theater.

Film, of course, is not disappearing completely. Very high end theatrical projection such as Imax will continue as film is still top end as a theatrical display technology rather than as a communications or storage device. I earlier published a list of my favorite display related movies. There were only 9 in my top ten. Cinema Paradiso is my #10. Here's to film.


Brick and Mortar gets a Boost

Three recent stories give some solace for the brick and mortar (B&M) retailers: Showrooming Debunked By Sales Data: CEA, Sun Sets on Endless E-Commerce Summer as Sales Tax Comes to Amazon, LG Switching To A “One Price” On-Line Policy for HDTV, Blu-ray Players and More. The CEA data is a bit suspect as the internal figures given in the story don't reconcile. However, assuming that the CEA has it reasonably accurate, showrooming is not as big of a problem as was suspected, and the playing field between B&Ms and on-line has shifted in the B&Ms favor.

TVii vs iTV

Simple for the Consumer
One of the big complaints retailers have with consumer electronics is that they can be complicated. Much of the technology development in TV sets has been in the form of attachments rather than developments in the set itself. As a result, the consumer is left with numerous boxes connected to their TV set and that can be connected to each other in a variety of ways. Though as long as audio is connected to audio, video to video, and signal out is plugged into signal in the system will work, it may not work well. Additionally, as there is no central control switching from one content source to another can be confusing. Smart remotes have been offered by companies such as Logitech and Acoustic Research that automate source changing; however these tend to require even more A/V skills to program initially. Google TV also offered some advanced capability in switching media, but has not been much of a success to this point.

Now comes Nintendo with a device that promises to make source switching very simple. It is agnostic with respect to content provider and provides additional services that form the basis of new usage models for watching TV. The new Nintendo game controller has a relatively large touch screen that can be used as a second POP screen (Picture outside of Picture) or provide additional information such as what happened in the show before you tuned in. Sounds great.

The Nintendo controller interfaces directly with the boxes around your TV through the a traditional IR interface. As bits of the technology get emulated on other platforms such as as a smartphone or tablet App, the recently buried IR interface might find its way back onto smartphones and perhaps into digital signage as well. The availability of a touchpanel system control might also further spur the consolidation of home theater attachments back into the TV, the controller,or into existing only as a service such as what is already happening to the DVR and optical drive.

There have been numerous rumors of a new Apple branded TV set ushering new paradigms for TV watching. Baring a substantially different direction by Apple, they may be loosing their "first mover" status in TV set usage innovation. This new product by Nintendo might spark a round of innovation in the industry ahead of the launch of any Apple branded TV set. Certainly the names GamePad and TVii seem to be making a statement. The 6.2" screen of the GamePad put it between the smallest tablets and the largest cell phones. Emulating this kind of functionality in a cell phone sized screen might not work well. The device also has Near Field Communications (NFC) which was anticipated but not found on the iPhone 5. NFC may facilitate on the spot purchase of content or user identification for restricted content. The device shows a great deal of both creativity and and business acumen.

Wednesday, September 12, 2012

My Take on the iPhone 5

Fundamentally there are only 5 things that you can do with information (process, transmit, store, sense, display). As an information device, here is a quick run-down on the changes embodied in the iPhone 5.

Process: The new iPhone as a 2X faster processor.

Transmit: LTE capability was added, but no Near Field Communications (NFC) as was rumored.

Store: The camera is faster which may be improved flash memory or an improved imager. In any case it is still built in memory only with that built in memory mark up.

Sense: The iPhone has better camera (in addition to being faster, more spatial resolution and better image stabilization).

Display: The speakers (a form of display) have been improved and noise cancelling added to the earpiece. The display itself has been improved in numerous ways some of which may or may not be apparent to the consumer. The display is bigger has 326 dpi spatial resolution, reported “better color fidelity” which I assume means higher chromatic resolution. Not mentioned in today’s reports but reported earlier, the screen should have better motion response as well.

Packaging: Beyond its information handling capability, the new iPhone has better packaging. The screen now comes in a 16:9 format, a wider aspect ratio than the previous iPhone. It seems that the designers wanted a bigger screen but the phone still had to fit well in the hand, so the screen was narrowed at the same time it was made bigger. Traditionally, Apple has preferred 16:10 for its computing devices being able to show a 16:9 image with a control bar at the bottom. The control bar is probably unnecessary for a phone. The screen is thinner, contributing to a thinner overall device. The camera also has a sapphire lens, probably more for improved scratch resistance than optics.
Content: A new iOS is coming and some changes to iTunes.

From a display perspective, as I have noted in other articles, it really takes 2 dimensions to describe a screen size, either an aspect ratio and diagonal, or height and width. Vizio ran into some criticism for its Cinema-wide sets for giving the identical information, screen diagonal and aspect ratio. In considering the palm size, Apple has decided that screen width is relevant to the consumer but continues in the traditional pattern of reporting a diagonal. It wouldn’t be a bad thing if the industry just started describing screen size with height and width. As to the other aspects of the screen, there are numerous improvements but probably with diminishing returns for the consumer in terms of visual quality. Packing in more pixels to the display diminishes battery life and if the display is already at the resolution limit of the viewer, there is not much to be gained but bragging rights. At some point, to show more visually complex content, a bigger display is needed, but the phone still must remain a hand held device. As a branding focus, the emphasis is shifting away from the display until some other aspect of the display becomes a marketing focus, a common event in the TV world where one season it is brightness, the next contrast, and so on.

Tuesday, September 11, 2012

The Microsoft Holodeck

Microsoft has filed a patent on an immersive display system that projects images on the walls around you in addition to the main image on the screen. Though the intent of the technology seems to be for gaming, if it is successful, it could find more general entertainment uses. The concept has some history, both successful and not. Before there was unlimited computing power, flight simulators used to monitor the direction of gaze of the pilot and generate high resolution imagery in front of him and much lower resolution in his or her periphery. The pilots could never tell that everything was not in high resolution. For the purposes of Microsoft's gaming technology, it would not be possible to shift the high resolution image to the wall beside you but it could very well give much more of the feel of being in the place rather than watching it on a screen.

Philips had previously launched a much less ambitious version of this called Ambilight where the the TV merely projected colors from the image to the wall around it. It did not have much impact on the market. I believe that it was not substantial enough. However, coupled with beam-steered speakers, the Microsoft innovation could offer a compelling experience. The technology needs a dark room with acceptable geometry and acceptable colors, however it is somewhat like 3D in that it may take some creative art in using it effectively. I would imagine that effects that start away from the main screen and draw your eye forward would be more effective than effects that would actually cause you to turn your head. I also imagine focusing the effects on the ceiling (always there, usually white)would be more effective that counting on bare walls around the set.

Monday, September 10, 2012

Black and White E-readers are Here to Stay

Like Rock and Roll, Black and White is here to stay, at least according to the recent NY Times article. The article was paraphrasing Jeff Bezos. However in an interview after the event at which the NY Times was reporting, Mr. Bezos seemed to say a more reasonable, “black and white displays will be around for a while to come”.

To be sure, a lot of trade-offs must be made in going to a color display. They are necessarily more complex, but more importantly for mobile devices, they consume multiples more power and the current generation of LCDs has no sunlight viewability capability at all. The first portable computers were monochrome. The Data General One (lunchbox configuration) had a monochrome LCD that was barely visible at all. The loss of brightness to do color made color out of the question then. The first notebook configuration PC, the Grid, had a red on black plasma screen. Plasma had no color capability then and the power consumption of that display made the Grid much more of a transportable than a mobile device. The original Compaq transportable had a monochrome CRT (green on white). Color CRTs were available and the power consumption did not matter as it was a plug-in device, but there was no color content in the PC world. This also meant that most monitors where monochrome as well being either green on black or amber on black.

The selection of a colored font (green or amber) rather than white on black was made for human factors reasons. Black on White, the paper paradigm, was also not used. In the eye, the cones (the color receptors) are in the center of the eye while the rods (the black and white receptors) are concentrated in the periphery of the eye. This arrangement is for very good primal reasons; in the dark you still want to have good peripheral vision to spot any sort of threat. In brighter environments you rely on your color vision. In an office environment using colored font is easier on the eyes. Color also provides chromatic contrast in addition to the brightness contrast of a white on black font. Pen on paper was inherently monochrome as adding color added significantly to the complexity and hence expense of reproducing handwritten documents. When the printing press was invented, the use of color blossomed but monochrome still tended to rule for cost constrained documents such as newspapers and paperback books. The image above is from a Gutenberg Bible.

Similar to the invention of the printing press, electronic word processing pioneered development of color. Information such as emphasis and misspellings were highlighted with color. This initial development of color content did away with monochrome computer monitors and subsequently with monochrome notebooks. In the early development of LCDs, yields on LCD arrays were low and yields on LCD color filters were even lower. A combination of increased yields and lower raw materials costs lead to color and monochrome LCDs equilibrating in price in the early 1990’s. Monochrome still had a substantial power consumption advantage. The color filter in an LCD disposes of about 2/3rds of the light coming from an LCD and powering the display is generally about half of the power consumption of a mobile device. So going to a color display meant about a 1/3rd reduction in batter life. However, once color was cost competitive with monochrome, color notebooks went from being about 20% of the market to about 95% of the market in about 9 months. The consumer was clearly stating their preferences.

Notably, Apple was the last to get rid of its monochrome notebook line. Steve Jobs had been an investor in a start-up that was developing a new kind of display technology. A friend of mine that saw an example said that it was monochrome and actually quite advanced for its time. However, perhaps it did not have a path to do color. Early on, some were writing Plasma off until it developed a means to do color. In any case, perhaps Mr. Jobs involvement with developing a monochrome display lead to the company holding on for monochrome for too long. When there was wide availability of color based software, color content, the transition to color in mobile computing was swift and absolute.

As noted above, color LCDs still have their issues with non-existent sunlight viewability and high power consumption as compared with digital paper such as the E-Ink displays in some Kindles. Although, I’m sure the E-Ink folk are gratified by the NY Times headline, I’m just as sure they are not going to stop working on color. However, Mr. Bezos may be right in that it might be a while. When the transition of notebooks from monochrome to color happened, color ranges were limited and color resolution was only 8 bits (that is 8 bits spread between red green and blue, not 8 bits per color)…. And, there was no need to do video refresh rates. Today, consumers have gotten used to color fidelity that is as good as their eyes can discern. A technology that only generates pastels might not be worth the power hit. Beyond that, there is the challenge of doing video refresh rates in a bi-stable display.

The statement that “Black and White is here to Stay” may be a bit overblown. Consumer’s preference for color is natural and overwhelming. They were unanimously willing to give up 30% of battery life to get color in notebooks. Perhaps they were willing to give even more. However, the power advantage of the paper-white display is much more than that.

Tuesday, September 4, 2012

Rugged Displays

There are about 3 dozen parameters by which you can measure a display. Occasionally there is either a technical breakthrough or someone opts to up the performance on one of these significantly beyond the norm and then promote that feature heavily. Other device makers then follow suit to capitalize on the promotional spending the first mover is making with their own performance improvements and promotional spending. Soon performance and the claims of performance move beyond what the average person can actually experience, beyond what anyone can see, and sometimes beyond what can even be measured on conventional equipment.

Lately these have tended to focus on electronic improvements due to the funding sources for R&D in the industry. A prominent computing company frequently they surveys users of computing devices and regularly finds that ruggedness was always in the top three of areas where consumers desire improved performance. However increased ruggedness involves making trade-offs from areas the company was pushing, specifically thinness. So this is not an area that gets much attention from the company or its competitors even though it has very high consumer utility. In the notebook area, I am only aware of Panasonic with their Toughbook brand, actively promoting ruggedness. In the cell phone are, in-spite of designers desire to make cell phones increasingly thin, we find out from of the out-fall of the Apple v. Samsung case that the vast majority of cell phone user encase their phones, sometimes doubling or tripling the thickness in order to increase the ruggedness.

Of course, there are also companies that specifically make devices for children that necessarily design to more stringent specs; but toughness has not widely caught on within the industry… with a few exceptions that are mostly prescriptive. The ThinkPad has a special rubberized paint that gives it better impact performance. The iPhone has the Gorilla Glass cover instead of a plastic one to give it better resistance against surface scratches (keys) in your pocket. Sony also has offered a TV set with a Gorilla Glass cover to prevent screen damage in case the kids get too involved in their video games. However, there are other challenges besides impact and surface scratches.

Early smart phones were very susceptible to moisture. Rather than fix the problem, the first reaction was to mark them with a dye that changed color when it got wet. This relieved the maker from having to pay for replacement of wet phones but did not solve the consumer’s problem. Lately, cell phone case makers have been offering water resistant models and some cell phone makers have introduced product that can actually be submerged without damage. In larger LCDs, digital signage and outdoor LCD TV sets have developed encasements that allow these devices to operate in the rain.

The sun can impact performance of a cell phone or mobile device in two ways. First, trying to use your mobile device outdoors is frequently problematic as the sun washes out the screen. Although this is not specifically ruggedness in that it is not a permanent failure of the device, it does render the device useless. The wash out can be so thorough, that it is sometimes not apparent that the screen is on. Pixel Qi makes a screen that is viewable in direct sunlight; I had expected that type of screen to star appearing on mobile devices before now. There is also a rumor that one of the next Kindles will have both an LCD and an e-paper display.

A second impact of the sun is radiant heating. The unsourced diagram shows the heating of the dashboard of a BMW on a 100 degree July day. The upper (green line) shows the dashboard reaches 185 degrees while the ambient (lower blue line) gradually climbs to 100. Please note that although the ambient air in the interior of the car may reach 145 degrees, the dashboard temperature is more a direct result of the amount of solar radiation it is receiving and the temperature curve reflects the rapid increase in radiation rather than the gradual increase in ambient temperature either inside or outside of the car. Further the dashboard is protected from some of the solar radiation by the glass in the car. An object in direct sun, say left on a picnic table near the summer solstice can reach 210 degrees or more and that equilibration with the current level of solar radiaon can happen in only about 10 minutes. Other than Apple’s preference for white encasements, little has been done to isolate mobile LCDs from possible impairment by the sun.

This may change. Although optical isolation is not a focus issue with mobile devices, it is a recognized problem with digital signage where both ambient light wash out and solar thermal clearing are substantial problems. Currently, the digital signage world basically lives with image wash out and uses active air cooling to combat the thermal issue. One LCD maker also seems to have a product that is significantly more temperature resistant to thermal clearing; however, I do not see them promoting this. More sophisticated approaches are available and the signage industry is actively investigating these. As they are applied to digital signage, they might also find their way into mobile devices as well.