Wednesday, January 9, 2013

Voice v. Touch


In today's CNET coverage of CES, Intel VP Mooly Eden is forecasting that voice recognition will make touch obsolete. Of course for conversion to voice recognition to be absolute, it must do something that even most human brains can't do, understand everyone's accent. It must also be able to pick out a particular voice among the ambient din. But of course, voice recognition is computationally intensive where touch isn't.

As touch has not completely replaced keyboards, voice recognition will find its place (or places), but it will be mostly to augment human interfaces or provide a natural interface where touch is not practical.

Monday, January 7, 2013

Digital TV Profesionals Debate on Gun Safety.


Over the holidays there has been an ongoing discussion on the "Digital TV Professionals" LinkedIn site regarding the school shootings in Connecticut. Below is my response to one comment that we should not be having that debate there.

In response to Graham, if the forum were continually about gun safety, I would agree with you. However, any venue can serve as a forum for such exceptional events. Further, the violence is not without a TV connection. Long ago, the US congress decided that TV was part of the problem and required that US TV sets contain a "V chip" (V for violence) to enable parents to block violent content from children. In any forthcoming debate regarding new gun legislation, I expect that the role of TV, video games in particular, will come up again. Like it or not, we as technical professionals can not divorce ourselves from the environment and discuss only technical issues outside of any social context. Whether "guns kill people" or "people kill People," the people kill people advocates will likely point to TV.

Thursday, January 3, 2013

Glass is Hard for Digital Signage II


Pyrex is a familiar name; what is it. To the technical folk at Corning, if you were discussing Pyrex, you were discussing alumino-borosilicate glass. It had a relatively high melting temperature and relatively low coefficient of thermal expansion (CTE). If you were discussing Pyrex with the marketing people, Pyrex® was a trade name that could be applied to anything. Since Corning sold its consumer products group, Pyrex has taken on more of the latter definition with tempered soda-lime glass being used in bakeware instead of boro-silicate. As the article states, the Pyrex you can buy today is “Not your grandmother’s Pyrex”.

Until the mid 1990’s all LCDs were made with Corning 7059 glass. “7059” was one of the compositions out of the Corning catalogue, the “Blue Pages.” Like the technical reference to Pyrex, 7059 meant a very specific composition. The Blue pages, only about a dozen copies ever existed, usually specified the type of furnace the glass was melted in and occasionally how the furnace was fired. Glasses need to matched to specific furnace refractories and even in the same furnace, changing the way you fired the furnace may give a different end composition and will give a different oxidation state of the glass… a different glass.

LCD substrates were manufactured exclusively on 7059 substrate not because of its composition or particular performance properties but because 7059 formed well on Corning’s fusion draw which made glass with virtually perfect flatness. Although it did not contain any group 1 elements (semiconductor poisons), 7059 had a number of drawbacks. Like most glasses, it formed a bit frothy and had to be compacted before use less the dimensions changed during the high temperature LCD processing. One of its other drawbacks was the high amount of arsenic contained in the glass. Arsenic is one of the elements that get added to glass to increase its formability and 7059 had a lot of it as the fusion process was extremely demanding. 7059 was replaced by 1737 and then by 1734, there was no particular order or logic behind the assignment numbering of glass compositions. Of course all of the newer glasses have names rather than numbers and many of the drawbacks have been addressed.

Having a name rather than a number does not mean that a technical glassmaker can or would ship anything or allow glass composition or processes to vary outside of established norms. Indeed it takes years to develop a new glass composition and depending on the element, parts per billion can radically impact its performance and fit for use. On occasion for CRTs, days worth of glass production were thrown out for parts per billion of Fluorine (a phosphor poison).

Although users of digital signage do not need to be concerned with chemical interactions, some knowledge of glass chemistry would be useful, particularly in selecting a cover glass, if one is required. In deciding between float vs. chemically strengthened, bonded vs. not, different selections will give different result in terms of cost, optical performance, fracture resistance, and what hazard is presented by the broken pieces should a fracture occur. The best advice is to know what you want, are getting, and to have a known supply chain that you trust.

Thursday, December 13, 2012

Optical Interconnects


Some years ago, Gene Amdahl was mounting a second startup. His first, Amdahl Computer was reasonably successful and subsequently sold to Fujitsu. The second startup was called Trilogy, it was an effort to build macro-scale processors that occupied an entire silicon wafer rather than just a chip. Though Trilogy came and went, it was a big deal at the time. The reason why the company is now just a historical footnote is that they ran into an insurmountable technology issue. They had no way to package the wafer. Specifically there was no way to get signal from one side of the wafer to the other. At the time, optical communications technology was still in its infancy. Trilogy approached Corning about developing an optical package but, of course, it was well beyond the capability of the optical technology as well.

Over a decade later, the issue came up again. As processor chips became denser, not only clock skew but interconnect densities started to become a problem, there was just not enough linear space at the edge to keep up with larger and more complex chips. Again, the obvious solution was to move to optical I/O. After another decade plus period of development, IBM has announced that the technology is ready to go. What this means is a few things. First, Moore’s Law gets another one of those needed breakthrough’s to keep it on-track. Second, with chips communicating optically, it is likely that you will now star seeing optical cabling inside computing devices rather than just a T3 fiber line leading to the building where they are housed. Third, rather than fiber reaching into the home (FTTH) or premises, you might see optical pathways directly from the device reaching out to the network.

In the optical networking world, there is a concept called “transparency” keeping the signal optical for a long as possible. Before the development of optical amplifiers, optical signals had to be periodically converted to electronics and regenerated as an optical signal. Given the processing and rise time of the electronics, it was much like flying a plane across the country but have to land every few miles and change planes. Transparency meant that the plane no longer had to land until it was near its destination. Optical I/O, optical output from computing devices may mean that the optical signal will eventually no longer have to drive to the local airport to catch its plane but can take off from right inside your PC. The results will be data rates that make today’s connections look like dial up. This scenario is probably another decade or so away, but an important threshold has been crossed.

This may also have some tendency to roll back the current "mobility boom" in favor of wired connections. Although wired connections tend to have inherently more bandwith than wireless, the wireless industry has been quite clever regarding signal processing and compressing more and more data into the spectrum available. Transparency right up to the processor may be more than can be matched by increased data compression presuming there is some application that could make use of it.