Tuesday, February 3, 2015

Engineering vs. Physics


In technical developments, I was taught to think of problems in terms of one basic dichotomy, "Engineering Problems vs. Physics Problems." Engineering problems are things such as cost, having the right tools, or product design issues. These are problems that can be solved with time or with just working harder; things you basically know how to solve although the best solution may not be apparent. Physics problems are much harder. These are problems that say for the product to operate in the way envisioned, it must violate some law of physics or some fundamental property of the materials or system being employed. This is not to say that the problem is unsolvable but that some invention is required for the product to work but in a different way from what was initially envisioned.

Amazingly, inventors and whole organizations can labor away at physics problems for years and years as if it were some routine engineering issue they were tackling. One of the more famous of these was American Optical's efforts to invent optical fiber. American Optical worked on the problem for years before the Corning invented a fiber and a manufacturing process that actually worked. The Corning inventions being direct and very simple (as many great inventions are) it was easily copied by American Optical who went into manufacture using the Corning inventions claiming prior art. Of course law suits ensued. American Optical lost because Corning was able to demonstrate that 1) American Optical never understood the physics of the problem they were trying to solve and 2) As to their prior art, if they had worked in that direction for a 1000 years, they would never have produced a working fiber because again, they didn't understand the nature of the problem they were trying to solve.

This long intro is a preface to discussion of Apple's recent decision to spend $2 Billion to convert the GTAT facility into a data center. Clearly Apple is giving up on sapphire, otherwise GTAT or some restructured version of it might be useful going forward. This would tend to show that Apple ran into a Physics Problem in converting from glass to sapphire, not: cost or yield issues, design issues, tooling or such. Sapphire actually had to physics issues fundamental to the nature of the material.

As described in "Big Surprise," although sapphire was harder and more scratch resistant than glass, with that hardness came brittleness. Secondly, and more importantly, sapphire has a higher index of refraction meaning that screens made with a sapphire overlay would have much higher surface reflections than glass. With LCDs having marginal performance outdoors anyway, the surface reflection issue made the material a "non-starter". Apple did file some patents that Applied the Gorilla Glass process to sapphire in an effort to make it tougher and patents concerning overlays to address the surface reflection issue. However, there is a Second Law of Thermodynamics problem in that secondary systems to correct system inefficiencies inevitably can not correct all of them. This is the reason why you can not build an auto engine that is 100% efficient... or anywhere near 100%. In the case of optics, secondary corrections (additional layers) inevitably create their own surface reflections, decrease optical throughput, add expense, cut yield, and otherwise introduce new inefficiencies that were not present in the original system. Frequently its like drilling a hole in the bottom of a sinking boat to let the water run out.

So, after a billion dollars of investment to produce sapphire for iPhone screens, the facility is being converted into a data center. I'm reminded of the ending of "Raiders of the Lost Ark," after such prodigious efforts to find the Ark it gets nailed up in a box, placed in government storage and lost again. Some of the sapphire making furnaces will no doubt find new homes. But placing so many at once, many might wind up being nailed up in boxes, mute testament to the years long hubbub over sapphire covered iPhones.

Tuesday, January 6, 2015

More on Display-Centric Movies


I had done a couple of posts on movies where display technology was central to the movie plot (here and here), naming a top ten. Of the ten, there were many other movies that I cold have named e.g. Star Wars, Star Trek, The Matrix, Total Recall, The Avengers); but for better or worse I chose the ten that I did. I recently saw a YouTube posting that I may have omitted a big one... 2001 A Space Odyssey.

The premise of the YouTube video is that the monolith (the 1 x 4 x 9 object that appears at the dawn of human intelligence) is actually a widescreen TV set on end. Although videos are not project on the monolith, it serves as a teaching tool for proto-humans early on in the movie, a means of conveying ideas. Later, when a duplicate appears orbiting Jupiter, it appears as a doorway to other worlds and it eventually transforms into the ultimate immersive display. In function, it exhibits the best of what TV is supposed to do. In physical form, the 4:9 aspect ratio (2.25 width to height) falls beyond the old 1.85 US widescreen standard and just short of the current 2.33 standard. The thickness is greater than what you would currently get in a TV but perhaps it has really good built in sound.

On a tangential subject, over the holidays I saw "The Hobbit, the Battle of the 5 Armies" in 3D. Thankfully, filmmakers have gotten out of the habit of throwing things out of the screen at you with their new-found toy in 3D and the use of 3D was quite acceptable. However, the thing that 3D was really supposed to bring, increased realism was very much lacking in the presentation. The digital projection was sharp and crisp as it is supposed to be, but it gave the overall effect of artificiality. Rather than being as looking through a window, the digital projection makes it painfully obvious the image is a projection. In a recent edition of "The Economist" director Quentin Tarantino describes digital cinema as "The death of Cinema as we know it". The Economist article goes on to state, "Digital projectors cannot match the sheer detail of a pristine 35mm print, nor its rich contrast between the deepest shadows and brightest highlights." However, for me, it was not the lack of detail but the digitized edges (not quite jaggies)that gave a fake feel to the image. Seemingly, this is something that could be corrected with software, however the function of most image enhancement software is edge sharpening rather than edge blurring. Perhaps the role of image post processing in cinema should be re-thought. In any case, as in my 10th selection for display centric movies, "Here's to Film".