Monday, September 2, 2024

My Intel Experience

I was hired by Intel in mid-2004 as part of their “Standard Building Blocks” program. The idea was, although there were then dozens of companies making desktop PCs, there were only a few making notebooks as engineering a notebook was much more difficult. The mission of the Standard Building Blocks group was to standardize notebook components so that anyone could build a laptop. I and another person were tasked with specifying the LCD/screen and lid design. When I joined Intel, as far as I could tell, there were less than a dozen people who even knew how an LCD worked. To the company, the display (only LCDs at the time) was just another device the processor wrote to and not a core part of the platform.

While I was at Intel, I packed up a few other responsibilities. I became an advisor to Intel Capital on their display technology investments. I served on the chip design teams for 2 generations of Intel mobile chipsets, advising them on the future capabilities and requirements for future generations of LCD. As part of this role, I initiated the switch from 4:3 to 16:9 as the standard screen format. Also, related to this role, I served on a Problem EXecution Team (PXT) that dealt with a defect that was discovered in the design of a chip that was already shipping.

The defect was that the graphics part of the chip was communicating with the LCD at a much lower voltage than was specified.  My role on the PXT was to both advise the team about LCD mechanics and communicate with the LCD makers about how Intel was handling the problem.  Unlike most teams at Intel where meetings started on time, ended on time, and had to be scheduled at least 24 hours in advance with a published agenda, PXT meetings happened whenever the team leader decided to call one and lasted until he was satisfied with progress.  Further, the meetings were so secretive, you were not even able to tell your boss what you were working on.  I communicated with the LCD makers through the Intel display technology experts in Korea, Taiwan, and Japan.

The PXT lasted for the better part of a year.  In that time, as mentioned, the aspect ratio of Laptop PCs was changed from 4:3 to 16:9 and the company started to gain an appreciation of just how display-centric computing was becoming.  My group was eventually disbanded at the demand of HP, which made a lot of money selling custom battery packs and replacement parts for their PCs much as they currently make huge amounts selling ink by the drop in cartridges special to each and every printer they design.  Standardization of notebook parts, much like the current move to refilling printers with ink rather than cartridges, would put an end to a wildly profitable product line.

I was laid off in 2007.  Further, Intel was beginning to grasp how the display was reshaping the computer platform. The company decided to put a favored executive in charge of all display-related issues, a person who knew nothing about LCDs.  He brought in his own team and not only was I gone, but all of the display experts that I worked with on the PXT were laid off as well.

Even back in those years, AMD processors were generally preferred to Intels for gaming PCs.  Consequently, AMD had a greater appreciation for screens and graphics.  This bled into the way AMD chips were designed, particularly their graphics chips.  Nvidia, of course, was a graphics company and their architecture was built on multiple cores running in parallel.  Intel, later was forced into multiple cores as you can not increase the clock speed on a chip indefinitely without the chip burning up.  However while Intel migrated to multiple cores, AMD and Nvidia graphics chips had dozens.

In recent times, with the rise of Bitcoin mining and now AI which embraces these multicore designs, AMD and especially Nvidia, have seen their stars rise while Intel struggles to catch up.  Before I left Intel, I sold my Intel stock at $26 per share. Fourteen years later, a share of Intel is worth $22.

Tuesday, February 3, 2015

Engineering vs. Physics


In technical developments, I was taught to think of problems in terms of one basic dichotomy, "Engineering Problems vs. Physics Problems." Engineering problems are things such as cost, having the right tools, or product design issues. These are problems that can be solved with time or with just working harder; things you basically know how to solve although the best solution may not be apparent. Physics problems are much harder. These are problems that say for the product to operate in the way envisioned, it must violate some law of physics or some fundamental property of the materials or system being employed. This is not to say that the problem is unsolvable but that some invention is required for the product to work but in a different way from what was initially envisioned.

Amazingly, inventors and whole organizations can labor away at physics problems for years and years as if it were some routine engineering issue they were tackling. One of the more famous of these was American Optical's efforts to invent optical fiber. American Optical worked on the problem for years before the Corning invented a fiber and a manufacturing process that actually worked. The Corning inventions being direct and very simple (as many great inventions are) it was easily copied by American Optical who went into manufacture using the Corning inventions claiming prior art. Of course law suits ensued. American Optical lost because Corning was able to demonstrate that 1) American Optical never understood the physics of the problem they were trying to solve and 2) As to their prior art, if they had worked in that direction for a 1000 years, they would never have produced a working fiber because again, they didn't understand the nature of the problem they were trying to solve.

This long intro is a preface to discussion of Apple's recent decision to spend $2 Billion to convert the GTAT facility into a data center. Clearly Apple is giving up on sapphire, otherwise GTAT or some restructured version of it might be useful going forward. This would tend to show that Apple ran into a Physics Problem in converting from glass to sapphire, not: cost or yield issues, design issues, tooling or such. Sapphire actually had to physics issues fundamental to the nature of the material.

As described in "Big Surprise," although sapphire was harder and more scratch resistant than glass, with that hardness came brittleness. Secondly, and more importantly, sapphire has a higher index of refraction meaning that screens made with a sapphire overlay would have much higher surface reflections than glass. With LCDs having marginal performance outdoors anyway, the surface reflection issue made the material a "non-starter". Apple did file some patents that Applied the Gorilla Glass process to sapphire in an effort to make it tougher and patents concerning overlays to address the surface reflection issue. However, there is a Second Law of Thermodynamics problem in that secondary systems to correct system inefficiencies inevitably can not correct all of them. This is the reason why you can not build an auto engine that is 100% efficient... or anywhere near 100%. In the case of optics, secondary corrections (additional layers) inevitably create their own surface reflections, decrease optical throughput, add expense, cut yield, and otherwise introduce new inefficiencies that were not present in the original system. Frequently its like drilling a hole in the bottom of a sinking boat to let the water run out.

So, after a billion dollars of investment to produce sapphire for iPhone screens, the facility is being converted into a data center. I'm reminded of the ending of "Raiders of the Lost Ark," after such prodigious efforts to find the Ark it gets nailed up in a box, placed in government storage and lost again. Some of the sapphire making furnaces will no doubt find new homes. But placing so many at once, many might wind up being nailed up in boxes, mute testament to the years long hubbub over sapphire covered iPhones.

Tuesday, January 6, 2015

More on Display-Centric Movies


I had done a couple of posts on movies where display technology was central to the movie plot (here and here), naming a top ten. Of the ten, there were many other movies that I cold have named e.g. Star Wars, Star Trek, The Matrix, Total Recall, The Avengers); but for better or worse I chose the ten that I did. I recently saw a YouTube posting that I may have omitted a big one... 2001 A Space Odyssey.

The premise of the YouTube video is that the monolith (the 1 x 4 x 9 object that appears at the dawn of human intelligence) is actually a widescreen TV set on end. Although videos are not project on the monolith, it serves as a teaching tool for proto-humans early on in the movie, a means of conveying ideas. Later, when a duplicate appears orbiting Jupiter, it appears as a doorway to other worlds and it eventually transforms into the ultimate immersive display. In function, it exhibits the best of what TV is supposed to do. In physical form, the 4:9 aspect ratio (2.25 width to height) falls beyond the old 1.85 US widescreen standard and just short of the current 2.33 standard. The thickness is greater than what you would currently get in a TV but perhaps it has really good built in sound.

On a tangential subject, over the holidays I saw "The Hobbit, the Battle of the 5 Armies" in 3D. Thankfully, filmmakers have gotten out of the habit of throwing things out of the screen at you with their new-found toy in 3D and the use of 3D was quite acceptable. However, the thing that 3D was really supposed to bring, increased realism was very much lacking in the presentation. The digital projection was sharp and crisp as it is supposed to be, but it gave the overall effect of artificiality. Rather than being as looking through a window, the digital projection makes it painfully obvious the image is a projection. In a recent edition of "The Economist" director Quentin Tarantino describes digital cinema as "The death of Cinema as we know it". The Economist article goes on to state, "Digital projectors cannot match the sheer detail of a pristine 35mm print, nor its rich contrast between the deepest shadows and brightest highlights." However, for me, it was not the lack of detail but the digitized edges (not quite jaggies)that gave a fake feel to the image. Seemingly, this is something that could be corrected with software, however the function of most image enhancement software is edge sharpening rather than edge blurring. Perhaps the role of image post processing in cinema should be re-thought. In any case, as in my 10th selection for display centric movies, "Here's to Film".

Wednesday, November 26, 2014

OLEDS and Why Your Old CRT TV still Works


Two stories about TV, not directly about OLED, that I will tie together down below. This note is about OLEDs. Although you can not buy a LCD TV that was made in the US, the active matrix LCD was actually invented in the US, in Pittsburgh by Westinghouse. Westinghouse was a prominent TV brand that was developing new technology. Westinghouse did not survive in the TV business long enough to capitalize on their invention because they had a problem at their factory. It seems a worker replaced a steel mesh filter in their water reclaim system with a copper coated steel mesh filter. That put just enough copper into the wash water and just enough wash water residue was left on the TV screen to serve as a phosphor poison. The TV tubes that the Westinghouse factory was making would go dark after about 6 months of use. The resultant recall of half a year’s production put Westinghouse out of the TV business. Parts per billion of copper put them out of business.

Another major brand of TV in those times was General Electric (GE). Although GE made glass for things such as light bulbs and such, GE bought their glass for CRT tubes from others. In addition to making glass, GE had an array of materials technologies including silicones and polycarbonate. At one point, in order to put downward pressure on glass prices, GE threatened to develop a polycarbonate CRT bulb. The glassmakers looked at the threat and determined that although such a product was possible and did have some advantages relative to glass, the product would only last about a year before there was enough inward migration of atmospheric gas to render the polycarbonate CRT non-functional.

The point of both stories is that emissive technologies tend to be sensitive to the most minute amount of contaminants. That is why some of the companies developing OLED technology concentrate on an ultra-clean manufacturing environment. It is also why the other key to OLED’s future is hermaticity. In a CRT, glass provided an absolute hermetic environment. The CRT was made in a clean environment, the inside of the tube, where the phosphors were, was maintained in high vacuum. Further a sacrificial barium “getter” was deposited on the inside of the tube to bind any stray oxygen that was left over from manufacture.

So, the phosphors did their thing in an absolutely pristine environment that was maintained as long as the tube continued to hold its vacuum, which is tantamount to forever for a consumer product. In terms of product chemistry, the environment virtually eliminated any alternative pathways that could be formed between the phosphor in its elevated state and when it drops back down to its base state by emitting a photon. The industry employed other tricks, such as moving to higher and higher voltage phosphors. This brought the product to the point where the phosphor aging was no longer the primary aging limitation but metallization of the glass from decades of electron bombardment.

The high voltage architecture may have some relevance to OLED design as well. But certainly, cleanliness and hermaticity are the key to making OLED technology work.