When Dick Messmer joined GE, it was a different era. Man had just walked the moon, computers took up entire rooms, and pocket-sized calculators were yet to be invented. The halls of GE Research, known then as the GE Research and Development Center, were full of excitement and opportunity. The head of NASA was Tom Paine, a former GE scientist/engineer, and the astronauts who walked the moon did so in gear that contained GE materials (helmet visors were made of GE Lexan and boot soles were made of GE silicones).
It was 1969, and Dick was a young man with drive, determination, and a strong academic background in math and physics. He had earned his Bachelor of Science from Carnegie Mellon University in 1963, his Ph.D. from the University of Alberta in 1967, and served as a postdoctoral research associate at the Massachusetts Institute of Technology in 1968. This, combined with his interest in computing and scientific theory, set Dick apart early in his career.
In the 1950s, GE scientists were the first to use a reproducible, verifiable, and well-documented process to grow a synthetic diamond. Riding the coattails of this long sought-after accomplishment, GE set out to fabricate an n-type diamond with the intent of making a radiation-hardened transistor. The team was working hard when Dick joined GE Research, but the effort was failing, and the team couldn’t figure out why. Applying his unique theoretical approach, Dick eventually proved - and published his first paper on - why the n-type diamond would never be possible. Sparing the technical details, Dick computationally showed that it was due to the Jahn-Teller effect. Read the abstract here.
While this wasn’t the news GE wanted to hear, Dick did find himself working with other experimenters on projects where a theoretical approach might offer new insight. He used GE’s mainframe computer to write programs on punched cards. Dick’s program performed quantum mechanical calculations, which was very demanding, so he only completed one run a day. Make a mistake, Dick explained, and the entire day would be lost! This was also the days before the handheld calculator, so Dick’s tools included slide rules, mechanical calculators, and pencil & paper.
By the 1980s, Dick had made a name for himself in quantum/statistical physics and materials science. He was publishing papers and lecturing on topics such as theoretical condensed matter physics, the physical behavior of semiconductors, and the physical behavior of chemical systems. Dick was the 1982 recipient of the Coolidge Award, GE’s highest individual honor recognizing technical leadership and research contributions. Shortly after, Dick was elected a Fellow of the American Physical Society; he also served as a visiting professor for two universities (Caltech, 1985 & University of Erlangen-Nurnberg, 1986) and an adjunct professor for the University of Pennsylvania (1980-1998).
It was a solid career for Dick, but as it turns out, he was just getting started.
Dick’s focus began shifting to quantitative finance, risk analysis, and computer science. In 1992, he was asked to head up a GE Corporate task force on parallel computing. At the time, parallel computing was considered the next big thing, but there were many hands in the pot; start-up companies were proposing different hardware architectures and programming approaches, and GE was unsure about purchasing a machine.
Dick assembled a team that included representatives from GE Corporate and each GE business. After evaluating the market and its players, they determined that GE should not purchase any of the machines available, but that they should closely monitor progress in the field. This turned out to be the right decision; most of the early start-ups went bankrupt, standards evolved, and more advanced machines became available. Today, GE Research is a big user of parallel computing.
When asked which project has had the most profound impact on his career, Dick talks about his digital underwriting work for GE Capital. In the mid-1990s, Dick took a bridge assignment with GE Capital while part of the Research and Development Center’s Applied Statistics Lab. Making a name for himself with GE Capital’s private-label credit card team, Dick was asked to help the insurance team increase the underwriting efficiency of its extended auto warranty program. He assembled a team at R&D that developed and applied an analytical approach that changed the course of the program and saved the business tens of millions of dollars.
Dick’s approach was so impressive that GE Capital funded a multi-year effort for Dick to create and lead a Digital Underwriting business. In 2004, the business was spun off in an IPO. Stock analysts cited the intellectual property created through the GE Global Research/GE Capital collaboration as one of the key advantages the IPO had over its competitors. The Fall 2006 issue of AI Magazine details a small part of this work. Read it here.
In addition to the aforementioned successes, Dick has led risk analysis and analytics projects for GE Aviation and GE Energy. He also served as a consultant for the Institute for Defense Analysis and the U.S. Department of Energy. Dick’s current work focuses on AI and the digital integration of data, processes, and analytics.
For 50 years, Dick’s career has aligned with the incredible growth of computers, computational science, and artificial intelligence (AI). His work helped usher in the current digital landscape, and more than 180 technical papers and 22 issued patents cement his status as a pioneer in the field.
Dick has a passion for learning new things and applying them to science and technology. This, combined with 50+ years of unwavering support from his wife Beverley, has always fueled his motivation to tackle GE’s most challenging problems.
So why GE Research for 50 years? “It is a great place to work, full of extremely talented colleagues who have technical expertise in almost every aspect of science and technology,” he said.
Congratulations to Dick on his golden anniversary with GE Research and thank you for your continued contributions to analytics automation, computing, and more.