TAMEST Blog Series: 2014 O’Donnell Awards Recipients (2nd Post by Dr. James Walker)

The following post is part of a special blog series highlighting the importance of our O’Donnell Awards program and its impact on the program’s past recipients in medicine, engineering, science, and technology innovation, as well as the importance of scientific research to Texas. The 2014 O’Donnell Awards recipients have each agreed to contribute to the blog series.

The second post in this series was written by Dr. James Walker, recipient of the 2014 O’Donnell Award in Technology Innovation. Dr. Walker was recognized for his pioneering work, development, and modeling in impact theory, penetration mechanics, material characterization and response under dynamic loading, and their application to resolving problems of international importance in personal protection and safety for defense and the space program.

View Dr. Walker’s presentation at the TAMEST 2014 Annual Conference.
View Dr. Walker’s portion of the 2014 Edith and Peter O’Donnell Awards tribute video.

The 2015 O’Donnell Awards recipients were recently announced through a press release and a video trailer on the TAMEST website.


Dr. James Walker, Recipient of the 2014 O’Donnell Award in Technology Innovation

Decreasing the Analysis Time to Speed Up Development of Ground Combat Vehicles

By James Walker, Ph.D.

I was a principal investigator in the DARPA Adaptive Vehicle Make (AVM) program, which is wrapping up this year (2014). AVM was a large research program with the ambitious goal of reducing the time from concept to production of a ground combat vehicle by a factor of five. There are many topics that come into play in the development and production of a new vehicle. Given our specific expertise in impact and blast, the Engineering Dynamics Department at Southwest Research Institute (SwRI), located in San Antonio, Texas, was in charge of delivering the survivability analysis tools. Our effort included three divisions at SwRI and four subcontractors.

The aim was that the vehicle be “correct by construction.” To achieve the AVM program goals, accurate modeling of vehicle systems’ behaviors is required. We delivered survivability tools that greatly sped up the design and analysis process. The SwRI team’s role in this program was to provide survivability models for ballistic, blast, and corrosion protection, and human factors models.

Our work produced significant survivability tools, highlighted by five major innovations:

• Innovation #1. Multi-fidelity analysis/varying levels of refinement in physics models, so that faster/lower fidelity computations could be performed in initial design space exploration, and more detailed analysis was performed during design refinement,
• Innovation #2. Automated meshing and connecting of parts for complex vehicle structure, with particular success in our automatic welding and bolting tools,
• Innovation #3. Uncertainty quantification and development of 95% bounding models thus indicating for minimal additional computational cost the robustness of the design,
• Innovation #4. Sophisticated large deformation/material failure material model library and more accurate blast loads, since the results of the computations cannot be more accurate than the material characterizations and the applied loads, and
• Innovation #5. Automating the whole survivability pipelines for blast and ballistics—essentially the designer can launch the entire analysis from CAD, making the survivability analysis tools easy for the designer to use.

In the DARPA AVM program, these tools went through an extensive testing beta test and a Gamma Test exercise by both commercial firms and engineering R&D laboratories. In that exercise, the SwRI team survivability tools received extensive praise, including

1. “[Survivability tools] are much, much, much faster than the way we typically do things.”
2. “Weeks of work done in an hour” [referring specifically to the automesher, autowelder, and shader]
3. “Very impressed with the automation in blast and ballistics.”
4. “There is nothing else like it [ballistic Shotline Viewer].”

Figure 1. Images from computations during DARPA AVM showing hull deformation

Figure 1. Images from SwRI team computations during DARPA AVM showing hull deformation due to blast and an automatically meshed vehicle hull with internal structural members.

As an example of automating an important behavior, consider the ability to handle welds and heat affected zones (HAZs). In the SwRI team software, this was completely automated, with the software looking for all finite elements that were in contact with a weld and then placing HAZ material properties into those elements. Figure 2 shows the bottom of a double V hull where, on the left, the heat affected zone is not included, while on the right, it is. There is a clear difference in the amount of damage and hull deflection. Accurately modeling the hull deformation requires these capabilities, which traditionally have been very labor intensive to include in a vehicle model prepared for analysis.

Figure 2. Blast computation on a conceptual hull

Figure 2. Images of a blast computation on a conceptual hull without a heat affected zone (HAZ) (top) and with an HAZ (bottom), showing the importance of including the HAZ. The HAZs and the welds in these examples were automatically produced by the SwRI team survivability tools.

An additional feature of the SwRI team survivability tools was the development of uncertainty-based bounds on the blast response. Given the variability in blast events, the uncertainty-based bounds are extremely helpful in identifying robust solutions. The bounds are obtained by assuming probability density functions (PDFs) for the main variables with variation or uncertainty in the blast problem: the charge density, energy, and geometric shape, the soil density and moisture content, and finally the depth of burial of the charge and the standoff with the bottom of the vehicle. With assumed distributions on these variables, the resulting probability density functions for the upward velocity, jump height, and a computed Dynamic Response Index (DRIz) spinal injury metric (with and without a blast seat with active mechanisms) are all computed. These PDFs allow the determination of a 95% bounding solution. A technique was then developed for rapidly determining the 95% bounding solution for similar blast cases, thus not requiring a recomputation of the PDF in each case, thus providing excellent nominal response values and bounds on the blast response (see Figure 3).

Figure 3. nominal-and-95-percent-upper-bound-for-each-plate-response-for-increasing-charge-mass-for-a-test-case

Figure 3. Nominal and 95% upper bound for each plate response (jump height, maximum vertical velocity, DRIz, and DRIz_seat) for increasing charge mass for a test case.

These examples are specific details that add up to analysis tools that address the larger goal of quicker turnaround for ground vehicles that can provide crew protection for a variety of threats. We are proud to support our troops and to work to provide them the best protection possible. Historically Texas provided ground vehicles to the U.S. military and hopefully such manufacturing will occur in Texas in the future. Nonprofit research establishments such as ours (SwRI), whose mission is “benefiting government, industry and the public through innovative science and technology,” will continue to promote efforts to provide protection to individuals in threatening environments of any kind, both natural and manmade. I’m pleased that The Academy of Medicine, Engineering & Science of Texas recognized the importance of our efforts to understand impact and blast events and to provide protection in such events. The Edith and Peter O’Donnell Award in Technology Innovation in 2014 was great recognition of our work in protection systems over the years, from work on bullet proof vests to work on shielding the International Space Station. The recognition invigorated our entire research team and is much appreciated.

The Edith and Peter O’Donnell Awards are unique awards that encourage, promote, and recognize Texas researchers by recognizing them by the Texas residents of the National Academies and by the heads of research universities and organizations. These awards are highly regarded by the leadership of the various institutions and demonstrate that resources invested in various programs have been good investments. I know that Southwest Research Institute leadership was very excited by our O’Donnell Award in Technology Innovation, the first O’Donnell Award to be awarded to a San Antonio researcher. Further, O’Donnell Awards recognition brings the work of the recipients to a wider audience. Recognition of research demonstrates to various professional organizations and funding agencies that it is valued and has been reviewed by prestigious committees, and thus helps us quickly convey the importance and the relevance of the work.

Texas is a large state with lots of ongoing research, both basic and applied. Recognition of good research programs helps us advertise our work and attract funding and collaborators, both within and outside the state. Scientific and engineering research is an important component of the growing Texas economy. By recognizing innovation and cutting-edge technology advancements that occur in Texas laboratories, such as our work at Southwest Research Institute, it helps build connections and increase industrial outreach, which helps the economy and promotes more growth. Texas and the nation benefit by growth of high-technology positions and industry, and the Edith and Peter O’Donnell Awards help highlight science and technology success and promote more innovation and investment.


James Walker, Ph.D.Dr. James Walker is an institute scientist at Southwest Research Institute (SwRI), a nonprofit engineering research and development center based in San Antonio.

TAMEST Blog Series: 2014 O’Donnell Awards Recipients (1st Post by Dr. Zhifeng Ren)

In anticipation of the upcoming announcement of the 2015 Edith and Peter O’Donnell Awards recipients, we are highlighting the importance of our O’Donnell Awards program and its impact on the program’s past recipients in medicine, engineering, science, and technology innovation, as well as the importance of scientific research to Texas. We have invited 2014 O’Donnell Awards recipients to contribute a post to this special blog series.

The first post in this series was written by Dr. Zhifeng Ren, recipient of the 2014 O’Donnell Award in Science. Dr. Ren has made seminal contributions to five scientific fields: carbon nanotubes, thermoelectrics, zinc oxide nanowires, high temperature superconductivity, and molecule delivery/sensing. He was the first to grow aligned carbon nanotube arrays in large scale, make nanostructured bulk thermoelectric materials with much improved properties, and synthesize hierarchical zinc oxide nanowires.

View Dr. Ren’s presentation at the TAMEST 2014 Annual Conference.
View Dr. Ren’s portion of the 2014 Edith and Peter O’Donnell Awards tribute video.

The 2015 O’Donnell Awards recipients will be announced on Tuesday, December 9, 2014, through a video trailer on the TAMEST website.


Dr. Zhifeng Ren, Recipient of the 2014 O’Donnell Award in Science

By Zhifeng Ren, Ph.D.

Receiving the 2014 O’Donnell Award in Science was great, an important reminder for me and everyone in my research group that good work will eventually be recognized. It has made us work even harder and driven us to want to achieve much more in the years to come.

High transmittance and large stretchability of flexible transparent electrodes

Fig. 1. High transmittance and large stretchability of flexible transparent electrodes. (Top) High transmittance is shown by the clear letters below the electrode, and (bottom) the electrode is stretched at least 100%.

In just the 10 months since the awards were announced, we have published about 30 papers in peer-reviewed journals and filed 10 patent applications, all as we continue our work on high-performance thermoelectric materials and other devices for efficient thermal energy conversion. In addition, we have also started several other exciting programs, such as extremely stretchable conducting transparent electrodes for potential applications in wearable optoelectronic devices, along with work in novel nano materials and our work to create devices for drug delivery into and out of cells, work which can be used to interrogate the activities inside the cells and ultimately may provide a new method for killing cancer cells.

The O’Donnell Awards are an important acknowledgment of scientific and technological achievement in Texas. But the state still has a long way to go to reach its potential as a center for science and technology, and the economic benefits that would come with that.

Nano size of the grains of newly developed thermoelectric material MgAgSb

Fig. 2a. Microstructure and thermoelectric properties of a newly developed thermoelectric material MgAgSb. This shows the nano size of the grains.

Everyone knows that the United States has had the largest economy in the world for decades. The question is, why? The answer is that the United States has the most advanced science and technology because of the continuous governmental support for both the basic research and practical technologies programs, in addition to a good academic system and a stable political system. These programs have discovered numerous basic science phenomena and also invented many technologies, and simultaneously educated many people over the last century. These talented people come from all over the world, drawn here to pursue their American dream.

Thermoelectric figure-of-merit and its energy conversion efficiency of thermoelectric material MgAgSb

Fig. 2b. Microstructure and thermoelectric properties of a newly developed thermoelectric material MgAgSb. This shows the thermoelectric figure-of-merit (left) and its energy conversion efficiency (right) in comparison with the state-of-the-art bismuth telluride.

In my own lab at the University of Houston, I have found both the financial support – for both financial assistance for graduate students and for facilities – and importantly, through the collaborations with colleagues, to be crucial. It has been especially important to work with researchers from the UH Cullen College of Engineering, and about one-third of my Ph.D. students come from the college, in the fields of mechanical engineering, materials science and engineering, electrical engineering, and chemical and biological engineering. These students and their advisors view the projects my group is carrying out from different angles, allowing us to solve challenging issues by bringing different approaches to the problems.

Molecular extraction by spearing cells

Fig. 3. Molecular extraction by spearing cells. (A) An external magnetic field drives multiple wall carbon nanotubes (MCNTs) toward a cell cultured on a polycarbonate filter. To indicate the molecular extraction, the cell is transfected for GFP overexpression beforehand. (B) MCNTs spear into the cell under magnetic force. (C) MCNTs spear through and out of the cell and extract GFP. GFP-carrying spears are collected in the pores of a polycarbonate filter. (D) GFP representing the intracellular signal molecules can be used for analysis of individual pores.

But even though the United States has been at the center of science and technology internationally for many years, Texas clearly has not been at the nation’s center of science and technology. That honor has gone to Massachusetts and California, which have the largest number of top research universities and probably most technology-driven startups. Boston alone has seven of the nation’s top 50 research universities, and California has 9.

Texas, the second most populous state in the country, should put more funding into universities to boost existing programs and attract many more top scientists. When Texas catches Massachusetts and California, it will draw more talented people to Texas. They will make new discoveries and create new technologies, which will generate new jobs and, ultimately, spur a better future for Texas.

In summary, science and technology are key for Texas to become the economic center of the United States, but we are not there yet.


Zhifeng Ren, Ph.D.Dr. Zhifeng Ren is M.D. Anderson Chair Professor in the Department of Physics and principal investigator at the Texas Center of Superconductivity at the University of Houston.

UT Dallas Faculty and Students Lead the Way in Exploration of the Brain

By David E. Daniel, Ph.D.

In the early 1990s, the federal government launched a 15-year program to map the human genome, and in the process revolutionized the way researchers conducted science. The Human Genome Project required the collaborative work of biologists, engineers, computer scientists, clinicians and more. It involved a hefty investment of research funding that, by some estimates, returned $141 for every dollar spent.

Now, Washington’s research establishment has issued a new challenge to the scientific community — the BRAIN Initiative (Brain Research through Advancing Innovative Neurotechnologies). This bold idea — that we can develop ways to provide a real-time view of the working brain — is of great interest here at UT Dallas, which has long been dedicated to discovering the brain’s inner workings.

Neuroscience Undergraduate EnrollmentLast year alone, the National Institutes of Health awarded 14 UT Dallas faculty members a total of 23 new grants to research the brain. These projects are spread across the School of Behavioral and Brain Sciences, the School of Natural Sciences and Mathematics, and the Erik Jonsson School of Engineering and Computer Science.

These federal grants support research that will help our scientists and engineers better understand anxiety disorders, post-traumatic stress, aging of the brain and autism. They support efforts to develop new methods for delivering molecules across the blood-brain barrier.

Researchers here realized long ago that advancement was likely to come faster if experts across an array of academic specialties worked together, as reflected in the varied missions of the Center for BrainHealth, the Center for Vital Longevity, the Texas Biomedical Device Center and the Department of Bioengineering, as well as in partnerships among researchers at UT Southwestern Medical Center, UT Arlington and UT Dallas.

These researchers not only focus on their own quest for knowledge but also pay keen attention to training future generations of neuroscientists. Our undergraduate neuroscience program is still relatively young, first enrolling students in 1996. Enrollment has more than tripled in the past eight years. Our master’s program in applied cognition and neuroscience and the doctoral program in cognition and neuroscience have both steadily increased in size.

Grants from the NIH and other sources support faculty inquiry, and also bring students into the lab to gain hands-on research experiences. For example, Drs. Christa McIntyre-Rodriguez and Sven Kroener were awarded a grant with a provision that undergraduate students be trained as researchers to investigate the mechanisms behind anxiety disorders. Upwards of 30 undergraduate volunteers spend time in our larger neuroscience research labs. More than 80 are involved in work in the Texas Biomedical Device Center, according to a report given recently by the center’s director, Dr. Rob Rennaker. The valuable experience these budding researchers gain can provide a major advantage when applying to top graduate schools.

It remains to be seen what will be discovered through the nascent BRAIN Initiative and where those discoveries will lead. But we expect that within this generation of scientists and researchers working on the project there will be a significant number of important connections and discoveries here at UT Dallas, where we focus on creating the future.


David E Daniel, Ph.D.David E. Daniel, Ph.D., is president of UT Dallas. He is a member of the National Academy of Engineering and past president of TAMEST.

Advancing Medicine through Nanotechnology: a Look at Houston Methodist Hospital

by Mauro Ferrari, Ph.D.

Houston Methodist Hospital is one of the biggest hospitals in Texas. Our Research Institute turns 10 this year and has made great strides in advancing medicine that focuses on getting effective treatments to our patients.

We have grown to 280 members and 1,400 credentialed researchers in our first 10 years. While this may seem small in comparison to the larger teaching hospitals, we are small by design. There are many excellent universities and institutions that excel at basic research, of course—it is the foundation of all science and technology. Our goal is to take the next step in helping our patients—building bridges from labs to the clinic. All our research is geared toward rapid application and begins with identifying our clinical needs. We perform some basic research in the spaces between scientific and clinical areas. Most of our work focuses on platforms like nanomedicine, information systems, and outcomes research that benefit multiple disciplines of medicine. And we partner these with what some have called a nirvana of applied research- expertise and strong support systems for clinical trials, small-scale clinical-grade manufacturing, and regulatory guidance for FDA approval.

Houston Methodist made the early choice to focus on a handful of emerging, exciting areas of applied medicine that, we believe, hold the most promise to transform the lives of our patients, and patients around the world.

One such area is nanomedicine, the development of safe and potent nanotechnologies for use in diagnosis and medical therapies. I began my own career in nanomedicine at Ohio State University, then transferred my laboratory first to UT Health Science Center at Houston and then to Houston Methodist in 2010. I served as special expert on nanotechnology at the National Cancer Institute (NCI) in 2003-2005, providing leadership into the formulation, refinement, and approval of the NCI’s Alliance for Nanotechnology in Cancer, currently the world’s largest program in medical nanotechnology

I’ve been fortunate to work with principal investigators doing transformational work in nanomedicine at Houston Methodist, including Ennio Tasciotti, Ph.D., Tony Hu, Ph.D., Paolo Decuzzi, Ph.D., and Haifa Shen, Ph.D., and other excellent scientists. Their work is being applied to areas of medicine as diverse as rapid-diagnostic devices, drug delivery, regenerative medicine, and imaging. This work has attracted millions of dollars to Texas in public research funding from the National Institutes of Health and the U.S. Department of Defense, and the progress our researchers make is published every month in major, high-impact journals such as Nature, Nature Nanotechnology, American Chemical Society Nano, and the Proceedings of the National Academy of Sciences.

Why such interest in nanomedicine? Because it has already transformed other areas of our lives, including electronics, computing, and manufacturing, and because we have figured out how to make nanotechnology safe for people. The silicon-based nanoparticles being developed in our laboratories have a low toxicity profile in the body and are usually removed from the bloodstream in 24 to 48 hours. The nanoparticles find their targets and act precisely, allowing them to efficiently accomplish their intended functions, such as delivering life-saving drugs, killing cancer cells, or improving the resolution of diagnostic imaging.

The next step—now underway—is to show how nanomedicine-based therapies can improve upon traditional ones, and for this, collaboration is key. In Houston we have the Alliance for NanoHealth, established with the support of U.S. Rep. John Culberson, Gov. Rick Perry, and TAMEST co-founder and retired U.S. Sen. Kay Bailey Hutchison. The Alliance unites Houston’s top academic institutions working in the field of nanomedicine. I have had the privilege of leading the Alliance since 2005, succeeding Bob Bast, Jr., M.D., of The University of Texas MD Anderson Cancer Center, and the late Samuel Ward “Trip” Casscells III, M.D., of UT Health, a great man of exceptional vision, to whom all of Texas owes gratitude for his inspired work and leadership. Dozens of collaborative projects in nanomedicine have been spurred forward by the Alliance, and for that and other reasons, we believe it has been a huge success.

Nanomedicine’s secrets harbor great opportunities for Texas. Having participated in the creation, Texans are world leaders. Our state stands to benefit greatly from its application to health care, science, and education, and because of the economic opportunities it presents to entrepreneurs. Not everything must be big in Texas. Indeed, some of the things we’re famous for should be very, very small.


Mauro FerrariMauro Ferrari, Ph.D., president and CEO of the Houston Methodist Research Institute and director of the Institute for Academic Medicine at Houston Methodist Hospital, is a regular speaker at TAMEST events, and is generally considered to be one of the founders of nanomedicine.

How High-Tech Computing Makes Everyday Life a Little Better

By Thomas J. Lange

We take them for granted, those products that help us start nearly every day. We shampoo and condition our hair, wash our skin, dry off with a fresh-smelling towel, shave, brush our teeth, fix our hair. Maybe we’ll also change the baby, feed the dog, start the dishwasher.

For more than seven generations, P&G has been inventing the products and building the brands aimed at making the morning’s start, and the day, just a little better. From the candle that lit the morning gloom in the 1837, to the floating bar of Ivory soap—‘99 44/100% Pure.’ To today, with brands like Pantene, Gillette, Crest, Covergirl, Hugo Boss, Pampers, Charmin, Cascade, Tide….

What most people don’t know is that behind each of those daily experiences, lays an amazing amount of Science, Engineering, and High Performance Computing.

P&G doesn’t usually talk about that because consumers really care more that Charmin is soft and strong, not really how it got that way. So, instead of an engineer in a white coat standing in front of a specialized machine making Charmin, we create ads with Mr. Whipple the friendly, quirky, grocer and today, cuddly cartoon bears.

From an Engineering perspective, this can leave the impression that everyday consumable goods are ‘low tech’—when the challenges our Scientists and Engineers face everyday are very much Rocket-Science hard. You see, our job is to break engineering ‘contradictions,’ and that is quite a challenge. For rocket science, it’s controlling an explosion—something that is inherently uncontrollable.

For us, we need to make Charmin that dissolves when wet, but is strong AND soft when dry. Bounty must be absorbent, but VERY strong when wet. Pampers need to be absorbent—but fit and comfort babies like cloth. Laundry treatments need to remove stains, but protect fabrics—including cloth dyes—and be concentrated yet still easy to use. Containers should never leak, but open easily. Containers, when dropped, should not break—but use a bare minimum of plastic that also recycles. Most importantly, all these products must be a good value for improving daily life, not just affordable for use once in a while.

Tide PODS® is truly a “one-wash wonder,” enabled by sophisticated computer simulation technology. The challenge of bringing together three different liquids into one pod, separated by a film that is both able to dissolve in cold water yet not dissolve from exposure to the contents is quite complicated. We had to do sophisticated computer simulations of how the pod could be mass produced without leaking—one splash droplet in the wrong place and we have a mess.

Diapers create another technological challenge. They need to fit like pants, but keep the baby and its surroundings dry and fit almost any size and shape. While there are thousands of baby shapes, no one can provide hundreds of sizes. Instead, we offer four to six options for the first two years of life. To get this right, we have teams working with computer models and simulations to identify what stretches where; how the waist band surrounds the tummy; and how leg holes will fit for both small and larger legs alike.

Finally, think about a shaving system that removes hair close to the skin, but protects your skin. The physics of hair removal, what pulls, what cuts, how sharp or slick the blade needs to be, at what angle the blade needs to be, all is precisely evaluated and determined by computer simulation.

Thomas Edison found 1000’s of things that did not work in his search for the materials that made the light bulb possible. We even have a name for that approach: ‘Edisonian investigation.’ For our products, we too are always ‘looking for a better way.’ High Performance Computing and the Engineering and Science Modeling & Simulation that it enables make possible hundreds of thousands of iterations on the computer in less time and with less cost. That allows us to continue our brands’ promise that our great, great, great grandchildren will start their day a little better than we did today.

The Procter & Gamble Company supports a number of programs and projects aimed at putting high-tech Modeling & Simulation tools in the hands of small businesses to help accelerate innovation and U.S. manufacturing quality.


Thomas J. Lange Thomas J. Lange, Director, R&D, Modeling & Simulation at Procter & Gamble Company was a keynote speaker at The Academy of Medicine, Engineering & Science of Texas’ (TAMEST’s) Annual Conference, January 16-17, 2014. The conference addressed the computational revolution in medicine, engineering, and science. Click to view a video of Lange’s keynote address.

Computational Science: The “Third Pillar” of Science

By Dr. Tinsley Oden and Dr. Omar Ghattas

A simple definition of science is this: the activity concerned with the systematic acquisition of knowledge. The English word is derived from scientia, which is Latin for “knowledge.” According to the Cambridge Dictionary, science is “the enterprise that builds and organizes knowledge in the form of testable explanations and predictions about the universe.” It is designed to reduce or eliminate ignorance by acquiring and understanding information and involves the mental comprehension of perceived truth or fact through cognition.

The question of how knowledge is acquired has been a subject of debate among philosophers of science for almost 3,000 years and, as far as is known, began in writings of Plato and Socrates. After millennia of debate by the greatest minds of human history, two avenues to scientific knowledge emerged: 1) observations, experimental measurements, information gained by the human senses, guided by instruments; and 2) theory, inductive hypotheses often framed in mathematical language. Observation and theory are thus, the two classical pillars of science.

Understanding HIV

ICES researchers have simulated the behavior of the HIV RT protein to help design therapeutic drugs. Protein motions are displayed as multiple light blue ribbons. The green and dark blue spheres represent the DNA which the protein HIV RT synthesizes.

Is there a third pillar? Is there a new avenue to gain scientific knowledge and guide engineering design? The answer, in our minds, and in the minds of most contemporary scientists and engineers, is very clearly “Yes.” It is the new discipline of computational science: “the use of computational algorithms to translate mathematical models that represent how the physical universe behaves into computer models that predict the future and reconstruct the past, and that are used to simulate a broad spectrum of engineered products, processes, and systems.”

Computational science represents the single most important scientific advance in human history. It has transformed forever the way scientific discoveries are made and how engineering design and manufacturing are carried out. It lies at the intersection of mathematics, computer science, and the core disciplines of science and engineering.

What can computational science and engineering (CS&E) do that classical science cannot? It can look into the past with so-called inverse analysis to determine which past events caused observed phenomena. It can explore the effects of thousands of scenarios for or in lieu of actual experiments. It can be used to study events beyond the reach of contemporary experimental science. It can optimize procedures for the design of products and systems. It can even explore the consequences of a breakdown in models and theories.

Mapping the Human Brain

Researchers in the Center for Computational Visualization, directed by Chandrajit Bajaj, have been automating construction of nanoscopic resolution models of the human brain and its activity. This picture shows an active chemical synapse between a (green) neuron axonal segment and a (yellow) dendritic spine head, surrounded by spherical neurotransmitters (blue, red, white ) at different stages of ion-channel activation.

Indeed, it is difficult to conceive of a contemporary engineered product, process, or system that has not been designed by the modern tools of computational science. From power systems, chemical processes, civil infrastructure, automotive and aerospace vehicles, and advanced materials, to electronic devices, communication systems, medical devices and procedures, pharmaceutical drugs, manufacturing systems, and operational logistics, and many more—sophisticated models running on high performance computers are used as surrogates of reality to facilitate virtual design, control, planning, manufacture, and testing, resulting in faster, cheaper, and better products and processes.

Moreover, the prediction of the behavior of natural systems using computer models has led to vastly improved understanding of these systems, which range from severe weather, climate change, energy resources, and earthquakes, to protein folding, genomics, chemical processes, and virus spread, to supernovae and evolution of galaxies, to name but a few. Indeed, the traditional core disciplines of science and engineering must now be reviewed and reconstituted because what had once been out of reach by traditional science is now well within reach due to the advent of powerful new tools and approaches afforded by computational science.

This past year marked the 10th anniversary of the founding of the Institute for Computational Engineering and Sciences (ICES), the leading research institute in the world in CS&E with over 250 faculty, research scientists, and graduate students, located here in Austin, Texas. Moreover, the Texas Advanced Computing Center (TACC) in 2013 deployed Stampede, one of the most powerful supercomputers in the world. These two resources, and others, have placed The University of Texas at Austin at the forefront of research and education in computational science and engineering. The impacts on the region and the state are just beginning to be felt, and will accelerate rapidly in the coming years.


Drs. J. Tinsley Oden and Omar GhattasDr. Tinsley Oden (director of the Institute for Computational Engineering and Science (ICES), associate vice president of Research and professor at UT Austin) and Dr. Omar Ghattas (director of the Center for Computational Geosciences at ICES and professor at UT Austin) will both be speakers at The Academy of Medicine, Engineering & Science of Texas’ (TAMEST’s) Annual Conference January 16-17, 2014. The conference will address the computational revolution in medicine, engineering, and science.

Mapping the Human Brain with Supercomputers

by Henry Markram, Ph.D.

Reconstruction of brain cells

This image shows the reconstruction of a handful of brain cells. About half way up is the spherical somata, containing the cell nuclei. The network of branches allows extensive interconnection between even a few cells, which gives the human brain highly efficient, massively parallel processing power. Indeed, a simulation of a few thousand cells appears like a very dense jungle, in which individual cells are virtually indistinguishable. In this image, the short branches you can see clustered around the somata are dendrites and the long ones running up to the top of the image are axons. The vertical nature of the network of branches allows connections between brain cells located in different layers of the cerebral cortex.

The Human Brain Project (HBP) is working to unify our understanding of the human brain. We’re harnessing the power of supercomputers for problems we cannot solve with experiments alone—mapping the human brain and its diseases and using our map to develop even more powerful computers.

The potential of this work is highlighted by the fact that the HBP is funded by one of the largest scientific grants ever awarded by the European Commission. We bring together leading researchers in neuroscience, medicine and computing from 80 partner universities in the US, Canada, Europe and Asia.

Our main challenge is that the human brain is so extraordinarily complex that it’s very difficult to understand exactly how it’s put together and how it works. Each of our roughly 87 billion neurons is intricately connected to thousands of other neurons. Yet it is the precise arrangement of these connections, coupled with the sheer number of them, that gives us our unmatched mental abilities.

At the same time, it has never been more urgent for us to address the many health challenges related to problems of the brain. We are living longer lives than ever before, and that makes us more vulnerable to brain-related old age diseases such as Alzheimer’s, dementia and Parkinson’s.

Modern neuroscience is gathering more and more experimental data, but it still covers only a small fraction of the brain’s overall structure and functionality. The task is further complicated by the need to understand brains from males and females, different species, and healthy as well as sick individuals. In short, knowledge derived from experimental data still contains massive gaps, and we can’t accumulate new data quickly enough to transform this situation anytime soon, without some extra help.

This is where supercomputers come in. They allow us to construct and refine mathematical rules, derived from the limited experimental evidence we have, to predict with increasing accuracy the structure and functioning of sections of the brain.

As the power of supercomputers increases, we can predict and simulate larger parts of the brain, more accurately. By 2020, we should have supercomputers powerful enough to attempt an initial reconstruction of the structural and functional organization of the whole human brain. Ultimately, we hope to apply disease-specific rules to build models of brain diseases, allowing us to understand them better and to speed up the development of new medicines. At the same time, our vastly expanded insight into brain function will help transform information technology, paving the way for more efficient and flexible computers.

By using supercomputing power to leverage neuroscience data, we can turn mapping the human brain into a tractable problem, laying the foundations for a unified theory of brain function, as well as revolutionary applications in healthcare and computer technology.


Henry Markram, Ph.D.Henry Markram (Director of the Blue Brain Project, Coordinator of the Human Brain Project and Professor of Neuroscience at the École Polytechnique Fédérale de Lausanne) will be a keynote speaker at The Academy of Medicine, Engineering & Science of Texas’ (TAMEST’s)  Annual Conference, January 16-17, 2014. The conference will address the computational revolution in medicine, engineering, and science.

Opening Doors for Young Scientists

By David E. Daniel, Ph.D.

UT Dallas faculty members are passionate about research, discovery and innovation. Their work in labs and in the field is not only vital to the pursuit of new knowledge, it is equally critical to the learning experience provided to students. This commitment to taking the time to help students get their hands dirty results in graduates who are capable of recognizing and seizing opportunity—to launch a new company, to make a scientific breakthrough, to change the world for the better.

Dr. Ray Baughman and Ph.D. student Carter Haines

Dr. Ray Baughman and Ph.D. student Carter Haines work together on nanotechnology research.

Consider Carter Haines BS’11. Carter came to UT Dallas the summer before his junior year at Plano East High School to participate in a program called NanoExplorers. Through NanoExplorers, qualified high school students gain early experience in conducting hands-on research related to nanotechnology, which examines how things work at the scale of atoms and molecules. The program was founded by Dr. Ray Baughman, director of the Alan G. MacDiarmid NanoTech Institute and holder of the Robert A. Welch Distinguished Chair in Chemistry. Dr. Baughman’s work in the world of the very small has a potentially huge impact in widespread applications from energy harvesting and storage to artificial muscles and super-strong fibers.

Dr. Baughman is a member of the National Academy of Engineering and one of our most distinguished and accomplished faculty members, but he hasn’t forgotten what it’s like to be young and unsure of how to get started in science. As a teenager, he rode his bike to the nearest university and without any introduction, knocked on the door of a professor’s lab. His initiative was rewarded with the opportunity to conduct laboratory research under the guidance of a university faculty member—an experience that inspired him to pursue his dreams. NanoExplorers is his way of opening a door to other potential young scientists.

Carter spent three high school summers as a NanoExplorer in Dr. Baughman’s lab. Then, as an undergraduate physics and neuroscience major at UT Dallas, he continued to work there, focusing on artificial muscles made from carbon nanotubes. When Carter began considering graduate schools, the choice was clear.

“What UT Dallas offers is unique—a lot of creativity and freedom,” says Carter, a current PhD student in materials science and engineering who has published six papers in high-impact journals and has three U.S. patent filings. He’s also managed to work in some important service, spending this past summer mentoring a new crop of high school students in NanoExplorers.

When we describe the impact UT Dallas is making, what we are really talking about is the work of people like Carter Haines and Ray Baughman. They set out not only to discover the unknown and turn it to humankind’s advantage, but also to encourage others to join them in that quest. Carter, Ray, and the people they teach and launch into the process of discovery, are the reason research universities like UT Dallas matter. Though making the next big discovery is a major motivation, opening doors for our students is always our greatest mission and greatest point of pride.


David E Daniel, Ph.D.David E. Daniel, Ph.D., is president of UT Dallas. He is a member of the National Academy of Engineering and past president of TAMEST.

Q&A with Dr. Joseph Beaman: 3-D Printing Pioneer

Dr. Joseph J. Beaman is the Earnest F. Gloyna Regents Chair in Engineering in the Department of Mechanical Engineering at The University of Texas at Austin and was elected to the National Academy of Engineering (NAE) in February 2013. He was elected to the NAE for innovation, development, and commercialization of solid freeform fabrication and selective laser sintering, an early form of additive layer manufacturing also known as 3-D printing—one of the most popular topics in the tech space currently.

Carl Deckard, Joe Beaman, and Paul Forderhase

From left to right: Carl Deckard, Joe Beaman, and Paul Forderhase photographed on November 19, 2012. The image in the background is of selective laser sintered miniature UT Austin towers before removal from the powder bed.

Dr. Beaman coined the term Solid Freeform Fabrication in 1987 referring to a manufacturing technology that produces freeform solid objects directly from a computer model of the object without part-specific tooling or knowledge. He was the first academic researcher in the field beginning in 1985, and one of the most successful Solid Freeform Fabrication approaches, Selective Laser Sintering (SLS), was developed in his laboratory at UT Austin. Carl Deckard, a student working in Dr. Beaman’s lab, came up with the idea as an undergraduate and pursued it further while working on his master’s degree. He and Dr. Beaman, who was the Principal Investigator (PI), received a $30,000 grant from the National Science Foundation (NSF) to advance the technology and build a proof of concept machine.

Complex selective laser sintered object

This is an example of a complex selective laser sintered object. All of the interior objects were manufactured at one time from a tough nylon polymer on a SinterStation 2000 machine.

Dr. Beaman worked with graduate students, faculty, and industrial suppliers on the fundamental technology including materials, laser scanning techniques, thermal control, mold- making techniques, direct metal fabrication, and biomedical applications. He was one of the founders of DTM Corporation (later acquired by 3-D Systems), that commercialized SLS technology. Dr. Beaman was in charge of advanced development for DTM during 1990–1992 when the company developed and marketed its first commercial systems.

SinterStation 2000

The SinterStation 2000

Today, Dr. Beaman is considered a pioneer in what is popularly known as 3-D printing. His work with SLS-based technology is used by manufacturers globally to dramatically compress the manufacturing cycle for complex parts. Benefits include greatly reduced cost, time, and the capability to achieve in a single operation, geometries that would otherwise require multiple operations or prove impossible to manufacture with standard techniques. The technology is broadly applicable to many fields including architecture, industrial design, automotive and aerospace engineering, military applications, medicine/healthcare, civil engineering, fashion, and food.

Wax models for a casting process known as investment casting or lost-wax casting

SLS was originally intended by Deckard as a way to make wax models for a casting process known as investment casting or lost-wax casting. The part made of green wax (left) can be used as a casting pattern to make an identical part out of aluminum (right) or other metal. The off-white part (center) is made of polycarbonate and is not involved in the investment casting process.

According to Wohlers Associates, a leading 3-D printing consultancy, the market for 3-D printing and additive manufacturing in 2012, consisting of all products and services worldwide, grew 28.6% (CAGR) to $2.204 billion. By 2017, Wohlers Associates believes the sale of 3-D printing products and services will approach $6 billion worldwide. And its adoption continues to expand among consumers and professionals with a wide range of price points and capabilities aligned with the needs of each group.  UPS is testing the market for 3-D printing services in stores in San Diego and Washington D.C. Approximately 80 percent of 3-D printing customers in the San Diego store were medical students interested in prototypes. NASA recently tested a rocket with a 3-D printed fuel injector.  3-D printing technology allowed the part to be created in just two parts instead of 115. NASA is also working with the company Made in Space and plans to launch a new 3-D printer in June 2014 for use on the International Space Station.

More information about Dr. Beaman and SLS is available here.

To provide more insights on the origins of SLS and the future of 3-D printing, we caught up with Dr. Beaman. He was kind enough to participate in the following Q&A.


When you and Carl Deckard began work in the mid-1980s on Selective Laser Sintering (SLS) what were you envisioning as the major industries and primary applications for commercial use of this technology?

Dr. Beaman: When Carl and I first discussed the concept, we were most concerned about how long it took to make the first one of almost anything. Carl’s interest was in parts that might come out of a standard machine shop, and I had worked in a machine shop while in high school, so I am sure this background colored our thoughts. Of course machine shops support many different types of industries that make mechanical parts.

What were the major challenges you and your team faced in the early days of DTM Corporation in developing SLS machines for use in commercial engineering/manufacturing environments?

Dr. Beaman: By its nature, SLS can make a wide variety of different shapes and therefore can be used in a numerous applications. This is a blessing and a curse. There are many possible markets, but a small company cannot address them all, especially when the market has to be created. Deciding what to focus on was the biggest challenge. We decided on casting and prototyping as a start. By the way, we also looked at an inexpensive SLS machine, which Paul Forderhase briefly studied. Paul was a master’s student that built the second SLS machine at UT and later joined DTM.

The core SLS patents will expire in February 2014. There’s speculation this could allow Chinese manufacturers to enter the market and effectively lower the prices for quality 3-D printers leading to the development of desktop SLS devices. What are your thoughts about the potential impact of these patents expiring?

Dr. Beaman: This is possible, but there are other specific patents that have longer life in this area and some of these may preclude a wholesale entrance by Chinese manufacturers in the general market.

3-D printing seems to be exploding in popularity with both consumers and industry with new applications being developed at a rapid pace. Many of these applications were not even imagined by the inventors of the technology. Where do you see the industry headed now almost 30 years since its inception?

Dr. Beaman: We have always split the market based on two axes, accuracy and strength. I usually refer to low accuracy and low strength applications as “3-D Printing.” This is the consumer market that includes a growing number of consumer-focused products from companies such as Makerbot, etc. This is the market that has exploded. Sometimes the market confuses the capabilities of 3-D Printers with higher performance additive manufacturing machines.

High strength, low accuracy yields machining forms. In which low accuracy “machining forms” are produced for final machining. Aerojet was a company that was formed to address this market for titanium (Ti) machining forms but went out of business because of competition with numerical control (NC) machining.

High accuracy, low strength yields casting patterns, which is a viable market. This includes patterns for lost-wax casting processes or for foundry casting patterns or molds. Applications here include jewelry, medical instruments and devices, and mechanical parts or forming operations.

Moderate accuracy, moderate strength is the realm of rapid prototyping. Rapid prototyping is a strength for our SLS technology and has applications in a wide range of manufacturing industries.

The Holy Grail is high strength, high accuracy, which is true manufacturing. I see true additive manufacturing as a strong direction now especially as an educated workforce gets developed that understands the design freedom allowed by additive manufacturing. This educated workforce will happen because of the 3-D printing market.

Are there new technologies under development that will further expand 3-D printing capabilities for both consumers and industrial design and manufacturing applications?

Dr. Beaman: A third axis is multiple materials. This would allow fabrication of system components such as parts with physically embedded electronics or structures with graded or discrete material interfaces. This will require improvements in CAD solid modeling software.

Will there continue to be market segmentation between complex industrial applications versus consumer applications for 3D printing or will this division begin to diminish as the technology evolves?

Dr. Beaman: For now, I see continued market segmentation unless someone comes up with a very inexpensive additive manufacturing machine that has great mechanical properties and accuracy.

Situational Awareness Key to Drought Management

By Danny D. Reible, Ph.D.

In 2011, Texas experienced the worst single-year drought in its history. Unfortunately, essentially all of Texas remains in a state of severe to exceptional drought. Many of the reservoirs remain at historically low levels, and the majority of Texas rivers exhibit flows well below normal (<25%ile). Drought conditions will ultimately ease but no one can state when or for how long.

Lake Travis August 2013

Lake Travis on August 24, 2013.

So how do we manage this situation? We respond effectively by developing a sense of situational awareness of the drought and its potential consequences. We must build a resilient management system that can take advantage of water when it’s available but also capable of maintaining critical needs, such as water for drinking, the economy, and the environment, in the face of a drought that may continue for one, two, or even five or more years. We have to project forward in time potential drought and water use scenarios and use water prudently and cautiously recognizing that there is a finite possibility that water availability will remain limited. I would suggest that we were slow to realize the potential consequences of drought as it strengthened its stranglehold on Texas in 2011. We made decisions that presumed that rain would soon return, but despite some promising signs in 2012, we remain in drought.

Among the decisions that were made was the release of 433,000 acre-feet of water from the Highland Lakes of Central Texas for irrigation of rice in South Texas. This is approximately 50 percent more than the average of the previous three years and occurred in what turned out to be a year with the lowest Highland Lake inflows on record; inflows that were insufficient to even offset normal evaporation from the lakes. Thus, every drop of water released from the lakes in 2011 came directly from the lake reserves. In hindsight, a more modest release could have maintained significant reserves in the lakes and could have led to greater recreational, residential, and agricultural use of the water over the last two years.  Instead, many businesses around the lake have suffered, recreational use has dropped, residential water use has been curtailed, and agriculture use has been largely shut down.

Highland Lake Inflows Compared to Austin Rainfall

Highland Lake inflows and agricultural releases/acre-feet compared to Austin rainfall/in.

While easier to see in hindsight, there were warning signs that, if acted upon, might have allowed greater reserves in the lakes and led to more water for all users in the last two years. Very little rainfall was observed in the watershed in February through April of 2011; Austin rainfall was less than 15 percent of normal levels during that period. Stream inflows were similarly affected. While any three-month period does not provide much of an indication of climate trends, greater situational awareness and prognostic models would have suggested that there was a possibility that dramatic reductions in lake levels and reserves were possible if the low rainfall, inflows, and high releases continued. It would have been important to implement a more cautious release plan that could have been updated as new information became available. The management plan in operation at the time had no such capability, but the resulting lake conditions illustrate how important it is to be able to adaptively manage the system to mitigate the potential effects of drought. Would release of only enough water for a single rice crop in 2011 made it possible to have rice crops in 2012 and 2013 while retaining enough water in the lakes for other uses? It is difficult to say, but such an outcome would likely have been preferred for both upstream and downstream water users.

Danny D. Reible, Ph.D.Dr. Danny D. Reible is the Donovan Maddox Distinguished Engineering Chair at Texas Tech University. He served as program chair of the 2012 Texas Water Summit.