Physicists from around the country come to Georgia Tech for a recent machine learning conference. (Photo Benjamin Zhao)

Physicists from around the country come to Georgia Tech for a recent machine learning conference. (Photo Benjamin Zhao)

School of Physics Professor Tamara Bogdanovic prepares to ask a question at the recent machine learning conference at Georgia Tech. (Photo Benjamin Zhao)

School of Physics Professor Tamara Bogdanovic prepares to ask a question at the recent machine learning conference at Georgia Tech. (Photo Benjamin Zhao)

Matthew Golden, graduate student researcher in the School of Physics, presents at a recent machine learning conference at Georgia Tech. (Photo Benjamin Zhao)

Matthew Golden, graduate student researcher in the School of Physics, presents at a recent machine learning conference at Georgia Tech. (Photo Benjamin Zhao)

The School of Physics’ new initiative to catalyze research using artificial intelligence (AI) and machine learning (ML) began October 16 with a conference at the Global Learning Center titled Revolutionizing Physics — Exploring Connections Between Physics and Machine Learning.

AI and ML have the spotlight right now in science, and the conference promises to be the first of many, says Feryal Özel, Professor and Chair of the School of Physics. 

"We were delighted to host the AI/ML in Physics conference and see the exciting rapid developments in this field,” Özel says. “The conference was a prominent launching point for the new AI/ML initiative we are starting in the School of Physics."​ 

That initiative includes hiring two tenure-track faculty members, who will benefit from substantial expertise and resources in artificial intelligence and machine learning that already exist in the Colleges of Sciences, Engineering, and Computing.

The conference attendees heard from colleagues about how the technologies were helping with research involving exoplanet searches, plasma physics experiments, and culling through terabytes of data. They also learned that a rough search of keyword titles by Andreas Berlind, director of the National Science Foundation’s Division of Astronomical Sciences, showed that about a fifth of all current NSF grant proposals include components around artificial intelligence and machine learning.

“That’s a lot,” Berlind told the audience. “It’s doubled in the last four years. It’s rapidly increasing.”

Berlind was one of three program officers from the NSF and NASA invited to the conference to give presentations on the funding landscape for AI/ML research in the physical sciences. 

“It’s tool development, the oldest story in human history,” said Germano Iannacchione, director of the NSF’s Division of Materials Research, who added that AI/ML tools “help us navigate very complex spaces — to augment and enhance our reasoning capabilities, and our pattern recognition capabilities.”

That sentiment was echoed by Dimitrios Psaltis, School of Physics professor and a co-organizer of the conference. 

“They usually say if you have a hammer, you see everything as a nail,” Psaltis said. “Just because we have a tool doesn't mean we're going to solve all the problems. So we're in the exploratory phase because we don't know yet which problems in physics machine learning will help us solve. Clearly it will help us solve some problems, because it's a brand new tool, and there are other instances when it will make zero contribution. And until we find out what those problems are, we're going to just explore everything.”

That means trying to find out if there is a place for the technologies in classical and modern physics, quantum mechanics, thermodynamics, optics, geophysics, cosmology, particle physics, and astrophysics, to name just a few branches of study.

Sanaz Vahidinia of NASA’s Astronomy and Astrophysics Research Grants told the attendees that her division was an early and enthusiastic adopter of AI and machine learning. She listed examples of the technologies assisting with gamma-ray astronomy and analyzing data from the Hubble and Kepler space telescopes. “AI and deep learning were very good at identifying patterns in Kepler data,” Vahidinia said. 

Some of the physicist presentations at the conference showed pattern recognition capabilities and other features for AI and ML: 

Alves’s presentation inspired another physicist attending the conference, Psaltis said. “One of our local colleagues, who's doing magnetic materials research, said, ‘Hey, I can apply the exact same thing in my field,’ which he had never thought about before. So we not only have cross-fertilization (of ideas) at the conference, but we’re also learning what works and what doesn't.”

More information on funding and grants at the National Science Foundation can be found here. Information on NASA grants is found here

News Contact

Writer: Renay San Miguel
Communications Officer II/Science Writer
College of Sciences
404-894-5209

Editor: Jess Hunt-Ralston

 

Meet CSE Profile Rafael Orozco

The start of the fall semester can be busy for most Georgia Tech students, but this is especially true for Rafael Orozco. The Ph.D. student in Computational Science and Engineering (CSE) is part of a research group that presented at a major conference in August and is now preparing to host a research meeting in November.

We used the lull between events, research, and classes to meet with Orozco and learn more about his background and interests in this Meet CSE profile.

Student: Rafael Orozco  

Research Interests: Medical Imaging; Seismic Imaging; Generative Models; Inverse Problems; Bayesian Inference; Uncertainty Quantification 

Hometown: Sonora, Mexico 

Tell us briefly about your educational background and how you came to Georgia Tech. 
I studied in Mexico through high school. Then, I did my first two years of undergrad at the University of Arizona and transferred to Bucknell University. I was attracted to Georgia Tech’s CSE program because it is a unique combination of domain science and computer science. It feels like I am both a programmer and a scientist.  

How did you first become interested in computer science and machine learning? 

In high school, I saw a video demonstration of a genetic algorithm on the internet and became interested in the technology. My high school in Mexico did not have a computer science class, but a teacher mentored me and helped me compete at the Mexican Informatics Olympiad. When I started at Arizona, I researched the behavior of clouds from a Bayesian perspective. Since then, my research interests have always involved using Bayesian techniques to infer unknowns.  

You mentioned your background a few times. Since it is National Hispanic Heritage Month, what does this observance mean to you? 

I am quite proud to be a part of this group. In Mexico and the U.S., fellow Hispanics have supported me and my pursuits, so I know firsthand of their kindness and resourcefulness. I think that Hispanic people welcome others, celebrating the joy our culture brings, and they appreciate that our country uses the opportunity to reflect on Hispanic history. 

You study in Professor Felix Herrmann’s Seismic Laboratory for Imaging and Modeling (SLIM) group. In your own words, what does this research group do? 

We develop techniques and software for imaging Earth’s subsurface structures. These range from highly performant partial differential equation solvers to randomized numerical algebra to generative artificial intelligence (AI) models.  

One of the driving goals of each software package we develop is that it needs to be scalable to real world applications. This entails imaging seismic areas that can be kilometers cubed in volume, represented typically by more than 100,000,000 simulation grid cells. In my medical applications, high-resolution images of human brains that can be resolved to less than half a millimeter.  

The International Meeting for Applied Geoscience and Energy (IMAGE) is a recent conference where SLIM gave nine presentations. What research did you present here? 
The challenge of applying machine learning to seismic imaging is that there are no examples of what the earth looks like. While making high quality reference images of human tissues for supervised machine learning is possible, no one can “cut open” the earth to understand exactly what it looks like.  

To address this challenge, I presented an algorithm that combines generative AI with an unsupervised training objective. We essentially trick the generative model into outputting full earth models by making it blind to which part of the Earth we are asking for. This is like when you take an exam where only a few questions will be graded, but you don’t know which ones, so you answer all the questions just in case.  

While seismic imaging is the basis of SLIM research, there are other applications for the group’s work. Can you discuss more about this? 

The imaging techniques that the energy industry has been using for decades toward imaging Earth’s subsurface can be applied almost seamlessly to create medical images of human sub tissue.  

Lately, we have been tackling the particularly difficult modality of using high frequency ultrasound to image through the human skull. In our recent paper, we are exploring a powerful combination between machine learning and physics-based methods that allows us to speed up imaging while adding uncertainty quantification.  
 
We presented the work at this year’s MIDL conference (Medical Imaging with Deep Learning) in July. The medical community was excited with our preliminary results and gave me valuable feedback on how we can help bring this technique closer to clinical viability. 

News Contact

Bryant Wine, Communications Officer
bryant.wine@cc.gatech.edu

Default Image: Research at Georgia Tech

A scientific machine learning (ML) expert at Georgia Tech is lending a hand in developing an app to identify and help Florida communities most at risk of flooding.

School of Computational Science and Engineering (CSE) Assistant Professor Peng Chen is co-principal investigator of a $1.5 million National Science Foundation grant to develop the CRIS-HAZARD system.

CRIS-HAZARD‘s strength derives from integrating geographic information and data mined from community input, like traffic camera videos and social media posts.  

This ability helps policymakers identify areas most vulnerable to flooding and address community needs. The app also predicts and assesses flooding in real time to connect victims with first responders and emergency managers.

“Successfully deploying CRIS-HAZARD will harness community knowledge through direct and indirect engagement efforts to inform decision-making,” Chen said. “It will connect individuals to policymakers and serve as a roadmap at helping the most vulnerable communities.”

Chen’s role in CRIS-HAZARD will be to develop new ML models for the app’s prediction capability. These assimilation models integrate the mined data with predictions from current hydrodynamic models.

Along with making an immediate impact in flood-prone coastal communities, Chen said these models could have broader applications in the future. These include models for improved hurricane prediction and management of water resources.

The models Chen will build for CRIS-HAZARD derive from past applications aimed at helping communities.

Chen has crafted similar models for monitoring and mitigating disease spread, including Covid-19. He has also worked on materials science projects to accelerate the design of metamaterials and self-assembly materials.

“Scientific machine learning is very broad concept and can be applied to many different fields,” Chen said. “Our group looks at how to accelerate optimization, account for risk, and quantify uncertainty in these applications.”

Uncertainty in CRIS-HAZARD is what brings Chen to the project, headed by University of South Florida researchers. While the app’s novelty lies in its use of heterogenous data, inferring predictions can be challenging since the data comes from different sources in varying formats. 

To overcome this, Chen intends to build new data assimilation models from scratch powered by deep neural networks (DNNs).

Along with their ability to find connections between heterogeneous data, DNNs are scalable and inexpensive. This beats the alternative of using supercomputers to make the same calculations.

DNNs are also fast and can significantly reduce computational time. According to Chen, the efficiency of DNNs can achieve acceleration hundreds of thousands of times greater than classical models.

Low cost and time make it possible to run DNN-based simulations multiple times. This improves reliability in prediction results in real-time once the DNNs are properly trained.

“The data may not be consistent or compatible since there are different models we’re trying to integrate, making prediction uncertain,” Chen said. “We can run these ML models many times to quantify the uncertainty and give a probability distribution or a range of predictions.”

CRIS-HAZARD also exemplifies the power of collaboration across disciplines and universities. In this case, machine learning techniques reach across state boundaries to help people that are vulnerable to flooding or other natural disasters.

USF Professor Barnali Dixon leads the project with Associate Professor Yi Qiang— both geocomputation researchers in the School of Geosciences, incorporating data science and artificial intelligence.

Subhro Guhathakurta collaborates with Chen from Georgia Tech. Along with being a professor in the School of City & Regional Planning, Guhathkurta is director of Tech’s Master of Science in Urban Analytics program and the Center for Spatial Planning and Analytics and Visualization.

News Contact

Bryant Wine, Communications Officer
bryant.wine@cc.gatech.edu

Wenjing Liao

Wenjing Liao

This feature supports Georgia Tech's presence at the International Conference on Machine Learning, July 23-29 in Honolulu. 

Honolulu Highlights | ICML 2023
Students and faculty have been focused and energized in their efforts this week engaging with the international machine learning community at ICML. See some of those efforts, hear from students themselves in our video series, and read about their latest contributions in #AI.

Georgia Tech’s experts and larger research community are invested in a future where artificial intelligence (AI) solutions can benefit individuals and communities across our planet. Meet the machine learning maestros among Georgia Tech’s faculty at the International Conference on Machine Learning — July 23-29, 2023, in Honolulu — and learn about their work. The faculty in the main program are working with partners across many domains and industries to help invent powerful new ways for technology to benefit all our futures.

One of the experts in Honolulu is Wenjing Liao, an assistant professor in the School of Mathematics. In addition to machine learning, Liao's research interests include imaging, signal processing, and high dimensional data analysis.

Learn more about the Georgia Tech contingent at the ICML here. Read more about machine learning research at Georgia Tech here.

News Contact

Renay San Miguel
Communications Officer II/Science Writer
College of Sciences
404-894-5209

 

Graduate students advised under SCL affiliated faculty member Jianjun Shi.
Jianjun Shi, Carolyn J. Stewart Chair and Professor

Graduate students, under the guidance of SCL affiliated faculty member Jianjun Shi, have recently received well-deserved recognition for their accomplishments. The students' research interests revolve around the use of machine learning and data analytics in relation to advanced manufacturing.

Michael Biehler (advisor: Professor Jianjun Shi)

  • Mary G. and Joseph Natrella Scholarship, American Statistical Association (ASA) (2023)
  • Best Student Paper Award (Winner) Quality Control and Reliability Engineering (QCRE) Division, IISE (2023)
    • For the paper: M. Biehler, D. Lin , J. Shi (2023): “DETONATE: Nonlinear Dynamic Evolution Modeling of Time-dependent 3-dimensional Point Cloud Profiles” IISE Transactions
  • Best Student Paper Award (Finalist) Data Analytics and Information Systems (DAIS) Division, IISE (2023)
    • For the paper: M. Biehler, A. Kulkarni, J. Li, J. Shi (2023+): “MULTI-MODAL: MULTI-fidelity, multi-modality 3D shape modeler:” submitted to IEEE Transactions on Automation Science and Engineering
  • Phillip J. and Delores A. Scott Graduate Student Health and Wellness Award, H. Milton Stewart School of Industrial and Systems Engineering, Georgia Tech (2023)
  • IHE-LeaD Fellow, Burroughs Wellcome Fund, Interdisciplinary and Health and Environment Leadership Development (2022-2023)

Alina Gorbunova (advisors: Professor Jianjun Shi and Professor Kamran Paynabar)

  • NSF Graduate Research Fellowship (2023)

 Shancong Mou (advisor: Professor Jianjun Shi)

  • Best Track Paper Award (Winner), Quality Control and Reliability Engineering (QCRE) Division, IISE (2023) 
    • For the paper: Mou, S., Gu, X., Cao, M., Bai, H., Huang, P., Shan, J., Shi, J.*, 2023 “RGI: Robust GAN-Inversion for Generic Pixel-wise Anomaly Detection and Mask-free Image Inpainting”, The International Conference on Learning Representations (ICLR 2023).
  • John S.W. Fargher Jr. Scholarship, IISE (2023)
  • Angela P. and Reed J. Baker Research Excellence Award, School of ISyE, Georgia Institute of Technology (2023)

 Zihan Zhang (advisors: Professor Jianjun Shi and Professor Kamran Paynabar)

  • Aerospace and Test Measurement Division Scholarship, ISA (2023)
  • ISA Scholarship, ISA (2023)
  • Gilbreth Memorial Scholarship, IISE (2023)
  • NCORE Student Scholar, National Conference on Race and Ethnicity in Higher Education (2023)
A stylized glacier (Selena Langner)
Alex Robel (Credit: Allison Carter)

Alex Robel is improving how computer models of melting ice sheets incorporate data from field expeditions and satellites by creating a new open-access software package — complete with state-of-the-art tools and paired with ice sheet models that anyone can use, even on a laptop or home computer.

Improving these models is critical: while melting ice sheets and glaciers are top contributors to sea level rise, there are still large uncertainties in sea level projections at 2100 and beyond.

“Part of the problem is that the way that many models have been coded in the past has not been conducive to using these kinds of tools,” Robel, an assistant professor in the School of Earth and Atmospheric Sciences, explains. “It's just very labor-intensive to set up these data assimilation tools — it usually involves someone refactoring the code over several years.”

“Our goal is to provide a tool that anyone in the field can use very easily without a lot of labor at the front end,” Robel says. “This project is really focused around developing the computational tools to make it easier for people who use ice sheet models to incorporate or inform them with the widest possible range of measurements from the ground, aircraft and satellites.”

Now, a $780,000 NSF CAREER grant will help him to do so. 

The National Science Foundation Faculty Early Career Development Award is a five-year funding mechanism designed to help promising researchers establish a personal foundation for a lifetime of leadership in their field. Known as CAREER awards, the grants are NSF’s most prestigious funding for untenured assistant professors.

“Ultimately,” Robel says, “this project will empower more people in the community to use these models and to use these models together with the observations that they're taking.”
 

Ice sheets remember

“Largely, what models do right now is they look at one point in time, and they try their best — at that one point in time — to get the model to match some types of observations as closely as possible,” Robel explains. “From there, they let the computer model simulate what it thinks that ice sheet will do in the future.”

In doing so, the models often assume that the ice sheet starts in a state of balance, and that it is neither gaining nor losing ice at the start of the simulation. The problem with this approach is that ice sheets dynamically change, responding to past events — even ones that have happened centuries ago. “We know from models and from decades of theory that the natural response time scale of thick ice sheets is hundreds to thousands of years,” Robel adds.

By informing models with historical records, observations, and measurements, Robel hopes to improve their accuracy. “We have observations being made by satellites, aircraft, and field expeditions,” says Robel. “We also have historical accounts, and can go even further back in time by looking at geological observations or ice cores. These can tell us about the long history of ice sheets and how they've changed over hundreds or thousands of years.”

Robel’s team plans to use a set of techniques called data assimilation to adjust, or ‘nudge’, models. “These data assimilation techniques have been around for a really long time,” Robel explains. “For example, they’re critical to weather forecasting: every weather forecast that you see on your phone was ultimately the product of a weather model that used data assimilation to take many observations and apply them to a model simulation.”

“The next part of the project is going to be incorporating this data assimilation capability into a cloud-based computational ice sheet model,” Robel says. “We are planning to build an open source software package in Python that can use this sort of data assimilation method with any kind of ice sheet model.”

Robel hopes it will expand accessibility. “Currently, it's very labor-intensive to set up these data assimilation tools, and while groups have done it, it usually involves someone re-coding and refactoring the code over several years.”

Building software for accessibility

Robel’s team will then apply their software package to a widely used model, which now has an online, browser-based version. “The reason why that is particularly useful is because the place where this model is running is also one of the largest community repositories for data in our field,” Robel says.

Called Ghub, this relatively new repository is designed to be a community-wide place for sharing data on glaciers and ice sheets. “Since this is also a place where the model is living, by adding this capability to this cloud-based model, we'll be able to directly use the data that's already living in the same place that the model is,” Robel explains. 

Users won’t need to download data, or have a high-speed computer to access and use the data or model. Researchers collecting data will be able to upload their data to the repository, and immediately see the impact of their observations on future ice sheet melt simulations. Field researchers could use the model to optimize their long-term research plans by seeing where collecting new data might be most critical for refining predictions.

“We really think that it is critical for everyone who's doing modeling of ice sheets to be doing this transient data simulation to make sure that our simulations across the field are all doing the best possible job to reproduce and match observations,” Robel says. While in the past, the time and labor involved in setting up the tools has been a barrier, “developing this particular tool will allow us to bring transient data assimilation to essentially the whole field.”

Bringing Real Data to Georgia’s K-12 Classrooms

The broad applications and user-base expands beyond the scientific community, and Robel is already developing a K-12 curriculum on sea level rise, in partnership with Georgia Tech CEISMC Researcher Jayma Koval. “The students analyze data from real tide gauges and use them to learn about statistics, while also learning about sea level rise using real data,” he explains.

Because the curriculum matches with state standards, teachers can download the curriculum, which is available for free online in partnership with the Southeast Coastal Ocean Observing Regional Association (SECOORA), and incorporate it into their preexisting lesson plans. “We worked with SECOORA to pilot a middle school curriculum in Atlanta and Savannah, and one of the things that we saw was that there are a lot of teachers outside of middle school who are requesting and downloading the curriculum because they want to teach their students about sea level rise, in particular in coastal areas,” Robel adds.

In Georgia, there is a data science class that exists in many high schools that is part of the computer science standards for the state. “Now, we are partnering with a high school teacher to develop a second standards-aligned curriculum that is meant to be taught ideally in a data science class, computer class or statistics class,” Robel says. “It can be taught as a module within that class and it will be the more advanced version of the middle school sea level curriculum.”

The curriculum will guide students through using data analysis tools and coding in order to analyze real sea level data sets, while learning the science behind what causes variations and sea level, what causes sea level rise, and how to predict sea level changes. 

“That gets students to think about computational modeling and how computational modeling is an important part of their lives, whether it's to get a weather forecast or play a computer game,” Robel adds. “Our goal is to get students to imagine how all these things are combined, while thinking about the way that we project future sea level rise.”

 

News Contact

Written by Selena Langner

Contact: Jess Hunt-Ralston

Xiuwei Zhang Group AWSOM 2023

The discovery of nucleic acids is a recent event in the history of scientific phenomenon, and there is still much learn from the enigma that is genetic code.

Advances in computing techniques though have ushered in a new age of understanding the macromolecules that form life as we know it. Now, one Georgia Tech research group is receiving well-deserved accolades for their applications in data science and machine learning toward single-cell omics research. 

Students studying under Xiuwei Zhang, an assistant professor in the School of Computational Science and Engineering (CSE), received awards in April at the Atlanta Workshop on Single-cell Omics (AWSOM 2023).

School of CSE Ph.D. student Ziqi Zhang received the best oral presentation award, while Mihir Birfna, an undergraduate student majoring in computer science, took the best poster prize.

Along with providing computational tools for biological researchers, the group’s papers presented at AWSOM 2023 could benefit populations of people as the research could lead to improved disease detection and prevention. They can also provide a better understanding of causes and treatments of cancer and new ability to accurately simulate cellular processes.

“I am extremely proud of the entire research group and very thankful of their work and our teamwork within our lab,” said Xiuwei Zhang. “These awards are encouraging because it means we are on the right track of developing something that will contribute both to the biology community and the computational community.”

Ziqi Zhang presented the group’s findings of their deep learning framework called scDisInFact, which can carry out multiple key single cell RNA-sequencing (scRNA-seq) tasks all at once and outperform current models that focus on the same tasks individually.

The group successfully tested scDisInFact on simulated and real Covid-19 datasets, demonstrating applicability in future studies of other diseases.

Bafna’s poster introduced CLARIFY, a tool that connects biochemical signals occurring within a cell and intercellular communication molecules. Previously, the inter- and intra-cell signaling were often studied separately due to the complexity of each problem.

Oncology is one field that stands to benefit from CLARIFY. CLARIFY helps to understand the interactions between tumor cells and immune cells in cancer microenvironments, which is crucial for enabling success of cancer immunotherapy.

At AWSOM 2023, the group presented a third paper on scMultiSim. This simulator generates data found in multi-modal single-cell experiments through modeling various biological factors underlying the generated data. It enables quantitative evaluations of a wide range of computational methods in single-cell genomics. That has been a challenging problem due to lack of ground truth information in biology, Xiuwei Zhang said. 

“We want to answer certain basic questions in biology, like how did we get these different cell types like skin cells, bone cells, and blood cells,” she said. “If we understand how things work in normal and healthy cells, and compare that to the data of diseased cells, then we can find the key differences of those two and locate the genes, proteins, and other molecules that cause problems.”

Xiuwei Zhang’s group specializes in applying machine learning and optimization skills in analysis of single-cell omics data and scRNA-seq methods. Their main interest area is studying mechanisms of cellular differentiation— the process when young, immature cells mature and take on functional characteristics.

A growing, yet effective approach to research in molecular biology, scRNA-seq gives insight of existence and behavior of different types of cells. This helps researchers better understand genetic disorders, detect mechanisms that cause tumors and cancer, and develop new treatments, cures, and drugs. 

If microenvironments filled with various macromolecules and genetic material are considered datasets, the need for researchers like Xiuwei Zhang and her group is obvious. These massive, complex datasets present both challenges and opportunities for the group experienced in computational and biological research.

Collaborating authors include School of CSE Ph.D. students Hechen Li and Michael Squires, School of Electrical and Computer Engineering Ph.D. student Xinye Zhao, Wallace H. Coulter Department of Biomedical Engineering Associate Professor Peng Qiu, and Xi Chen, an assistant professor at Southern University of Science and Technology in Shenzhen, China.

The group’s presentations at AWSOM 2023 exhibited how their work makes progress in biomedical research, as well as advancing scientific computing methods in data science, machine learning, and simulation.

scDisInFact is an optimization tool that can perform batch effect removal, condition-associated key gene detection, and perturbation, which is made possible by considering major variation factors in the data. Without considering all these factors, current models can only do these tasks individually, but scDisInFact can do each task better and all at the same time.

CLARIFY delves into how cells employ genetic material to communicate internally, using gene regulatory networks (GRNs) and externally, called cell-cell interactions (CCIs). Many computational methods can infer GRNs and inference methods have been proposed for CCIs, but until CLARIFY, no method existed to infer GRNs and CCIs in the same model.

scMultiSim simulations perform closer to real-world conditions than current simulators that model only one or two biological factors. This helps researchers more realistically test their computational methods, which can guide directions for future method development.

Whether they be computer scientists, biologists, or non-academics alike, the advantage of interdisciplinary and collaborative research, like Xiuwei Zhang’s group, is its wide-reaching impact that advances technology to improve the human condition.

“We’re exploring the possibilities that can be realized by advanced computational methods combined with cutting edge biotechnology,” said Xiuwei Zhang. “Since biotechnology keeps evolving very fast and we want to help push this even further by developing computational methods, together we will propel science forward.”

News Contact

Bryant Wine, Communications Officer
bryant.wine@cc.gatech.edu

A blue image of interconnected nodes
A portrait of Anton Bernshteyn. He is standing in front of a chalkboard that is covered with mathematical equations.

Anton Bernshteyn is forging connections and creating a language to help computer scientists and mathematicians collaborate on new problems — in particular, bridging the gap between solvable, finite problems and more challenging, infinite problems. Now, an NSF CAREER grant will help him achieve that goal.

The National Science Foundation Faculty Early Career Development Award is a five-year grant designed to help promising researchers establish a foundation for a lifetime of leadership in their field. Known as CAREER awards, the grants are NSF’s most prestigious funding for untenured assistant professors.

Bernshteyn, an assistant professor in the School of Mathematics, will focus on “Developing a unified theory of descriptive combinatorics and local algorithms” — connecting concepts and work being done in two previously separate mathematical and computer science fields. “Surprisingly,” Bernshteyn says, “it turns out that these two areas are closely related, and that ideas and results from one can often be applied in the other.” 

“This relationship is going to benefit both areas tremendously,” Bernshteyn says. “It significantly increases the number of tools we can use”

By pioneering this connection, Bernshteyn hopes to connect techniques that mathematicians use to study infinite structures (like dynamic, continuously evolving  structures found in nature), with the algorithms computer scientists use to model large – but still limited – interconnected networks and systems (like a network of computers or cell phones).

“The final goal, for certain types of problems,” he continues, “is to take all these questions about complicated infinite objects and translate them into questions about finite structures, which are much easier to work with and have applications in practical large-scale computing.”

Creating a unified theory

It all started with a paper Bernshteyn wrote in 2020, which showed that mathematics and computer science could be used in tandem to develop powerful problem-solving techniques. Since the fields used different terminology, however, it soon became clear that a “dictionary” or a unified theory would need to be created to help specialists communicate and collaborate. Now that dictionary is being built, bringing together two previously-distinct fields: distributed computing (a field of computer science), and descriptive set theory (a field of mathematics). 

Computer scientists use distributed computing to study so-called “distributed systems,” which model extremely large networks — like the Internet — that involve millions of interconnected machines that are operating independently (for example, blockchain, social networks, streaming services, and cloud computing systems).

“Crucially, these systems are decentralized,” Bernshteyn says. ”Although parts of the network can communicate with each other, each of them has limited information about the network’s overall structure and must make decisions based only on this limited information.” Distributed systems allow researchers to develop strategies — called distributed algorithms — that “enable solving difficult problems with as little knowledge of the structure of the entire network as possible,” he adds.

At first, distributed algorithms appear entirely unrelated to the other area Bernshteyn’s work brings together: descriptive set theory, an area of pure mathematics concerned with infinite sets defined by “simple” mathematical formulas. 

“Sets that do not have such simple definitions typically have properties that make them unsuitable for applications in other areas of mathematics. For example, they are often non-measurable – meaning that it is impossible, even in principle, to determine their length, area, or volume," Bernshteyn says.

Because undefinable sets are difficult to work with, descriptive set theory aims to understand which problems have “definable”— and therefore more widely applicable— solutions. Recently, a new subfield called descriptive combinatorics has emerged. “Descriptive combinatorics focuses specifically on problems inspired by the ways collections of discrete, individual objects can be organized,” Bernshteyn explains. “Although the field is quite young, it has already found a number of exciting applications in other areas of math.”

The key connection? Since the algorithms used by computer scientists in distributed computing are designed to perform well on extremely large networks, they can also be used by mathematicians interested in infinite problems.

Solving infinite problems

Infinite problems often occur in nature, and the field of descriptive combinatorics has been particularly successful in helping to understand dynamical systems: structures that evolve with time according to specified laws (such as the flow of water in a river or the movement of planets in the Solar System). “Most mathematicians work with continuous, infinite objects, and hence they may benefit from the insight contributed by descriptive set theory,” Bernshteyn adds.

However, while infinite problems are common, they are also notoriously difficult to solve. “In infinite problems, there is no software that can tell you if the problem is solvable or not. There are infinitely many things to try, so it is impossible to test all of them. But if we can make our problems finite, we can sometimes determine which ones can and cannot be solved efficiently,” Bernshteyn says. “We may be able to determine which combinatorial problems can be solved in the infinite setting and get an explicit solution.”

“It turns out that, with some work, it is possible to implement the algorithms used in distributed computing on infinite networks, providing definable solutions to various combinatorial problems,” Bernshteyn says. “Conversely, in certain limited settings it is possible to translate definable solutions to problems on infinite structures into efficient distributed algorithms — although this part of the story is yet to be fully understood.”

A new frontier

As a recently emerged field, descriptive combinatorics is rapidly evolving, putting Bernshteyn and his research on the cutting edge of discovery. “There’s this new communication between separate fields of math and computer science—this huge synergy right now—it’s incredibly exciting,” Bernshteyn says.

Introducing new researchers to descriptive combinatorics, especially graduate students, is another priority for Bernshteyn. His CAREER grant funds will be especially dedicated to training graduate students who might not have had prior exposure to descriptive set theory. Bernshteyn also aims to design a suite of materials ranging from textbooks, lecture notes, instructional videos, workshops, and courses to support students and scholars as they enter this new field.

“There’s so much knowledge that’s been acquired,” Bernshteyn says. “There’s work being done by people within computer science, set theory, and so on. But researchers in these fields speak different languages, so to say, and a lot of effort needs to go into creating a way for them to understand each other. Unifying these fields will ultimately allow us to understand them all much better than we did before. Right now we’re only starting to glimpse what’s possible.”

News Contact

Written by Selena Langner

Default Image: Research at Georgia Tech

The American Society of Mechanical Engineers (ASME) has honored Georgia Tech aerospace engineering professor George Kardomateas with the Spirit of St. Louis Medal for exemplary work in the progress of aeronautics and astronautics. He is in great company as Daniel Guggenheim, Neil A. Armstrong, John E. Northrup, John W. Young (AE 1952), George W. Lewis, Charles S. Draper, Robert G. Lowey, Michael Collins, and the late Dewey Hodges have also received this premier medal. ASME will present Kardomateas with the medal at the International Mechanical Engineering Congress & Exposition in Columbus, Ohio, October 30-November 3, 2022.

Kardomateas has spent over thirty years improving aircrafts from a structural standpoint. More specifically he investigates ways to ensure that aerospace structures retain their structural integrity. He focuses on the special part of mechanics called fracture mechanics, which studies the conditions for the initiation and propagation of cracks and debonds. “Fracture mechanics and damage tolerance have been very successful in that, nowadays, airplanes don’t usually come down because of structural failure,” explained Kardomateas.

He credits his lifelong scientific triumphs to his education in the United States and Greece, his collaboration with past and present colleagues at Georgia Tech, and the academic system in America. “The environment at Georgia Tech fosters collaboration and innovation. The higher education system provides opportunities through the collegial network in scientific forums where ideas can be exchanged with those inside and outside of your institution.” Former AE School professors, including the late Bob Carlson, and George Simitses, inspired him as colleagues and also acted as mentors to him.

Kardomateas earned a diploma from the National Technical University of Athens in Greece and both his master’s and doctoral degrees from the Massachusetts Institute of Technology. In 1989, he joined the School of Aerospace Engineering's faculty at the Georgia Tech. He has authored three books, An Introduction to Fatigue in Metals and CompositesStructural and Failure Mechanics of Sandwich Composites, and Mechanics of Failure Mechanisms in Structures. He is also the editor of six volumes on the topic of failure mechanics of composite and sandwich structures, an associate editor of the Handbook of Damage Mechanics: Nano to Macro Scale for Materials and Structures, as well as the author of about 200 papers published in refereed journals or as parts of books.

In addition to his work at Georgia Tech, he has served the discipline in several capacities. The ASME Fellow has operated as an Associate Editor of the Journal of Applied Mechanics, and the AIAA Journal, as a Contributing Editor of the International Journal of Non-Linear Mechanics and as a guest editor of the International Journal of Solids and Structures and the Journal of Mechanics of Materials and Structures. In addition, he has served as the technical chair of the 2014 ASME Congress, general chair of the 2015 ASME Congress, and the steering committee chair of the 2017 ASME Congress. He was the elected chairman of the Applied Mechanics Division Composites Committee and the program representative of the Aerospace Division Structures and Materials Committee.  Kardomateas has also served in many other panels and committees including as the Chair of the Daniel Guggenheim Medal Award Board, and on the Organizing Committee of the sixth, seventh, tenth and eleventh Institute for Advanced Composites Manufacturing’s International Conferences on Sandwich Structures; he has also served on external evaluation committees for many academic programs.

Currently, the medal winner is working on his next book that focuses on the fracture and fatigue of metallic and composite aerospace structures, which will include his latest research advances in the field.

News Contact

Monique Waddell

photograph of Oliver Brand, Madhavan Swaminathan, Shimeng Yu in lab
photograph of Madhavan Swaminathan
photograph of IEN technical staff in nanofabrication cleanroom
photograph of Oliver Brand, Madhavan Swaminathan, Shimeng Yu in Marcus corridor

The world’s dependence on semiconductors came into sharp focus in 2021, when automotive manufacturing ground to a halt because of massive computer chip shortages – as Asian suppliers couldn’t keep up with demand for microelectronics –  miniaturized electronic circuits and components that drive everything from smartphones to new vehicle components to hypersonics weapons systems.

The culprit was global supply chain disruptions caused by the Covid-19 pandemic. The crisis has highlighted the pressing need for the U.S. to bolster its domestic semiconductor supply chains and industrial capacity, after three decades of decline as a semiconductor producer. The U.S. share of global semiconductor fabrication has dropped to 12% today, compared to 37% in 1990, according to the Semiconductor Industry Association (SIA). In addition, the semiconductor industry today only accounts for 250,000 direct U.S. jobs. 

As the country rebuilds its semiconductor infrastructure at home, Georgia Tech serves as a vital partner – to train the microelectronics workforce, drive future microelectronics advances, and provide unique fabrication and packaging facilities for industry, academic and government partners to develop and test new solutions.

“We’re one of the only universities that can support the whole microelectronics stack – from new materials and devices to packaging and systems,” said Madhavan Swaminathan, the John Pippin Chair in Microsystems Packaging in the School of Electrical and Computer Engineering  and director of the 3D Systems Packaging Research Center.

Continue reading this research feature, Microelectronics Momentum Drives the Nation's Semiconductor Research

 

 

News Contact

Anne Wainscott-Sargent

Georgia Tech Research News

404-435-5784

asargent7@gatech.edu