headshot of Maryam Alavi

Generative AI tools have taken the world by storm. ChatGPT reached 100 million monthly users faster than any internet application in history. The potential benefits of efficiency and productivity gains for knowledge-intensive firms are clear, and companies in industries such as professional services, health care, and finance are investing billions in adopting the technologies.

But the benefits for individual knowledge workers can be less clear. When technology can do many tasks that only humans could do in the past, what does it mean for knowledge workers? Generative AI can and will automate some of the tasks of knowledge workers, but that doesn’t necessarily mean it will replace all of them. Generative AI can also help knowledge workers find more time to do meaningful work, and improve performance and productivity. The difference is in how you use the tools.

In this article, we aim to explain how to do that well. First, to help employees and managers understand ways that generative AI can support knowledge work. And second, to identify steps that managers can take to help employees realize the potential benefits.

What Is Knowledge Work?  

Knowledge work primarily involves cognitive processing of information to generate value-added outputs. It differs from manual labor in the materials used and the types of conversion processes involved. Knowledge work is typically dependent on advanced training and specialization in specific domains, gained over time through learning and experience. It includes both structured and unstructured tasks. Structured tasks are those with well-defined and well-understood inputs and outputs, as well as prespecified steps for converting inputs to outputs. Examples include payroll processing or scheduling meetings. Unstructured tasks are those where inputs, conversion procedures, or outputs are mostly ill-defined, underspecified, or unknown a priori. Examples include resolving interpersonal conflict, designing a product, or negotiating a salary.

Very few jobs are purely one or the other. Jobs consist of many tasks, some of which are structured and others which are unstructured. Some tasks are necessary but repetitive. Some are more creative or interesting. Some can be done alone, while others require working with other people. Some are common to everything the worker does, while others happen only for exceptions. As a knowledge worker, your job, then, is to manage this complex set of tasks to achieve their goals.

Computers have traditionally been good at performing structured tasks, but there are many tasks that only humans can do. Generative AI is changing the game, moving the boundaries of what computers can do and shrinking the sphere of tasks that remain as purely human activity. While it can be worrisome to think about generative AI encroaching on knowledge work, we believe that the benefits can far outweigh the costs for most knowledge workers. But realizing the benefits requires taking action now to learn how to leverage generative AI in support of knowledge work.

Continue reading: How Generative AI Will Transfer Knowledge Work

Reprinted from the Harvard Business Review, November 7, 2023.

  • Maryam Alavi is the Elizabeth D. & Thomas M. Holder Chair & Professor of IT Management, Scheller College of Business, Georgia Institute of Technology.
  • George Westerman is a Senior Lecturer at MIT Sloan School of Management and founder of the Global Opportunity Forum in MIT’s Office of Open Learning.

News Contact

Lorrie Burroughs

Physicists from around the country come to Georgia Tech for a recent machine learning conference. (Photo Benjamin Zhao)

Physicists from around the country come to Georgia Tech for a recent machine learning conference. (Photo Benjamin Zhao)

School of Physics Professor Tamara Bogdanovic prepares to ask a question at the recent machine learning conference at Georgia Tech. (Photo Benjamin Zhao)

School of Physics Professor Tamara Bogdanovic prepares to ask a question at the recent machine learning conference at Georgia Tech. (Photo Benjamin Zhao)

Matthew Golden, graduate student researcher in the School of Physics, presents at a recent machine learning conference at Georgia Tech. (Photo Benjamin Zhao)

Matthew Golden, graduate student researcher in the School of Physics, presents at a recent machine learning conference at Georgia Tech. (Photo Benjamin Zhao)

The School of Physics’ new initiative to catalyze research using artificial intelligence (AI) and machine learning (ML) began October 16 with a conference at the Global Learning Center titled Revolutionizing Physics — Exploring Connections Between Physics and Machine Learning.

AI and ML have the spotlight right now in science, and the conference promises to be the first of many, says Feryal Özel, Professor and Chair of the School of Physics. 

"We were delighted to host the AI/ML in Physics conference and see the exciting rapid developments in this field,” Özel says. “The conference was a prominent launching point for the new AI/ML initiative we are starting in the School of Physics."​ 

That initiative includes hiring two tenure-track faculty members, who will benefit from substantial expertise and resources in artificial intelligence and machine learning that already exist in the Colleges of Sciences, Engineering, and Computing.

The conference attendees heard from colleagues about how the technologies were helping with research involving exoplanet searches, plasma physics experiments, and culling through terabytes of data. They also learned that a rough search of keyword titles by Andreas Berlind, director of the National Science Foundation’s Division of Astronomical Sciences, showed that about a fifth of all current NSF grant proposals include components around artificial intelligence and machine learning.

“That’s a lot,” Berlind told the audience. “It’s doubled in the last four years. It’s rapidly increasing.”

Berlind was one of three program officers from the NSF and NASA invited to the conference to give presentations on the funding landscape for AI/ML research in the physical sciences. 

“It’s tool development, the oldest story in human history,” said Germano Iannacchione, director of the NSF’s Division of Materials Research, who added that AI/ML tools “help us navigate very complex spaces — to augment and enhance our reasoning capabilities, and our pattern recognition capabilities.”

That sentiment was echoed by Dimitrios Psaltis, School of Physics professor and a co-organizer of the conference. 

“They usually say if you have a hammer, you see everything as a nail,” Psaltis said. “Just because we have a tool doesn't mean we're going to solve all the problems. So we're in the exploratory phase because we don't know yet which problems in physics machine learning will help us solve. Clearly it will help us solve some problems, because it's a brand new tool, and there are other instances when it will make zero contribution. And until we find out what those problems are, we're going to just explore everything.”

That means trying to find out if there is a place for the technologies in classical and modern physics, quantum mechanics, thermodynamics, optics, geophysics, cosmology, particle physics, and astrophysics, to name just a few branches of study.

Sanaz Vahidinia of NASA’s Astronomy and Astrophysics Research Grants told the attendees that her division was an early and enthusiastic adopter of AI and machine learning. She listed examples of the technologies assisting with gamma-ray astronomy and analyzing data from the Hubble and Kepler space telescopes. “AI and deep learning were very good at identifying patterns in Kepler data,” Vahidinia said. 

Some of the physicist presentations at the conference showed pattern recognition capabilities and other features for AI and ML: 

Alves’s presentation inspired another physicist attending the conference, Psaltis said. “One of our local colleagues, who's doing magnetic materials research, said, ‘Hey, I can apply the exact same thing in my field,’ which he had never thought about before. So we not only have cross-fertilization (of ideas) at the conference, but we’re also learning what works and what doesn't.”

More information on funding and grants at the National Science Foundation can be found here. Information on NASA grants is found here

News Contact

Writer: Renay San Miguel
Communications Officer II/Science Writer
College of Sciences
404-894-5209

Editor: Jess Hunt-Ralston

 

Two hands holding an example of the DUCKY polymer membranes researchers created to perform the initial separation of crude oils with significantly less energy. (Photo: Candler Hobbs)

A sample of a DUCKY polymer membrane researchers created to perform the initial separation of crude oils using significantly less energy. (Photo: Candler Hobbs)

A new kind of polymer membrane created by researchers at Georgia Tech could reshape how refineries process crude oil, dramatically reducing the energy and water required while extracting even more useful materials.

The so-called DUCKY polymers — more on the unusual name in a minute — are reported Oct. 16 in Nature Materials. And they’re just the beginning for the team of Georgia Tech chemists, chemical engineers, and materials scientists. They also have created artificial intelligence tools to predict the performance of these kinds of polymer membranes, which could accelerate development of new ones.

The implications are stark: the initial separation of crude oil components is responsible for roughly 1% of energy used across the globe. What’s more, the membrane separation technology the researchers are developing could have several uses, from biofuels and biodegradable plastics to pulp and paper products.

“We're establishing concepts here that we can then use with different molecules or polymers, but we apply them to crude oil because that's the most challenging target right now,” said M.G. Finn, professor and James A. Carlos Family Chair in the School of Chemistry and Biochemistry.

Read the full story on the College of Engineering website.

News Contact

Joshua Stewart
College of Engineering

Meet CSE Profile Rafael Orozco

The start of the fall semester can be busy for most Georgia Tech students, but this is especially true for Rafael Orozco. The Ph.D. student in Computational Science and Engineering (CSE) is part of a research group that presented at a major conference in August and is now preparing to host a research meeting in November.

We used the lull between events, research, and classes to meet with Orozco and learn more about his background and interests in this Meet CSE profile.

Student: Rafael Orozco  

Research Interests: Medical Imaging; Seismic Imaging; Generative Models; Inverse Problems; Bayesian Inference; Uncertainty Quantification 

Hometown: Sonora, Mexico 

Tell us briefly about your educational background and how you came to Georgia Tech. 
I studied in Mexico through high school. Then, I did my first two years of undergrad at the University of Arizona and transferred to Bucknell University. I was attracted to Georgia Tech’s CSE program because it is a unique combination of domain science and computer science. It feels like I am both a programmer and a scientist.  

How did you first become interested in computer science and machine learning? 

In high school, I saw a video demonstration of a genetic algorithm on the internet and became interested in the technology. My high school in Mexico did not have a computer science class, but a teacher mentored me and helped me compete at the Mexican Informatics Olympiad. When I started at Arizona, I researched the behavior of clouds from a Bayesian perspective. Since then, my research interests have always involved using Bayesian techniques to infer unknowns.  

You mentioned your background a few times. Since it is National Hispanic Heritage Month, what does this observance mean to you? 

I am quite proud to be a part of this group. In Mexico and the U.S., fellow Hispanics have supported me and my pursuits, so I know firsthand of their kindness and resourcefulness. I think that Hispanic people welcome others, celebrating the joy our culture brings, and they appreciate that our country uses the opportunity to reflect on Hispanic history. 

You study in Professor Felix Herrmann’s Seismic Laboratory for Imaging and Modeling (SLIM) group. In your own words, what does this research group do? 

We develop techniques and software for imaging Earth’s subsurface structures. These range from highly performant partial differential equation solvers to randomized numerical algebra to generative artificial intelligence (AI) models.  

One of the driving goals of each software package we develop is that it needs to be scalable to real world applications. This entails imaging seismic areas that can be kilometers cubed in volume, represented typically by more than 100,000,000 simulation grid cells. In my medical applications, high-resolution images of human brains that can be resolved to less than half a millimeter.  

The International Meeting for Applied Geoscience and Energy (IMAGE) is a recent conference where SLIM gave nine presentations. What research did you present here? 
The challenge of applying machine learning to seismic imaging is that there are no examples of what the earth looks like. While making high quality reference images of human tissues for supervised machine learning is possible, no one can “cut open” the earth to understand exactly what it looks like.  

To address this challenge, I presented an algorithm that combines generative AI with an unsupervised training objective. We essentially trick the generative model into outputting full earth models by making it blind to which part of the Earth we are asking for. This is like when you take an exam where only a few questions will be graded, but you don’t know which ones, so you answer all the questions just in case.  

While seismic imaging is the basis of SLIM research, there are other applications for the group’s work. Can you discuss more about this? 

The imaging techniques that the energy industry has been using for decades toward imaging Earth’s subsurface can be applied almost seamlessly to create medical images of human sub tissue.  

Lately, we have been tackling the particularly difficult modality of using high frequency ultrasound to image through the human skull. In our recent paper, we are exploring a powerful combination between machine learning and physics-based methods that allows us to speed up imaging while adding uncertainty quantification.  
 
We presented the work at this year’s MIDL conference (Medical Imaging with Deep Learning) in July. The medical community was excited with our preliminary results and gave me valuable feedback on how we can help bring this technique closer to clinical viability. 

News Contact

Bryant Wine, Communications Officer
bryant.wine@cc.gatech.edu

Default Image: Research at Georgia Tech

A scientific machine learning (ML) expert at Georgia Tech is lending a hand in developing an app to identify and help Florida communities most at risk of flooding.

School of Computational Science and Engineering (CSE) Assistant Professor Peng Chen is co-principal investigator of a $1.5 million National Science Foundation grant to develop the CRIS-HAZARD system.

CRIS-HAZARD‘s strength derives from integrating geographic information and data mined from community input, like traffic camera videos and social media posts.  

This ability helps policymakers identify areas most vulnerable to flooding and address community needs. The app also predicts and assesses flooding in real time to connect victims with first responders and emergency managers.

“Successfully deploying CRIS-HAZARD will harness community knowledge through direct and indirect engagement efforts to inform decision-making,” Chen said. “It will connect individuals to policymakers and serve as a roadmap at helping the most vulnerable communities.”

Chen’s role in CRIS-HAZARD will be to develop new ML models for the app’s prediction capability. These assimilation models integrate the mined data with predictions from current hydrodynamic models.

Along with making an immediate impact in flood-prone coastal communities, Chen said these models could have broader applications in the future. These include models for improved hurricane prediction and management of water resources.

The models Chen will build for CRIS-HAZARD derive from past applications aimed at helping communities.

Chen has crafted similar models for monitoring and mitigating disease spread, including Covid-19. He has also worked on materials science projects to accelerate the design of metamaterials and self-assembly materials.

“Scientific machine learning is very broad concept and can be applied to many different fields,” Chen said. “Our group looks at how to accelerate optimization, account for risk, and quantify uncertainty in these applications.”

Uncertainty in CRIS-HAZARD is what brings Chen to the project, headed by University of South Florida researchers. While the app’s novelty lies in its use of heterogenous data, inferring predictions can be challenging since the data comes from different sources in varying formats. 

To overcome this, Chen intends to build new data assimilation models from scratch powered by deep neural networks (DNNs).

Along with their ability to find connections between heterogeneous data, DNNs are scalable and inexpensive. This beats the alternative of using supercomputers to make the same calculations.

DNNs are also fast and can significantly reduce computational time. According to Chen, the efficiency of DNNs can achieve acceleration hundreds of thousands of times greater than classical models.

Low cost and time make it possible to run DNN-based simulations multiple times. This improves reliability in prediction results in real-time once the DNNs are properly trained.

“The data may not be consistent or compatible since there are different models we’re trying to integrate, making prediction uncertain,” Chen said. “We can run these ML models many times to quantify the uncertainty and give a probability distribution or a range of predictions.”

CRIS-HAZARD also exemplifies the power of collaboration across disciplines and universities. In this case, machine learning techniques reach across state boundaries to help people that are vulnerable to flooding or other natural disasters.

USF Professor Barnali Dixon leads the project with Associate Professor Yi Qiang— both geocomputation researchers in the School of Geosciences, incorporating data science and artificial intelligence.

Subhro Guhathakurta collaborates with Chen from Georgia Tech. Along with being a professor in the School of City & Regional Planning, Guhathkurta is director of Tech’s Master of Science in Urban Analytics program and the Center for Spatial Planning and Analytics and Visualization.

News Contact

Bryant Wine, Communications Officer
bryant.wine@cc.gatech.edu

A stylized glacier (Selena Langner)
Alex Robel (Credit: Allison Carter)

Alex Robel is improving how computer models of melting ice sheets incorporate data from field expeditions and satellites by creating a new open-access software package — complete with state-of-the-art tools and paired with ice sheet models that anyone can use, even on a laptop or home computer.

Improving these models is critical: while melting ice sheets and glaciers are top contributors to sea level rise, there are still large uncertainties in sea level projections at 2100 and beyond.

“Part of the problem is that the way that many models have been coded in the past has not been conducive to using these kinds of tools,” Robel, an assistant professor in the School of Earth and Atmospheric Sciences, explains. “It's just very labor-intensive to set up these data assimilation tools — it usually involves someone refactoring the code over several years.”

“Our goal is to provide a tool that anyone in the field can use very easily without a lot of labor at the front end,” Robel says. “This project is really focused around developing the computational tools to make it easier for people who use ice sheet models to incorporate or inform them with the widest possible range of measurements from the ground, aircraft and satellites.”

Now, a $780,000 NSF CAREER grant will help him to do so. 

The National Science Foundation Faculty Early Career Development Award is a five-year funding mechanism designed to help promising researchers establish a personal foundation for a lifetime of leadership in their field. Known as CAREER awards, the grants are NSF’s most prestigious funding for untenured assistant professors.

“Ultimately,” Robel says, “this project will empower more people in the community to use these models and to use these models together with the observations that they're taking.”
 

Ice sheets remember

“Largely, what models do right now is they look at one point in time, and they try their best — at that one point in time — to get the model to match some types of observations as closely as possible,” Robel explains. “From there, they let the computer model simulate what it thinks that ice sheet will do in the future.”

In doing so, the models often assume that the ice sheet starts in a state of balance, and that it is neither gaining nor losing ice at the start of the simulation. The problem with this approach is that ice sheets dynamically change, responding to past events — even ones that have happened centuries ago. “We know from models and from decades of theory that the natural response time scale of thick ice sheets is hundreds to thousands of years,” Robel adds.

By informing models with historical records, observations, and measurements, Robel hopes to improve their accuracy. “We have observations being made by satellites, aircraft, and field expeditions,” says Robel. “We also have historical accounts, and can go even further back in time by looking at geological observations or ice cores. These can tell us about the long history of ice sheets and how they've changed over hundreds or thousands of years.”

Robel’s team plans to use a set of techniques called data assimilation to adjust, or ‘nudge’, models. “These data assimilation techniques have been around for a really long time,” Robel explains. “For example, they’re critical to weather forecasting: every weather forecast that you see on your phone was ultimately the product of a weather model that used data assimilation to take many observations and apply them to a model simulation.”

“The next part of the project is going to be incorporating this data assimilation capability into a cloud-based computational ice sheet model,” Robel says. “We are planning to build an open source software package in Python that can use this sort of data assimilation method with any kind of ice sheet model.”

Robel hopes it will expand accessibility. “Currently, it's very labor-intensive to set up these data assimilation tools, and while groups have done it, it usually involves someone re-coding and refactoring the code over several years.”

Building software for accessibility

Robel’s team will then apply their software package to a widely used model, which now has an online, browser-based version. “The reason why that is particularly useful is because the place where this model is running is also one of the largest community repositories for data in our field,” Robel says.

Called Ghub, this relatively new repository is designed to be a community-wide place for sharing data on glaciers and ice sheets. “Since this is also a place where the model is living, by adding this capability to this cloud-based model, we'll be able to directly use the data that's already living in the same place that the model is,” Robel explains. 

Users won’t need to download data, or have a high-speed computer to access and use the data or model. Researchers collecting data will be able to upload their data to the repository, and immediately see the impact of their observations on future ice sheet melt simulations. Field researchers could use the model to optimize their long-term research plans by seeing where collecting new data might be most critical for refining predictions.

“We really think that it is critical for everyone who's doing modeling of ice sheets to be doing this transient data simulation to make sure that our simulations across the field are all doing the best possible job to reproduce and match observations,” Robel says. While in the past, the time and labor involved in setting up the tools has been a barrier, “developing this particular tool will allow us to bring transient data assimilation to essentially the whole field.”

Bringing Real Data to Georgia’s K-12 Classrooms

The broad applications and user-base expands beyond the scientific community, and Robel is already developing a K-12 curriculum on sea level rise, in partnership with Georgia Tech CEISMC Researcher Jayma Koval. “The students analyze data from real tide gauges and use them to learn about statistics, while also learning about sea level rise using real data,” he explains.

Because the curriculum matches with state standards, teachers can download the curriculum, which is available for free online in partnership with the Southeast Coastal Ocean Observing Regional Association (SECOORA), and incorporate it into their preexisting lesson plans. “We worked with SECOORA to pilot a middle school curriculum in Atlanta and Savannah, and one of the things that we saw was that there are a lot of teachers outside of middle school who are requesting and downloading the curriculum because they want to teach their students about sea level rise, in particular in coastal areas,” Robel adds.

In Georgia, there is a data science class that exists in many high schools that is part of the computer science standards for the state. “Now, we are partnering with a high school teacher to develop a second standards-aligned curriculum that is meant to be taught ideally in a data science class, computer class or statistics class,” Robel says. “It can be taught as a module within that class and it will be the more advanced version of the middle school sea level curriculum.”

The curriculum will guide students through using data analysis tools and coding in order to analyze real sea level data sets, while learning the science behind what causes variations and sea level, what causes sea level rise, and how to predict sea level changes. 

“That gets students to think about computational modeling and how computational modeling is an important part of their lives, whether it's to get a weather forecast or play a computer game,” Robel adds. “Our goal is to get students to imagine how all these things are combined, while thinking about the way that we project future sea level rise.”

 

News Contact

Written by Selena Langner

Contact: Jess Hunt-Ralston

Xiuwei Zhang Group AWSOM 2023

The discovery of nucleic acids is a recent event in the history of scientific phenomenon, and there is still much learn from the enigma that is genetic code.

Advances in computing techniques though have ushered in a new age of understanding the macromolecules that form life as we know it. Now, one Georgia Tech research group is receiving well-deserved accolades for their applications in data science and machine learning toward single-cell omics research. 

Students studying under Xiuwei Zhang, an assistant professor in the School of Computational Science and Engineering (CSE), received awards in April at the Atlanta Workshop on Single-cell Omics (AWSOM 2023).

School of CSE Ph.D. student Ziqi Zhang received the best oral presentation award, while Mihir Birfna, an undergraduate student majoring in computer science, took the best poster prize.

Along with providing computational tools for biological researchers, the group’s papers presented at AWSOM 2023 could benefit populations of people as the research could lead to improved disease detection and prevention. They can also provide a better understanding of causes and treatments of cancer and new ability to accurately simulate cellular processes.

“I am extremely proud of the entire research group and very thankful of their work and our teamwork within our lab,” said Xiuwei Zhang. “These awards are encouraging because it means we are on the right track of developing something that will contribute both to the biology community and the computational community.”

Ziqi Zhang presented the group’s findings of their deep learning framework called scDisInFact, which can carry out multiple key single cell RNA-sequencing (scRNA-seq) tasks all at once and outperform current models that focus on the same tasks individually.

The group successfully tested scDisInFact on simulated and real Covid-19 datasets, demonstrating applicability in future studies of other diseases.

Bafna’s poster introduced CLARIFY, a tool that connects biochemical signals occurring within a cell and intercellular communication molecules. Previously, the inter- and intra-cell signaling were often studied separately due to the complexity of each problem.

Oncology is one field that stands to benefit from CLARIFY. CLARIFY helps to understand the interactions between tumor cells and immune cells in cancer microenvironments, which is crucial for enabling success of cancer immunotherapy.

At AWSOM 2023, the group presented a third paper on scMultiSim. This simulator generates data found in multi-modal single-cell experiments through modeling various biological factors underlying the generated data. It enables quantitative evaluations of a wide range of computational methods in single-cell genomics. That has been a challenging problem due to lack of ground truth information in biology, Xiuwei Zhang said. 

“We want to answer certain basic questions in biology, like how did we get these different cell types like skin cells, bone cells, and blood cells,” she said. “If we understand how things work in normal and healthy cells, and compare that to the data of diseased cells, then we can find the key differences of those two and locate the genes, proteins, and other molecules that cause problems.”

Xiuwei Zhang’s group specializes in applying machine learning and optimization skills in analysis of single-cell omics data and scRNA-seq methods. Their main interest area is studying mechanisms of cellular differentiation— the process when young, immature cells mature and take on functional characteristics.

A growing, yet effective approach to research in molecular biology, scRNA-seq gives insight of existence and behavior of different types of cells. This helps researchers better understand genetic disorders, detect mechanisms that cause tumors and cancer, and develop new treatments, cures, and drugs. 

If microenvironments filled with various macromolecules and genetic material are considered datasets, the need for researchers like Xiuwei Zhang and her group is obvious. These massive, complex datasets present both challenges and opportunities for the group experienced in computational and biological research.

Collaborating authors include School of CSE Ph.D. students Hechen Li and Michael Squires, School of Electrical and Computer Engineering Ph.D. student Xinye Zhao, Wallace H. Coulter Department of Biomedical Engineering Associate Professor Peng Qiu, and Xi Chen, an assistant professor at Southern University of Science and Technology in Shenzhen, China.

The group’s presentations at AWSOM 2023 exhibited how their work makes progress in biomedical research, as well as advancing scientific computing methods in data science, machine learning, and simulation.

scDisInFact is an optimization tool that can perform batch effect removal, condition-associated key gene detection, and perturbation, which is made possible by considering major variation factors in the data. Without considering all these factors, current models can only do these tasks individually, but scDisInFact can do each task better and all at the same time.

CLARIFY delves into how cells employ genetic material to communicate internally, using gene regulatory networks (GRNs) and externally, called cell-cell interactions (CCIs). Many computational methods can infer GRNs and inference methods have been proposed for CCIs, but until CLARIFY, no method existed to infer GRNs and CCIs in the same model.

scMultiSim simulations perform closer to real-world conditions than current simulators that model only one or two biological factors. This helps researchers more realistically test their computational methods, which can guide directions for future method development.

Whether they be computer scientists, biologists, or non-academics alike, the advantage of interdisciplinary and collaborative research, like Xiuwei Zhang’s group, is its wide-reaching impact that advances technology to improve the human condition.

“We’re exploring the possibilities that can be realized by advanced computational methods combined with cutting edge biotechnology,” said Xiuwei Zhang. “Since biotechnology keeps evolving very fast and we want to help push this even further by developing computational methods, together we will propel science forward.”

News Contact

Bryant Wine, Communications Officer
bryant.wine@cc.gatech.edu

Chuck Zhang

Chuck Zhang, GTMI faculty member and the Harold E. Smalley Professor in the H. Milton Stewart School of Industrial and Systems Engineering, is one of five faculty members will help grow the College of Engineering’s work in high-impact cyber-physical systems security (CPSS) as new Cybersecurity Fellows.

Fellows represent expertise in a variety of areas of CPSS, which addresses risks where cyber and physical worlds intersect. That includes the Internet of Things (IoT), industrial systems, smart grids, medical devices, autonomous vehicles, robotics, and more.

“As devices, systems, and the world continue to become more connected, cyber-related threats that were traditionally limited to the digital domain have made their way to physical systems,” said Raheem Beyah, dean of the College, Southern Company Chair, and a cybersecurity expert. “The College of Engineering has world-renowned cybersecurity and artificial intelligence researchers. This new cohort will continue to expand the College’s breadth of expertise and leadership in CPSS.”

The three-year fellowship was made possible by a gift from Kyle Seymour, a 1982 mechanical engineering graduate who retired as president and CEO of S&C Electric Company in 2020. Seymour wanted to help increase cybersecurity-related research and instruction within the College.

School chairs nominated potential fellows, who were evaluated and selected by a committee of senior cybersecurity researchers and College leaders. 

Five faculty members will help grow the College of Engineering’s work in high-impact cyber-physical systems security (CPSS) as new Cybersecurity Fellows.

Fellows represent expertise in a variety of areas of CPSS, which addresses risks where cyber and physical worlds intersect. That includes the Internet of Things (IoT), industrial systems, smart grids, medical devices, autonomous vehicles, robotics, and more.

“As devices, systems, and the world continue to become more connected, cyber-related threats that were traditionally limited to the digital domain have made their way to physical systems,” said Raheem Beyah, dean of the College, Southern Company Chair, and a cybersecurity expert. “The College of Engineering has world-renowned cybersecurity and artificial intelligence researchers. This new cohort will continue to expand the College’s breadth of expertise and leadership in CPSS.”

The three-year fellowship was made possible by a gift from Kyle Seymour, a 1982 mechanical engineering graduate who retired as president and CEO of S&C Electric Company in 2020. Seymour wanted to help increase cybersecurity-related research and instruction within the College.

School chairs nominated potential fellows, who were evaluated and selected by a committee of senior cybersecurity researchers and College leaders. 

View the new Cybersecurity Fellows >>

News Contact

Walter Rich

Concept diagram showing satellite capturing and deorbiting a spent rocket fuselage.

Top-down, slow motion view of hands tying a traditional fishing net knot

One hand holding a net of thin black cord in the middle. The net is draped over the person's other hand, below.

Image courtesy of Georgia Tech Research Institute.

Diagram showing concept of active space debris removal. The system is launched from earth and maneuvers to intercept a spent rocket fuselage. It then separates into four components with a net stretched between them. The net wraps around the fuselage, capturing it, and the entire system deorbits safely.

Lisa Marks standing in front of a closed door. The door features a net pattern and the title, "The Algorithmic Craft Lab."

One hand holding a net of thin black cord in the middle. The net is draped over the person's other hand, below.

Lisa Marks is launching the ancient craft of fishing villages into space vehicle design. Her work adapting traditional textile handcraft to modern problems created a unique opportunity for collaboration cleaning up space debris.

According to NASA's Orbital Debris Program Office (OPDO), this debris jeopardizes future space projects. Large objects like rocket bodies and non-functional satellites are the source of fragmentation debris.

The OPDO website says removal of even five of the highest-risk objects per year could stabilize the low Earth orbit debris environment.

A research team with members from the Georgia Tech Research Institute, the Aerospace Systems Design Laboratory, and the Space Systems Design Laboratory has developed a concept using a net to capture and de-orbit large debris.

A mutual connection at Tech's GVU recommended that the team speak to Lisa Marks, assistant professor in the School of Industrial Design, based on her work combining traditional textile with new materials and methods.

Putting Textiles in Space Requires Creative Expertise

“There’s a lot of different projects on space debris happening all around the world,” Marks said, “and there’ve been a few concept papers talking about using a net.”

“But all the drawings of the net are basic concepts, just a square with a few hatches through it. No one has figured out what that net might be.”

Marks researches ways to combine traditional textile handcraft with algorithmic modeling. “I specialize in analyzing the shape of every stitch and how we can use that stitch differently. Can we create new patterns through coding, or make it larger and out of wood?”

“It allows me to think really creatively about how we can use different textiles.”

This innovative, exploratory approach is a natural fit to create a net for a job no has ever done. “There's a lot of technical considerations with this,” Marks said. 

“It must pack incredibly small, weigh very little, and still be strong enough to capture and drag a rocket fuselage. There are considerations just for a material to exist in space. It needs to have low UV reactivity, low off gassing.”

“We need to understand every single little aspect of each of these techniques in order to do this.”

Static Nets Catch Fish; Slippery Nets Catch Rockets

Marks is working with Teflon, using the same knots used for fishing nets, but the non-traditional material means the nets work differently than fishing nets, she said. “These knots are made to be static, because you don’t want fish to get through the nets. But because Teflon is so slippery, the knots move around.”

“I think it will help the net’s strength, because the net will deform around irregular shapes before it breaks. What makes it unsuitable for fishing and annoying to work with becomes a huge benefit for what we need it to do.”

Some traditional handcraft techniques are dying out, and Marks sees projects like this as a reason preserving these techniques is important. “We don’t know what problems we’re going to have to solve in the future, and these crafts can be used in really surprising ways.”

“I would not have thought, ‘Netted filet lace, that’s how we’re going to solve a space problem!’ But if we lose this type of lace, we can’t solve space problems with it.”

A blue image of interconnected nodes
A portrait of Anton Bernshteyn. He is standing in front of a chalkboard that is covered with mathematical equations.

Anton Bernshteyn is forging connections and creating a language to help computer scientists and mathematicians collaborate on new problems — in particular, bridging the gap between solvable, finite problems and more challenging, infinite problems. Now, an NSF CAREER grant will help him achieve that goal.

The National Science Foundation Faculty Early Career Development Award is a five-year grant designed to help promising researchers establish a foundation for a lifetime of leadership in their field. Known as CAREER awards, the grants are NSF’s most prestigious funding for untenured assistant professors.

Bernshteyn, an assistant professor in the School of Mathematics, will focus on “Developing a unified theory of descriptive combinatorics and local algorithms” — connecting concepts and work being done in two previously separate mathematical and computer science fields. “Surprisingly,” Bernshteyn says, “it turns out that these two areas are closely related, and that ideas and results from one can often be applied in the other.” 

“This relationship is going to benefit both areas tremendously,” Bernshteyn says. “It significantly increases the number of tools we can use”

By pioneering this connection, Bernshteyn hopes to connect techniques that mathematicians use to study infinite structures (like dynamic, continuously evolving  structures found in nature), with the algorithms computer scientists use to model large – but still limited – interconnected networks and systems (like a network of computers or cell phones).

“The final goal, for certain types of problems,” he continues, “is to take all these questions about complicated infinite objects and translate them into questions about finite structures, which are much easier to work with and have applications in practical large-scale computing.”

Creating a unified theory

It all started with a paper Bernshteyn wrote in 2020, which showed that mathematics and computer science could be used in tandem to develop powerful problem-solving techniques. Since the fields used different terminology, however, it soon became clear that a “dictionary” or a unified theory would need to be created to help specialists communicate and collaborate. Now that dictionary is being built, bringing together two previously-distinct fields: distributed computing (a field of computer science), and descriptive set theory (a field of mathematics). 

Computer scientists use distributed computing to study so-called “distributed systems,” which model extremely large networks — like the Internet — that involve millions of interconnected machines that are operating independently (for example, blockchain, social networks, streaming services, and cloud computing systems).

“Crucially, these systems are decentralized,” Bernshteyn says. ”Although parts of the network can communicate with each other, each of them has limited information about the network’s overall structure and must make decisions based only on this limited information.” Distributed systems allow researchers to develop strategies — called distributed algorithms — that “enable solving difficult problems with as little knowledge of the structure of the entire network as possible,” he adds.

At first, distributed algorithms appear entirely unrelated to the other area Bernshteyn’s work brings together: descriptive set theory, an area of pure mathematics concerned with infinite sets defined by “simple” mathematical formulas. 

“Sets that do not have such simple definitions typically have properties that make them unsuitable for applications in other areas of mathematics. For example, they are often non-measurable – meaning that it is impossible, even in principle, to determine their length, area, or volume," Bernshteyn says.

Because undefinable sets are difficult to work with, descriptive set theory aims to understand which problems have “definable”— and therefore more widely applicable— solutions. Recently, a new subfield called descriptive combinatorics has emerged. “Descriptive combinatorics focuses specifically on problems inspired by the ways collections of discrete, individual objects can be organized,” Bernshteyn explains. “Although the field is quite young, it has already found a number of exciting applications in other areas of math.”

The key connection? Since the algorithms used by computer scientists in distributed computing are designed to perform well on extremely large networks, they can also be used by mathematicians interested in infinite problems.

Solving infinite problems

Infinite problems often occur in nature, and the field of descriptive combinatorics has been particularly successful in helping to understand dynamical systems: structures that evolve with time according to specified laws (such as the flow of water in a river or the movement of planets in the Solar System). “Most mathematicians work with continuous, infinite objects, and hence they may benefit from the insight contributed by descriptive set theory,” Bernshteyn adds.

However, while infinite problems are common, they are also notoriously difficult to solve. “In infinite problems, there is no software that can tell you if the problem is solvable or not. There are infinitely many things to try, so it is impossible to test all of them. But if we can make our problems finite, we can sometimes determine which ones can and cannot be solved efficiently,” Bernshteyn says. “We may be able to determine which combinatorial problems can be solved in the infinite setting and get an explicit solution.”

“It turns out that, with some work, it is possible to implement the algorithms used in distributed computing on infinite networks, providing definable solutions to various combinatorial problems,” Bernshteyn says. “Conversely, in certain limited settings it is possible to translate definable solutions to problems on infinite structures into efficient distributed algorithms — although this part of the story is yet to be fully understood.”

A new frontier

As a recently emerged field, descriptive combinatorics is rapidly evolving, putting Bernshteyn and his research on the cutting edge of discovery. “There’s this new communication between separate fields of math and computer science—this huge synergy right now—it’s incredibly exciting,” Bernshteyn says.

Introducing new researchers to descriptive combinatorics, especially graduate students, is another priority for Bernshteyn. His CAREER grant funds will be especially dedicated to training graduate students who might not have had prior exposure to descriptive set theory. Bernshteyn also aims to design a suite of materials ranging from textbooks, lecture notes, instructional videos, workshops, and courses to support students and scholars as they enter this new field.

“There’s so much knowledge that’s been acquired,” Bernshteyn says. “There’s work being done by people within computer science, set theory, and so on. But researchers in these fields speak different languages, so to say, and a lot of effort needs to go into creating a way for them to understand each other. Unifying these fields will ultimately allow us to understand them all much better than we did before. Right now we’re only starting to glimpse what’s possible.”

News Contact

Written by Selena Langner