Gaurav Doshi, Assistant Professor in Applied Economics, Georgia Tech School of Economics

Gaurav Doshi, Assistant Professor in Applied Economics, Georgia Tech School of Economics

Gaurav Doshi, assistant professor in applied economics and a faculty affiliate of the Georgia Tech Energy Policy and Innovation Center researches, among other topics, ways to make the benefits of large electrification projects more transparent.

It’s a chicken and egg situation: Should renewable energy projects launch first hoping that transmission lines to pipe generated power to distant places will follow on their heels? Or should the transmission lines be stood up first as a way to attract investments in renewable energy projects? Which comes before the other? It’s a question that has intrigued Gaurav Doshi, assistant professor at the School of Economics at Georgia Tech, for a while now. His award-winning paper about this research explores the downstream effects of building power lines.

After a bachelor’s and master’s degree in applied economics from the Indian Institute of Technology at Kanpur, Doshi earned his doctorate in the same field from the University of Wisconsin at Madison in 2023. He explored questions about environmental economics as part of his doctoral work.

“Once I started researching energy markets in the U.S., I kept getting deeper and coming up with new questions,” Doshi says. Among the many his work explores: What are the effects of infrastructure policies and how can they help decarbonization efforts? What are some of the unintended consequences policy makers need to think about?

One of his current research projects has roots in his doctoral work. It explores how to quantify the benefits of difficult-to-quantify environmental infrastructure projects. Case in point: Decarbonization will likely lead to more electrification from renewable energy resources and will need power lines to transport this energy to places of demand. The costs for such infrastructure are pretty transparent as part of government project funding. But the benefits are less so, Doshi points out. To develop effective policy, both the costs and benefits need clear visibility. “Otherwise the question arises ‘why should we spend billions of dollars of taxpayer money if we don’t know the benefits?’”

Read Full Story on the EPIcenter Webpage

News Contact

Written by: Poornima Apte

Contact: Priya Devarajan || SEI Communications Program Manager

2025 EPICenter Summer Affiliates

Top (Left to Right): John Kim, Maghfira “Afi” Ramadhani, Mehmet “Akif” Aglar
Bottom (Left to Right): La’Darius Thomas, Yifan Liu, Niraj Palsule

The Energy Policy and Innovation Center (EPIcenter) at Georgia Tech has announced the selection of six students for its inaugural Summer Research Program. The doctoral candidates, pursuing degrees in electrical and computer engineering, economics, computer science, and public policy, will be on campus working full-time on their dissertation research throughout the summer semester and present their findings in a final showcase. 

EPIcenter will provide a full stipend and tuition for the 2025 summer semester to support the students.

“I look forward to hosting a fantastic cohort of early-career energy scholars this summer,” said Laura Taylor, EPIcenter’s director. “The summer research program will not only help the students advance their research while engaging in interdisciplinary dialogue but also offers professional development opportunities to position them for a strong start to their careers.”

The students will work with EPIcenter staff and be provided with on-campus workshops on written and oral communications. Biweekly meetings over the summer will offer the students an opportunity to share their work, progress, and ideas with each other and the EPIcenter faculty affiliates. In addition, the students will have the opportunity to engage with programs and distinguished guests of the center. 

For students interested in presenting their research at a conference, EPIcenter also will provide travel grants of up to $600 pursuant to having their paper/presentation posted on the EPIcenter website.

"I applied to the Summer Research Program because its structure and community aligned perfectly with my summer plan on dissertation work in energy policy,” said Yifan Liu. “I aim to finalize key dissertation chapters and engage closely with peers and mentors to prepare me for the job market." 

The program offers students an opportunity to promote their work through the EPIcenter communication channels including the website, news feeds, blogs, and the SEI newsletter.

“I am very excited to spend my summer at EPIcenter exploring how battery storage entry affects competition in the electricity market,” said Maghfira “Afi” Ramadhani, one of the student affiliates selected for the summer research program. “Specifically, I look at how the rollout of battery storage in the Texas electricity market impacts renewable curtailment, fossil-fuel generator markup, and generator entry and exit.”

With a variety of backgrounds and perspectives on energy, each of the students in the summer program brings something unique to EPIcenter.

La’Darius Thomas: “My project explores the potential of peer-to-peer energy trading systems in promoting decentralized, sustainable energy solutions. I aim to contribute to the development of energy models that empower individuals and communities to directly participate in electricity markets.”

Niraj Palsule: “I intend to gain interdisciplinary insights interfacing energy transition technology and policy developments by participating in the EPIcenter Summer Research Program.”

John Kim: “I believe the EPIcenter Summer Research Program will deepen my investigation of how environmental hazards disproportionately affect vulnerable communities through research on power outage impacts and lead contamination. This summer, I hope to refine my analysis and complete research on the socioeconomic dimensions of power reliability and environmental resilience.”

Mehmet “Akif” Aglar: "I applied to the EPIcenter Summer Research Program because it offers the chance to work alongside and learn from a community of highly qualified researchers across various fields. I believe the opportunity to present my work, receive feedback, and benefit from the structure the program provides will be invaluable for advancing my research."

About EPICenter

The mission of the Energy Policy and Innovation Center is to conduct rigorous studies and deliver high impact insights that address critical regional, national, and global energy issues from a Southeastern U.S. perspective. EPICenter is pioneering a holistic approach that calls upon multidisciplinary expertise to engage the public on the issues that emerge as the energy transformation unfolds. The center operates within Georgia Tech’s Strategic Energy Institute.

News Contact

Priya Devarajan || SEI Communications Program Manager

Default Image: Research at Georgia Tech

Daniel Molzahn will readily admit he’s a Cheesehead.  

Born and brought up in Wisconsin, the associate professor at the School of Electrical and Computer Engineering attended the University of Wisconsin, Madison, for undergraduate and graduate studies. It was also at Madison that he decided to go into the family business: power engineering. 

Molzahn’s grandfather was a Navy electrician in World War II and later completed a bachelor’s in electrical engineering. He eventually was plant director at a big coal plant in Green Bay. Molzahn’s dad was also a power engineer and worked at a utility company, focusing on nuclear power.  

It was not uncommon for family vacations to include a visit to a coal mine or a nuclear power plant. Being steeped in everything power engineering eventually seeped into Molzahn’s bones. “I remember seeing all the infrastructure that goes into producing energy and it was endlessly fascinating for me,” he says.  

That endless fascination has worked its way into Molzahn’s research today—at the intersection of computation and power systems. 

Read Full Story on the EPIcenter Webpage

News Contact

Written by: Poornima Apte
News Contact: Priya Devarajan || SEI Communications Program Manager

headshot of a man

CHART Founding Director Bruce Walker

Imagine a future where robotic guide dogs lead the visually impaired, flying cars navigate the skies, and electric self-driving vehicles communicate effortlessly with pedestrians.

That future is being shaped today at Georgia Tech’s Center for Human-AI-Robot Teaming (CHART). Led by Bruce Walker, a professor in the School of Psychology and the School of Interactive Computing, the newly launched Center aims to transform how humans, artificial intelligence, and robots work together. By focusing on the dynamic partnership between humans and intelligent systems, CHART will explore how humans can collaborate more effectively with artificial intelligence systems and robots to solve critical scientific and societal challenges.

“There are wonderful Georgia Tech units like the Institute for People and Technology and the Institute for Robotics and Machines  that do an incredible job focusing on using and creating intelligent systems and technology,” says Walker. “CHART adds value to this ecosystem with our emphasis on the interactive partnership between humans, AI technology, and robots and machines with agency.”

Based in the School of Psychology, CHART has built an international and interdisciplinary consortium of researchers and innovators from academia and industry. Its impressive membership includes researchers from five Georgia Tech colleges, 18 universities worldwide, industry, public policy organizations, cities, and NASA.

“With expertise encompassing psychology, design, interactive computing, robotics, aerospace engineering, mechanical engineering, public policy, and business, CHART leverages a wealth of knowledge to help us tackle multifaceted challenges — and we’re adding new members every week,” says Walker.

To help shepherd this growth, CHART’s Steering Committee includes School of Psychology Professor Christopher Wiese and Assistant Professor Mengyao Li and School of Mechanical Engineering Assistant Professor Ye Zhao.

Tomorrow’s technology

Several research programs already underway at CHART showcase its vision of deeply transformative, human-centered research:

Robotic guide dogs

Walker co-leads this research with Sehoon Ha, an assistant professor in the School of Interactive Computing. The project explores the partnership between a robotic guide dog robot and a human as they navigate the physical and social environment. Key concerns include trust, communication, sharing of responsibilities, and how the human-robot team integrates into social settings. The project also addresses practical design issues like ensuring the robot operates quietly to avoid interfering with auditory cues critical for blind users.

Flying cars

This project investigates how humans will interact with emerging flying vehicle technologies. It explores user interfaces, control systems, and human-machine interaction design, including whether traditional steering controls might evolve into joystick-like mechanisms. Broader issues include how flying cars will fit into current infrastructure, impacts on pilot licensing policy and regulation, and the psychology of adopting futuristic technologies.

Pedestrians and self-driving cars

Researchers are exploring how driverless electric vehicles and pedestrians can communicate to keep our future streets safe, including how vehicles signal their intentions to pedestrians. Teams are also implications for safety and public policy, including accident liability and the quiet nature of electric vehicles.

Generative AI in Education

This project examines how students use generative AI like ChatGPT as collaborators in learning. The research explores its effects on outcomes, education policy, and curriculum development.

Meet CHART Founding Director Bruce Walker

Walker is excited about CHART’s future and its role in improving the world.

“We’ve got an ambitious plan and with the caliber of researchers we have assembled from around the world, the possibilities are limitless,” says Walker. “I see Georgia Tech leading the way as a center of gravity in this space.”

His background renders him well-suited to the interdisciplinary nature of the Center. Walker brings a wealth of experience in psychology, human-computer interaction, and related fields, with research interests spanning sonification and auditory displays, trust in automation, technology adoption, human-AI-robot teaming, and assistive technologies. In addition to CHART, he's the director of the Georgia Tech Sonification Lab.

Walker’s academic research has resulted in more than 250 journal articles and proceedings, and he has consulted for NASA, state and federal governments, private companies, and the military. He is also an active entrepreneur, founding startups and working on projects related to COVID diagnosis, skin cancer detection, mental health monitoring, gun safety, and digital scent technology. 

Reflecting on the journey ahead, Walker says, “We’ve come out of the gate strong. I look forward to the innovations ahead and continuing to cultivate a community of future leaders in this field.”

News Contact

Laura S. Smith, writer

Zijie (Jay) Wang CHI 2025
CHI 2024 Farsight

A Georgia Tech alum’s dissertation introduced ways to make artificial intelligence (AI) more accessible, interpretable, and accountable. Although it’s been a year since his doctoral defense, Zijie (Jay) Wang’s (Ph.D. ML-CSE 2024) work continues to resonate with researchers.

Wang is a recipient of the 2025 Outstanding Dissertation Award from the Association for Computing Machinery Special Interest Group on Computer-Human Interaction (ACM SIGCHI). The award recognizes Wang for his lifelong work on democratizing human-centered AI.

“Throughout my Ph.D. and industry internships, I observed a gap in existing research: there is a strong need for practical tools for applying human-centered approaches when designing AI systems,” said Wang, now a safety researcher at OpenAI.

“My work not only helps people understand AI and guide its behavior but also provides user-friendly tools that fit into existing workflows.”

[Related: Georgia Tech College of Computing Swarms to Yokohama, Japan, for CHI 2025]

Wang’s dissertation presented techniques in visual explanation and interactive guidance to align AI models with user knowledge and values. The work culminated from years of research, fellowship support, and internships.

Wang’s most influential projects formed the core of his dissertation. These included:

  • CNN Explainer: an open-source tool developed for deep-learning beginners. Since its release in July 2020, more than 436,000 global visitors have used the tool.
  • DiffusionDB: a first-of-its-kind large-scale dataset that lays a foundation to help people better understand generative AI. This work could lead to new research in detecting deepfakes and designing human-AI interaction tools to help people more easily use these models.
  • GAM Changer: an interface that empowers users in healthcare, finance, or other domains to edit ML models to include knowledge and values specific to their domain, which improves reliability.
  • GAM Coach: an interactive ML tool that could help people who have been rejected for a loan by automatically letting an applicant know what is needed for them to receive loan approval.
  • Farsight: a tool that alerts developers when they write prompts in large language models that could be harmful and misused.  

“I feel extremely honored and lucky to receive this award, and I am deeply grateful to many who have supported me along the way, including Polo, mentors, collaborators, and friends,” said Wang, who was advised by School of Computational Science and Engineering (CSE) Professor Polo Chau.

“This recognition also inspired me to continue striving to design and develop easy-to-use tools that help everyone to easily interact with AI systems.”

Like Wang, Chau advised Georgia Tech alumnus Fred Hohman (Ph.D. CSE 2020). Hohman won the ACM SIGCHI Outstanding Dissertation Award in 2022.

Chau’s group synthesizes machine learning (ML) and visualization techniques into scalable, interactive, and trustworthy tools. These tools increase understanding and interaction with large-scale data and ML models. 

Chau is the associate director of corporate relations for the Machine Learning Center at Georgia Tech. Wang called the School of CSE his home unit while a student in the ML program under Chau.

Wang is one of five recipients of this year’s award to be presented at the 2025 Conference on Human Factors in Computing Systems (CHI 2025). The conference occurs April 25-May 1 in Yokohama, Japan. 

SIGCHI is the world’s largest association of human-computer interaction professionals and practitioners. The group sponsors or co-sponsors 26 conferences, including CHI.

Wang’s outstanding dissertation award is the latest recognition of a career decorated with achievement.

Months after graduating from Georgia Tech, Forbes named Wang to its 30 Under 30 in Science for 2025 for his dissertation. Wang was one of 15 Yellow Jackets included in nine different 30 Under 30 lists and the only Georgia Tech-affiliated individual on the 30 Under 30 in Science list.

While a Georgia Tech student, Wang earned recognition from big names in business and technology. He received the Apple Scholars in AI/ML Ph.D. Fellowship in 2023 and was in the 2022 cohort of the J.P. Morgan AI Ph.D. Fellowships Program.

Along with the CHI award, Wang’s dissertation earned him awards this year at banquets across campus. The Georgia Tech chapter of Sigma Xi presented Wang with the Best Ph.D. Thesis Award. He also received the College of Computing’s Outstanding Dissertation Award.

“Georgia Tech attracts many great minds, and I’m glad that some, like Jay, chose to join our group,” Chau said. “It has been a joy to work alongside them and witness the many wonderful things they have accomplished, and with many more to come in their careers.”

News Contact

Bryant Wine, Communications Officer
bryant.wine@cc.gatech.edu

Matthew Oliver, Associate Professor, School of Economics, Georgia Tech, EPIcenter Faculty Affiliate

Matthew Oliver, Associate Professor, School of Economics, Georgia Tech, EPIcenter Faculty Affiliate

Students in Matthew Oliver’s economics of environment and international energy markets classes likely don’t have a clue about his unusual journey to the lectern: “I was bent on being a rock and roll musician from the time I was 16, and so I ended up dropping out of the University of Memphis after just three semesters,” says Oliver, an associate professor in the School of Economics at the Georgia Institute of Technology. “I was on tour for eight years — and I was starting to feel burned out.” 

At a crossroads, Oliver decided to end his musical career — a choice he credits with launching him into academia. “I was 28 and wondering what to do with my life, so I reenrolled in college and discovered economics.”  With a longtime love of the environment and growing concern for the climate, says Oliver, “I grew fascinated with solar power and other renewables and the new markets emerging around them.”  

Today, his work in energy and environmental economics has implications for policies shaping the energy transition, from subsidies for rooftop solar to the expansion of battery storage. 

“The current frontier of energy economics is electricity and renewables, and these are areas I am passionate about,” he says. 

PVs and amped up electric use 

One of Oliver’s core research thrusts is the solar rebound effect (SRE). This phenomenon involves a quirk of human behavior: When people install solar photovoltaic (PV) panels on the roofs of their homes, they often consume more electricity. “The introduction of solar energy does not perfectly displace grid-supplied energy, but instead reduces demand for grid-supplied energy on a less than one-for-one basis, because the household increases its total electricity consumption,” says Oliver. The bottom line: Solar PV systems may not lead to as much carbon emission reduction as anticipated.  

Read more on the EPIcenter Webpage

News Contact

Written by: Leda Zimmerman

News Contact: Priya Devarajan, SEI Communications Program Manager

Ashley at the US Capitol Building.

When Ashley Cotsman arrived as a freshman at Georgia Tech, she didn’t know how to code. Now, the fourth-year Public Policy student is leading a research project on AI and decarbonization technologies.

When Cotsman joined the Data Science and Policy Lab as a first-year student, “I had zero skills or knowledge in big data, coding, anything like that,” she said.

But she was enthusiastic about the work. And the lab, led by Associate Professor Omar Asensio in the School of Public Policy, included Ph.D., master’s, and undergraduate students from a variety of degree programs who taught Cotsman how to code on the fly.

She learned how to run simple scripts and web scrapes and assisted with statistical analyses, policy research, writing, and editing. At 19, Cotsman was published for the first time. Now, she’s gone from mentee to mentor and is leading one of the research projects in the lab.

“I feel like I was just this little freshman who had no clue what I was doing, and I blinked, and now I’m conceptualizing a project and coming up with the research design and writing — it’s a very surreal moment,” she said. 
 

Ashley takes a selfie with a friend in front of a poster presentation at a conference.

Cotsman, right, presenting a research poster on electric vehicle charging infrastructure, another project she worked on with Asensio and the Data Science and Policy Lab.

 

What’s the project about?

Cotsman’s project. “Scaling Sustainability Evaluations Through Generative Artificial Intelligence.” uses the large language model GPT-4 to analyze the sea of sustainability reports organizations in every sector publish each year. 

The authors, including Celina Scott-Buechler at Stanford University, Lucrezia Nava at University of Exeter, David Reiner at University of Cambridge Judge Business School and Asensio, aim to understand how favorability toward decarbonization technologies vary by industry and over time.

“There are thousands of reports, and they are often long and filled with technical jargon,” Cotsman said. “From a policymaker’s standpoint, it’s difficult to get through. So, we are trying to create a scalable, efficient, and accurate way to quickly read all these reports and get the information.”

 

How is it done?

The team trained a GPT-4 model to search, analyze, and see trends across 95,000 mentions of specific technologies over 25 years of sustainability reports. What would take someone 80 working days to read and evaluate took the model about eight hours, Cotsman said. And notably, GPT-4 did not require extensive task-specific training data and uniformly applied the same rules to all the data it analyzed, she added.

So, rather than fine-tuning with thousands of human-labeled examples, “it’s more like prompt engineering,” Cotsman said. “Our research demonstrates what logic and safeguards to include in a prompt and the best way to create prompts to get these results.”

The team used chain-of-thought prompting, which guides generative AI systems through each step of its reasoning process with context reasoning, counterexamples, and exceptions, rather than just asking for the answer. They combined this with few-shot learning for misidentified cases, which provides increasingly refined examples for additional guidance, a process the AI community calls “alignment.”

The final prompt included definitions of favorable, neutral, and opposing communications, an example of how each might appear in the text, and an example of how to classify nuanced wording, values, or human principles as well.

It achieved a .86 F1 score, which essentially measures how well the model gets things right on a scale from zero to one. The score is “very high” for a project with essentially zero training data and a specialized dataset, Cotsman said. In contrast, her first project with the group used a large language model called BERT and required 9,000 lines of expert-labeled training data to achieve a similar F1 score.

“It’s wild to me that just two years ago, we spent months and months training these models,” Cotsman said. “We had to annotate all this data and secure dedicated compute nodes or GPUs. It was painstaking. It was expensive. It took so long. And now, two years later, here I am. Just one person with zero training data, able to use these tools in such a scalable, efficient, and accurate way.”  
 

Cotsman posing in front of the US Capitol building in Washington, DC.

Through the Federal Jackets Fellowship program, Cotsman was able to spend the Fall 2024 semester as a legislative intern in Washington, D.C.

 

Why does it matter?

While Cotsman’s colleagues focus on the results of the project, she is more interested in the methodology. The prompts can be used for preference learning on any type of “unstructured data,” such as video or social media posts, especially those examining technology adoption for environmental issues. Asensio and the Data Science and Policy team use the technique in many of their recent projects.

“We can very quickly use GPT-4 to read through these things and pull out insights that are difficult to do with traditional coding,” Cotsman said. “Obviously, the results will be interesting on the electrification and carbon side. But what I’ve found so interesting is how we can use these emerging technologies as tools for better policymaking.”

While concerns over the speed of development of AI is justifiable, she said, Cotsman’s research experience at Georgia Tech has given her an optimistic view of the new technology.

“I’ve seen very quickly how, when used for good, these things will transform our world for the better. From the policy standpoint, we’re going to need a lot of regulation. But from the standpoint of academia and research, if we embrace these things and use them for good, I think the opportunities are endless for what we can do.”

News Contact

Di Minardi

Ivan Allen College of Liberal Arts

House with solar panels on the roof

A recent study by Matthew E. Oliver from the Georgia Institute of Technology and his co-authors, Juan Moreno-Cruz from the University of Waterloo and Kenneth Gillingham from Yale University, delves into the solar rebound effect.

The "solar rebound effect" is a phenomenon where households with residential solar photovoltaic (PV) systems end up consuming more electricity in response to greater solar energy generation. This outcome arises because the cost savings from generating their own electricity lead to increased usage. A recent study by Matthew E. Oliver from the Georgia Institute of Technology and his co-authors, Juan Moreno-Cruz from the University of Waterloo and Kenneth Gillingham from Yale University, delves into this effect, providing crucial insights for policymakers and researchers.

The study, titled "Microeconomics of the Solar Rebound under Net Metering," explores how different net metering policies influence the solar rebound effect. Net metering allows households to sell excess electricity generated by their solar panels back to the grid, often at the retail rate. This policy makes solar PV systems more financially attractive but also impacts household behavior.

The authors developed a theoretical framework to understand the solar rebound. They found that under classic net metering, the rebound is primarily an income effect. Households feel wealthier due to the savings on their electricity bills and thus consume more electricity. However, under net billing, where excess electricity is compensated at a lower rate, a substitution effect also comes into play. This means households might change their consumption patterns based on the relative costs of electricity from the grid versus their solar panels.

The study also incorporates behavioral economics concepts like moral licensing and warm glow effects. Moral licensing occurs when people justify increased consumption because they feel they are already doing something good, like generating green energy. Warm glow refers to the positive feelings from contributing to environmental sustainability, which can either increase or decrease consumption depending on the household's values.

One of the key takeaways from the study is the importance of the regulatory environment. Policymakers need to carefully design net metering policies to balance promoting solar adoption while accounting for the possibility that rebound effects may offset the desired outcomes of grid resilience and reduced greenhouse gas emissions. For instance, switching from net metering to net billing might reduce the rebound effect, leading to better environmental outcomes.

The welfare analysis conducted by the authors shows that the solar rebound's impact on social welfare depends on various factors, including the cleanliness of the electricity grid and the external costs of electricity production. In cleaner grids, the rebound might be less detrimental, while in grids reliant on fossil fuels, it could negate some of the environmental benefits of solar adoption.

This research underscores the complexity of energy policy and the need for nuanced approaches that consider both economic and behavioral factors. By understanding the solar rebound effect, stakeholders can make more informed decisions to promote sustainable energy use.

For more detailed insights, you can explore the full study by Matthew E. Oliver and his co-authors. Their work provides a robust foundation for future empirical research and policy development in the field of renewable energy.

This article was written with the assistance of Microsoft Copilot (Jan. 27, 2025) and edited by Georgia Tech EPIcenter's Gilbert X. Gonzalez and Matthew E. Oliver.

News Contact

News Contact: Priya Devarajan || SEI Communications Program Manager

Written by: Gilbert X. Gonzalez, EPIcenter, Matthew Oliver, EPIcenter Faculty Affiliate

Deven Desai and Mark Riedl

Deven Desai and Mark Riedl have seen the signs for a while. 

Two years since OpenAI introduced ChatGPT, dozens of lawsuits have been filed alleging technology companies have infringed copyright by using published works to train artificial intelligence (AI) models.

Academic AI research efforts could be significantly hindered if courts rule in the plaintiffs' favor. 

Desai and Riedl are Georgia Tech researchers raising awareness about how these court rulings could force academic researchers to construct new AI models with limited training data. The two collaborated on a benchmark academic paper that examines the landscape of the ethical issues surrounding AI and copyright in industry and academic spaces.

“There are scenarios where courts may overreact to having a book corpus on your computer, and you didn’t pay for it,” Riedl said. “If you trained a model for an academic paper, as my students often do, that’s not a problem right now. The courts could deem training is not fair use. That would have huge implications for academia.

“We want academics to be free to do their research without fear of repercussions in the marketplace because they’re not competing in the marketplace,” Riedl said. 

Desai is the Sue and John Stanton Professor of Business Law and Ethics at the Scheller College of Business. He researches how business interests and new technology shape privacy, intellectual property, and competition law. Riedl is a professor at the College of Computing’s School of Interactive Computing, researching human-centered AI, generative AI, explainable AI, and gaming AI. 

Their paper, Between Copyright and Computer Science: The Law and Ethics of Generative AI, was published in the Northwestern Journal of Technology and Intellectual Property on Monday.

Desai and Riedl say they want to offer solutions that balance the interests of various stakeholders. But that requires compromise from all sides.

Researchers should accept they may have to pay for the data they use to train AI models. Content creators, on the other hand, should receive compensation, but they may need to accept less money to ensure data remains affordable for academic researchers to acquire.

Who Benefits?

The doctrine of fair use is at the center of every copyright debate. According to the U.S. Copyright Office, fair use permits the unlicensed use of copyright-protected works in certain circumstances, such as distributing information for the public good, including teaching and research.

Fair use is often challenged when one or more parties profit from published works without compensating the authors.

Any original published content, including a personal website on the internet, is protected by copyright. However, copyrighted material is republished on websites or posted on social media innumerable times every day without the consent of the original authors. 

In most cases, it’s unlikely copyright violators gained financially from their infringement.

But Desai said business-to-business cases are different. The New York Times is one of many daily newspapers and media companies that have sued OpenAI for using its content as training data. Microsoft is also a defendant in The New York Times’ suit because it invested billions of dollars into OpenAI’s development of AI tools like ChatGPT.

“You can take a copyrighted photo and put it in your Twitter post or whatever you want,” Desai said. “That’s probably annoying to the owner. Economically, they probably wanted to be paid. But that’s not business to business. What’s happening with Open AI and The New York Times is business to business. That’s big money.”

OpenAI started as a nonprofit dedicated to the safe development of artificial general intelligence (AGI) — AI that, in theory, can rival human thinking and possess autonomy.

These AI models would require massive amounts of data and expensive supercomputers to process that data. OpenAI could not raise enough money to afford such resources, so it created a for-profit arm controlled by its parent nonprofit.

Desai, Riedl, and many others argue that OpenAI ceased its research mission for the public good and began developing consumer products. 

“If you’re doing basic research that you’re not releasing to the world, it doesn’t matter if every so often it plagiarizes The New York Times,” Riedl said. “No one is economically benefitting from that. When they became a for-profit and produced a product, now they were making money from plagiarized text.”

OpenAI’s for-profit arm is valued at $80 billion, but content creators have not received a dime since the company has scraped massive amounts of copyrighted material as training data.

The New York Times has posted warnings on its sites that its content cannot be used to train AI models. Many other websites offer a robot.txt file that contains instructions for bots about which pages can and cannot be accessed. 

Neither of these measures are legally binding and are often ignored.

Solutions

Desai and Riedl offer a few options for companies to show good faith in rectifying the situation.

  • Spend the money. Desai says Open AI and Microsoft could have afforded its training data and avoided the hassle of legal consequences.

    “If you do the math on the costs to buy the books and copy them, they could have paid for them,” he said. “It would’ve been a multi-million dollar investment, but they’re a multi-billion dollar company.”
     
  • Be selective. Models can be trained on randomly selected texts from published works, allowing the model to understand the writing style without plagiarizing. 

    “I don’t need the entire text of War and Peace,” Desai said. “To capture the way authors express themselves, I might only need a hundred pages. I’ve also reduced the chance that my model will cough up entire texts.”
     
  • Leverage libraries. The authors agree libraries could serve as an ideal middle ground as a place to store published works and compensate authors for access to those works, though the amount may be less than desired.

    “Most of the objections you could raise are taken care of,” Desai said. “They are legitimate access copies that are secure. You get access to only as much as you need. Libraries at universities have already become schools of information.”

Desai and Riedl hope the legal action taken by publications like The New York Times will send a message to companies that develop AI tools to pump the breaks. If they don’t, researchers uninterested in profit could pay the steepest price.

The authors say it’s not a new problem but is reaching a boiling point.

“In the history of copyright, there are ways that society has dealt with the problem of compensating creators and technology that copies or reduces your ability to extract money from your creation,” Desai said. “We wanted to point out there’s a way to get there.”

News Contact

Nathan Deen

 

Communications Officer

 

School of Interactive Computing

woman speaking

DHS Assistant Secretary for CWMD, Mary Ellen Callahan, speaks to students on the Georgia Tech campus in September. Photo by Terence Rushin, College of Computing

Even though artificial intelligence (AI) is not advanced enough to help the average person build weapons of mass destruction, federal agencies know it could be possible and are keeping pace with next generation technologies through rigorous research and strategic partnerships. 

It is a delicate balance, but as the leader of the Department of Homeland Security (DHS), Countering Weapons of Mass Destruction Office (CWMD) told a room full of Georgia Tech students, faculty, and staff, there is no room for error. 

“You have to be right all the time, the bad guys only have to be right once,” said Mary Ellen Callahan, assistant secretary for CWMD. 

As a guest of John Tien, former DHS deputy secretary and professor of practice in the School of Cybersecurity and Privacy as well as the Sam Nunn School of International Affairs, Callahan was at Georgia Tech for three separate speaking engagements in late September. 

"Assistant Secretary Callahan's contributions were remarkable in so many ways,” said Tien. “Most importantly, I love how she demonstrated to our students that the work in the fields of cybersecurity, privacy, and homeland security is an honorable, interesting, and substantive way to serve the greater good of keeping the American people safe and secure. As her former colleague at the U.S. Department of Homeland Security, I was proud to see her represent her CWMD team, DHS, and the Biden-Harris Administration in the way she did, with humility, personality, and leadership."

While the thought of AI-assisted WMDs is terrifying to think about, it is just a glimpse into what Callahan’s office handles on a regular basis. The assistant secretary walked her listeners through how CWMD works with federal and local law enforcement on how to identify and detect the signs of potential chemical, biological, radiological, or nuclear (CBRN) weapons. 

“There's a whole cadre of professionals who spend every day preparing for the worst day in U.S. history,” said Callahan. “They are doing everything in their power to make sure that that does not happen.”

CWMD is also researching ways to implement AI technologies into current surveillance systems to help identify and respond to threats faster. For example, an AI-backed bio-hazard surveillance systems would allow analysts to characterize and contextualize the risk of potential bio-hazard threats in a timely manner.

Callahan’s office spearheaded a report exploring the advantages and risks of AI in, “Reducing the Risks at the Intersection of Artificial Intelligence and Chemical, Biological, Radiological, and Nuclear Threats,” which was released to the public earlier this year. 

The report was a multidisciplinary effort that was created in collaboration with the White House Office of Science and Technology Policy, Department of Energy, academic institutions, private industries, think tanks, and third-party evaluators. 

During his introduction of assistant secretary, SCP Chair Michael Bailey told those seated in the Coda Atrium that Callahan’s career is an incredible example of the interdisciplinary nature he hopes the school’s students and faculty can use as a roadmap.

“Important, impactful, and interdisciplinary research can be inspired by everyday problems,” he said. "We believe that building a secure future requires revolutionizing security education and being vigilant, and together, we can achieve this goal."

While on campus Tuesday, Callahan gave a special guest lecture to the students in “CS 3237 Human Dimension of Cybersecurity: People, Organizations, Societies,” and “CS 4267 - Critical Infrastructures.” Following the lecture, she gave a prepared speech to students, faculty, and staff. 

Lastly, she participated in a moderated panel discussion with SCP J.Z. Liang Chair Peter Swire and Jerry Perullo, SCP professor of practice and former CISO of International Continental Exchange as well as the New York Stock Exchange. The panel was moderated by Tien.

News Contact

John Popham, Communications Officer II 

School of Cybersecurity and Privacy | Georgia Institute of Technology

scp.cc.gatech.edu | in/jp-popham on LinkedIn

Get the latest SCP updates by joining our mailing list!