For more than seven decades, operations research has been defined by a deceptively simple question: how can mathematics and data be used to make better decisions? 

At Cornell’s School of Operations Research and Information Engineering, that question has always been answered through a powerful combination of optimization, probability, statistics, algorithms and modeling. Now, as artificial intelligence reshapes how data are generated and transformed into actionable information, the school is formally reorganizing its undergraduate curriculum to make AI a visible and coherent part of that tradition. 

Beginning this academic year, the school is launching a new focused elective – Data, Decisions and AI – that brings together courses in machine learning, reinforcement learning, data mining, causal inference and ethics, while explicitly connecting them to the core principles of operations research. The change reflects both a recognition of how AI has transformed decision-making systems and a conviction that the school’s mathematical foundations place it at the heart of that transformation. 

“This is really labeling something that has already been happening for more than 10 years,” said David Shmoys, a longtime faculty member who helped lead the redesign. “The intent is to help structure things and to make students cognizant that OR fundamentals help shape how AI systems behave, because the tools underneath them are tools we already teach.” 

David Shmoys

“We’re aware that some of our students are interested both in building machine learning and AI systems and in studying their applications to decision-making in traditional operations research domains,” said David Williamson, school director.  “We’ve built this track for them.” 

At its core, operations research is about building quantitative models that guide intelligent decision-making, Shmoys explained. That process has always involved several interlocking components: defining what “better” means through optimization, modeling uncertainty using probability combined with statistical tools, and developing algorithms to achieve it. 

Those same components now underpin modern AI systems. 

“When you go under the hood and look at how AI does what it does, the prediction mechanisms, the learning, the reasoning – it’s all using algorithms, methods and models that have been central to the OR community for decades,” Shmoys said. 

Training large AI models, for example, is fundamentally an optimization problem. Deep learning methods define a loss function – a mathematical way of measuring how well a model performs – and then use continuous optimization techniques to tune millions or even billions of parameters. While the scale is unprecedented, the underlying ideas are familiar. 

“They’re using the simplest kinds of things we’ve taught in continuous optimization courses for 40 years,” Shmoys said, “just on a scale we would never have imagined.” 

In other cases, AI research has effectively rediscovered concepts long taught in operations research classrooms. Reinforcement learning, a cornerstone of modern AI, closely parallels Markov decision processes, a framework that operations researchers have studied for generations. 

“The AI community was bold enough to do this at scales we were probably too timid to think about,” Shmoys said. “But the intellectual roots were already there.” 

What has changed most dramatically is not the mathematics, but the data. Today’s AI systems draw from massive, heterogeneous and often unstructured sources – text, images, sensor streams and human behavior at scale. Turning that information into actionable models requires new tools, particularly in machine learning and natural language processing. 

“That layer – going from data ‘out there in the wild’ to models we can manage mathematically – that’s really where modern AI comes in,” Shmoys said. 

The school’s new focused elective is designed to make those connections explicit. Rather than treating AI as an add-on or a black box, the curriculum emphasizes how predictive models interact with decision-making systems, and how choices made in modeling, objectives and constraints shape outcomes. 

The structure builds on courses the school has been offering for years, including statistical data mining, learning with big and messy data, and reinforcement learning with operations research applications. New offerings, such as causal inference, will expand the menu further, creating a coherent pathway rather than a collection of disconnected electives. 

“What we want students to see is not just a little bit here and a little bit there, but the whole panoramic view of this emerging technology,” Shmoys said. 

For now, the AI emphasis takes the form of a focused elective rather than a formal concentration. Students who complete the required sequence will be able to note the distinction on their résumés, and their progress will be visible through Cornell’s degree-tracking systems. Over time, the school anticipates that the focused elective could evolve into an official track within the operations research and information engineering major. 

“This mechanism is a simple way to make it available for students now,” Williamson said, “and then think about how it becomes a transcript-level concentration later.” 

The change aligns with broader efforts across Cornell University to integrate AI into existing disciplines rather than isolate it as a standalone major. A campus-wide AI initiative launched several years ago led to the creation of a technical AI minor and encouraged departments to develop AI-focused pathways within their own curricula. 

For the School of Operations Research and Information Engineering, that integration was a natural fit. 

“Operations research is already about data and decisions,” Shmoys said. “AI just changes the scale and the texture of the data, and gives us new tools for building models and algorithms.” 

The focused elective also requires students to grapple with the societal and ethical fairness, accountability and governance alongside technical content. 

“If you’re going to build these systems and put them into the world, you need to think broadly about their impact,” Williamson said. 

David Williamson smiles in front of Cornell's Gates Hall.

That perspective reflects a broader educational goal: helping students understand not just how to build models, but how to reason critically about them. As AI tools become more accessible, the ability to interpret, evaluate and question outputs becomes more important than rote coding skill. 

“What we’re really teaching is how to think critically about a decision-making setting,” Shmoys said. “That’s at the core of what OR has always been about.” 

Demand for the new focus is already evident. The school has long attracted students interested in applied problem-solving across domains such as transportation, health care, logistics and public policy. Increasingly, those students want to understsnd how AI systems operate, and how to shape them responsibly. 

“This resonates,” Shmoys said. “It gives students a structure and a language for thinking about how AI fits into what they’re already studying.” 

Ultimately, the curricular shift reflects a belief that operations research has a central role to play in the future of AI, not as a competitor to computer science, but as a discipline that provides the mathematical and conceptual backbone for intelligent systems. 

“If Cornell operations research wants to retain the role it’s earned,” Shmoys said, “we have to stay at the front. This is how we do that.”