Friday Fellow Feature: Matthew Levine

Passionate about mathematics and dynamical systems, Schmidt Center Postdoctoral Fellow Matt Levine focuses on collaboration and knowledge sharing as he applies his computational skills towards biomedical discoveries.
Tags:
Share:
Nadya Karpova
December 20, 2024

As dynamic as the systems that he researches, Matt Levine brings his friendliness, mathematical knowledge, and eagerness to collaborate and help others to the Schmidt Center. 

After studying biophysics at Columbia University, the New Jersey native found his passion for computational projects as opposed to lab work. He joined a research project on diabetes with David Albers, George Hripcsak, and Lena Mamykina, quickly realizing that he enjoyed tackling complex dynamics problems. 

Matt also collaborated with mathematician and his future PhD advisor Andrew Stuart, who showed him the clarifying power of math and encouraged him to pursue a PhD in applied mathematics at Caltech. As Matt figured out that he wanted to return to some of his biological roots, he worked with Michael Elowitz on biological computations during his final year of graduate school, and decided to pursue a fellowship at the intersection of applied math, machine learning, statistics, and biomedicine, leading him to the Schmidt Center in the fall of 2023.

In his research on dynamical systems, Matt finds a balance between combining traditional, physics-based models with modern AI techniques. “It’s easy for mathematicians to work on theoretical models that seem perfect on paper, but fail in practice,” he says. “This is why we need to ground ourselves in real data and questions to ensure that the theories work in real-world scenarios.” 

Read on to learn more about Matt’s numerous collaborations, this year’s conference presentations, and travel highlights.

Matt at his Caltech graduation.

Tell us about your area of research.

A lot of my research centers around dynamical systems, which are systems that evolve over time. My focus is on understanding these systems by learning from the data we collect, which often involves developing models to explain how these systems work or to predict their future behavior. 

I think about this problem by blending old-school, knowledge-based approaches with newer techniques. The idea is that we often have a solid understanding of a system before we even start collecting data. Rather than relying solely on modern AI approaches that learn everything from scratch, I believe in using a hybrid style. This means we start with what we know and use that knowledge to fill in the gaps where we don’t have information.

I apply these principles in an application-agnostic setting, or a broad way where the methods are applicable to many different types of problems. However, I also focus on specific applications to make sure that the rubber can meet the road. I test these models on actual problems, observe where they fall short, and analyze why they didn’t work. After I get things to work in that setting, it’ll eventually fail again, creating a loop, and that’s the loop I like to live in.

A lot of the applications that I've worked on have a biomedical focus. For example, I’ve worked on modeling glucose dynamics in people with diabetes, using real-world data from daily self-monitoring to gain insights into their physiology. I’ve also applied these methods to climate science, aiming to improve climate models by using data to calibrate and refine these models. 

What’s something you wish more people understood in regards to your research?

I wish that more people would recognize the power of mathematics. It’s a powerful language that can be used as a clear, precise, and efficient communication tool. Even when working with experimentalists, writing down the plan and evaluations in terms of rigorous mathematics is a clarifying exercise. By getting it into such a clear language, math brings up questions that wouldn’t have been asked otherwise, forcing you to look at new things carefully.

Matt has worked substantially in the biomedical sciences, and enjoys collaborating on impactful applied projects.

Tell us about one of your collaborations.

I’m excited about a new collaboration with Luca Pinello, who’s exploring cell fate and differentiation as dynamic processes. He’s interested in how cells evolve over time, but instead of watching a continuous "video" of a cell’s entire lifespan, we work with snapshots—random moments in the cell's life. It’s like taking a snapshot of Earth today and trying to understand aging; you get different snapshots at different times but can't connect one person’s past and future.

This challenge raises interesting methodological questions and has led to discussions on various dynamical systems and modeling approaches. Our collaboration has led to new mathematical definitions and innovative methods. I'm excited to continue learning from Luca’s group, while sharing insights from my work – it’s been rewarding working together.

You presented at several major conferences this year.

I went to two exciting conferences in July. The first was ICML (International Conference on Machine Learning) in Vienna, where I gave an oral presentation on a paper I have with a group from Stanford (Emily Fox, Ramesh Johari, Dessi Zaharieva, Bob Junyi Zou) – Hybrid Neural ODE Causal Modeling and an Application to Glycemic Response. We focused on learning dynamics from data collected from people with diabetes. Because the data was noisy and sparse, we had to use a lot of existing knowledge to improve our methods, which led us to develop some new approaches. 

Before and after the conference, I also spent time in upper Austria on the weekends with a friend from grad school. It was really peaceful – we hiked, swam in lakes, observed the scenery, watched the cows, ate the sausage, and had a great time. 

At the Fourth Symposium on Machine Learning and Dynamical Systems at the Fields Institute in Toronto, I presented work done in collaboration with a group from Caltech (Andrew Stuart, Edoardo Calvello, Nikola Kovachki) during my PhD. We’re exploring how large language models (LLMs) and their Transformer architectures, typically used for sequences like word lists, can be adapted for continuous data such as time series or images. Our new approach, called operator learning, reformulates these models to handle data of varying resolutions. This allows our models to process and understand data consistently, even when resolution changes, like in pathological images with different pixel densities.

Matt's never bored as long as he has a board to jot down his ideas.

What’s a memorable experience you had related to your research?

I recently had the opportunity to collaborate with a friend – Iñigo Urteaga –  from my Columbia days. Although we weren’t in the same research group, we were close friends and colleagues—he was a postdoc while I was still a pre-doc, before I applied to graduate school. 

In the summer of 2023, I visited Iñigo at the Basque Center for Applied Mathematics in Bilbao, Spain. With him starting a new professorship there and me about to begin a postdoc at Broad in the fall of 2023, it was the ideal moment for us to reconnect and tackle a project we had long discussed. Our collaboration focuses on uncertainty quantification in dynamical systems—a field that examines how we can model systems from data and gauge the range of possible models that could explain our observations. This approach helps us understand not just what the data tells us, but how confident we should be about different explanations. We developed a JAX package that’s available on GitHub

I visited him again last winter, and he’s been a formal visitor at MIT and Broad since October. We’re continuing to work together and he’s had a chance to meet the Schmidt Center fellows and start other collaborations.

Mostly, it was really nice to work in Bilbao. We’d take coffee breaks in cafes, drink our espresso, then have long, Spanish lunches. It was a pleasant, peaceful environment to get work done, making our collaboration particularly rewarding.

"It's very cathartic for me to spend time in the mountains, with the mountain air, in the middle of nowhere," says Matt.

Let’s talk about the Oberwolfach Research Institute for Mathematics (MFO).

Coined “Math Camp” by a friend and located in the Black Forest, Germany, the MFO conference program hosts 20-30 researchers for focused seminars and workshops. When I attended, the schedule was relaxed, with lectures, coffee breaks, and group lunches, and every Wednesday we would hike and enjoy Black Forest cake together— it’s a whole tradition. 

During one of these hikes, I was talking to a researcher from Youssef Marzouk’s group about challenges I faced with learning differential equations from partially observed data. That conversation sparked an idea about using data assimilation, which turned out to be a breakthrough. I went back to my room, ran some code, and the next day I had a working solution for something I'd been struggling with for over a year during my PhD. I was very excited and shared my findings with the group in an impromptu talk, and this idea eventually contributed to a paper I was working on. I'm still exploring similar concepts today with Youssef’s group, focusing on uncertainty quantification for machine learning in unobserved systems.

What advice would you give to aspiring researchers?

One of the aspects I truly enjoy about academia is the emphasis on collaboration and knowledge sharing. I find a lot of joy in working with others and believe that a friendly, open environment greatly enhances the productivity and satisfaction of everyone involved. 

When people ask how I’ve managed to build many collaborations, my answer is simple: being friendly goes a long way. I’ve noticed that many successful researchers share this trait. For instance, George Hripcsak, the former chair of the Department of Biomedical Informatics at Columbia, is a prime example. His success in leading large consortia and managing diverse projects largely comes from people enjoying working with him. He’s great at what he does and he’s approachable. In my experience, being kind, open, and personable often proves to be more effective than just being a research machine. Be friendly!

What are some of your hobbies?

I love skiing – it’s very cathartic for me to spend time in the mountains, with the mountain air, in the middle of nowhere. Last winter I was an adaptive ski instructor, which was a really rewarding experience, as I worked with people who had specific needs, and tailored their skiing lessons accordingly. Whether it involved using specialized equipment or teaching them one-on-one, the goal was to make skiing accessible and enjoyable for everyone. It was definitely the hardest job I ever had, but incredibly fulfilling. I also like playing tennis and music, specifically the piano and guitar. I enjoy doing activities because they help me be present in the moment in a physically engaging way, which is different from my regular job. 

Nothing brings Matt down -- after he falls, he bounces back up, with a smile.

Some of Matt’s Favorite Things:

  • Underrated technology: Metronome
  • Game: Chess
  • Movie: Office Space
  • Book: A Visit from the Goon Squad by Jennifer Egan
  • Snack: Cadbury chocolate 
  • Element on the periodic table: Ununennium
  • Anything else? I’m very appreciative of my mentors!

Get Involved

Connect with us on social media:

linkedin logotwitter logo