When Dr. Alyssa Pierson was a child, robots weren’t on her radar; she wanted to be a cartoonist. Her initial interest in art is what led her to biology, engineering, robotics, and research at MIT CSAIL and Boston University. She is also Chief Scientist at one of the world’s most cutting-edge robotics companies, Ava Robotics.
Focusing on Ava’s UV disinfection robot, Dr. Pierson’s research informs product innovation and creative business solutions across the organization. As Ava continues to build upon its platform, her unique perspective, and all she has learned along her career path, is invaluable.
From sanitation and telepresence to their latest security robot in partnership with Johnson Controls, applying a robotics platform that can be used in many different ways requires a different research mindset. MassTLC sat down with Dr. Pierson to learn more about that mindset, her work, Ava robots, and what’s keeping her curious about the future of her field.
Tell us more about Ava Robotics and the work you do there. What is your role, and what does that look like day to day?
Ava Robotics is a company that has a mobile robotics platform and brings robots into the workplace to work with and for humans. At the moment, we have three robots. One is a telepresence robot, which offers a virtual presence that moves around both from autonomous navigation and by control from a human user. We recently launched the Tyco Security Robot powered by Ava with Johnson Controls. But how I initially joined Ava, was for the UV disinfection robot. It’s an autonomous mobile robot that can disinfect spaces with UVC light, and it provides UVC disinfection without needing a human to expose themselves to it.
More broadly, Ava Robotics is a platform, and the goal is to build many different types of robots on top of it. There is a wide range of possibilities. Our robots are all built from the same ecosystem, and I’m excited to see the robots that we introduce in the future.
My role is to be an advisor to the different teams. I bring expertise about the current state of robotics, including cutting-edge technologies we could bring to our field, and how to best implement those on the robots we have, at scale. I look at what is going on in different academic labs, different universities, and ask, “What are new tools and techniques coming to the field, and how can we bring those over to our robots?” At the same time, another part of my role is looking at new applications or use cases for this mobile platform.
Were you always excited about robotics? Where did your passion and interest come from, and how did that lead to the work that you’re doing today?
I actually got into robotics rather late. I’m a little bit jealous of kids today, they grow up doing things like middle school robotics clubs and LEGO League. I didn’t get into robotics until after my undergraduate career when I went back to grad school. I had always been interested in engineering, control systems, and system design, but my background was in mechanical engineering, making things.
When I was in elementary school, I really wanted to be a cartoonist. I loved drawing, I loved art, and my favorite cartoonist was Gary Larson who did “The Far Side.” My parents also noticed I had a proclivity for math and science and said, “Well, you know, Gary Larson wasn’t just a cartoonist, he was a biologist first, and that gave him his ideas for the cartoons.” After that, I wanted to study biology. I was always interested in math and science in high school, and when I went to college, I found engineering.
During undergrad, I worked on a research project that was pretty influential for me, which was doing a kinematic analysis of pigeons in flight. It seems very different from robotics, but later in grad school, I came back to this project and realized what excited me about this was being able to understand how and why they moved. Studying robotics was a way for me to ask, “What if I could control and study how things move in intelligent or autonomous ways?” That led me to my Ph.D. at Boston University, where I worked on multi-robot systems and really got into this field and have stayed in the field since.
What type of research do you focus on in your role at Ava? How does your research inform other aspects of the business?
I just celebrated my one-year work anniversary with Ava Robotics this fall. For the past year, I focused on the UV products, particularly because this was while we were going through the pilot phase through the product launch.
What I am looking at specifically now is using my expertise in robotics path planning algorithms and autonomous navigation to come up with the first prototype or proof of concept of autonomous algorithms to bring to the Ava platform. Then, I work with the software and hardware team to translate to the product level.
At the end of the day, we want autonomous robots that don’t need a human supervisor, so my role is first researching and coming up with those initial algorithms and then working with the team to find the direction to take it from a prototype algorithm to something that can be deployed.
Ava has led the way in the development of intelligent, cutting-edge robots. What are you most excited about to see unfolding in the space right now? What’s next for the UV disinfection robot and other Ava innovations?
UVC disinfection is a technology that’s been around for some time, but since the pandemic, we started to think of new ways to use this technology. I’m excited to see it used as an additional tool for keeping workplaces and environments healthier for the humans that occupy them. What we’ve seen over the past year is that preventative measures can keep workers healthier and more confident in their workspaces.
A new application that we’ve introduced is a security robot. Many buildings and commercial sites have security cameras set up currently, but those are single-point fixtures. They can give you a view of what’s going on, but there can be blind spots, or they might not have great resolution. With a security robot, an additional vantage point can be deployed. If a motion sensor goes off, the robot can go to that location and provide an additional set of eyes, and it can provide a more persistent patrol. In addition to additional viewpoints from cameras, the robot will have other sensors, and it could also check things like temperature and humidity, which is important in a climate-controlled warehouse, for example.
It’s a really exciting third application and completely different from telepresence and UV disinfection. For all these applications, my contribution is to ask “How can we make this a smarter platform with advanced autonomous capabilities? How do we give this the brains and the robot know-how to operate in these human environments?”
For all these applications, my contribution is to ask “How can we make this a smarter platform with advanced autonomous capabilities? How do we give this the brains and the robot know-how to operate in these human environments?”
As a researcher, what would you say to someone maybe who might be hesitant to embrace a future of AI and robotics in general, from an ethics perspective? Ava describes part of its mission as “delivering on a vision of robots working with and for people.” How do you relate that to your work?
What’s very exciting about Ava’s robots is that all our robots are made to improve the worker experience in the space that they’re in. It’s not a worker replacement technology.
Telepresence robots are exciting because they create an opportunity to bring somebody into a space when maybe they couldn’t physically be there. In a world of remote and hybrid work, it’s about bringing somebody into an office that otherwise couldn’t make it into an office. We’ve also seen it used in hospitals so both loved ones and healthcare providers can see each other without worrying about disease or pathogen transmission.
With UV disinfection robots, the ultimate goal is to make the workplace safer. It is not safe for a human to be exposed to UVC light directly, so without this robot, a human would have to take a UV wand and disinfect the space, which puts that person at risk.
There are a lot of ways in which robots can work for us. We’re not trying to make a robot that humans need to adapt to or figure out how to co-exist with. Our vision is to use our platform to ask, “What are the needs of the humans? And how can we provide that with a robot?”
Our vision is to use our platform to ask, “What are the needs of the humans? And how can we provide that with a robot?”
You have such an inspiring career. What advice would you give to people that want to get kids excited about STEM, especially young girls?
Be curious and stay curious. I think it’s good to be interested in a lot of different things, and it’s okay to not necessarily know your one true idea. If you do, that’s awesome as well. It’s okay to have one vision, and it’s okay to bounce around and find your calling through experience. It took me a while to find robotics.
In terms of getting people interested in STEM, and especially getting young girls interested, we need to open up how we talk about different technologies and show that different career paths are viable. That means having visible mentors in the field, giving demonstrations, and finding new ways to bring robotics to the elementary school level.
It’s about getting hands-on experience and really just getting to play – that’s what makes robotics exciting.