Zhi Tan

portrait of zhi

Assistant Professor
Northeastern University

zhi.tan at northeastern.edu

View My CV

View My Google Scholar Profile

View My GitHub Profile

I am a Computer Science Assistant Professor at Khoury College of Computer Sciences at Northeastern University working on Human-Robot Interaction.

I am actively recruiting multiple PhD student for Fall 2024. If you are interested in pursuing a PhD exploring human-robot interaction related questions and want to live in Boston, please apply to Northeastern's CS PhD program and mention my name in your application.

If you are a current Northeastern student (both graduate and undergradute) and interested working with me, please send me an email with your resume/CV and which of the on-going projects interest you and why.


My research explores how autonomous robotic systems can collaborate with multiple robots and other embodied systems to provide assistance to diverse users. We use an interdisciplinary and iterative approach where we develop heterogeneous robotic systems that work together to complete tasks that neither robot could complete by itself. Through our explorations, we gain insights into human behavior in response to robots, generate new interaction paradigms and design guidelines, and make advancement in robotic systems and algorithms.

I was previously a Postdoctoral Fellow as part of NSF AI Institute for Collaborative Assistance and Responsive Interaction for Networked Groups (AI-CARING) at Georgia Institute of Technology. I received my Ph.D. in Robotics in Dec 2021 from Carnegie Mellon University and my B.S. in Computer Sciences from University of Wisconsin-Madison in 2015.

Research

Sequential Human Interaction with Multiple-Robots

an example of person transfer. It also show the research's contribution: (1) Taxanomy of Interaction, (2) Robot Joining Stratergy, and (3) Communication between robots

Our future likely involves humans interacting with multiple robots, whether because the first robot lacks the capability to complete the task, is more efficient, or the service requires the usage of multiple robots. In this line of work, we explored how to design sequential interaction with multiple robots. We created a taxonomy for this space [HRI'21], understood how robots should talk to each other [HRI'19], created strategies for robots joining an existing one-on-one interaction [ICMI'22], and deployed our system in the field [Thesis].

Assistive Navigation Robots in Complex Indoor Spaces

A person with visual impairment with their face blurred interacting with a stationary robot. It also show the gripper of the robot where it has a ridge to point direction and also a button.

Indoor navigation in complex, unfamiliar spaces remains difficult for people with visual impairments. In this line of research, we explored how various types of robots could help people navigate indoor spaces. We explored and designed algorithms for mobile robots [IROS'19, ASSETS'21 LBR], stationary manipulators [ASSETS'19], and small spherical robots [RO-MAN'18].

Leveraging Other Intelligent Systems in Human-Robot Interaction

A stretch robot trying to grab a milk in a fridge. On the fridge is also a smart home door sensor.

Future human-robot interactions are likely to take place in environments with other intelligent systems, such as screens and smart home sensors. In my past work, I demonstrated a HRI system that takes advantage of a co-located 50-inch touch screen and manages users' attention between the robot and the touch screen [HRI'20 LBR]. My on-going work explores how robots can leverage additional knowledge from smart home sensors to improve robots' efficiency.

Designing Longitudinal Assistive Robots

a person standing next to stretch reading a tablet.

In collabroation with other researchers in AI-CARING, we are taking user-centric approach to design systems that tackle challenges faced by people with mild cognitive impairment living at home. We are currently (1) designing systems that provide safety instructions, (2) creating methods to summarize events for remote care partners, and (3) developing smart-home simulators to advance AI/ML research




Other Projects

Robots in Society

a self-portrait of the NASA Mars Rover Oppertunity 'Oppy'

I have collaborated with other researchers looking into how people anthromorphize robots they've never met, such as Mars rovers [HRI'20] and how robots can induce bystander intervention when being physically abused [HRI'18].

Robot Identities

Showing four examples of the different ways robot identity can jump between robots: (1) one-for-one, (2) one-for-all, (3)re-embodiment, and (4) co-embodiment

I have collaborated with other researchers looking into how robots' identities can jump and co-exist in multiple robots [DIS'19] and how people perceive a dishonest robot [IJSR'21].