html5 bootstrap template by

human-computer interaction (HCI) -> persuasive technology

machine learning (ML)

I am interested in building applications that support people through self-driven behavior change.

Currently, I am an MEng student at MIT.

email: andrewwo [at] mit [dot] edu

resume | github


Professor Randall Davis

Research PI: Spring 2019, Spring 2020-ongoing


Professor Stefanie Mueller

Research PI: Fall 2018


Academic Background

M.Eng. Computer Science: ongoing

B.S. Computer Science: 2019


Professional Experience

US Army National Guard: ongoing

Cyber Warfare Officer

Salesforce: Summer 2018

Software Engineering Intern


Eye Movements in Cognitive Disorder Screening

eye-tracking for partial screening of cognitive disorders such as Alzheimer's

Andrew Wong, Randall Davis
> MIT Multimodal Understanding Group

The human body makes many unconscious, minute movements that may be telling of cognitive state. For example, Alzheimer's can affect the movement of the eyes. We explored creating a system to partially screen for cognitive disorders from eye-tracker and digitizer data during a written cognitive test.


AutoAssemblyAssistant (AAA)

wearable cognitive assistant for model car assembly

Andrew Wong, James Wu, Junjue Wang, Daniel Siewiorek, Roberta Klatzky
> CMU Human Computer Interaction Institute

Cognitive assistants are systems that interact directly with people to aid them through tasks. We believe cognitive assistants can be implemented on existing smart devices without special hardware. We explored this idea with AAA, a cognitive assistant that guides user through the steps of building a model car, while detecting and correcting mistakes. We track user progress using only a wearable's camera and ML-based object detection running on a cloudlet server.

video | poster | technical report | code


physical objects that adapt to the user's skill level

Dishita Turakhia, Yini Qi, Lotta Blumberg, Or Oppenheimer, Andrew Wong, Kevin Reuss, Stefanie Mueller
> MIT HCI Engineering Group

For tailored learning, we explored physical objects with adaptable levels of difficulty. An example prototype is a basketball hoop that shrinks/widens and gets taller/shorter based on the quantity and quality of baskets made by the user. For this project, I implemented a visual programmer that generated Arduino code for any adaptive tool.

code | journal article submitted to TOCHI

Side Projects


texting app to keep communities connected during the COVID pandemic

Volunteer effort led by Daniel Jackson

Communities that normally meet up in-person are having trouble staying connected with the COVID pandemic. Hand2Hold is a texting app that matches people in an existing community together to check-in with each other.



virtual visual aid for military leaders during mission briefings

Final Project for 6.835: Intelligent Multimodal User Interfaces

Today, visual aids during mission briefings are done with physical toy pieces in a sandbox. I explored a tech solution with hand-tracking (Leap Motion) and speech recognition (Web Speech API).

video | report | code

Adaptive Pull-up Trainer

detects when a user is stuck mid pull-up and provides physical support

Final Project for 6.810: Interactive Technologies

Adaptive tools are physical objects that change based on the user. We explored building an adaptive pull-up trainer, which allows for more tailored workouts. When a user is detected to be stuck using a flex sensor armband and ultrasonic sensors on the pull-up bar, this tool spun a high-powered motor on a pulley system to lift the user up.

presentation slides