About Me

I am a Research Scientist at Apple working with the Perception and Interaction Engineering group under Machine Intelligence Neural Design (AIML). I work on developing innovative groundbreaking research in motion, vision, audio, and other sensor technologies that is pivotal in advancing applications in health monitoring, ambient intelligence, interactive technologies, and spatial computing experiences.

Prior to Apple, I was a Machine Learning Data Scientist at ŌURA where I developed sophisticated computational algorithms to analyze longitudinal physiological data in the field of mobile health including women's health, leading to the launch of features such as Pregnancy Insights and Symptom Radar.

I completed my Ph.D. at the University of Texas at Austin where I worked in the Human Signals Lab under the supervision of Prof. Edison Thomaz. During that time, I was affiliated with the Wireless, Networking & Communications Group (WNCG), the Institute for Foundations of Machine Learning (IFML), and the Intelligent Machine Learning Consortium (iMAGiNE). My main research lied at the intersection of mobile and ubiquitous computing and human-centered AI.

Regarding other academic and industry experiences, I was a Visiting Student Researcher-Intern at E.L Ginzton Laboratory at Stanford University under the supervision of Prof. Butrus Khuri-Yakub working on transcranial high intensity focused ultrasound. I also worked on epileptic seizure prediction and detection optimization as an undergraduate research assistant at the American University of Beirut supervised by Prof. Zaher Dawy. Alongside academic work, I also have industrial experiences through a residency at X, The Moonshot Factory (formerly Google X), CA and internships at Intel, CA and Apple, PA.


Recent News