IMPACT Lab Integrative Multimodal Perception And Cognitive Technologies

Projects

Our projects focus on integrative multimodal perception and intelligent systems, combining machine learning with heterogeneous data sources such as vision, audio, tactile sensing, and physiological signals. Applications span healthcare, assistive technologies, and real-world intelligent systems.

Early Autism Detection from Voice Biomarkers
Early Autism Detection from Voice Biomarkers Deep learning for speech-based screening signals

Research on early autism detection using voice biomarkers and deep learning, with emphasis on robust feature learning and careful evaluation for health applications.

Multimodal Perception and Sensor Fusion
Multimodal Perception and Sensor Fusion Learning representations across heterogeneous data

Methods for integrating multiple data modalities (e.g., vision, audio, tactile signals) to improve robustness and generalization in real-world intelligent systems.

Other Projects

Additional Projects
Additional Projects More details on other projects coming soon