Laugh, and your super-accelerated AI laughs with you

Human touch: New efforts to teach machines to better understand emotions (and to learn faster) are underway at New York Tech.
By GREGORY ZELLER //

New York Institute of Technology computer-science researchers are pushing artificial intelligence ever closer to human norms – including efforts to help AI better understand human emotion.

The National Science Foundation has signed off on two fresh New York Tech research grants, each offering a modest sum but both important to their recipients, two assistant professors in the university’s College of Engineering and Computing Sciences.

Researcher Houwei Cao will use her nearly $100,000 award to help machines detect human emotions, in real time, by calculating facial expressions, body movements, hand gestures and speech patterns.

Her fellow assistant professor, Jerry Cheng, will put his $60,000 NSF grant toward more efficient and secure deep-learning processing machines – “AI accelerators” that can reliably process and interpret super-huge datasets.

Houwei Cao: All the feels.

Large datasets actually factor into both researchers’ efforts. Cao uses them to teach machines to “recognize” emotions and respond accordingly.

In humans, that’s an automatic-recognition process often influenced by subtle expressive behaviors and environmental interactions. Cao’s grant will fund a 12-month research project addressing the challenges of spontaneous emotions and the imperfect audio/video signals of real life, versus the perfectly controlled signals of laboratory settings.

Her goal: a new “multimodal emotion recognition system” ready for practical use.

“Analysis and recognition of spontaneous emotion is a challenging task,” Cao noted. “We aim to design an emotion-recognition system for real-life human-computer interaction applications.”

Lofty ambitions also mark the work of Cheng and collaborators at Rutgers, Temple and Indiana universities, who are attempting to build reliable AI accelerators capable of interpreting extremely large-scale deep-learning computations.

Jerry Cheng: Deep thoughts.

“Deep learning” uses neural networks (algorithms loosely modeled on the human brain) to recognize patterns in sensory information, including sounds and visual images, and to store this layered data in numeric form. Each time a deep-learning system repeats a task, the layers are further tuned and the results improve – just like the human brain.

But AI accelerators, which ensure that very large datasets are accurately processed in a timely manner, are few and far between, according to Cheng.

“Today’s state-of-the-art hardware and software do not provide sufficient computing capabilities and resources to ensure accurate deep-learning performance,” the researcher said.

Enter the newly funded collaborators. who are developing what the New York Tech assistant professor called “a scalable and robust system” packing a “low-cost, secure, deep-learning hardware-accelerator architecture and a suite of large-data-compatible deep-learning algorithms.”

It’s a mouthful for sure, and a potentially giant leap for AI, right alongside Cao’s emotional efforts. Both projects will see New York Tech faculty and students working side-by-side, and both speak well of the Old Westbury-based College of Engineering and Computing Sciences’ progress as a recognized leader in its field, according to Dean Babak Beheshti.

“These grants will enable our computer science faculty to perform cutting-edge research that is as important for the challenges they address as they are for the opportunities they afford New York Tech students and faculty collaborators at other leading universities,” Beheshti said in a statement.