Qingqing Cao
Research Scientist • Apple AIML.
I am a research scientist at Apple AIML. My research interests include efficient NLP, mobile computing, and ML systems. I have focused on building efficient and practical NLP systems for both edge devices and the cloud, such as on-device (visual) question answering and faster Transformer models.
Previously, I was a postdoc in the UW NLP group at the University of Washington. I hold a Ph.D. degree in computer science at Stony Brook University. I was a recipient of the Catacosinos Fellowship at Stony Brook University and a Rising Star in Data Science at the University of Chicago.
News
May 01, 2024 | APT paper got accepted to ICML 2024 🎉! Congrats to Bowen 👏! |
---|---|
Apr 22, 2024 | Checkout OpenELM, a new efficient language model family that optimizes parameters for accuracy with fewer tokens using layer-wise scaling! Training code is on Github, models are also on HuggingFace. |
Feb 15, 2024 | Glad to be invited to serve as Action Editor / Area Chair for ACL 2024 ! |
Jan 16, 2024 | BTR was accepted to ICLR as a spotlight paper! 🎊 |
Nov 10, 2023 | Gave a talk at the Efficient ML workshop hosted by Google Research. |