New Apple study shows LLMs can tell what you’re doing from audio and motion data
9to5mac.com·1d·
Flag this post

Apple Intelligence iOS 26 light purple

Apple researchers have published a study that looks into how LLMs can analyze audio and motion data to get a better overview of the user’s activities. Here are the details.

They’re good at it, but not in a creepy way

A new paper titled “Using LLMs for Late Multimodal Sensor Fusion for Activity Recognition” offers insight into how Apple may be considering incorporating LLM analysis alongside traditional sensor data to gain a more precise understanding of user activity.

This, they argue, has great potential to …

Similar Posts

Loading similar posts...