Apple recently unveiled new accessibility features, including Eye Tracking technology, allowing users with physical limitations to control their iPad or iPhone using their eyes. Additionally, the Music Haptics feature utilizes the iPhone’s Taptic Engine to provide a unique music experience for individuals who are hard of hearing or deaf.
Tim Cook, Apple’s CEO, emphasized the company’s commitment to inclusive design and accessibility, highlighting Apple’s longstanding dedication to integrating accessibility into both hardware and software over the past four decades.
The Eye Tracking feature, powered by on-device machine learning, enables users to calibrate and set it up quickly using the front-facing camera. All data required for the setup and usage of Eye Tracking is securely stored on the device and not shared with Apple. This feature works seamlessly with both iPadOS and iOS apps and doesn’t necessitate any additional hardware or accessories. Other notable features include Vocal Shortcuts, allowing users to execute tasks with custom sounds, and Vehicle Motion Cues, designed to reduce motion sickness when using iPhone or iPad in a moving vehicle. With Vehicle Motion Cues, animated dots on the screen’s edges signify changes in vehicle motion, minimizing sensory conflict without disrupting the main content. These features utilize sensors within iPhone and iPad and can be easily toggled on or off in Control Center.
Post Your Comments