![]() Now that we're familiar with some of VoiceOver's basic controls, let's explore the rest of the app. Dan: To activate an item, pinch your right ring finger or your left index finger. VoiceOver: Make a heart with two hands, button. Dan: To move focus to the previous item, pinch your right middle finger. Use a pinch gesture or a compatible device, button. ![]() ♪ VoiceOver: Make a heart with two hands, button. By default, you can move focus to the next item by pinching your right index finger. Dan: On this platform, VoiceOver uses different finger pinches on different hands to perform different actions. Choose how you will cheer up grumpy clouds. ♪ Ethereal instrumental music ♪ VoiceOver: Happy Beam. Let's open the app and toggle VoiceOver on with a triple-press of the Digital Crown. This is a great tool when we're testing the accessibility of our app. We've gone ahead and added VoiceOver to the Accessibility Shortcut in Settings > Accessibility > Accessibility Shortcut, so that whenever we triple-press the Digital Crown, VoiceOver will toggle on or off. Let's take a look at some of the ways we can improve the VoiceOver experience in this app. In the app, you make heart gestures with your hands to turn grumpy clouds happy. Drew and I have been working on a really fun app called Happy Beam that utilizes ARKit and RealityKit. VoiceOver is the built-in screen reader available on all Apple platforms, and I'm excited to say we've brought it to this one as well. Let's start by talking about VoiceOver support. There are a few things to consider when discussing vision accessibility: VoiceOver support, visual design, and motion. Let's start by talking about the ways you can support people who are blind or low vision in your apps. We are so excited about these features, and as a developer, you can help by making sure the experiences that you're building include everyone. And we've reimagined our flagship assistive technologies specifically for spatial computing. You'll recognize many features you already know and love, like Dynamic Type support, Increase Contrast, and Spoken Content features. At Apple, we recognize that access to technology is a fundamental human right, and this platform contains the largest list of accessibility features we've ever included in the first generation of a product. Therefore, it's important to keep people of all abilities in mind when you're building out your app, so that everyone can enjoy and benefit from them. For example, someone who is blind could interact with the real world without having to see what's on the displays. ![]() In fact, these experiences have the potential to be incredibly impactful to people who are blind or low vision, have limited mobility, or limb differences. While spatial computing experiences are often built with stunning visual features and a variety of hand inputs, that doesn't mean that vision or physical movement are required to engage with them. Let's get started! We've designed this immersive platform for everyone. Then, I'll hand it off to Drew to discuss motor, cognitive, and hearing accessibility in spatial computing. Next, I'll dive into some of the specifics of what you can do in your apps to support people who are blind or low vision. In this talk, I'll give you an overview of some of the accessibility features available on this platform. I am thrilled to talk about accessibility in spatial computing, alongside my colleague Drew. ![]() ♪ Mellow instrumental hip-hop ♪ ♪ Dan Golden: Hi, I'm Dan from the Accessibility team. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |