Apple launches new programme to teach blind children how to CODE


Apple launches new programme to teach blind children how to CODE

There's something powerful hiding in your iPhone. Buried in the settings – right between the option to make your phone talk to your car and the page that will tell you how much space youre using – are a range of features that could be the difference between being able to use an iPhone and not.
They are Apples accessibility features, and theyre ones about which the company is particularly proud. Clicking that option means being introduced to a whole range of different features: tools to allow your iOS device to describe what youre doing if youre not able to see it, and others that let it be entirely controlled by special switches if the fine touch targets of an iPhone prove troublesome.
For some people, those settings and features might go entirely unused: like CarPlay and the language options that they sit alongside, they might just be another option thats interesting but unpressed. But for other people, they might be the difference between being able to use the phone and not, and options that could provide a life-changing way of getting into new technology.
Now, Apple hopes those features are part of a new way to embrace a project that has become one of its defining aims: to allow everyone to learn to code. And Apple is clear that truly means everyone, even for students who have long thought coding wasnt accessible to them.
The company already offered an app called Swift Playgrounds, available for free on the iPad. That offers a fun way for children – and adults – to play through a series of levels that appear to be a game but are actually teaching you the basics of Swift, the programming language that powers much of the iPhone and its apps.
But while that proved to be a hit, there was still something important missing. The app included all of Apples accessibility features – it has support for VoiceOver, for instance, allowing the iPad to tell its user was happening on the screen – butit posed challenges for children with visual impairment who might find it harder to follow along.
Apple has now announced that after work with the Royal National Institute of Blind People, it is rolling out resources to ensure that the Everyone Can Code curricula is really accessible for everyone, with new tools for students who are blind or visually impaired.
They include separate books with graphics of the levels from Swift Playgrounds that are rendered in tactile, braille, high-contrast ink print. They correspond with the games levels – meaning that its possible to play along with all of the challenges from the app, bringing it to life for the blind and low-vision students who might otherwise have found themselves discouraged.
It is the first time Apple has introduced such features outside of the US, where it also worked with a charity for the blind and visually impaired.
The RNIB said it had supported the project to ensure that the same resources were available to young people with visual impairment as their sighted peers.
“Every child, including children with visual impairment, should have the opportunity to learn the programming and computer coding skills that are part of the national school curriculum. This is especially important for future participation in the growing digital economy. However, many of the tools and methods used by schools to introduce children to coding are not accessible to all," said David Clarke, director of services at RNIB.
“We are delighted to have worked with Apple on this project to make their coding education app, Swift Playgrounds, more accessible for children and young people with visual impairment – so that they are able to access the same resources and information as their sighted peers and can fulfil their potential in the digital age.”
Apple has been publicly and passionately committed to accessibility for more than a decade, and today its director of accessibility programmes Sarah Herrlinger tells The Independent that the company believes accessibility to be a "human right" that has become a central part of the way the company designs the products that have gone on to take over the world.
In the iPhone, accessibility features arrived in earnest with iOS 3 and the iPhone 3GS, which brought VoiceOver – a technology that describes whats happening on the phone, baking in a tool that previously had largely required expensive kit if it has been available at all. VoiceOver has continued to flourish and has been joined by a huge number of other accessibility features.
They include smart pieces of software such as artificial intelligence technologies that are built into the camera and so allow the phone to describe what it is taking a photo of, allowing people with visual impairments to take photographs. Those are joined by a vast array of hardware: hearing aids that can communicate with and be controlled by iPhones, and switches that mean the entirety of an iPad can be controlled with presses of a button or movements of the head.
Perhaps the most impressive sit between the two, as Apple products often do. A feature called Live Listen allows people to turn their iPhone into a microphone that they can then listen in on through their wireless AirPods – which has proven so successful that a recent Reddit thread pointing it out was upvoted nearly 5,000 times.
The sheer number and reach of accessibility features has meant that the challenge for Apple and its users has now become as much about helping people find features as it is about creating them.
To do so, it ensures that accessibility is promoted throughout its retail stores all over the world. That happens in the potentially obvious ways, like ensuring that employees are fully briefed as to how to welcome anyone into its stores. But it happens in longer-term ones, too, such as ensuring that retail staff are able to take people through all the accessibility features and running workshops – just as the company throws events to teach people to take the perfect selfie, there are similar lessons on the potentially hidden accessibility features found in iOS devices.
It also means taking those features out to the people who might need them most. For the launch of the new Everyone Can Code resources, The Independent joined Apple engineers and staff at a school in Wandsworth, where parents and students were introduced to the accessibility features of iPads and then set loose to use them to work their way through some parts of Apples curriculum.
Students tore through the various levels and challenges of Swift Playgrounds with a speed and ease matched only by their enthusiasm. And they did so using a wide variety of different tools, each suited to their needs – from relatively minor adjustments to screen settings all the way through to the detail braille books and VoiceOver technology.
Apple has made sure that accessibility features are now a central part of the design process: just as with Swift Playgrounds, the secret products now being cooked up in Apples secret labs are being made with such considerations in mind. No matter what futuristic and mysterious features are waiting in the iPhones of the future, they will be made to be available to everyone.
It means that when the Apple Watch arrived in 2016, it came as the first wearable accessible to the blind from the very beginning, borrowing the same familiar gestures and features from the bigger iPhones and iPads. It gradually opened up to support people with a wide array of disabilities, including tools that now allow wheelchair users to accurately track their fitness and activity.
Such an approach is core to what Apple does, says Sarah Herrlinger. Everyone in Apples engineering staff has that "foundation" because the company is clear "what our corporate values are and why they're important", she says, but the key role that the accessibility team plays makes sure that is a part of everything Apple does.
"Because it is just one team, they can really pay attention and ensure that when things are developed theres a consistency across them," she says.
But Apples software for iPhones and iPads is just the beginning of the story. The hardware is also a platform for a wide array of apps from third-party developers, which make up a central part of using the operating system.
Apple has ensured that those developers are catered for by the system as much as possible, so that the breakthroughs it has made in accessibility are available even outside of its own apps. The company has built APIs into the software that developers use to make the apps, for instance, allowing them to do much of the heavy lifting to ensure that apps are accessible.
"We're going to pick up as much as we can," says Herrlinger. Developers can tell iOS the name of a certain button, for instance, ensuring that VoiceOver can identify it properly when it's reading it out to a person who uses that feature.
And when the app is developed, Apple says it is committing to ensuring that the developers who make their apps as open and assistive as possible are rewarded and highlighted as much as possible. Those arent just utilities, though there are plenty of those, such as apps that can identify the exact denomination of a bank note, for instance, or read out a receipts. Apple also highlights the kinds of apps that everyone relies on and have ensured that everyone can rely on them, such as Uber or Starbucks – apps, as it happens, that use exactly the same programming principles and languages found in the pages of Apples latest resources

No comments:

Post a Comment