Google: Creatability
Google Creative Lab New York was tasked with demonstrating Google's commitment to inclusive design. The goal was to move beyond corporate social responsibility messaging and create tangible tools that empowered the disability community. They needed to show how Google's AI technology could solve real-world human problems by making creative expression accessible to everyone, regardless of their physical abilities, through a browser-based experience.
Creative Idea
Created open-source AI experiments allowing users to make art through any physical input.
Google collaborated with the disability community to build open-source, AI-powered tools that allow anyone to create music and art using a webcam, demonstrating that true innovation lies in making creative expression accessible to every human body.
Turning PoseNet Into a Universal Paintbrush
Designing With Not For
The project’s soul was defined by a shift from "designing for" to "designing with" the disability community. Google Creative Lab partnered with the NYU Ability Project and the Henry Viscardi School to ensure the tools met actual needs. Key collaborators included deaf composer Jay Alan Zimmerman, who helped develop "Seeing Music" to visualize sound textures, and blind scientist Josh Miele. This collaborative approach ensured the experiments weren't just technical demos, but functional instruments for creators like Chansy Fleet and Barry Farrimond.
Privacy at the Edge
A critical technical constraint was the decision to use TensorFlow.js. By running the PoseNet AI model directly in the browser, the team ensured that all motion tracking happened locally on the user's device. No video or images were ever sent to Google servers, a "privacy-first" production detail that allowed users to experiment freely without data concerns. This architecture also made the tools globally accessible to anyone with a basic webcam and a browser, bypassing the need for expensive specialized hardware.
The Nose Theremin and Beyond
The production focused on "Universal Input," meaning every tool had to respond to voice, body movement, keyboard, or switch devices. This led to the creation of the "Nose Theremin," which became a viral highlight of the campaign by allowing users to play music simply by moving their face. Beyond the immediate web interface, the project’s Component Library was released as open-source on GitHub. This allowed the global developer community to take the "AI for Good" framework and build their own bespoke accessibility tools, extending the campaign's life far beyond its initial launch.
Creative Strategy Deconstructed
Company
Google's advanced AI research and browser-based machine learning capabilities.
Category
Tech companies often treat accessibility as a compliance checkbox or a niche secondary feature.
Customer
Creators with disabilities who want to express themselves but find traditional creative software physically inaccessible.
Culture
The rise of open-source culture and the democratization of AI through web browsers.
Company
Google's advanced AI research and browser-based machine learning capabilities.
Category
Tech companies often treat accessibility as a compliance checkbox or a niche secondary feature.
Strategy:
Democratize specialized capabilities through universal interfaces to transform passive inclusion into active creative empowerment.
Customer
Creators with disabilities who want to express themselves but find traditional creative software physically inaccessible.
Culture
The rise of open-source culture and the democratization of AI through web browsers.
Strategy:
Democratize specialized capabilities through universal interfaces to transform passive inclusion into active creative empowerment.
Results
The 'Creatability' project was launched as an open-source collection of AI experiments. While specific numerical reach is not detailed in this specific video, it highlights the successful development and deployment of multiple accessible tools including Sound Canvas, Seeing Music, and Clarion Lite. The project successfully utilized TensorFlow.js to make machine learning accessible directly in the browser, removing barriers to entry for users with disabilities. The campaign fostered collaboration between Google Creative Lab, NYU Ability Project, and organizations like Open Up Music, creating a scalable framework for inclusive design.
Open Source
availability for global developers
Web-Based
no specialized hardware required
AI-Driven
utilizing TensorFlow.js for accessibility
Strategy Technique
Build an Utility, Not an Ad
Instead of just talking about inclusion, Google built a functional suite of open-source tools that solved real creative barriers for people with disabilities, making the brand's values tangible.
Explore TechniqueCreative Technique
Technology
The campaign leverages browser-based AI and body-tracking technology to remove physical barriers to art, turning the webcam into a universal interface for musical and visual expression.
Explore TechniqueCraft Breakdown
The campaign's excellence lies in its seamless integration of complex AI technology into simple, intuitive creative interfaces that prioritize human accessibility.
The use of TensorFlow.js to bring real-time AI processing to the browser is a technical feat that directly enables the campaign's core promise.
The UI/UX of the experiments is brilliantly simplified, turning complex inputs like body tracking into joyful creative expressions.
The script effectively translates complex technical concepts into an emotional and accessible narrative about human potential.
The filming captures the intimate relationship between the users and the technology with clarity and warmth.
The synergy between cutting-edge technology and empathetic design transforms a technical demo into a powerful human story.


















