Bonjour! I just returned from a week in France with some of the Automattic Design team at the Design Bienniale in St. Étienne, which included a collaboration between our own John Maeda and the Google Material Design team. You can watch our evening of presentations from Automattic and Google designers, including the European premiere of the Open Web Meditation (with a French translation!).
This week’s Six Signals are extra meaty and future-facing, including behavioral concepts for autonomous vehicles, climate change gear as fashion, and AI tools that guide visually impaired people. Enjoy!
01: From “juddering” to captcha street furniture — vocabulary for the autonomous future
My colleague Beau Lebens tipped me off to this fantastic work by Jan Chipchase, who has put together a glossary of speculative terminology about autonomous vehicles and emerging behavior. Some of my favorites include:
- Juddering: “the ripple of a dozen or more cars in a parking lot that react and finally settle to the arrival of a new vehicle.”
- Captcha street furniture: “introduced by residents looking to filter out autonomous vehicles from passing through their neighbourhoods. (The opposite will also be true, with human-drivers filtered out of many contexts).”
- Shy-distance: “the distance by which your vehicle instinctively avoids, shies away from other vehicles on the road and stationary objects.”
02: AI-driven app to guide visually impaired users
Google recently released their Lookout app for Pixel devices, which helps those with visual disabilities make sense of their physical surroundings. “By holding or wearing your device (we recommend hanging your Pixel phone from a lanyard around your neck or placing it in a shirt front pocket), Lookout tells you about people, text, objects and much more as you move through a space.”
03: Dystopian accessories for unbreathable air
As air pollution becomes a more common problem worldwide —from persistent smog conditions like we see in Beijing and Shanghai, to more frequent incidences of forest fires in places like California — face masks are becoming a necessity for more people. As a result, we’re beginning to see companies capitalizing on this need and turning the face mask into a fashion accessory. Rose Eveleth reports on this emerging reality in Vox:
The near-future of this accessory could depend on who picks up the object first … It could be adopted by streetwear fans (Supreme already sells a face mask, although it doesn’t seem to actually do much in the way of safety or filtration) or by users who prefer the Burning Man aesthetic. Or perhaps the wellness world adopts these masks, in which case the product design would look quite different. “The other direction might be the sort of Lululemon-ification of the masks, if they’re treated as these essential wellness objects and they enter the world of performance fabrics and athleisure and athletic wear.”
04: Regulating algorithms like drugs
As algorithmic systems have a real impact on more aspects of our lives, from our health care to our financial services, we face increasingly pressing questions about how to monitor and interrogate these systems. A recent Quartz article suggests that we could take cues from the medical industry and use similar processes to those used for prescription drugs. The authors point out several similarities:
- They affect lives
- They can be used as medical treatment
- They perform differently on different populations
- They can have side effects
05: The luxury of human contact
The joy — at least at first — of the internet revolution was its democratic nature. Facebook is the same Facebook whether you are rich or poor. Gmail is the same Gmail. And it’s all free. There is something mass market and unappealing about that. And as studies show that time on these advertisement-support platforms is unhealthy, it all starts to seem déclassé, like drinking soda or smoking cigarettes, which wealthy people do less than poor people.
The wealthy can afford to opt out of having their data and their attention sold as a product. The poor and middle class don’t have the same kind of resources to make that happen.
06: Designing ethical experiences
The past few years have seen more widespread concern over the “dark patterns” in software design — the ways in which experiences are designed to monetize our attention, extract our data, and exploit addictive tendencies. In response, designer Jon Yablonski has put together a clear and accessible set of resources for “humane design” that is ethical and respectful.
As designers, we play a key role in the creation of such technology, and it’s time we take responsibility for the impact that these products and services we build are having on people it should serve.
See you in two weeks! If you would like to receive Six Signals in your inbox, sign up for the Automattic.Design mailing list.