Six signals: Reality training and robot furniture

Six signals logo.

In this week’s signals we are hearing a lot about spaces: real, virtual, hidden, and cramped. The people that use those spaces sometimes get replaced by machines, and those machines are very keen to understand what we do in them. Should we keep telling them?


If you want to get future issues in your inbox, please sign up for our newsletter.



1: Context-aware fill for the real world

This speculative short fiction explores a near future where mixed reality — in the form of ubiquitous “iGlasses” — has become prevalent. What’s most compelling about this piece is the exploration of a version of AR where “augmentation” includes editing out parts of the real world; a context-aware fill for reality. You could easily imagine a future like this where, if you pay extra, you no longer have to see billboards. 

And then, without really knowing what I was doing, I took my iGlasses off. I know, I know. Who does that anymore? It’s so easy to forget you’re wearing them, especially since they delete other peoples’ glasses from your field of vision. I got this rush of panic when the object tags, memory aides and search bar disappeared. It felt like diving into a pool and not knowing where the surface was.

Keep Your Augmented Reality. Give Me a Secret Garden.


2: Flat-pack robot furniture

Video: IKEA

IKEA has announced a new line of adaptive, robotic furniture that is controlled via touchpad and can convert into a bed, couch, desk, or wardrobe, depending on your needs at the moment. It is intended to maximize the use of space in small, urban apartments, appealing to the increasing percentage of the world’s population that live in high-density cities. The furniture line, called Rognan, will launch in Hong Kong and Japan in 2020.

ROGNAN robotic furniture for small space living


3: Building faces from speech

MIT researchers have published a paper describing an AI that can approximate a face from recordings of the voice that produced it. Trained on millions of YouTube videos, the guesses are directionally correct, often getting ethnicity, gender, and basic bone structure right. Researchers have found that their models are picking up subtle differences in tone and timbre in their guesses, but caution that their models are dependent on a limited set of training data, which (like most other attempts like these) means it works better for populations more represented online.

Recordings of voices may be a new frontier in privacy considerations as technologies like these improve. Already Lyrebird is offering synthetic voice generation based on recordings of a person’s voice, and as the article points out, companies like Chase are using voice-matching technologies to identify customers. It’s something to consider the next time your latest conference talk gets uploaded to YouTube.

With this AI, your voice could give away your face


4: Humans are dead, long live humans

Image: Walmart

The Washington Post tells us the dystopian story of “Freddy”, a floor-cleaning robot that is named after a janitor who was fired, making way for his automated replacement. Walmart’s recent investments in a variety of automated retail robots are changing the way their human workers approach their own work:

Their jobs, some workers said, have never felt more robotic. By incentivizing hyper-efficiency, the machines have deprived the employees of tasks they used to find enjoyable. Some also feel like their most important assignment now is to train and babysit their often inscrutable robot colleagues.

As Walmart turns to robots, it’s the human worker who feel like machines


5: A design solution for better data privacy

Designer Lennart Ziburski has developed and shared a design system for increasing data privacy while still enabling the personalization we’ve come to expect from modern apps. This proposal has two parts: an on-device “circle of trust” where apps can share data with each other freely, but only on your device, and a set of “data permits” for when a service wants to share your data.

While a compelling and detailed proposal, it may be technically impossible to enforce. Once an app has access to your calendar or contacts locally, it’s trivial to design a way to transmit that information, even if the OS has a “permit” scheme or similar that app developers are encouraged to comply with. Those that do comply will likely use that as a premium feature for those who can pay to keep their data private, with others unwittingly paying for their “free” services with their purchase histories, location data, and social graphs.

Rethinking how technology uses our personal data


6: Making fake-reality to help AI understand real-reality

Facebook Reality Labs (what a name!) has taken high-resolution 3D scans of real offices and homes for use in training their systems to better understand human spaces, identify objects, and practice navigation. Building virtual spaces as inputs to navigation and recognition routines has greatly increased the speed at which these systems can be trained; experiments that once took months can now be accomplished in just a few hours. 

Facebook plans to release their AI Habitat system, and the underlying Replica data set, so that the AI community can benefit equally. The hope is that navigational systems will soon have a deeper understanding of human spaces, eventually leading to “social presence” AI with avatars that know where to sit, stand, and walk.

Facebook has built stunning virtual spaces for its AI programs to explore


One more (haptic) robot thing…


If you want to get future issues of Six Signals in your inbox, please sign up for our newsletter.


  • Category: