Do you remember the scene in the Looney Tunes, when Wile E Coyote sets up a black hole in the road for Road Runner to fall into, but then all goes wrong for him? There are so many lessons in it, and the coyote learns so few of them!
Learning is Experiential
As a child growing up you learn everything through experience. Before you understand language you can’t listen to a teacher telling you what is the correct way of doing things. Before you can read, you can’t pick up a book with the best practices of a particular field. So you learn to sit, you learn to walk, and you actually learn to listen and to speak through trial and error.
The apparent efficiency of listening to a teacher, or reading a book actually clashes with the tradeoff in effectiveness, with a relatively low capacity to retain information about what we heard or read. That is why exercises are so important: you can verify if you understand, train your muscle memory of the steps required to apply particular tools to arrive at a solution. Exercises turn what could be a passive role into active engagement.
Hands on Design Workshop
In 2010 ago I organized a workshop, part of Singularity University’s Executive Program held at NASA Ames in California. The attendees were divided into groups of 6-8 people, and tasked with designing future products for a pervasive internet of things. Inspired by Bruce Sterling’s terminology, I called it the Spime Design Workshop, and went on to organize additional ones at companies like CISCO, and all over the world. What was particularly interesting about this one at NASA, and relevant for our themes of the Metaverse, is that there was an extra group, who participated in the workshop remotely.
I made the URL of the workshop’s live stream publicly available, and a group of people following it self-organized, and set up to work in Second Life!
The way I organized the flow of activity required each group to pick specific roles: the timekeeper, the notetaker, the presenter, the illustrator. After my presentation about the Internet of Things, and a description of their task, they had one hour to discuss their ideas and illustrate them on simple sheets of paper. I would go around the tables, let them ask questions, and take photos of the paper to project during the presentations. The groups would receive a vote by everyone present.
When the surprise group announced to me via direct message that they wanted to participate as well, I was delighted.
They went on and not only came up with an interesting concept, but due to the nature of the environment they shared, they were able to create a crude 3D prototype illustrating the concept and clarifying the value proposition. Their presentation, projecting the window on Second Life, with the avatars seen by the physical participants in the room, was engaging and ranked among the first.
Hands on in the Metaverse
The beauty of the unified nature of the Metaverse, is that it potentially eliminates the difference between the participants, and the world. As an avatar, you are digital, and you can have unlimited power, root access, as it were, to the universe in which you exist, while inside the digital world. Now, certainly there have to be rules, and it is useful to have constraints around our godlike powers, otherwise we all end up like Wile E Coyote.
Representing the types of interactions and objects that we can manipulate, will require an entirely new design practice. We will have to come up with the primitives of object creation and manipulation, which have to become as universally understood and intuitive as copy and paste are today. (Just to confirm how far we have to go, not even copy and paste are completely clear to everyone, on our personal computers, or smartphones.)
Direct Manipulation of Concepts
If we are smart and ambitious enough, starting from these basic interactions, we should be able to rapidly progress, towards representing and manipulating concepts of increasing degrees of abstraction in the Metaverse. The efficiency that we gain, especially when a consolidated set of tools and interfaces becomes second nature, are going to be astonishing. Our AI tools are adept in both understanding spoken commands, and explaining themselves to us in natural language. However we both realize the limitations of this narrow communication channel. The full potential of the Metaverse is in allowing us to communicate much more effectively and broadly with Artificial Intelligences, as well as with each other. Even more so, when we will gain the technical means to directly interface our visual cortex, eliminating the need for kludgy visors completely. It will take time, and the path towards it is going to be exciting!