What a desktop might look like in VR

One of the key mechanics in our upcoming VR puzzle gameRuberg—is building a wacky chain reaction machine with a variety of gadgets (gadgets being key objects that users can use to build a machine).

A lot of gadgets on a table top

It is thus important for the spawning of these gadgets to be easy and useful. And in a world with no limits, making the final decision on that user experience can be difficult.

This is the final version we came up with:

https://gfycat.com/FewBoldHarborporpoise

To help us narrow down an approach, we looked at the history of graphical user interfaces (GUI’s) for inspiration. Human-computer interfaces have more or less shared the same key interface metaphors over the past few decades. One such metaphor is called the desktop metaphor in which the computer screen is treated as the surface of a desk (hence the name, desktop). In this metaphor, documents and folders resting on the table’s surface become embedded inside of windows on a computer screen. These windows can be moved around on a two-dimensional plane much like pushing objects on a table’s surface. The desktop metaphor underlies today’s window managers.

Xerox Alto Interface
Xerox Alto Interface. Source: http://www.computerhistory.org/revolution/input-output/14/347/1859

This desktop metaphor fits well for our VR puzzle game—Ruberg—because:

  1. Grouping objects is as simple as moving them into a window
  2. Windows can be moved and rearranged

These two points allow us to:

  1. Show a sensibly arranged assortment of gadgets for quick perusal
  2. Provide flexibility in how the user wants to organize their workspace

After this stage, we found ourselves asking:

If desktops are inspired by real-world tables, what would this desktop look like in VR?

We came to this question when we realized that the desktop metaphor was created to fit a three-dimensional view (of the table) into a two-dimensional view (for the computer screen). Why then, should we use an abstracted model when we can go directly to the original source of the metaphor? We are, after all, dealing with three dimensions instead of two.

This is the simple idea we came up with.

An image of a table with its legs cut off

We decided to treat the table as a three-dimensional object with its legs cut off because who needs gravity for VR user interfaces anyway. This gives us the following features:

Feature 1: 3D objects can be placed “on top” of the table surface rather than flattening it into a 2D representation

Feature 2: Tables can be moved around in space

After a few rounds of testing, we realized that there are two modes of using this “table”. One mode where the user can take the table with them, which we’ve called a shortcut access, and another mode where the user can leave behind the table floating in space, which we’ve called area-dependent access.

At this point, we thought the name table had to go. As you can see above, our table isn’t so much a table as it is a panel—and so we decided to call it a panel. Brilliant decision making—I know.

The value behind the shortcut access is obvious: by allowing users to bring their “panel” with them, they can have quick access to any gadgets on the panel.

https://gfycat.com/DefiantFirstKudu

We then thought about how groups of panel will follow the user. Will they float in front of the user? Or will they be attached to one of the controllers? We went with the latter purely because the distance is minimal between the gadget needing to be grabbed and the hand that’s doing the grabbing, and physically attaching something to something is more grounded in reality, which is exactly what the game is about (the building of chain reaction machines anyway *spoiler alert* but not the story). As a bonus, there’s a lot more control over how a gadget can be grabbed out of the panel if both hands are involved in the process.

After deciding that the panel would connect to the controller, it was natural for us to decide that the panel would be physically joined to each other in order to be moved as a group because, again, this is a game grounded in reality! This gave us our third feature:

Feature 3: panel have a physical form and can “snap” to each other or “unsnap” from each other to form groups or individual panels

https://gfycat.com/FrighteningThoughtfulIrrawaddydolphin

 

https://gfycat.com/OfficialLoathsomeKillifish

The shortcut access is useful but what’s also useful is being able to leave some panels in the air as an alternative way of accessing gadgets. This has some benefits: some users may prefer a single-handed operation, and others may want to massively spam a pile of gadgets to pick from versus grabbing from the panel and then placing one gadget at a time of which the area-dependent access affords.

It was great fun thinking about how a desktop might look like in VR and how users can use it to be productive. What was surprising to us is how relevant old metaphors were. We may not need the abstraction from having to convert 3D paradigms into 2D paradigms, but it was helpful to look at the source of the metaphors to see what early designers wanted to emulate. Modern interfaces have also evolved quite a bit and so we were able to draw inspirations from them despite them still being abstractions.

That’s it for this month. We’ll be releasing another dev blog about another feature in Ruberg next month. In the meantime, if you think you’ll enjoy building wacky chain reaction machines or solving challenging physics-based puzzles, please add Ruberg to your Steam Wishlist!


Here’s a bonus feature (though it’s a bit hard to see in the video):

Feature 4: Dampening of containers

https://gfycat.com/ThornyInfamousIraniangroundjay

 

https://gfycat.com/RewardingDamagedAfricanmolesnake