I’ve been researching the insane non-sequitur play in simulation sandbox games, and the first paper looking into it has just been published. You can find a summary and commentary on it here.
When you create a soundtrack by mixing different sounds into the different channels for the left and right speaker, you have made use of modern stereo sound systems, but your sounds will not take advantage of 5.1 surround sound. And similarly, if you create 6 different tracks for all speakers of the 5.1 sound system, then what are you going to do when 7.1 sound systems come out?
Rather than place a sound in a specific speaker, the better approach is to place sounds in a 3D space around the user and then let the software play the sound as best it can out of whichever speakers the user has. This system for making audio content is future-proof because as additional speakers are added around and above and below, it will be able to accurately make use of all of them.
This is what we need with haptics. It has to be able to accommodate an arbitrary level of fidelity, so that games made today and in the future are playable. At the moment it is a real mess, and game devs have to hardcode in support for specific devices or it won’t work at all. To cut through all this mess, I think what you need is middleware – a universal haptic interface through which games and haptic devices communicate.
These are general locational effects that are applied primarily to large devices that surround the player, such as windmachines, heaters and floor surfaces, but they can be translated down into local body effects such as gamepad rumble as well.
Bubble effects are specific in terms of a 3D location, direction, and magnitude around the player’s centre of mass. Think of them like sounds that you place in the 3D space to be delivered through a surround sound speaker system. So you can feed in an explosion location and magnitude, a wind and rain direction, rough terrain, slippery mud, snowstorm winds, etc. For example, the Cyberith omni-directional treadmill has a floor that can rumble.
If there are no bubble devices then the developer and the player have the option to translate those haptic outputs into body effects instead (or in addition to the bubble effects).
These are local effects that are specified in terms of location on a generic player body. You can just specify a point location, or like when rigging a character for animation, you can paint vertex weights across the mesh to cover a whole area. When it is passed through to haptic devices, it will search for the nearest haptic device.
So an effect may occur on your elbow, but since you don’t have any haptic device there, it will automatically look up your arm and into your body, or down your arm to your hand, until it finds the nearest haptic device to use to deliver the effect. You can specify whether it should try to reach in towards the centre of mass, or out towards the limbs in its search.
You can specify whether it should prioritise location or direction. For example if a bullet comes from the right and hits your character’s left arm, but all you have on is a haptic vest, then should the vest deliver an impact on the left (where the arm was hit) or on the right (the direction the effect came from)? If you prioritise location, it will be felt on the left near the arm. If you prioritise direction, it will be felt on the right in the direction it came from.
Just as bubble effects can be translated down into body effects, body effects can be translated up into bubble effects, just in case the player lacks any kind of haptic vest but has windmachines.
Output parameters (every haptic event can include these pieces of data, and it will all be sent as a singular input to a haptic device):
- Location: All haptic effects can have a position in relation to the player’s body (for body effects) or centre of mass (for bubble effects). Or they can be global and apply to everywhere.
- Vector: With direction and magnitude, and possibly torque. If the device is incapable of representing one of these (eg a rumble motor that can only vibrate to convey magnitude), then it will only represent what it can. These force vectors can be used not only for active events like an explosion that pushes the player’s body, but also the passive resistance of objects that prevent the player from moving through them. If the haptic device is capable of limiting motion, then it would be done through vectors.
- Waveform: An audio file. High fidelity haptic devices (eg ViviTouch) can output exactly the “texture” of a waveform, but rumble motors can try their best to match their rumble speed to the waveform.
- Image / Video: Two-dimensional surface-creation devices (eg a floor of an omni-directional treadmill, or a surface under a fingertip) can take a static image or a moving video file and use the brightness of each pixel to determine the height of each section of the surface. Or you could feed in the heightmap of the terrain mesh, or any other 2D array.
- Kinetic: The usual physical forces we associate with haptic feedback.
- Thermal: Changes in temperature that can be achieved with air conditioners, heat lamps, and other such devices to create the heat from an explosion or the chill of a mountaintop.
One can imagine in the future a singular haptic module that can take in all these different forms of data and produce all these different effects. There could a module that straps on, like a PrioVR sensor, for body effects. It is also easy to imagine a surround sound system where each speaker also has built in a way of blowing hot and cold air, and maybe even creating the pulse of an explosive shockwave with ultrasonic or infrasound transducers. But for now, the system would simply send the data to whatever devices the player had, and it would do whatever it could.
What This Means
This system was designed with “the path of least resistance” in mind, with minimal extra work needed from developers, players and hardware manufacturers for haptic effects to be supported. Game events already take place in specific locations and involve physics forces, and all the developer needs to do is feed that data into the system as a haptic event, and it will make it work with whatever devices the player has – no longer do developers need to hardcode in specific support for specific devices. As long as the makers of the hardware include an interface with the system, and game devs support the system, it will work with all current and future devices, in whatever arrangement.
That’s the beauty of this system: If you make a game and then later a new device is released, as long as it interfaces with this haptic system, it will have retro-active compatibility with your game! And existing haptic devices like the Novint Falcon can be updated to include support for this haptic system, and then they will work with all games that support this haptic system. That is why this kind of system is future-proof. This is why it might be particularly relevant to OSVR.
Even though these can accommodate very high-fidelity outputs such as waveforms and surface textures, developers don’t necessarily have to worry about making content for these – they can just re-use the assets they have already created: Namely decals, terrain textures and sound effects.
For example, the most common haptic output is taking damage. The quickest and cheapest way to get this running is simply to feed into the Haptic Vector the direction the damage came from and the amount of damage inflicted, and feed into the Haptic Waveform the sound file for the damage effect, and feed into the Haptic Surface Texture the decal of the bullet wound that would be applied to the character. This gives different and specific haptic effects for different damage types, magnitudes and directions, but even if the developer failed to feed in one or two of these things, the system would still make use of whatever it was given. If all they fed into the haptic system was damage magnitude, then it could still make use of that. But ideally game studios would one day have a haptic designer in the same way that they have a sound designer today – someone whose job it is to create specific effects for all the events in the game, and making sure all those effects are as good as possible.
But even if you don’t have the time and resources to handcraft all of these effects, then you can still just throw in damage direction, magnitude and sound effect. They may not be particularly realistic haptic effects, or as fun as possible, but they will definitely be interesting and different from each other, and that will be far superior to nothing. It’s a good short-term solution.
Integration with devices is as simple as giving them a default profile of locations on the bubble and / or body. For example, a gamepad’s haptic profile might simply be in the centre of mass of the player’s body, or maybe it would occupy two points – one in each hand. Either way, if it is the only haptic device you have, then it will be fed every haptic event so the result is the same. But if you also have a haptic vest, then maybe the gamepad should occupy the hands so that chest events are sent to the vest and hand events are sent to the gamepad.
Player Options and Customisation:
Allows players to
- Bind their haptic devices to different haptic outputs, just like you can bind your keyboard and mouse inputs to different game actions like spacebar to jump, left-click to shoot. You get a list of all your detected haptic devices and they show up in their default places on your body/bubble, but you can place them anywhere you want if you want to re-map them because your setup is non-standard.
- For body effects: You can place a point in space on the surface of the mesh or inside the mesh. Or like when rigging a character for animation, you can paint vertex weights across the mesh to cover a whole area.
- For bubble effects: You can specify where your output devices are in relation to your centre of mass, and which direction they are pointing, if any. By default they are assumed to be pointing at your centre of mass.
- Adjust sensitivity, and invert or mute any output
- Add an offset for the direction of axes for vectors to compensate for a difference in how your particular equipment is set up.
For this to be maximally useful, especially in terms of player customisation of sensitivities and output bindings, it would have to work like a gaming peripheral driver: You can create a general default profile that will work for all games unless you have created a different profile for the specific game that you are opening up. Like gaming peripheral drivers, it would detect the game being opened, look for a profile attached to a game of that name, and if it didn’t find any, activate the default profile.
There’s also the potential it opens up for prototyping: Anyone can get an arduino and some desk fans or some other homemade device and create an interface for it for the universal haptic system, and then play some games with haptic system compatibility to test out their prototype.
Realistically, the hat switch mode for the joystick allows you access to a maximum of four items. This is a problem for games with more than four weapons (ie, most games).
The lack of a scroll wheel is one of the most bothersome problems with most game controllers.
What if you could scroll by spining the joystick around like a dial?
Moving the joystick out of the dead zone engages it as a scroll wheel and the weapon list opens up. Clockwise motion is scrolling up, counter clockwise is scrolling down. Releasing the joystick back into the dead zone disengages the scrolling and the weapon menu closes down.
Now the only thing missing is the tactile feedback of being able to feel how close you are to clicking over the threshold to the next item on the list. Some options for achieving this:
You could have motorized thumbsticks that can provide resistance or move on their own. This could let you feel the resistance as you approach a boundary, and the satisfying click when you pass over that threshold.
Vivitouch produces a small high-fidelity haptic pad for thumbsticks that acts like an artificial muscle which opens up a whole new range of possible haptic sensations apart from just rumble.
Valve is using a haptic feedback touchpad to give control information to your thumbs.
Tactical Haptics uses skin shear to provide directional haptic feedback.
You can copy and paste the freePIE code from this Google doc into a new freePIE script and it should all work:
Sorry for the radio silence. Been very busy recently with unannounced projects.
One of which requires some islands. Exactly how many, how far apart they should be, and how large, we really don’t know. Playtesting will determine. But ideally you want a way to tweak and experiment with those kinds of variables in order to find the best combination for gameplay. With hand-made assets this is very time-intensive to do, as it means prototyping levels by getting artists to make the levels manually. What if you could just playtest a level, turn some nobs to adjust its layout, then playtest it again? Procedurally generated environments allow for much easier game balancing for exactly this reason.
A simple method for generating a variety of plausible-looking islands that have gameplay-relevant differences in topology occurred to me and I prototyped it with very nice results:
Considering how simple the concept is, I’m surprised at how well these islands turned out. We usually hear people complain that procedurally-generated content is too repetitive, but these each seem to have their own character, entailing different gameplay, and are for the most part (dare I say it) rather pretty.
A while ago it occurred to me that part of the repetition and boredom of PCG might have to do with the absence of the qualitatively different “powers of ten” that Will Wright talked about inspiring Spore. That is, if we have lots of different things going on at various scales, all layered on top of each other, maybe from the players perspective it wouldn’t look quite as repetitive.
The islands involve multiple layers of perlin noise at different scales. Perlin noise gives you a gradient, and we needed sharp cliffs, so I simply have a threshold and if the perlin noise value is over that threshold, then it creates a plateau. And by varying that threshold, you can make those land features more like rivers or blobs. I’m also doing some simple maths to sharply exaggerate differences in the perlin noise before deciding whether or not it is over the threshold.
To keep things plausible, the height or general magnitude of a feature is randomly determined first, and this value is used to probabilistically affect variables like the scale modifier of the perlin noise and the threshold (the “riverness-to-blobness” of the feature).
This system allows us to instantly prototype and test any combination of variables like the size, number, and distances between islands without having to waste the time of artists getting them to make entire levels. It also gives us the option of having a virtually infinite world if we decide to go for that.
You might be asking yourself, “Why are there little circles of flat land in the centre of the islands?”
Excellent question. Well spotted. I’m not going to tell you.
Okay, I have already spent far too much time on this thing. I feel kinda guilty. Sometimes I just can’t stop thinking about an idea for a game mechanic until I prototype it, but in this case I already have too many projects running in parallel to justify spending time on this one: A Hoverboard game, where you control your feet using the Razer Hydra.
I’ve always enjoyed console games like Aggressive Inline, Tony Hawk’s Pro Skater and Dave Mirra’s BMX. It was immense fun doing crazy jumps and backflips and other stunts (and seeing the ragdoll when you crashed in Dave Mirra). But the trick controls were never really that intuitive: They required you to string together a sequence of inputs to execute a pre-animated trick like a kickflip.
So, what if you had one-to-one control of a physics-based extreme sports game, set in the future, on a hoverboard, in a multi-leveled cyberpunk city sandbox?
So I’ve got some basic stuff working but overall I’m not satisfied with the “feel” of the board control. It is too twitchy and not as “grounded” as I would like (eg you feel like you have no mass). But like I said, I’ve spent too much time on this already.
My musings while I couldn’t stop thinking about this:
- Your character’s hands and body would automatically react to how you moved the feet to stay balanced (notice in the prototype footage, the upper half is still as a statue because I never got around to implementing this). I think this will really boost the awesomeness factor to the feeling of positioning the board, and maybe make you feel like you have more weight.
- Each analog trigger controls a thruster on that end of the board. So when you put your right foot behind you and pull the right trigger, jets shoot out what is the back of the board to push you forward. But if you put your right foot in front, those jets would obviously push you back. And similarly for the left foot.
- Physics-based control: The board is pushing downward to lift you up. Therefore, if you tilt the board one way, it should push you in the opposite direction, as you would intuitively expect. And as a consequence…
- You should be able to “hit the brakes” like a snowboarder, by bringing the board in front of you and tilting it to shoot those hover jets ahead, and thus push you back and slow you down drastically.
- You should be able to build up a lot of speed, given that it is a frictionless form of travel.
- You should be able to strafe and slide and spin on the spot, and spin as you move.
- You should be able to “jump” by doing an actual ollie: If you crouch down (by raising the controllers), and then straighten your legs really fast, then you will push the hoverboard near to the ground (and its repulsive force, like a magnet, will increase exponentially with proximity), pushing you upward. Then you can bring your feet up to your chest again (by raising the controllers) and thus jump over things, or jump off ramps.
- Using this system, grinding should just work, as a result (though you may have to make contact with a part of the board that isn’t a hover jet).
- By default your feet would be stuck to the board, but you could do flip tricks with your board by moving your feet in a particular way, and hitting a “release board” button that would send the board rotating according to the movements you made – all physics-based. So to get your feet back on the board, you’re probably going to have to hit another button to grab the board and pull it back toward your feet before you land. It would be possible for you to kick the board away from you since this is a physics-based trick system.
- There should be other, secondary features integrated into the physics-based gameplay, like:
- Grabbing vertical or horizontal poles to start swinging around them, and releasing the button to let go of the pole.
- Grabbing on to vehicles (hovercars, hoverbikes, trams, police cars, etc) to hitch a ride, and let go when you are ready.
- Ragdoll when you crash. (And ragdoll for pedestrians when you crash into them). Maybe you could even use this to attack pedestrians with your tricks. Or grab them and throw them off buildings using the “grab” button.
- Maybe have a “sticky hover” feature that you have to activate manually, and drains as you use it (like a speed boost in a racing game), and what it does is allow your board to “hover” across any surface, like walls, ceilings, etc.
- Maybe have a “hover flight” feature that allows you to temporarily fly using the hoverboard (which would drain a bar, like the sticky hover). Maybe combine this with the thrusters controller by the triggers – so you just point your board upward and hit the trigger if you want to get airborne briefly.
- Maybe your hoverboard would act like a parachute, allowing you to fall from any height without getting hurt, as long as you landed with your board positioned beneath you correctly.
- Maybe you could grind on powerlines to supercharge your hoverboard in some way (or maybe just to recharge it).
- The hoverboard should have streaking neon taillights like the bikes in the opening of Akira.
- I don’t know what the core challenges would involve. Maybe graffiti (keeping in the whole “punk” theme of hoverboarding), vandalism, escaping from the cops, racing other skaters…
There definitely aren’t enough hoverboard games. And not enough extreme sports games for PC.
This game would really only work with the Hydra, I must admit. But I don’t care – it would be worth it because it would be amazing. I would also try the Oculus Rift with it, because I think they could be very complimentary in this case.
Anyway, I definitely want to play this game more than I want to make it. I don’t even have the time to make it. But I’m happy to give my prototype to anyone interested in tinkering or experimenting or making some kind of game out of it. Just let me know and I’ll send you a copy.
You can finally play my honours thesis game experiment, Microbial Sketchpad, on Kongregate:
Microbial Sketchpad is a simulation sandbox / artistic creativity toy where you can create moving visual art, or conduct experiments on the ecosystem, or both.
Created as part of a thesis investigating the role of exploratory play in understanding emergent systems.
Use current tool / Adjust interface controls: Left Click
Extract: Hold Right Click
Drag camera: Middle Click (or WASD or Arrow Keys)
Zoom: Scroll Wheel
Select different tools: 1 – 7
Show/hide info: i
Show/hide menu: p or Tab
Reset world: r or Backspace
I experimented with some game ideas relating to platformers while at uni, and people seem to agree that the result is quite interesting.
I was trying to see if I could take the platformer genre and make that system of interaction as interesting as possible. The result is a momentum toy that requires quick reactions and judgement calls.
More info (eg Controls) can be found in the readme included.
You can also play it online here, but the resolution isn’t quite right, and it seems to run slower:
I added this and another quick game experiment from a while ago to the About page, for anyone interested.