I sat on Razer's Project Esther gaming cushion for 15 minutes and it messed with my brain in the best possible way
Razer Sensa HD Haptics wants to move gaming past the rumble packs of the past
CES 2024 isn't short on exciting concept products that will never see the light of an actual production line, but after spending just under half an hour with Razer's Project Esther, this gamer equivalent of a cabbie's beaded seat cushion absolutely needs to be one of the few that breaks through to the mainstream.
The concept is simple. You have a cushion that you can put in any chair, gaming or otherwise, connect it to a PC (or any other compatible gaming system like a PS5), and experience deeper immersion in games, movies, music, and more through the power of carefully calibrated vibrations. But unlike haptics of the past that have never really been able to move beyond rumbles of varying intensity, the Razer Sensa HD Haptics at the core of Project Esther aims to do something different by utilizing something known as directional haptics to better sculpt how we perceive the games we're playing.
I am not a materials engineer or a neurologist, but I have a passing understanding of how the brain really is a weird-as-hell organ, and harnessing that weirdness can create some truly incredible effects and experiences.
It's how optical illusions work, why two people can listen to the same audio recording and argue online for years about whether it's saying "Yanny" or "Laurel", or how that blue-dress-gold-dress image went so viral back in 2015. Perception, my friends, can just get really weird and can be pretty easily manipulated by skilled hands.
Fortunately for Razer, it acquired a company called Interhaptics in 2022 that has some very smart people who know a whole lot more than I do about this stuff, and they developed a software development kit (SDK) for coding new kinds of directional haptics that can border on a form of haptic illusion, the way a swirling spiral can start getting weird, but in a really fun way.
Through the skillful manipulation of vibration with accompanying visual or auditory stimulus, you can prompt your brain into doing some multisensory integration that translate haptic signals into something deeper than mere vibrations of varying intensity.
Feeling things that aren't there because your brain just makes stuff up
Not to get too mystical, but the world we move through every day is probably very different in an objective sense than to how we experience it. The various stimuli we sense through our sight, hearing, taste, touch, balance, whatever, are all given to the brain as electrical signals through our various nerves and our brains take it all and stitch it into the reality we experience.
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
As a result, the green I see on a car might actually look radically different to you, but since the signal is internally consistent to each of us, we both understand it to be something we each call green, regardless of what "color" it might look like were I to see it through your eyes and you see it through mine.
Another classic example is how we move through life more or less looking at everything across the visible ridge of our nose and the inner corners of either eye...and our brain just omits that visual information from the final image we see because it's not in any way relevant to our perception of the world. We have to literally think about it and force ourselves to see the ridge of our noses that have always been there.
It's this nature of the brain to just freelance its own reality from impulses that makes Project Esther so compelling. One of the first real demonstrations of this that I experienced was a video of a floating orb on a monitor in front of me. As it moved back into the background, the haptic response in Project Esther vibrated at my back with less intensity in relation to the orb's distance from me. As it approached, the intensity increased. Pretty standard stuff, on its own, but then the orb slowly moved off the edge of the screen and the haptic sensation at my side felt as if it was genuinely moving past me.
It's when the orb moved "behind me" that things seemed to shift. The haptic response at my back wasn't anything more than a vibration pattern moving in a wave from one side to the other side, but because I had seen the orb move to the right of the screen and disappear, the haptics continued the orb's journey around my body because my brain just filled in that bit (a process known as "perceptual constancy"), even though there's no reason for it to have done so other than to maintain a consistent perception of the world around me based on how I expect reality to operate.
Again, this isn't new in itself, but by the time the orb had emerged on the other side of the screen and returned to the center of the screen, my perception of it as being "in front" of me also changed along the way.
To be clear, other than the Razer Kraken V3 HyperSense over my ears (which were themselves vibrating in close concert with Project Esther), there was no haptics on the front of my body, but somehow my brain had gone from treating the haptic wave at my back as oscillating being behind me like a turning fan and translated it into a fuller 360-degree experience of something that was now "in front" of me, despite the actual haptic response of directly in front and directly behind being the exact same physical vibration.
While something of a parlor trick when done with a wavy green orb, replace the orb with Lady Dimitrescu stalking you through her castle in the first act of Resident Evil Village, and you can immediately see the potential for something like this.
Next, the Razer haptics expert who was walking me through the demo played a scene of a rainy street, and as my mind was primed by the stimuli of the visuals on the monitor to the rain sounds in the Kraken V3 HyperSense, my brain just translated the soft pitter-patter vibrations of the haptics in the seat and at my back as the sensation of being rained on.
And not just rained on at my back and bottom, not that the vibrations were timed like the cadence of rain, but that somehow I was being rained on. This was less than 10 minutes into the demo and I was pretty much sold at that point. But, all this was really just setup for the actual functional use of this tech in real games.
Opening up whole new doors of perception from inside a battlemech
The final part of the demo was to let me experience two visions for immersive haptics in gaming, courtesy of Mortal Kombat 1 and MechWarrior 5 Mercenaries.
Starting with the former, I got to beat up on Scorpion as Sub Zero, with the occasional tag team action from other characters doling out damage at random intervals.
Since Mortal Kombat 1 did not incorporate the Interhaptics directional haptic software development toolkit (SDK), punches and music registered in the haptic controller I was using, in the headset on my head, and in the Project Esther cushion...but it was just rumbles, really. It was really good, don't get me wrong, but we've all been experiencing this kind of thing with games and game accessories over the years. Some are better, some are worse, but ultimately, there's nothing different about the sensations themselves, just their intensities.
A special on-rails sequence in MechWarrior 5 Mercenaries, on the other hand, was built incorporating the Interhaptics SDK's directional HD haptics, and everything felt different. All four weapons firing off my mech felt distinct, consistent, and seemingly positioned around my body. Shoulder mounted rocket volleys registered as if they were above my shoulder. The machine gun arm made short work of enemy tanks, and each stuccatto tcht-tcht-tcht of the machine gun registered at my right side, more or less where the mech's arm cannon would be.
Lasers similarly had a longer, more dopplery shoop feel to them from my left, below where the missiles felt like they were coming from a moment earlier, and and the same for the single dumbfire missile on my right side (though I don't do dumbfire, so I tried it once and shrugged it off and went back to my other weapons).
Hits to the mech also registered at different positions and intensities, so taking a flanking shot felt like getting hit in the back, while tank shells at my front landed blows more like nuisance fire than how it felt duking it out with a heavily armed mech.
Again, I'll point out that this isn't new technology. These kinds of haptics have been around for years now, and they have been used effectively to deepen immersion in any number of games I've played over the years, but by the time the on-rails mission was done, it had felt like an actual battle, unlike any other similar attempt at haptic immersion I've experienced. Needless to say, should the gamer gods grant us mercy and deliver this concept to actual production, it would be the must-have accessory for every gamer on the planet.
Speaking of chairs...
So all of this was happening within a cushined seat that anyone could put onto a chair wide enough to support it, but Razer has made a number of the best gaming chairs on the market in recent years, so I asked when we should expect to see these integrated add-on cushions get fully integrated into, say, the Iksur or Enki gaming chairs?
Razer, naturally, demured, and concept projects like Project Esther are meant to see if something could work, regardless of any other cost or production issues that would need to be dealt with if it ever got to the preproduction stage. So even if everyone at Razer is on board with the idea, the best anyone can say is that it works as a cushion to put over your seat, but the jury is still out on whether it would ever or could ever work built into an actual chair.
Not to mention that there are a lot of steps between where Project Esther is today and where it would need to be in order for something like this to make it to market as a consumer item, but concept projects like Esther don't come across every day, so here's to hoping that you too will one day experience something as cool as this.
Check out our CES 2024 hub for all the latest news from the show as it happens. We'll be covering everything from 8K TVs and foldable displays to new phones, laptops, smart home gadgets, and the latest in AI, so stick with us for the big stories. And don’t forget to follow us on TikTok for the latest from the CES show floor!
John (He/Him) is the Components Editor here at TechRadar and he is also a programmer, gamer, activist, and Brooklyn College alum currently living in Brooklyn, NY.
Named by the CTA as a CES 2020 Media Trailblazer for his science and technology reporting, John specializes in all areas of computer science, including industry news, hardware reviews, PC gaming, as well as general science writing and the social impact of the tech industry.
You can find him online on Bluesky @johnloeffler.bsky.social