More than three years ago, Stitch Kingdom broke the news that Disney was looking into a way to control what it considered a viable threat: abuse of refillable mugs. The news caused an uproar across the internet sparking further debate over a topic that had already been debated immeasurable times before. While some recognized it was a long day coming, many argued that it really wasn’t worth Disney’s time and effort to implement such a system, while others plotted on how to get around it just based on the patent application. Three years later, the Walt Disney World Resort has introduced its ‘Rapid Fill’ program which, in effect, does exactly what the company set out to do.
Disney Research today unveiled AIREAL, a low-cost, highly scalable solution that aims to fill an increasing void in gaming by bring tactile feedback to motion-responsive controllers, such as the Microsoft Kinect.
While traditional gaming controllers often use vibration to provide haptic feedback to gamers, those using controllers that respond to physical movement haven’t been afforded that luxury until now.
The AIREAL uses a (mostly) 3D printed vortex generator that uses speakers whose diaphragms are activated, pushing air out of the device. By alternating patterns and lengths, the tactile feedback can vary depending on the player’s situation and expectations. Furthermore, the AIREAL system is scalable, allowing an array of them to be used in situations to provide feedback in true three dimensional space.
They may never be described as being happy or magical, but if Disney has any say about it, airport security screening areas may become a lot more traveler friendly by addressing checkpoint congestion issues by moving much of the process outside of it.
The existing problem, according to the patent application, is that complying with existing TSA policies is time consuming, with travelers needing to spend most of their time in lines by removing and isolating certain objects from their carry-ons, cleaning out pockets, removing shoes, etc. The process gets exponentially more difficult when families have small children who need additional assistance, slowing down the process and creating a chain-reaction, directly affecting other passengers.
With a patent application titled simply ‘Role-Play Simulation Engine,’ Disney Parks may be looking to use its NextGen technology base and cash in on the CosPlay/LARPing — that’s Live Action Role Play — crazes and bring a brand new experience to its theme parks.
The patent allows for guests to participant in ‘long-form role play’ events in which they interact with performers that are employed by the park to engage the guest in the role-playing activities. The performers don’t even need to be humans either. They can be audio animatronics, for example, or something as simple as a video screen that triggers in response to the guest’s arrival.
While Walt Disney World’s billion-dollar NextGen project has been no secret for quite some time, along with many of its aspects such as extensive use of RFID, the company itself has remained famously mum about the extent of the project, something we have been discussing for well over a year.
Disney Research Pittsburgh has just released the video below which demonstrates one of its latest projects: an audio animatronic robot that can interact with people by playing catch with them. The system uses an off-the-shelf Microsoft Kinect (according to the video’s narration) along with an external camera system (ASUS Xtion PRO LIVE) to locate balls and a Kalman ?lter to predict ball destination and timing. So not only is the robot able to track a human’s position and size by the location of their head, but it can attempt to move its hand to catch the ball. If the robot misses the catch, it’s fully aware and even responds with one of several different humorous animations to elicit a response from the person interacting with it.
Stop us if you’ve heard this one: When is a character not a character? When they’re an ‘experience delivery system.’ Get it? Okay, maybe it’s not that funny, or clever, but according to a patent application for a system known as ‘Managing Experience State to Personalize Destination Visits,’ it’s the future truth — and it’s a key element to the MyMagic+/My Disney Experience coming to the Walt Disney World Resort as part of its NextGen experience.
In an age where Kinect and PlayStation Eye/Move are encouraging less traditional interaction with video game consoles and cameras are a mainstay in virtual everything, let alone most mobile devices, one man at Disney Interactive sees video game systems moving even further from the path of the familiar and letting the console games make their own decisions based on — you guessed it — physical appearance.
Both ‘System and method for number of players determined using facial recognition’ (US Patent Application 20120214585) and ‘Gender and age based gameplay through face perception’ (US Patent Application 20120214584) list Phillippe Paquet as the sole inventor and offer to leverage existing technology in interesting ways.
After presenting its technique for cloning the human face in an effort to produce more realistic audio animatronics at SIGGRAPH, Disney Research Zurich has released this video which takes a closer look at the process, which we began discussing on here last month.
While Disney Research Zurich prepares to present its face cloning for audio animatronic use at SIGGRAPH today, Disney Research Pittsburgh is demonstrating its own new technology which allows converting any isolated plant into an interactive experience, allowing computers to detect where a human touches the plant.
Dubbed ‘Botanicus Interacticus: Interactive Plant Technology,’ the technology, which is based on the Touche technology introduced earlier this year, allows a single electrical wire to be inserted in soil. The wire transmits a frequency sweep between .1 and 3 Mhz which allows the area in which the plant is touched to be estimated without causing any damage to the plant itself.