Disney Research Zurich, along with the University of Zaragoza, have now shared the results of their ‘Stylized Hair Capture’ project. The aim of the project is to improve upon the already popular trend of custom 3D printed figures which have traditionally been limited to facial scans being plastered onto an existing model, neglecting any other personal attribute. The ‘Stylized Hair Capture’ project sets out to improve on the fad with the intent of capturing the individual as they appear that day, right down to their individual hair style and color. The demo video — embedded below — even goes as far as to show that any object with hair- or fur-like texture, such as a stuffed animal, can be scanned in digitally and faithfully recreated as a result.
During the ACM Conference on Computer Graphics & Interactive Techniques (SIGGRAPH) event taking place August 10-14, 2014, Disney Research Zurich will present its Spin-It project, designed to optimize the moment of inertia for spinnable objects. In short, it’s a proven method for making balancing toys such as yo-yos and spinning tops.
If you were to ask a dozen people to close their eyes and imagine the traditional spinning top or yo-yo, chances are they’d all visualize the same shape of the object. This is because the objects’ shapes are already designed to perform their desired task — that is to say that the mass is equally distributed across the object, so it’s perfectly balanced. But what if you wanted to spin or yo-yo an object that wasn’t perfectly symmetrical and balanced? What if, for example, you wanted to spin a top shaped like an elephant?
Disney Research is expanding into the film business with its first short film titled Lucid Dreams of Gabriel which will be released in August. A teaser trailer for the film was released today, showcasing some of the special effects and filming techniques Disney Research employed using what it is calling ‘The Flow-Of-Time,’ consisting of ‘local frame rate variation, local pixel timing and a variety of artistic shutter functions.’
Disney Research along with Scott Hudson of Carnegie Mellon’s Human-Computer Interaction Institute have demonstrated how it’s possible to print soft, interactive objects using new 3D printers. No longer restricted using rigid materials such as plastics, new 3D printers can digitally print objects made of softer materials, such as wool and wool blend yarn.
Disney Research and Carnegie Mellon University today published its findings on how its team was able to produce and harvest electrical energy through by rubbing and even tapping specially formulated paper and a method so simple, even a child can reproduce it — as demonstrated by the sample video.
The only special requirement for the electrical generator is a thin, flexible sheet of polytetrafluoroethylene (PTFE), commonly known as Teflon. That sheet is then placed between two conductive layers, such as sheets of metallized polyester, that serve as electrodes.
Disney Research today unveiled AIREAL, a low-cost, highly scalable solution that aims to fill an increasing void in gaming by bring tactile feedback to motion-responsive controllers, such as the Microsoft Kinect.
While traditional gaming controllers often use vibration to provide haptic feedback to gamers, those using controllers that respond to physical movement haven’t been afforded that luxury until now.
The AIREAL uses a (mostly) 3D printed vortex generator that uses speakers whose diaphragms are activated, pushing air out of the device. By alternating patterns and lengths, the tactile feedback can vary depending on the player’s situation and expectations. Furthermore, the AIREAL system is scalable, allowing an array of them to be used in situations to provide feedback in true three dimensional space.
After presenting its technique for cloning the human face in an effort to produce more realistic audio animatronics at SIGGRAPH, Disney Research Zurich has released this video which takes a closer look at the process, which we began discussing on here last month.
While Disney Research Zurich prepares to present its face cloning for audio animatronic use at SIGGRAPH today, Disney Research Pittsburgh is demonstrating its own new technology which allows converting any isolated plant into an interactive experience, allowing computers to detect where a human touches the plant.
Dubbed ‘Botanicus Interacticus: Interactive Plant Technology,’ the technology, which is based on the Touche technology introduced earlier this year, allows a single electrical wire to be inserted in soil. The wire transmits a frequency sweep between .1 and 3 Mhz which allows the area in which the plant is touched to be estimated without causing any damage to the plant itself.
Traditional motion capture techniques use cameras to meticulously record the movements of actors inside studios, enabling those movements to be translated into digital models. But by turning the cameras around — mounting almost two dozen, outward-facing cameras on the actors themselves — scientists at Disney Research, Pittsburgh (DRP), and Carnegie Mellon University (CMU) have shown that motion capture can occur almost anywhere — in natural environments, over large areas and outdoors.