Virtual Reality Is Coming to Your Eyes
Although virtual reality (VR) for consumer use seems to be popping up everywhere, VR for professional/industrial applications continues to be developed at a less frantic pace.
Many pundits are saying 2016 is the year virtual reality (VR) goes mainstream because the twin barriers of computing limitations and high costs have fallen quite a bit. More likely, 2016 will be the year of setting expectations. For gaming applications, headsets will be fabulous. For business and industrial use, VR headsets will have to change in three significant areas: technology, collaboration, and physical discomfort.
The latest iteration of VR headsets offer 360 degree tracking, integrated audio, comfortable and sensible ergonomics—for $100 on up (to around $400). A high-end PC is extra (minimum $900). So far, VR headset manufacturers have not defined the system specs for a mid-level VR computer at the consumer level. For business/professional applications, expect the specs to exceed present high-end vendor-certified workstations, which currently cost multiple thousands of dollars.
That said, technical breakthroughs for seamless, comfortable and true-to-life VR are still needed, including smaller and faster processors, and better batteries, motion tracking, haptics, visual quality (i.e., pixel resolution), and software (such as for content analysis, artificial intelligence, and voice recognition). “Today, there are many VR solutions on the market,” says Michael Kerausch, VR product marketing manager for ESI North America (esi-group.com). “Many of them claim to be useful at the industrial level and they are always somehow useful.”
Kerausch says that engineers, designers and other professionals want real-time (RT) and real-scale VR. “Off-line calculations, such as for finite element analysis (FEA) simulation, have a little bit more flexibility regarding physical modeling because you’re not needing to ensure RT results. People are doing their preprocessing, pushing the 'start simulation' button, and then waiting for their results.” Such waits are unacceptable in VR.
And as for real-scale, this is important for understanding the ergonomics associated with the product or process under investigation. Today, displaying real-scale from a desktop is difficult at best.
Generating VR sucks up a lot of compute power, especially as the VR gets closer to matching the entirety of physical laws applying to materials, mechanical interactions, gravity and so on. Computing power is also needed for including environmental factors in the VR, such as lighting, shadows and surface reflections. Therein lies the difference whether someone can differentiate leather from glass in a VR simulation. Add to that the need for scenery. “You can render life when you’re interacting with scenery,” adds Kerausch.
IC.IDO from ESI Group is a VR program for industrial applications. The software creates real-time, physics-based, immersive 3D views of products. Used in cave automatic virtual environments (CAVE), IC.IDO can show products in actual size and simulate product behavior in real time. The high-end visualizations can display real-time simulations of collision, friction, gliding; constraints and kinematics, and both flexible and plastic objects. The software is compatible with nearly all 3D hardware, and can be integrated with existing computer-aided design (CAD), computer-aided engineering (CAE) and product lifecycle management (PLM) systems. The software is already being used in virtual engineering, prototyping, building, servicing and product presentation.
Currently, IC.IDO is aimed at CAVE. “We are not yet 100 percent sure what the productive use cases will look like for VR headsets," says Kerausch.
While headset-based VR “definitely has the potential to revolutionize desktop workspaces,” says Kerausch, he sees “a big difference” between VR for desktops and for CAVE. The biggest difference is in the collaborativeness of the whole experience. In a CAVE, people from several disciplines—without wearing head-mounted displays (HMDs)—are “communicating and interacting with each other while looking at life-size models. How does that work where people are wearing HMDs? That will probably not work so easily.”
Besides the interpersonal aspects of collaboration, including all the non-verbals that go into natural conversations, there are also the software and data aspects. For the ESI Group, collaborative VR points to a cloud-based system. Stakeholders are doing more than “just” chatting and sharing screens, continues Kerausch. “You have one virtual model, and two or three or more people from all over the world interacting with that model.” Those interactions include adding annotations; turning and zooming in; measuring; manipulating shapes and fasteners; and more. According to Kerausch, a collaborative VR system could take over the documentation of design changes. People won’t have to, in the worst case, take a series of screenshots from the CAD system, put it into PowerPoint, write descriptions, then email this documentation to the relevant people. A VR-imbued CAD system would be self-documenting. Cloud-based systems can make this happen. These systems are designed to easily load CAD and other large datasets into a browser interface. They are designed to easily link these data to workstations and are designed to connect to any other input/output device, including any HMDs that come. Cloud systems also ensure that everyone is working on the same dataset, such as CAD 3D model, FEA analysis or production line simulation.
The VR experience often gives users eyestrain, headaches or, worse, makes them nauseated. (Gabe Newell, co-founder and managing director of video game development and online distribution company Valve Corporation [valvesoftware.com], has been quoted as describing headset demonstrations as the “world’s best motion sickness inducers.”) To date, doctors from a broad swath of medical disciplines are still investigating how to make VR comfortable.
Called “simulator sickness” or “cyber-sickness,” this discomfort occurs when a person’s brain tries to rationalize what the person is seeing with what the person is physically feeling. In a CAVE, people can feel their motion as they walk around and view the model. If the person is wearing an HMD while merely sitting or standing in one place, the person is not feeling many—if any—properties of the movement appearing before that person’s eyes. In many cases, people wearing an HMD are shifting their viewpoints using mouse interactions. Mousing around “breaks the whole immersive experience for some moments,” says Kerausch. Moreover, moving one’s hands to move a mouse pointing device just isn’t enough to combat cyber-sickness.
“I can’t imagine people will work eight hours with an HMD in front of their eyes,” says Kerausch. Instead, he imagines people will use VR headsets for a couple of minutes while they perform a quick review—“for certain validation issues or for certain cases where they want to have a realistic impression about a third dimension, to get a more precise product experience.”
Nevertheless, VR headsets are very much in use today. So are CAVE throughout the automotive industry. “I’m very sure CAVE will survive,” says Kerausch. “There will be no disruptive change due to Oculus and then all CAVEs are gone.” However, expect VR to be more than just a gamer’s paradise. The technology for “generating something that’s plausible,” in the words of Kerausch, will eventually affect business and industrial applications as well.
The high-end automotive CAD/CAM systems do a whole lot more than their name implies. In addition to design and manufacturing, they have the ability to support analysis, product data management, and more.
Although the term “continuous improvement” is generally associated with another company, Honda is certainly pursuing that approach, as is evidenced by the Accord, which is now in its ninth generation.
Although the RAV4 has plenty of heritage in the small crossover segment, competition has gotten a whole lot tougher, so Toyota has made significant changes to the fourth-generation model.