Honda Introduces "Cooperative Mobility Ecosystem" at CES 2017
LAS VEGAS, Jan. 5, 2017 -- Honda unveiled its Cooperative Mobility Ecosystem concept today at CES 2017 in Las Vegas, connecting the power of artificial intelligence, robotics and big data to transform the mobility experience of the future and improve customers' quality of life. Featuring a number of prototype and concept technology demonstrations at CES, the Honda concept envisions a future where vehicles will communicate with each other and infrastructure to mitigate traffic congestion and eliminate traffic fatalities, while increasing the productivity of road users and delivering new types of in-vehicle entertainment experiences. Vehicles will create new value by autonomously providing services when not in use by their owners.
Experience the interactive Multimedia News Release here: http://www.multivu.com/players/English/7988331-honda-ces-cooperative-mobility-ecosystem
Honda also announced collaborations with Visa, DreamWorks Animation and innovative start-ups through the Honda Developer Studio and Honda Xcelerator open innovation programs based out of Honda Silicon Valley Lab. Further, as part of its effort to accelerate open innovation, Honda has established a new URL for areas including AI, Big Data and Robotics. Interested companies and individuals can access the following URL: http://www.honda.co.jp/openinnovation/.
Supporting its Cooperative Mobility Ecosystem theme, Honda introduced the Honda NeuV, an electric automated mini-vehicle concept equipped with an artificial intelligence (AI) "emotion engine"*1 and automated personal assistant, creating new possibilities for human interaction and new value for customers.
The global mobility company also introduced Honda Riding Assist, a concept motorcycle that applies Honda's robotics technology to maintain balance. Visitors to Honda's exhibit (LVCC, North Hall – 7312) also can experience firsthand Honda robotics technology by "test-driving" the UNI-CUB, the company's self-balancing personal mobility device.
"Since our founding, Honda has focused on creating technologies that help people," said Yoshiyuki Matsumoto, president & CEO of Honda R&D Co., Ltd. "Our goal is to showcase a future technology path that results in a redefined mobility experience."
Following is a summary of the product and technology concepts Honda has on display at CES:
Honda Riding Assist motorcycle
In a global debut at CES, Honda unveiled its Riding Assist technology, which leverages Honda's robotics technology to create a self-balancing motorcycle that greatly reduces the possibility of falling over while the motorcycle is at rest. Rather than relying on gyroscopes, which add a great deal of weight and alter the riding experience as announced by other companies, the Honda Riding Assist motorcycle incorporates technology originally developed for the company's UNI-CUB personal mobility device.
Designed to create new possibilities for customers, the NeuV (pronounced "new-v"), which stands for New Electric Urban Vehicle, is a concept vehicle whose genesis is based on the fact that privately-owned vehicles sit idle 96 percent of the time. The NeuV explores the idea of how to create new value for its owner by functioning as an automated ride sharing vehicle, picking up and dropping off customers at local destinations when the owner is not using the car. The NeuV also can sell energy back to the electric grid during times of high demand when it's not in use. These activities have the potential to create a new business model for enterprising customers.
"We designed NeuV to become more valuable to the owner by optimizing and monetizing the vehicle's down time," said Mike Tsay, principal designer, Honda R&D Americas.
NeuV also functions as a thoughtful and helpful AI assistant utilizing an "emotion engine", an emerging technology developed by Honda and SoftBank (cocoro SB Corp.). Called HANA (Honda Automated Network Assistant), in its application in the NeuV, the "emotion engine" will learn from the driver by detecting the emotions behind the driver's judgments and then, based on the driver's past decisions, make new choices and recommendations. HANA can check on the driver's emotional well-being, make music recommendations based on mood, and support the owner's daily driving routine.
The NeuV features a full touch panel interface enabling both the driver and passenger to access a simple and convenient user experience. The vehicle has two seats, a storage area in back, and an electric skateboard for "last mile" transit. The NeuV also features outstanding outward visibility via a headerless windshield and a dramatically sloping belt line that make maneuvering easy.
At CES, Honda launched its "Safe Swarm" concept, which utilizes bio-mimicry – replicating the behavior of a school of fish – to create a safer, more efficient and enjoyable driving experience. The Honda Safe Swarm demonstration immerses visitors in a world where vehicles sharing the road communicate with one another using dedicated short range communication (DSRC) to support the driver in negotiating complex driving situations. The Safe Swarm concept enables vehicles to operate cooperatively, enabling more efficient, low-stress and, ultimately, collision-free mobility.
"The autonomous age has dawned, and Honda, like all automakers, is working to refine and advance this technology to achieve our goal for a collision-free society in the 2040 timeframe," said Frank Paluch, president, Honda R&D Americas. "Using vehicle-to-vehicle and vehicle-to-infrastructure communications and drawing upon big data and artificial intelligence, Honda will work with others to create an environment in which road conditions are predicted and managed, and collisions avoided."
The Honda UNI-CUB display enables CES attendees to experience a self-balancing personal mobility device that enables the seated rider to control speed, move in any direction and stop, all by simply shifting body weight. Earlier this year, the company opened the UNI-CUB's API seeking to facilitate the creation of software that can control the device from a smartphone and other devices, which would provide the potential to expand its value and functionality for people. This expands upon the UNI-CUB's original system, which currently allows the seated rider to control speed, move in any direction and stop, all by simply shifting body weight. With the ability to freely move forward, backward, side-to-side and diagonally, UNI-CUB can quickly and easily maneuver among people.
Open Innovation and Collaboration
Continuing its pursuit of open innovation and collaboration, Honda also announced initiatives with entrepreneurs, startups and global tech brands via the Honda Silicon Valley Lab.
- Visa – Building on their mobile payment collaboration at last year's Mobile World Congress, Honda is conducting two proof-of-concept demonstrations at CES created through its partnership with Visa. These demonstrations will be the first conducted with infrastructure partners Gilbarco Veeder-Root and IPS Group. The demos will showcase the simplicity and convenience when paying for services such as gasoline purchases and public parking from the comfort and safety of a vehicle.
- DreamWorks Animation – Honda has teamed with DreamWorks Animation to develop new cross-platform, augmented- and virtual reality-content and solutions for the in-vehicle experience. Honda is demonstrating a proof of concept version of its Honda Dream Drive in-car virtual reality prototype featuring exclusive DreamWorks Animation content at CES.
- VocalZoom – Through its Silicon Valley Honda Xcelerator incubator program, Honda is working with VocalZoom to apply the company's Human to Machine (HMC) optical sensor technology to the creation of a safer, more satisfying in-car voice-control experience. By "reading" physical facial skin vibrations as people speak, the VocalZoom sensor isolates their words from other voices and noise in the background. This enables automotive voice recognition systems to perform far more accurate than has been possible with traditional speech-recognition solutions. VocalZoom's optical sensor has the potential to deliver seamless, near-perfect voice-control performance even in a noisy in-cabin environment.
- LEIA Inc. – Through another Honda Xcelerator collaboration, with LEIA, Honda has developed a new driver's display concept that uses LEIA's nano technology to provide three-dimensional images, providing seamless transitions between different viewing angles for warnings and driver-assistive systems. Although 3D can be distracting if it isn't designed correctly, the LEIA's nanotech approach presents depth in a way that feels natural. Honda sees a number of potential applications for this technology, from navigation to traffic information.
CES attendees can learn more and experience demonstrations of the Honda Cooperative Mobility Ecosystem at the Honda booth (#7312) from January 5-8 at the Las Vegas Convention Center. Videos, images and more details can be found at honda.us/CES2017.
About Honda Technology
Honda is creating technologies and products that advance the company's clean, safe, fun and connected brand values. These efforts include advancements in automated vehicles, connectivity and ultra-low carbon mobility. In North America, the company has more than 300,000 vehicles on the road equipped with the Honda Sensing™ or AcuraWatch™ safety and driver-assistive technologies and more than 400,000 vehicles featuring Apple CarPlay® and Android Auto™ compatibility. Honda also is testing advanced automated vehicle technologies in Japan and North America and is targeting 2020 for the deployment of highly automated vehicles on U.S. highways. Honda also is working to fulfill its environmental and safety vision: "to realize the joy and freedom of mobility and a sustainable society where people can enjoy life." Toward this target, Honda is striving to make two-thirds of its global automobile sales from hybrid, plug-in hybrid and electrified vehicles including fuel cell and battery electric vehicles by around 2030.
*1 The "emotion engine" is a set of AI technologies developed by cocoro SB Corp., which enable machines to artificially generate their own emotions.
CONTACT: Marcos Frommer, firstname.lastname@example.org; Jaymie Robinson, email@example.com