Tuesday, March 18, 2014

My next car will know when I'm in a good mood

It will also know when I'm sad, grumpy, or just plain mad. Okay, it won't quite figure it out on its own — I will have to tell it. How will my car get so smart? Because Gracenote has been working to personalize the music experience in a way that dynamically adapts content to the driver’s mood and musical taste.

It's not just about play-listing locally stored content and displaying album art anymore. The folks at Gracenote can now merge multiple occupants’ music collections and find common ground. They can create stations that incorporate cloud-based content from multiple sources. And they can adapt the nature of the content played in the car to the mood of the occupants through their Mood Grid technology.

This not only makes the music experience intensely personal but it does it automagically, keeping the driver’s eyes on the road and hands on the wheel. Here’s a short video from 2014 CES that provides a quick overview — on a system powered by QNX technology, of course :-)

Wednesday, March 12, 2014

Crowd-sourced maps: the future of in-car navigation?

Guest post by Daniel Gast, innovation manager, Elektrobit Automotive

Crowdsourcing has become a major trend. Even McDonald’s has been getting into the act, asking consumers to submit new ideas for burgers. In 2013 the company’s “My Burger 3.0” campaign elicited an enormous response in Germany, with more than 200,000 burger ideas and more than 150,000 people voting for their favorites.

From burgers we go to a key component of navigation systems: digital maps. OpenStreetMap (OSM), a well-known and globally crowdsourced project, is dedicated to creating free worldwide maps and has attracted more than 100,000 registered contributors. These people volunteer their services, creating digital maps without being paid; take a glimpse of their work at www.openstreetmap.org.

Why is the amount of data behind OSM constantly growing?
Creating OSM maps is a kind of charity work, open to all to contribute and to use with free licenses. The technology behind it is very user friendly, which will help ensure long-term loyalty among contributors. But probably the most important factor is the fun it brings. Contributing content to this project consists of recording streets, buildings, bridges, forests, point of interests, and other items that you would benefit from having in a map. For many OSM editors, this is their favorite hobby — they are “addicts” in the best sense of the word. They love the project and aspire to create a perfect map. That’s the reason why the growing amount of available map data is of very good quality.

Can automakers and drivers benefit from crowd-sourced map data like OpenStreetMap?
Yes, they can. Because so many people contribute to the project, the amount of data is growing continuously. Every contributor can add or edit content at any time, and changes are integrated into the public OSM database immediately.

In the beginning only streets were collected, but because the data format is extensible, editors can add data like parking spots or pedestrian walkways. For instance, a group of firemen added hydrants for their region to the map material, using OSM’s flexibility to define and add new content. Automakers could take advantage of this flexibility to integrate individual points of interest like car repair shops or to drive business models with third-party partners, such as couponing activities.

Because it’s free of charge, OSM data could, in the mid to long term, develop into a competitive and low-priced alternative to databases being provided by commercial map data suppliers.

For their part, automakers could easily provide toolkits that allow drivers to edit wrong or missing map data on the go. Or even better, allow them to personalize maps with individual content like preferred parking places or favorite burger restaurants.

Are automotive infotainment systems ready for these new kinds of map data?
From a technical point of view, automotive software like the QNX CAR Platform for Infotainment or EB street director navigation can, without modifications, interpret this new kind of data, since the OSM map data can be converted to a specific format, much like commercial map data. It’s like creating your individual burger: the bread and meat remains the same, but you opt for tomatoes instead of onions.

That said, some gaps in the OSM data must be filled before it can provide full-blown automotive navigation. Features like traffic signs, lane information, and turn restrictions are available, but coverage remains limited. Also, the regional coverage varies widely — coverage in Germany, for example, is much higher than in countries in Africa or South America.

From the automaker’s perspective, it could be an interesting challenge to encourage the community to contribute this type of content. One opportunity to support this idea is to develop an OSM-based navigation system for mobile use. After reaching maturity the system could be easily merged into the vehicle and would allow drivers to use premium directions from automotive-approved infotainment systems like EB street director — which we saw at CES in the QNX CAR Platform — for less money.

Daniel Gast has worked for Elektrobit since 2000, initially as software engineer, later as product manager for EB street director navigation. Subsequent to this he took over the responsibility for the business area navigation solutions. He now coordinates innovation management for Elektrobit Automotive. Daniel studied computer science in Erlangen.

Keep up to date with Elektrobit's latest automotive news and products by signing up for the EB Automotive Newsletter — Ed.

Tuesday, March 11, 2014

Tackling fragmentation with a standard vehicle information API

Tina Jeffrey
Has it been a year already? In February 2013 QNX Software Systems became a contributing member of the W3C’s Automotive Web Platform Business Group, which is dedicated to accelerating the adoption of Web technologies in the auto industry. Though it took a while to rev up, the group is now in full gear and we’re making excellent progress towards our first goal of defining a vehicle information API for passenger vehicles.

The plan is to establish a standard API for accessing speed, RPM, tire pressure, and other vehicle data. The API will enable consistent app development across automakers and thereby reduce the fragmentation that affects in-vehicle infotainment systems. Developers will be able to use the API for apps running directly on the head unit as well as for apps running on mobile devices connected to the head unit.

Parallel processing
Let me walk you through our work to date. To get started, we examined API specifications from four member organizations: QNX, Webinos, Intel, and GENIVI. Next, we collected a superset of the attributes from each spec and categorized each attribute into one of several functional groups: vehicle information, running status, maintenance, personalization, driving safety, climate/environment, vision systems, parking, and electric vehicles. Then, we divvied up these functional groups among teams who worked in parallel: each team drafted an initial API for their allotted functional group before sharing it with the members at large.

Throughout this effort, we documented a set of API creation guidelines to capture the intent and reasoning behind our decisions. These guidelines cover details such as data representation, attribute value ranges and increments, attribute naming, and use of callback functions. The guidelines also capture the rules that govern how to grow or extend the APIs, if and when necessary.

Driving towards closure
In December the business group editors began to pull the initial contributions into a single draft proposal. This work is progressing and will culminate in a member’s face-to-face meeting mid-March in Santa Clara, California, where we will review the draft proposal in its entirety and drive this first initiative towards closure.

I’m sure there will be lots more to talk about, including next potential areas of focus for the group. If you're interested in following our progress, here’s a link to the draft API.


Tuesday, March 4, 2014

Self-driving cars? We had ‘em back in ‘56

Quick: What involves four-part harmony singing, control towers in the middle of the desert, and a dude smoking a fat stogie? Give up? It's the world of self-driving cars, as envisioned in 1956.

No question, Google’s self-driving car has captured the public imagination. But really, the fascination is nothing new. For instance, at the 1939 World’s Fair, people thronged to see GM’s Futurama exhibit, which depicted a world of cars controlled by radio signals. GM continued to promote its autonomous vision in the 1950s with the Firebird II, a turbine-powered car that could drive itself by following an "electronic control strip" embedded in the road. Here, for example, is a GM-produced video from 1956 in which a musically adept family goes for an autonomous drive:

Fast-forward to today, when it seems that everyone is writing about self-driving cars. Most articles don’t add anything new to the discussion, but their ubiquity suggests that, as a society, we are preparing ourselves for a future in which we give up some degree of control to our vehicles. I find it fascinating that an automaker was at the avant-garde of this process as far back as the 1930s. Talk about looking (way) ahead.

And you know what’s cool? Comparing the vision of the good life captured in the above video with the vision captured in the “Imagined” video that QNX produced 56 years later. In both cases, autonomous drive forms part of the story. And in both cases, an autonomous car helps to bring family together, though in completely different ways. It seems that, no matter how much technology (and our vision of technology) changes, the things closest to our hearts never do:

One more thing. Did you notice how much the sets in the GM video look like something straight out of the Jetson’s, right down to the bubble-domed car? They did to me. Mind you, the video predates the Jetson’s by several years, so if anything, the influence was the other way around.