Thursday, June 28, 2012

Video: Talking HTML5 with Audi’s Mathias Halliger

Derek Kuhn
From the elegant look outside to the technology inside, Audi has some of the most advanced cars on the road today. At CES this year, I sat down with Mathias Halliger, head of architecture, MMI system, for Audi AG, to talk about some of this technology and how HTML5 will transform the infotainment systems found in their cars.

Mathias firmly believes that HTML5 is an automotive game changer because of the doors it can open for OEMs, technology providers, app developers, and consumers. So pass the popcorn and without further ado, here is the latest video in our HTML5 video series.


Monday, June 25, 2012

New webinar: Understanding mobile apps for the car

You're an app developer. You're looking for new opportunities. You were hoping, perhaps, that Web-connected refrigerators would be the next big thing. Being first to market with a fridge app — that would have been cool, right? I mean, literally.

Problem is, the market for fridge apps hasn't warmed up yet. I'm sure it will, though. But until then, why not the car? Cars are already connected. Car makers want to make them even more connected. And those cars will need apps, whether those apps are hosted on a phone, in the cloud, or in the car itself.

Interested? Intrigued?
Then set your calendar to the webinar happening this Thursday, June 28, at 1:00 pm ET. Here's the official synopsis:

Wouldn't your app look good here?
    Understanding Mobile Apps for Automotive
    Today's merger of mobile handsets and automotive platforms is creating a brand-new market for app developers. However there are many differences between a phone and car.
    This session provides an introduction to the automotive market for the app developer looking to get into this space. Learn how a car infotainment system is structured, UI considerations that help prevent driver distraction, why HTML5 promises to be the next killer development environment for the car, and more.

On the downside, you won't learn about apps for white goods.
But, because the webinar is hosted by my inimitable colleague Andy Gryc, who has actually written software for cars, you will get the straight goods. Which is, well, cool.

Tuesday, June 19, 2012

In-car displays you hear, rather than see

We still have a lot in common with our caveman ancestors. (Yes, I know, they didn't all live in caves. Some lived in forests, others in savannahs, and still others in jungles. But I'm trying to make a point, so bear with me!)

Take, for example, our sense of hearing. At one time, we used auditory cues to locate prey or, conversely, avoid becoming prey. If a cave bear growled, getting a fix on the location of the growl could mean the difference between life and death. At the very least, it helped you avoid running directly into the bear's mouth.

Kidding aside, the human auditory system has a serious ability to fix the location, direction, and trajectory of objects, be they cave bears or Buicks. And it's an ability that's been honed from time immemorial. So why not take advantage of it when creating user interfaces for cars?

Which brings us to spatial auditory displays. In a nutshell, these displays allow you to perceive sound as coming from various locations in a three-dimensional space. Deployed in a car, they can help you intuitively identify voices and sources of instructions, and help pinpoint the location and relative trajectory of danger. They can also improve reaction times to application prompts and potentially hazardous events.
Interested in this topic? Learn more in Scott Pennock's ECD article, "Spatial auditory displays: Reducing cognitive load and improving driver reaction times."

I know, that's a lot to take in. So let's look at an example.

Locating the emergency vehicle, without really trying
Have you ever been cruising along when, suddenly, you hear an ambulance siren? I don't know about you, but I often spend time figuring out where, exactly, the ambulance is coming from. And I don't always get it right. That's called a location error.

Such errors can occur for a variety of reasons. For example, if the ambulance is approaching from the right, but your left window is open and a building on the left is reflecting sound from the siren, you might make the mistake of thinking that the ambulance is approaching from the left. Your mind realizes, quite correctly, that the sound is coming from the left, but the environment is conspiring to mask where the sound is actually coming from.

A spatial auditory display can help address this problem by controlling the acoustic cues you hear. The degree to which the display can do this depends, in part, on the hardware employed. For example, a display based on a large array of loudspeakers can provide more location information than one based on two loudspeakers.

In any case (and this is important), the display can help you determine the location more quickly and with less cognitive load — which means you may have more brain cycles to respond to the situation appropriately.

Helping the driver locate and track an emergency vehicle

A slight right, not a sharp right
I'm only scratching the surface here. Spatial auditory displays can, in fact, help improve all kinds of driving activities, from engaging in a handsfree call to using your navigation system.

For example, rather than simply say "turn right", the display could emit the instruction from the right side of the vehicle. It could even use apparent motion of the auditory prompt to convey a slight right as opposed to a sharp right.

But enough from me. To learn more about spatial auditory displays, check out a new article from my colleague Scott Pennock, whose knowledge of spatial auditory displays far surpasses mine. The article is called Spatial auditory displays: Reducing cognitive load and improving driver reaction times, and it has just been published by Embedded Computing Design magazine.

Monday, June 18, 2012

QNX reference vehicle makes stopover at FTF Americas 2012

Fresh off Telematics Detroit, the QNX reference vehicle is on the road again. And this time, it’s headed to the Freescale Technology Forum (FTF) in San Antonio.

Have you seen photos of the vehicle? If so, you'll know it's a specially modified Jeep Wrangler. From the outside, the Jeep stills looks the same, but beneath the hood, something has changed. For the first time, the Jeep’s head unit and instrument cluster, both based on the QNX CAR 2 application platform, are using Freescale i.MX 6 processors. And what better place than FTF to show off this new processor support?

Closeup of Jeep's instrument cluster. See previous post for more photos of vehicle.

As before, the reference vehicle will showcase several capabilities of the QNX CAR 2 platform, including:

  • auto-centric HTML5 framework
  • integration with a variety of popular smartphones
  • one-touch Bluetooth pairing with smartphones using NFC
  • ultra HD hands-free communication
  • DLNA support for phone- and home- based media
  • tablet-based rear-seat entertainment
  • reconfigurable digital instrument cluster
  • Wi-Fi hotspot

The vehicle will also demonstrate several popular third-party technologies, including Pandora, Slacker, and TuneIn Internet radio; TCS navigation; Weather Network; Best Parking; and Vlingo/AT&T Watson voice recognition.

What, more demos?
The reference vehicle isn't the only place to catch QNX technology at FTF. QNX will also showcase:

  • a 3D digital instrument cluster based on a Freescale i.MX 6 quad processor and the QNX Neutrino RTOS, and built with Elektrobit's EB GUIDE Human Machine Interface environment
  • a complete dashboard, including head unit and digital cluster, based on the QNX CAR 2 platform
  • demos for industrial controllers, medical devices, multi-core systems, and advanced graphics, all of which run on the QNX Neutrino RTOS and Freescale silicon

QNX at the podium
Did I mention? QNX experts will also in participate in several presentations and panels. Here's the quick schedule:

  • The HTML5 Effect: How HTML5 will Change the Networked Car — June 19, 2:00 pm, Grand Oaks Ballroom A
  • Using an IEC 61508-Certified RTOS Kernel for Safety-Critical Systems — June 20, 2:00 pm, Grand Oaks Ballroom P
  • Embedded Meets Mobility: M2M Considerations and Concepts — June 20, 5:15 pm, Grand Oaks Ballroom E
  • New System Design for Multicore Processors — June 21, 10:30 am, Grand Oaks Ballroom F

Visit the FTF website for details on these and other FTF presentations.

And if you're at FTF, remember to catch the QNX demos at pod numbers 1400 to 1405.

Thursday, June 14, 2012

WIRED Autopia slips into driver's seat of QNX reference vehicle

Chances are, you've seen pictures of the new QNX reference vehicle. You may have even seen the "making of" video that QNX released a few days ago. But have you seen any video of the vehicle in action?

If not, check out this vid by Doug Newcomb of WIRED Autopia. Last week, at Telematics Detroit, Doug met up with Andrew Poliak of QNX for a tour of the vehicle and its various features, including a re-skinnable UI and voice-controlled Facebook integration. The camera was rolling, and here's what it caught:

HTML5 brings new buzz to infotainment system development

QNX to unveil QNX CAR 2 platform on Freescale i.MX 6 at FTF Americas — a guest post from Paul Sykes of Freescale

If you’ve visited the QNX website recently or attended the Telematics Detroit conference last week, then you’ve surely noticed that HTML5 is getting a lot of attention in automotive these days. The buzz around HTML5 focuses on two areas: as an application development and delivery framework, and as an HMI framework. In discussions with many industry participants, my impression is that the application framework part is generally accepted, while the HMI framework part still isn’t well understood.

I don’t intend to discuss these HTML5 aspects in detail. There are experts within the ecosystem that can do a much better job than me. But I will say that Freescale applications processors will offer the processing and graphics performance to run the desired applications and bring the HMI to life with stunning graphics.

Next week, Freescale will host the annual FTF Americas event in San Antonio, TX. We are very excited about the first public unveiling of the QNX CAR 2 application platform on i.MX 6. Since QNX CAR 2 is based on HTML5, it is particularly fitting to mention in this blog. For those with an interest in understanding more about HTML5 for infotainment systems, QNX and many other ecosystem partners will be on hand at FTF to discuss their thoughts and plans.

Paul Sykes is a member of Freescale’s driver information systems team.


Wednesday, June 13, 2012

The making of the QNX reference vehicle: Jeep Wrangler

Guest post from Nicole Forget of QNX Software Systems
Nicole Forget

Just one week ago, our new reference vehicle was revealed at Telematics Detroit 2012. The Jeep Wrangler features QNX’s digital instrument cluster, which is totally re-skinnable. In fact, the entire user interface of the head unit, which was created using HTML5, can also be re-skinned. The head unit supports loads of functions, too, including the virtual mechanic, which are outlined in an earlier post.

The following video gives you some insight into the hard work that was put into the making of the reference vehicle. Check it out!


Monday, June 11, 2012

Moving beyond the browser: HTML5 as an automotive app environment

If you’ve already visited this blog, you’ll know that we are bullish on HTML5 as a way to implement infotainment system HMIs. Not surprisingly, I’ve spent a fair amount of time searching the Web for facts and opinions on using HTML5 in the car, to see how this idea is catching on.

Overall, people see numerous benefits, such as the ability to leverage mobile app development to keep pace with the consumer demands, the availability of a large pool of knowledgeable developers, and the attractiveness of a truly open specification supported by many possible vendors.

But when it comes to the challenges of making HTML5 a reality in the car, I found a common thread of questions, mostly rooted in the erroneous belief that an HTML5 application environment is “just a browser.” Everyone is familiar with the concept of a browser, so it’s easy to see why people take this point of view.

So what are the key differences between a browser and an HTML5 application environment? Here’s my quick view.

The experience
Everyone is familiar with the browser experience. You navigate to a web site through bookmarks, a search engine, or direct entry of a URL. The browser implements a user interface (aka the chrome) around a rendering engine and provides bookmarks, URL entry, back and forward, scrolling and panning, and other familiar features.

An automotive HMI based on HTML5 provides a different experience — just look at the accompanying screen shots and decide for yourself if they look like a browser. In fact, the user experience of an HTML5-based HMI is similar to that of any other purpose-built HMI. It can consist of a main screen, window management, navigation controls, and other typical user interface widgets.

A radio tuner and a media player from the QNX CAR 2 application platform. Both apps are based on HTML5, but beyond that, they neither act nor look like a web browser.

A system that uses an HTML5-based HMI can include:

  • core applications that look and act like native applications
  • add-on (downloaded and installed) applications that have controlled interfaces to the underlying hardware
  • “web link” applications that simply link to a cloud-hosted application that can be downloaded on demand and cached

The web link approach makes it easy to update applications: just update the server and the remote client systems will automatically pull the application when needed.

Local resources
Web browsers pull text, images, and other content from the web and render it on the user’s machine. The process of loading this remote content accounts for much of the user’s wait time. This paradigm changes with a local HTML5 application environment — because resources can exist locally, images and other components can load much more quickly.

What’s more, screens and user interfaces can be designed to fit the platform’s display characteristics. There is no need for panning and scrolling, and only limited need for zooming. Resources such as RAM can be optimized for this experience.

Security and sandboxing
Browsers load content and executable JavaScript code dynamically. This really is the power of the web technologies. The problem is, dynamically loaded code represents a threat to an embedded platform.

Browsers are designed to be sandboxed. By default, JavaScript code can execute only in the context of a browser engine, and cannot access the underlying operating system primitives and hardware. This approach changes in an HTML5 application environment. To give JavaScript code the ability to behave like a native application, the environment needs interfaces to the underlying OS through to the hardware. Plugins are used to implement these HTML5-to-OS interfaces.

Nonetheless, access to the underlying platform must be carefully controlled. Hence, a security scheme forms a critical component of the HTML5 application environment.

Application packaging
The app experience has become familiar to anyone who owns a smartphone or tablet. An HTML5 application environment in the car can also support this kind of experience: developers create and sign application packages, and users can download those packages from an application store. In an automotive context, authenticity of the applications and control over what they can or cannot do is critical. Again, a security model that enforces this forms a key part of the HTML5 application environment.

So, how should you think of an HTML5 application environment?
From my perspective, an HTML5 environment is like any other traditional HMI toolkit, but with much more flexibility and with inherent support for connected applications. In an HTML5 application environment, you can find technologies similar to those of any proprietary toolkit, including:

  • a rendering engine (HTML5 rendering engine)
  • a set of content authoring and packaging tools
  • layout specifications (HTML5 and CSS3)
  • a programming language (JavaScript)
  • an underlying data model (DOM)

The difference is, these components are developed with a web experience in mind. This, to me, is the most significant benefit: the web platform is open, scalable, and well understood by countless developers.

Wednesday, June 6, 2012

A reference vehicle by any other name

As you know, we’ve been running a contest about the make of our new reference vehicle. We said the first 25 people that guessed it right before the opening of Telematics Detroit would receive a $25 gift card to Starbucks.

Man, do our followers know their cars!

We had 17 correct guesses of a Jeep Wrangler. No one got the exact model right – Jeep Wrangler Sahara – but we think this ride is pretty sweet no matter what it’s called. Congrats to the following winners – we’ll be in touch soon.

Lazarus Long

If you’re at TU Detroit, one demo you won’t want to miss is on mobile connectivity. WARNING: You may leave with one question unanswered: If you could post to Facebook while in your car using simple voice commands, what would you share?

Find out more about the new QNX reference vehicle. And stay tuned for future contests.

Full disclosure: QNX releases first complete photos of new reference vehicle

"Any customer can have a car painted any color... so long as it is black."

We've come a long way since 1909, when Henry Ford penned this now-famous sentence. Not only can modern consumers pick the colors and features they want in a vehicle, but, in many cases, they can order them online. Getting the car you want, with the options you want, has never been easier.

Still, most forms of personalization are baked in. Once you order a car in, say, Barcelona Red (the color of my new Venza), it's hard to reverse the decision. But imagine the day when you can sit behind the wheel and watch your car's instrument cluster automatically reconfigure itself according to your personal preferences. And imagine if the cluster could do the same for everyone else who uses the car.

That's the kind of future QNX is working to make happen.

But you know what? I'm getting ahead of myself. I promised pictures of the new QNX reference vehicle, so let's look at them — especially since they offer tantalizing examples of what I was just talking about. :-)

The vehicle
Up to now, we've only released teaser images of the reference vehicle, with just enough detail to get people guessing as to what it might be. But enough with the mystery. Here's a full-on shot of the vehicle — a Jeep Wrangler Sahara — in all its off-road glory:

Yes, it's a Jeep

By the way, if you were one of the first 25 people in Canada or the US to guess it was a Jeep during our recent teaser campaign, congratulations! We'll identify the winners shortly.

The cluster
Once you get behind the wheel, the first thing you'll see is the digital instrument cluster. Let's zoom in so you can get a good look:

The cluster is implemented entirely in software and can reconfigure itself on the fly to display various types of information. Better yet, you can re-skin the cluster at the tap of a touchscreen button, like so:

As you can see, the cluster communicates with the head unit's navigation system to display turn-by-turn directions. Nice touch.

The head unit
Now look to your right, and you'll see the head unit. It supports a whack of functions (note my deft use of technical language), including one-touch pairing with Bluetooth smartphones, hybrid navigation, text-to-speech, natural speech recognition, streaming Internet radio, weather reporting, parking search, and too many other things to mention here.

In this photo, the head unit displays one of my favorite applications, the virtual mechanic. Intrigued? Check out my description of an early version of this app.

You know what else is cool? The unit's media player can post Facebook updates that list the song currently playing — but only when you tell it to, using voice commands. (Personal control over technology. I like that.) To view these updates later today and tomorrow while the Jeep is at Telematics Detroit, check out the QNX Facebook page.

Here's another photo of the head unit, showing its app tray:

The radio
What car would be complete without a radio? Mind you, in this case, "radio" includes support for streaming Internet radio from Pandora and TuneIn. And keeping in tune with the personalized listening experience these services offer you, the head unit's radio gives you a choice of skins:

In fact, almost every aspect of the head unit can be easily re-skinned. What's more, the underlying code remains the same: only the user interface, created in HTML5, changes from one skin to another. Which means automotive developers can create a single code base and re-use it across multiple vehicle lines. Doing more with less — what could be bad?

That's all I have for now, but before you go, check out the two press releases QNX issued this morning on the Jeep's personalization and Facebook features. Also, check out the QNX Flickr page for even more photos of the Jeep.

Tuesday, June 5, 2012

Cold beer (and a chance encounter) at Telematics Detroit

Last year I was part of QNX’s advance team for Telematics Detroit 2011, which had me in Detroit well before the show. I was just heading out to dinner when I saw a small bar set up on a terrace outside my hotel. I wandered by for a closer look, thinking I might stop by later to enjoy a cool drink in the warm evening air.

A few guys were sitting drinking beer and they invited me to join them. Ever social, I agreed. It turned out that the person who extended the invitation was from Cybercom — serendipitous, as Cybercom is a Bluetooth connectivity provider that we were considering for our next-gen QNX CAR application platform.

Fast forward one year exactly. We are getting ready to showcase that next-gen platform, QNX CAR 2, at Telematics Detroit 2012 — in a new reference vehicle, no less. Meanwhile, QNX and Cybercom are now engaged with a number of leading tier one suppliers. In fact, Cybercom has issued a press release about the partnership; read it here.

To Kristian at Cybercom: thanks for showing kindness to a guy on his lonesome, and for being a key contributor and partner in our mutual automotive initiatives.