Wednesday, March 25, 2015

Keeping it fresh for 35 years

By Megan Alink, Director of Marketing Communications for Automotive

Recently, my colleagues Paul Leroux and Matt Young showed off a shiny new infographic that enlightens readers to the many ways they encounter QNX-based systems in daily life (here and here). After three-and-a-half decades in business we’ve certainly been around the block a time or two, and you might think things are getting a bit stale. As the infographic shows, that couldn’t be further from the truth here at QNX. From up in the stars to down on the roads; in planes, trains, and automobiles (and boats too); whether you’re mailing a letter or crafting a BBM on your BlackBerry smartphone, the number and breadth of applications in which our customers deploy QNX technology is simply astounding.

For those who like some sound with their pictures, we also made a video to drive home the point that, wherever you are and whatever you do, chances are you’ll encounter a little QNX. Check it out:


Wednesday, March 18, 2015

Building smartphone-caliber connectivity into cars

Paul Leroux
Implementing cellular and Wi-Fi connectivity in a vehicle is never trivial. But with the right technology, the task can become a lot simpler.

When it comes to selling cars, just how important is connectivity? Can the services provided by connected cars, such as Internet radio, remote diagnostics, and real-time traffic information, influence vehicle buying decisions? And if so, how much?

In 2014, telecom giant Telefónica decided to find out. In a survey of 5000 consumers, the company found that 71% of respondents were interested in using, or were already using, connected car services. Other studies report similar findings. Parks Associates, for example, found that 78% of people who already own a connected car will demand connectivity features in their next vehicle.

Of course, “connected car” means different things to different people. It could, for example, refer to a car that has a built-in cellular modem, or to a car that uses the driver’s smartphone to access online services. Moreover, the features offered by my connected car may differ completely from the features offered by your connected car. But no matter what form it takes, or what applications it enables, connectivity in the car can be a challenge to implement. In a recent blog post on LinkedIn, Roger Lanctot of Strategy Analytics attests to this difficulty, stating that nearly every car maker seeking to implement connectivity has stumbled on issues ranging from bad connections and poor user interfaces to interminable delays.

Consider, for example, the challenge of embedding a cellular modem in a vehicle — or any other embedded device, for that matter. Initializing and managing the modem requires a large set of software that, among other things, must:

  • handle modem reset and recovery, because even the best modems crash
  • monitor and manage power consumption to optimize current draw
  • ensure data throughput and reliability
  • reduce or eliminate call-drops and call-setup failures

The challenge doesn’t stop there. Network operators, for example, are paying more attention to M2M connections on their networks, thereby increasing the demand for operator-approved modems and modules. Meanwhile, system designers may need to swap out modems to target different regions or price points, or to take advantage of newer, more capable modem technology. The goal, then, is to implement a flexible, future-proofed design that can accommodate such changes with a bare minimum of fuss.

Enter a new webinar hosted by my colleagues Karen Bachman and Leo Forget. In “Applying smartphone wireless technology to connected embedded systems,” they will examine the challenges of embedding wireless connectivity and explore how to address these challenges through software frameworks developed for smartphones and other mobile devices. True to the title, Karen and Leo will look at use cases not just for automotive, but for other industries as well, such as medical and industrial. The bulk of the conversation, though, will focus on common issues that embedded developers face, regardless of the device type they are building.

Attend this webinar to learn about:

  • Applications that stand to benefit the most from wireless connectivity
  • Challenges and complexity of bringing connectivity to cars and other embedded systems
  • Potential security and privacy risks introduced by wireless connectivity, including unauthorized access and unencrypted data transfer
  • The benefits of creating flexible products that easily accommodate advances in modem technology

Here are the webinar coordinates:

Applying smartphone wireless technology to connected embedded systems

Thursday, March 26, 2015
12:00 pm to 1:00 pm EST
Register: TechOnLine


Wednesday, March 11, 2015

Long time, no see: Catching up with the QNX CAR Platform

By Megan Alink, Director of Marketing Communications for Automotive

It’s a fact — a person simply can’t be in two places at one time. I can’t, you can’t, and the demo team at QNX can’t (especially when they’re brainstorming exciting showcase projects for 2016… but that’s another blog. Note to self.) So what’s a QNX-loving, software-admiring, car aficionado to do when he or she has lost touch and wants to see the latest on the QNX CAR Platform for Infotainment? Video, my friends.

One of the latest additions to our QNX Cam YouTube channel is an update to a video made just over two and a half years ago, in which my colleague, Sheridan Ethier, took viewers on a feature-by-feature walkthrough of the QNX CAR Platform. Now, Sheridan’s back for another tour, so sit back and enjoy a good, old-fashioned catch-up with what’s been going on with our flagship automotive product (with time references, just in case you’re in a bit of a hurry).

Sheridan Ethier hits the road in the QNX reference vehicle based on a modified Jeep Wrangler, running the latest QNX CAR Platform for Infotainment.

We kick things off with a look at one of the most popular elements of an infotainment system — multimedia. Starting around the 01:30 mark, Sheridan shows how the QNX CAR Platform supports a variety of music formats and media sources, from the system’s own multimedia player to a brought-in device. And when your passenger is agitating to switch from the CCR playlist on your MP3 device to Meghan Trainor on her USB music collection, the platform’s fast detection and sync time means you’ll barely miss a head-bob.

The QNX CAR Platform’s native multimedia player — the “juke box” — is just one of many options for enjoying your music.

About five minutes in, we take a look at how the QNX CAR Platform implements voice recognition. Whether you’re seeking out a hot latt√©, navigating to the nearest airport, or calling a co-worker to say you’ll be a few minutes late, the QNX CAR Platform lets you do what you want to do while doing what you need to do — keeping your hands on the wheel and your eyes on the road. Don’t miss a look at concurrency (previously discussed here by Paul Leroux) during this segment, when Sheridan runs the results of his voice commands (multimedia, navigation, and a hands-free call) smoothly at the same time.

Using voice recognition, users can navigate to a destination by address or point of interest description (such as an airport).

At eight minutes, Sheridan tells us about one of the best examples of the flexibility of the QNX CAR Platform — its support for application environments, including native C/C++, Qt, HTML5, and APK for running Android applications. The platform’s audio management capability makes a cameo appearance when Sheridan switches between the native multimedia player and the Pandora HTML5 app.

Pandora is just one of the HTML5 applications supported by the QNX CAR Platform.

As Sheridan tells us (at approximately 12:00), the ability to project smartphone screens and applications into the vehicle is an important trend in automotive. With technologies like MirrorLink, users can access nearly all of the applications available on their smartphone right from the head unit.

Projection technologies like MirrorLink allow automakers to select which applications will be delivered to the vehicle’s head unit from the user’s connected smartphone. 

Finally, we take a look at two interesting features that differentiate the QNX CAR Platform — last mode persistence (e.g. when the song you were listening to when you turned the car off starts up at the same point when you turn the car back on) and fastboot (which, in the case of QNX CAR, can bring your backup camera to life in 0.8 seconds, far less than the NHTSA-mandated 2 seconds). These features work hand-in-hand to ensure a safer, more enjoyable, more responsive driving experience.

Fastboot in 0.8 seconds means that when you’re ready to reverse, your car is ready to show you the way.

Interested in learning more about the QNX CAR Platform for Infotainment? Check out Paul Leroux’s blog on the architecture of this sophisticated piece of software. To see QNX CAR in action, read Tina Jeffrey’s blog, in which she talks about how the platform was implemented in the reimagined QNX reference vehicle for CES 2015.

Check out the video here:


Wednesday, March 4, 2015

“What do you mean, I have to learn how not to drive?”

The age of autonomous driving lessons is upon us.

Paul Leroux
What would it be like to ride in an autonomous car? If you were to ask the average Joe, he would likely describe a scenario in which he sips coffee, plays video games, and spends quality time with TSN while the car whisks him to work. The average Jane would, no doubt, provide an equivalent answer. The problem with this scenario is that autonomous doesn’t mean driverless. Until autonomous vehicles become better than humans at handling every potential traffic situation, drivers will have to remain alert much or all of the time, even if their cars do 99.9% of the driving for them.

Otherwise, what happens when a car, faced with a situation it can’t handle, suddenly cedes control to the driver? Or what happens when the car fails to recognize a pedestrian on the road ahead?

Of course, it isn’t easy to maintain a high level of alertness while doing nothing in particular. It takes a certain maturity of mind, or at least a lack of ADD. Which explains why California, a leader in regulations for autonomous vehicles, imposes restrictions on who is allowed to “drive” them. Prerequisites include a near-spotless driving record and more than 10 years without a DUI conviction. Drivers must also complete an autonomous driving program, the length of which depends on the car maker or automotive supplier in question. According to a recent investigation by IEEE Spectrum, Google offers the most comprehensive program — it lasts five weeks and subjects drivers to random checks.

1950s approach to improving driver
alertness. Source:
 
Modern Mechanix blog

In effect, drivers of autonomous cars have to learn how not to drive. And, as another IEEE article suggests, they may even need a special license.

Ample warnings
Could an autonomous car mitigate the attention issue? Definitely. It could, for example, give the driver ample warning before he or she needs to take over. The forward collision alerts and other informational ADAS functions in the latest QNX technology concept car offer a hint as to how such warnings could operate. For the time being, however, it’s hard to imagine an autonomous car that could always anticipate when it needs to cede control. Until then, informational ADAS will serve as an adjunct, not a replacement, for eyes, ears, and old-fashioned attentiveness.

Nonetheless, research suggests that adaptive cruise control and other technologies that enable autonomous or semi-autonomous driving can, when compared to human drivers, do a better job of avoiding accidents and improving traffic flow. To quote my friend Andy Gryc, autonomous cars would be more “polite” to other vehicles and be better equipped to negotiate inter-vehicle space, enabling more cars to use the same length of road.

Fewer accidents, faster travel times. I could live with that.


2015 approach to improving driver alertness: instrument cluster from the QNX reference vehicle.

Monday, March 2, 2015

Hypervisors, virtualization, and taking control of your safety certification budget

A new webinar on how virtualization can help you add new technology to existing designs.

First things first: should you say “hypervisor” or “virtual machine monitor”? Both terms refer to the same thing, but is one preferable to the other?

Hypervisor certainly has the greater sex appeal, suggesting it was coined by a marketing department that saw no hope in promoting a term as coldly technical as virtual machine monitor. But, in fact, hypervisor has a long and established history, dating back almost 50 years. Moreover, it was coined not by a marketing department, but by a software developer.

“Hypervisor” is simply a variant of “supervisor,” a traditional name for the software that controls task scheduling and other fundamental operations in a computer system — software that, in most systems, is now called the OS kernel. Because a hypervisor manages the execution of multiple OSs, it is, in effect, a supervisor of supervisors. Hence hypervisor.

No matter what you call it, a hypervisor creates multiple virtual machines, each hosting a separate guest OS, and allows the OSs to share a system’s hardware resources, including CPU, memory, and I/O. As a result, system designers can consolidate previously discrete systems onto a single system-on-chip (SoC) and thereby reduce the size, weight, and power consumption of their designs — a trinity of benefits known as SWaP.

That said, not all hypervisors are created equal. There are, for example, Type 1 “bare metal” hypervisors, which run directly on the host hardware, and Type 2 hypervisors, which run on top of an OS. Both types have their benefits, but Type 1 offers the better choice for any embedded system that requires fast, predictable response times — most safety-critical systems arguably fall within this category.

The QNX Hypervisor is an example of a Type 1 “bare metal” hypervisor.


Moreover, some hypervisors make it easier for the guest OSs to share hardware resources. The QNX Hypervisor, for example, employs several technologies to simplify the sharing of display controllers, network connections, file systems, and I/O devices like the I2C serial bus. Developers can, as a result, avoid writing custom shared-device drivers that increase testing and certification costs and that typically exhibit lower performance than field-hardened, vendor-supplied drivers.

Adding features, without blowing the certification budget
Hypervisors, and the virtualization they provide, offer another benefit: the ability to keep OSs cleanly isolated from each other, even though they share the same hardware. This benefit is attractive to anyone trying to build a safety-critical system and reduce SWaP. Better yet, the virtualization can help device makers add new and differentiating features, such as rich user interfaces, without compromising safety-critical components.

That said, hardware and peripheral device interfaces are evolving continuously. How can you maintain compliance with safety-related standards like ISO 26262 and still take advantage of new hardware features and functionality?

Enter a new webinar hosted by my inimitable colleague Chris Ault. Chris will examine techniques that enable you to add new features to existing devices, while maintaining close control of the safety certification scope and budget. Here are some of the topics he’ll address:

  • Overview of virtualization options and their pros and cons
     
  • Comparison of how adaptive time partitioning and virtualization help achieve separation of safety-critical systems
     
  • Maintaining realtime performance of industrial automation protocols without directly affecting safety certification efforts
     
  • Using Android applications for user interfaces and connectivity

Webinar coordinates:
Exploring Virtualization Options for Adding New Technology to Safety-Critical Devices
Time: Thursday, March 5, 12:00 pm EST
Duration: 1 hour
Registration: Visit TechOnLine

Monday, February 9, 2015

QNX-powered Audi Virtual Cockpit shortlisted for MWC’s Global Mobile Awards

By Lynn Gayowski

2015 has just started and the QNX auto team is already off to the races. It was only last month at CES that the digital mirrors in our 2015 technology concept car were selected as a finalist for Engadget’s Best of CES Awards, in the category for best automotive tech. Now we’re excited to share some other big, award-related news. Drum roll, please… the QNX-powered Audi virtual cockpit in the 2015 Audi TT has been shortlisted for Mobile World Congress’ prestigious Global Mobile Awards, in the category for best mobile innovation for automotive!

The 2015 Audi TT features a one-of-a-kind, innovative, and just plain awesome, instrument cluster — the Audi virtual cockpit — powered by the QNX operating system. With the Audi virtual cockpit, everything is in view, directly in front of the driver. All the functions of a conventional instrument cluster and a center-mounted head unit are blended into a single, highly convenient, 12.3" display. This approach allows users to interact with their music, navigation, and vehicle information in a simple, streamlined fashion. As you may recall, the QNX-powered Audi virtual cockpit also took home first place in CTIA’s Hot for the Holidays Awards late last year.

Props also to our BlackBerry colleagues, who received 2 nominations themselves for the Global Mobile Awards: BlackBerry Blend in the best mobile service or app for consumers category, and Blackberry for BBM Protected in the best security/anti-fraud product or solution category.

The winners will be announced on March 3 at the Global Mobile Awards ceremony at Mobile World Congress. We can’t wait to hit Barcelona! In the meantime, check out the video below to see the Audi virtual cockpit in action.




Thursday, February 5, 2015

Have you heard about Phantom Intelligence yet?

If you haven’t, I bet you will. Phantom Intelligence is a startup that is looking to revolutionize LiDAR for automotive. I hadn’t heard of them either until QNX and Phantom Intelligence found themselves involved in a university project in 2014. They had some cool technology and are just all-around good guys, so we started to explore how we could work together at CES 2015. One thing led to another and their technology was ultimately featured in both the QNX reference vehicle and the new QNX technology concept car.

I knew little about LiDAR at the beginning of the partnership. But as I started to ramp up my knowledge I learned that LiDAR can provide valuable sensor input into ADAS systems. Problem is, LiDAR solutions are big, expensive, and have not, for the most part, provided the kind of sensitivity and performance that automakers look for.

Phantom Intelligence is looking to change all this with small, cost-effective LiDAR systems that can detect not just metal, but also people (handy if you are crossing the street and left your Tin Man costume at home) and that are impervious to inclement weather. As a frequent pedestrian this is all music to my ears.

I am still in no way qualified to offer an intelligent opinion on the pros and cons of competing LiDAR technology so I’m just going on the positive feedback I heard from customers and other suppliers into the ADAS space at CES. Phantom turned out to be one of the surprise hits this year and they are just getting started. That’s why I think you will be hear more about them soon.


Both QNX vehicles showcased at CES 2015 use a LiDAR system from Phantom Intelligence to detect obstacles on the road ahead.