Thursday, September 18, 2014

A glaring look at rear-view mirrors

Some reflections on the challenge of looking backwards, followed by the vexing question: where, exactly, should video from a backup camera be displayed?

Mirror, mirror, above the dash, stop the glare and make it last! Okay, maybe I've been watching too many Netflix reruns of Bewitched. But mirror glare, typically caused by bright headlights, is a problem — and a dangerous one. It can create temporary blind spots on your retina, leaving you unable to see cars or pedestrians on the road around you.

Automotive manufacturers have offered solutions to this problem for decades. For instance, many car mirrors now employ electrochromism, which allows the mirror to dim automatically in response to headlights and other light sources. But when, exactly, did the first anti-glare mirrors come to market?

According to Wikipedia, the first manual-tilt day/night mirrors appeared in the 1930s. These mirrors typically use a prismatic, wedge-shaped design in which the rear surface (which is silvered) and the front surface (which is plain glass) are at angles to each other. In day view, you see light reflected off the silvered rear surface. But when you tilt the mirror to night view, you see light reflected off the unsilvered front surface, which, of course, has less glare.

Manual-tilt day/night mirrors may have debuted in the 30s, but they were still a novelty in the 50s. Witness this article from the September 1950 issue of Popular Science:



True to their name, manual-tilt mirrors require manual intervention: You have to take your hand off the wheel to adjust them, after you’ve been blinded by glare. Which is why, as early as 1958, Chrysler was demonstrating mirrors that could tilt automatically, as shown in this article from the October 1958 issue of Mechanix Illustrated:


Images: Modern Mechanix blog

Fast-forward to backup cameras
Electrochromic mirrors, which darken electronically, have done away with the need to tilt, either manually or automatically. But despite their sophistication, they still can't overcome the inherent drawbacks of rear-view mirrors, which provide only a partial view of the area behind the vehicle — a limitation that contributes to backover accidents, many of them involving small children. Which is why NHTSA has mandated the use of backup cameras by 2018 and why the last two QNX technology concept cars have shown how video from backup cameras can be integrated with other content in a digital instrument cluster.

Actually, this raises the question: just where should backup video be displayed? In the cluster, as demonstrated in our concept cars? Or in the head unit, the rear-view mirror, or a dedicated screen? The NHTSA ruling doesn’t mandate a specific device or location, which isn't surprising, as each has its own advantages and disadvantages.

Consider, for example, ease of use: Will drivers find one location more intuitive and less distracting than the alternatives? In all likelihood, the answer will vary from driver to driver and will depend on individual cognitive styles, driving habits, and vehicle design.

Another issue is speed of response. According to NHTSA’s ruling, any device displaying backup video must do so within 2.5 seconds of the car shifting into the reverse. Problem is, the ease of complying with this requirement depends on the device in question. For instance, NHTSA acknowledges that “in-mirror displays (which are only activated when the reverse gear is selected) may require additional warm-up time when compared to in-dash displays (which may be already in use for other purposes such as route navigation).”

At first blush, in-dash displays such as head units and digital clusters have the advantage here. But let’s remember that booting quickly can be a challenge for these systems because of their greater complexity — many offer a considerable amount of functionality. So imagine what happens when the driver turns the ignition key and almost immediately shifts into reverse. In that case, the cluster or head unit must boot up and display backup video within a handful of seconds. It's important, then, that system designers choose an OS that not only supports rich functionality, but also allows the system to start up and initialize applications in the least time possible.

Tuesday, September 16, 2014

Ontario tech companies team up to target the connected car

To predict who will play a role tomorrow's connected vehicles, you need to look beyond the usual suspects.

When someone says “automobile,” what’s the first word that comes to mind? Chances are, it isn’t Ontario. And yet Ontario — the Canadian province that is home to QNX headquarters — is a world-class hub of automotive R&D and manufacturing. Chrysler, Ford, General Motors, Honda, and Toyota all have plants here. As do 350 parts suppliers. In fact, Ontario produced 2.5 million vehicles in 2012 alone.

No question, Ontario has the smarts to build cars. But to fully appreciate what Ontario has to offer, you need to look beyond the usual suspects in the auto supply chain. Take QNX Software Systems, for example. Our roots are in industrial computing, but in the early 2000s we started to offer software technology and expertise to the world’s automakers and tier one suppliers. And now, a decade later, QNX offers the premier platform for in-car infotainment, with deployments in tens of millions of vehicles.

QNX Software Systems is not alone. Ontario is home to many other “non-automotive” technology companies that are playing, or are poised to play, a significant role in creating new automotive experiences. But just who are these companies? The Automotive Parts Manufacturers Association (APMA) of Canada would like you to know. Which is why they've joined forces with QNX and other partners to build the APMA Connected Vehicle.

A showcase for Ontario technology.
The purpose of the vehicle is simple: to showcase how Ontario companies can help create the next generation of connected cars. The vehicle is based on a Lexus RX350 — built in Ontario, of course — equipped with a custom-built infotainment system and digital instrument cluster built on QNX technology. Together, the QNX systems integrate more than a dozen technologies and services created in Ontario, including gesture recognition, biometric security, emergency vehicle notification, LED lighting, weather telematics, user interface design, smartphone charging, and cloud connectivity.

Okay, enough from me. Time to nuke some popcorn, dim the lights, and hit the Play button:



Wednesday, September 10, 2014

QNX-powered Audi Virtual Cockpit drives home with CTIA award

Congratulations to our friends at Audi! The new Audi Virtual Cockpit, which is based on the QNX OS, has just won first prize, connected car category, in the 2014 CTIA Hot for the Holidays awards.

I’ve said it before and I’ll say it again: the Audi Virtual Cockpit is an innovative, versatile, and absolutely ravishing piece of automotive technology. But you don’t have to take my word for it — or the word of the CTIA judges, for that matter. Watch the video and see for yourself:



Created in 2009, the Hot for the Holidays awards celebrate the most desirable mobile consumer electronics products for the holiday season. The winners for this year’s awards were announced this afternoon, at the CTIA Super Mobility event in Las Vegas. Andrew Poliak of QNX Software Systems was on hand and he took this snap of the award:



Visit the CTIA website to see the full list of winners. And visit the Audi website to learn more about the Audi Virtual Cockpit.

Monday, September 8, 2014

Some forward-thinking on looking backwards

The first rear-view camera appeared on a concept car in 1956. It's time to go mainstream.

Until today, I knew nothing about electrochromism — I didn’t even know the word existed! Mind you, I still don’t know that much. But I do know a little, so if you’re in the dark about this phenomenon, let me enlighten you: It’s what allows smart windows to dim automatically in response to bright light.

A full-on technical explanation of electrochromism could fill pages. But in a nutshell, electrochromic glass contains a substance, such as tungsten trioxide, that changes color when you apply a small jolt of electricity to it. Apply a jolt, and the glass goes dark; apply another jolt, and the glass becomes transparent again. Pretty cool, right?

Automakers must think so, because they use this technology to create rear-view and side-view mirrors that dim automatically to reduce glare — just the thing when the &*^%$! driver behind you flips on his high-beams. Using photo sensors, these mirrors measure incoming light; when it becomes too bright, the mirror applies the requisite electrical charge and, voilĂ , no more fried retinas. (I jest, but in reality, mirror glare can cause retinal blind spots that affect driver reaction time.)

So why am I blabbing about this? Because electrochromic technology highlights a century-old challenge: How do you see what — or who — is behind your car? And how do you do it even in harsh lighting conditions? It’s a hard problem to solve, and it’s been with us ever since Dorothy Levitt, a pioneer of motor racing, counseled women to “hold aloft” a handheld mirror “to see behind while driving.” That was in 1906.

Kludges
For sure, we’ve made progress over the years. But we still fall back on kludges to compensate for the inherent shortcomings of placing a mirror meters away from the back of the vehicle. Consider, for example, the aftermarket wide-angle lenses that you can attach to your rear window — a viable solution for some vehicles, but not terribly useful if you are driving a pickup or fastback.

Small wonder that NHTSA has ruled that, as of May 2018, all vehicles under 10,000 pounds must ship with “rear visibility technology” that expands the driver’s field of view to include a 10x20-foot zone directly behind the vehicle. Every year, backover crashes in the US cause 210 fatalities and 15,000 injuries — many involving children. NHTSA believes that universal deployment of rear-view cameras, which “see” where rear-view mirrors cannot, will help reduce backover fatalities by about a third.

Buick is among the automotive brands that are “pre-complying” with the standard: every 2015 Buick model will ship with a rearview camera. Which, perhaps, is no surprise: the first Buick to sport a rearview camera was the Centurion concept car, which debuted in 1956:


1956 Buick Centurion: You can see the backup camera just above the center tail light.

The Centurion’s backup camera is one of many forward-looking concepts that automakers have demonstrated over the years. As I have discussed in previous posts, many of these ideas took decades to come to market, for the simple reason they were ahead of their time — the technology needed to make them successful was too immature or simply didn’t exist yet.

Giving cameras the (fast) boot
Fortunately, the various technologies that enable rear-view cameras for cars have reached a sufficient level of maturity, miniaturization, and cost effectiveness. Nonetheless, challenges remain. For example, NHTSA specifies that rear-view cameras meet a number of requirements, including image size, response time, linger time (how long the camera remains activated after shifting from reverse), and durability. Many of these requirements are made to order for a platform like the QNX OS, which combines high reliability with very fast bootup and response times. After all, what’s the use of backup camera if it finishes booting *after* you back out of your driveway?


Instrument cluster in QNX technology concept car displaying video from a backup camera.

Wednesday, September 3, 2014

Domo arigato, for self-driving autos

Lynn Gayowski
When talk moves to autonomous cars, Google's self-driving car is often the first project that springs to mind. However, there are a slew of automakers with autonomous or semi-autonomous vehicles in development — Audi, BMW, General Motors, Mercedes-Benz, and Toyota, to name a few. And did you know that QNX has been involved with autonomous projects since 1997?

Let's begin at the beginning. Obviously the first step is to watch the 1983 Mr. Roboto music video. To quote selectively, "I've come to help you with your problems, so we can be free." As Styx aptly communicated with the help of synthesizers, robots have the potential to improve our lives. Current research predicts autonomous cars will reduce traffic collisions and improve traffic flow, plus drivers will be freed up for other activities.

So let's take a look at how QNX has been participating in the progress to self-driving vehicles.



The microkernel architecture of the QNX operating system provides an exemplary foundation for systems with functional safety requirements, and as you can see from this list, there are projects related to cars, underwater robots, and rescue vehicles.

Take a look at this 1997 video from the California Partners for Advanced Transportation Technology (PATH) and the National Automated Highway System Consortium (NAHSC) showing their automated driving demo — the first project referenced on our timeline. It's interesting that the roadway and driving issues mentioned in this video still hold true 17 years later.



We're estimating that practical use of semi-autonomous cars is still 4 years away and that fully autonomous vehicles won't be available to the general public for about another 10 years after that. So stay tuned to the QNX Auto Blog. I'm already envisioning a 30-year montage of our autonomous projects. With a stirring soundtrack by Styx.

Monday, August 25, 2014

QNX Acoustics for Voice — a new name and a new benchmark in acoustic processing


Tina Jeffrey
Earlier this month, QNX Software Systems officially released QNX Acoustics for Voice 3.0 — the company’s latest generation of acoustic processing software for automotive hands-free voice communications. The solution sets a new benchmark in hands-free quality and supports the rigorous requirements of smartphone connectivity specifications.

Designed as a complete software solution, the product includes both the QNX Acoustics for Voice signal-processing library and the QWALive tool for tuning and configuration.

The signal-processing library manages the flow of audio during a hands-free voice call. It defines two paths: the send path, which handles audio flowing from the microphones to the far end of the call, and the receive path, which handles audio flowing from the far end to the loudspeakers in the car:





QWALive, used throughout development and pre-production phases, gives developers realtime control over all library parameters to accelerate tuning and diagnosis of audio issues:



A look under the hood
QNX Acoustics for Voice 3.0 builds on QNX Software Systems’ best-in-class acoustic echo cancellation and noise reduction algorithms, road-proven in tens of millions of cars, and offers breakthrough advancements over existing solutions.

Let me run through some of the innovative features that are already making waves (sorry, couldn’t resist) among automotive developers.

Perhaps the most significant innovation is our high efficiency technology. Why? Well, simply put, it saves up to 30% both in CPU load and in memory requirements for wideband (16 kHz sample rate for HD Voice) and Wideband Plus (24 kHz sample rate). This translates into the ability to do more processing on existing hardware, and with less memory. For instance, automakers can enable new smartphone connectivity capabilities on current hardware, without compromising performance:



Another feature that premieres with this release is intelligent voice optimization technology, designed to accelerate and increase the robustness of send-path tuning. This technology implements an automated frequency response correction model that dynamically adjusts the frequency response of the send path to compensate for variations in the acoustic path and vehicle cabin conditions.

Dynamic noise shaping, which is exclusive to QNX Acoustics for Voice, also debuts in this release. It enhances speech quality in the send path by reducing broadband noise from fans, defrost vents, and HVAC systems — a welcome feature, as broadband noise can be particularly difficult for hands-free systems to contend with.

Flexibility and portability — check and check
Like its predecessor (QNX Aviage Acoustic Processing 2.0), QNX Acoustics for Voice 3.0 continues to offer maximum flexibility to automakers. The modular software library comes with a comprehensive API, easing integration efforts into infotainment, telematics, and audio amplifier modules. Developers can choose from fixed- and floating-point versions that can be ported to a variety of operating systems and deployed on a wide range of processors or DSPs.

We’re excited about this release as it’s the most sophisticated acoustic voice processing solution available to date, and it allows automakers to build and hone systems for a variety of speech requirements, across all their vehicle platforms.

Check out the QNX Acoustics for Voice product page to learn more.

Monday, August 18, 2014

QNX Acoustics for Active Noise Control wins a Silver Stevie

Lynn Gayowski

The winners of the 11th annual International Business Awards have been announced and I'm happy to share that QNX Acoustics for Active Noise Control (ANC) has won a Silver Stevie Award in the software category, for Best New Product or Service of the Year! The awards program honours the achievements of organizations and working professionals worldwide, and received more than 3,500 nominations this year from dozens of countries. It feels great to be chosen as a winner among so many entries.

If you're unfamiliar with QNX Acoustics for ANC, it's a software solution that can dramatically reduce unwanted engine harmonic noise inside the cabin of a vehicle. The software's algorithms for noise cancellation can run on an existing CPU or DSP in the infotainment system, eliminating the need for dedicated hardware controller modules. The end result is significant savings for automakers and a quieter ride for drivers and passengers.

This is the second award for QNX Acoustics for ANC, after a win in February at the Embedded World conference's embedded AWARDs. If you want to learn more about the solution, read our white paper titled A Software-Based Approach to Active Noise Control in Automobiles. Congratulations QNX!