Tuesday, May 28, 2013

HTML5 blooper reel

I find bloopers infinitely amusing — mind you I’m talking about those that come on a reel, not those that happen for real. Missed deadlines, cost over-runs, IP disputes — these are the bloopers we all could do without.

Helping customers avoid bloopers is what we do — so to speak. Except it seems, when we put them in front of the camera. <grin>

Seriously though, no customers were hurt in the making of this video.



This compilation of bloopers from the HTML5 series highlights the professionalism of QNX customers, partners, and employees as well as their good nature.
 

Friday, May 24, 2013

#QNXLive Twitter Sessions Return!

Ask us your questions about self-driving cars and the secrets of the QNX Garage

Paul Leroux
We’re back for more of your questions. Back in December, we held our first #QNXLive Twitter sessions leading up to CES 2013; next week, we’re revving up for Telematics Detroit (June 5-6) with not one, but two #QNXLive sessions with experts from the QNX auto team.

Autonomous cars continue to captivate the popular imagination and are quickly becoming a reality. On Tuesday, May 28 at 4pm ET, Justin Moon, global technical evangelist,will give a preview of his Telematics Detroit panel, “The Autonomous Car: The Road to Driverless Driving” in his first #QNXLive session. Justin will share his thoughts on the latest developments in autonomous and assisted driving, how the industry defines “autonomous”, how your car is already autonomous in certain respects, and how self-driving cars will change your driving experience.

On Thursday, May 30 at 1pm ET, Alex James, concept development team software engineer, will take you Behind the scenes at the QNX Garage in his #QNXLive session. Have you ever wondered what a day looks like in the QNX garage for the concept design team? What does the team enjoy most about working in the garage? Alex will give you a behind-the-scenes look at the birthplace of the QNX technology concept car based on a Bentley Continental GT and the reference vehicle based on a Jeep Wrangler Sahara — both will be at Telematics Detroit.

You can submit your questions now or on the day of the Twitter sessions by tweeting @QNX_Auto with the hashtag #QNXLive. As usual, we’ll be sure to call you out if you asked a question that we selected.

Be sure to follow @QNX_Auto for next week’s live Twitter sessions – and the latest from Telematics Detroit. I’m looking forward to being your host for #QNXLive.

In the meantime, check out our recent posts on autonomous cars and the following videos:

Meet Justin Moon, product manager turned concept designer (Justin is nothing if not versatile: he's since taken on the role of global technical evangelist.)


    Meet the QNX concept team: Alex James, software engineer


      QNX technology concept car - Bentley Continental



        Wednesday, May 22, 2013

        Cisco study: people want a safer, more personalized driving experience

        And they're willing to give up some of their privacy to get it.

        Paul Leroux
        Call me old-fashioned, but my hackles go up every time a web site or business asks me for personal information. My reaction is at once emotional and rational. Emotional because I'm a private person by nature; sharing details about myself simply goes against the grain. Rational because I know that people want this information more for their own benefit than for mine.

        Does that mean I never share personal information? Of course not. Even if someone wants it primarily for their benefit, I may still enjoy some benefit in return. That said, I weigh the pro's and con's carefully. And I ask questions. For instance, who will have access to the information? And what will they do with it?

        In effect, personal information becomes a form of tender — something I barter in exchange for a perceived benefit. And it seems I'm not alone.

        Recently, Cisco published the results of a study on what car drivers would be willing to give up in exchange for a variety of benefits. For instance, 60% would provide DNA samples or other biometric information in return for personalized security or car security. And a whopping 74% would let their driving habits be monitored in return for lower insurance or service maintenance costs. Cisco sums it up in this infographic:


        In autonomous we trust
        The study also found that people are willing to embrace autonomous cars — but the enthusiasm varies significantly by geography. For instance, Canada trails the U.S. by 8 percentage points, but both countries are miles behind India or Brazil.


        The study surveyed more than 1,500 consumers across 10 countries. That's only about 150 people per country, so I wouldn't put too much credence into this geographic breakdown. That said, the differences are dramatic enough to suggest that self-driving cars will see faster adoption in some countries than others.

        For more on these and other findings, visit the Cisco website.

        Monday, May 13, 2013

        The great autonomous car debate

        Paul Leroux
        When it comes to cars that drive themselves, are you for or against? Either way, you're bound to find fodder for your arguments in Six reasons to love, or loathe, autonomous cars, a recent CNET article co-authored by Wayne Cunningham and Antuan Goodwin.

        Wayne is for, Antuan is against, and they both score good points. For instance, Wayne argues that autonomous cars will reduce accidents and help the elderly remain mobile. Antuan, meanwhile, warns of the potential for reduced privacy and the likelihood that driving will become less random — that last point may not sound like a drawback, but I found myself nodding in agreement.

        Actually, I found myself agreeing with both writers on several points. Does that make me a fence-sitter or just someone with a balanced perspective? Read the article and tell me what you think.

        Thursday, May 9, 2013

        Specs for Cars?

        Tina Jeffrey
        As Google Glass, the latest in experimental computer wearables, starts to make its way into the hands of select users, a multitude of use cases is popping up. For instance, a WIRED article recently explored the notion of your car being a ‘killer app’ for Google Glass. Now, you may not want to think of your car as a killer app, but let’s contemplate this use case for a moment.

        Drivers wearing Glass could pair their new specs to their phone and instantly have a personal heads-up display (HUD) that overlays virtual road signs and GPS information over the scene in front of them. For instance:


        Source: Google

        Glass also understands voice commands and could dictate an email, display turn by turn directions, or set up and display point-of-interest destination data based on a simple voice command such as “Find the nearest Starbucks”.

        This is all very cool — but does it bring anything new to the driving experience that isn’t already available? Not really. Car makers have already deployed voice-enabled systems to interface with navigation and location-based services; these services either run locally or are accessed through a brought-in mobile device and displayed on the head unit in a safe manner. ADAS algorithms, meanwhile, perform real-time traffic sign detection and recognition to display speed limits on the vehicle’s HUD. All this technology exists today and works quite well.

        Catch me if you can
        Another aspect to consider is the regulatory uncertainty created by drivers wearing these types of devices. Police can spot a driver with their head down texting on a cellphone or watching a movie on a DVD player. But detecting a driver performing these same activities while wearing a head-mounted display — not so easy. There’s no way of knowing whether the activities a driver is engaged in are driving related or an outright distraction. Unlike an HUD specified by the automaker, which is designed to coordinate and synchronize displayed data based on vehicle conditions and an assessment of cognitive load, a head-mounted display like Glass could give a driver free reign to engage in any activity at any time. This flies in the face of driver distraction guidelines being promulgated by government agencies.

        Don’t get me wrong. Glass is cool technology, and I see viable applications for it. For instance, as an alternative to helmet cams when filming a first-person perspective of a ski run down a mountain, or in taking augmented reality gaming to the next level. (You can see many other applications on the Glass site.) But Glass is a personal display that operates as an extension of your cellphone, not as a replacement for a car’s HUD. Cars need well-integrated, useable systems that can safely enhance the driving experience. Because of this, I don’t believe that devices like Glass, as they are currently conceived, will garner a spot in our cars.

        Tuesday, May 7, 2013

        QNX, Renesas to integrate R-Car SoCs and QNX CAR platform

        Paul Leroux
        This just in: QNX and Renesas today announced that they are integrating support for Renesas R-Car SoCs into the QNX CAR application platform. The companies also announced that QNX is joining the Renesas R-Car consortium, a partner program designed to help drive rapid development of next-generation in-car systems.

        Renesas designed its R-Car SoCs to power a variety of high-end navigation and infotainment systems. The SoCs integrate a 3D graphics processor, an audio processing DSP, and image recognition processing IPs, as well as support for CAN, MOST, USB, Ethernet, SD Card, and other interfaces.

        As part of their collaboration, QNX and Renesas intend to include support for the recently announced R-Car H2 Soc. According to Renesas, the H2 is now the world's highest-performance SoC for car information systems, with a Dhrystone benchmark of over 25000 DMIPS.

        For more information, read the press release.

        UPDATE: My colleague Kosuke, who writes for the QNX Japan blog, has just sent some photos from the Embedded Systems Expo in Tokyo, where Renesas unveiled the first demonstration of the QNX CAR platform on an R-Car M1A processor.

        Here is a shot of the demo:



        And here's a closeup of the BOCK-W board that hosts the M1A processor:


        Monday, May 6, 2013

        Experience the MLOVE!

        Justin Moon
        The Monterey Peninsula is arguably one of the most inspiring backdrops for a conference on the planet. I couldn’t think of a better place to hold this year’s MLOVE Confestival USA. MLOVE brings together thought leaders, business innovators, and forward thinking engineers in the mobile space for three days of inspirational talks and thought-provoking workshops. The goal of the conference is quite simply to inspire attendees and spark innovation. This year the QNX team pulled out all the stops — not only did we have a speaking slot but the QNX technology concept car (a specially modified Bentley) was front and center throughout the entire conference.

        Day 1
        MLOVE was the first outdoor public demonstration of the Bentley, so Mark Rigley and I started the day by putting the car through its paces. Connectivity? Check. Screens visible in the sunlight? Got it. Input, voice, touch, media? All good! In the afternoon we demonstrated the Bentley to delegates from a wide range of mobile companies. Kudos to our talented concept development team for flawless execution.

        The evening’s festivities were in the key of inspiration. A very engaging talk by Maurice Conti of Autodesk on future trends was the opening salvo. Steve Brown, producer and director at Spark Pictures, introduced a feature film that showcases passion and empowerment during the Burning Man Festival.

        Inside the Bentley
        Day 2
        The second day was jam-packed with presentations, an open space interactive ideation session, and even a startup competition. The focus of the day’s presentations included the internet of things, wearable computing, artificial intelligence, and transformational media. Highlights included discussions not just on the digitization and connectivity of things, but on the over-arching experience contained within the idea. We also learned what space startup Nanosatisfi is doing with lightweight, inexpensive satellites (CubeSats) that users can rent for experiments. Talk about the Internet of Things! Each break in the action saw the Bentley come back into focus with engaging conversations as well as demonstrations.

        Day 3
        Connected Vehicle day! Day 3 saw a very passionate set of presentations focusing on connected vehicles and how we will interact with them. Topics included direct interaction (the fusion of mobile and automotive technology), inter-vehicle communication, and interaction with city infrastructure. I had the opportunity (and the pleasure) to deliver a talk around reshaping the mobile and automotive user experience. The exclamation point to my presentation was the seamless interaction between my mobile device and the Bentley, which was just outside the conference area. Not only did I speak about the possibilities, I demonstrated them.

        Moving the needle
        MLOVE is about innovative people coming together to discuss future trends in mobility, to figure out what needs to be done to move the needle, and generally to be inspired by the experience. Ideation was abundant and minds were blown. Thank you MLOVE — now I have so much more to think about! In all seriousness, great conference, great ideas, great people, a great experience.


        Wednesday, May 1, 2013

        Report from Barcelona: first meeting of the W3C automotive business group

        Last week, I had the privilege of attending the first face-to-face meeting of the W3C automotive business group and the honor of being nominated group co-chair. (The other co-chair is Adam Abramski, an open source project manager for Intel.) With more than 70 members, the group has already become the eight-largest group in the W3C, even though it is barely two months old. Clearly, it’s generating a lot of interest.

        The meeting included three presentations and two contributions. I presented on the lessons we’ve learned with the QNX CAR platform, how we think the market is changing, and how these changes should drive HTML5 standardization efforts.

        I presented my three “musts” for standardizing HTML5 in the car:
        1. Must create something designed to run apps, not HMIs (unless HMIs come along for free)
        2. Must focus on mobile developers as the target development audience
        3. Must support integration of HTML5 environments with native environments like EB Guide and Qt
        I described some of the changes that have resulted from the alignment of the QNX CAR platform with the Apache Cordova framework, and why they are crucial to our HTML5 work. Unfortunately, we didn't have our W3C contribution ready due to these changes, but members generally agreed that having a standard consistent with mobile development was an appropriate course change.

        Tizen and GenIVI gave presentations about their vehicle APIs. Tizen has contributed its APIs, but GenIVI hasn't yet — still waiting on final approvals. Webinos contributed its APIs before the meeting, but didn’t deliver a presentation on its contribution; members had reviewed the Webinos work before the meeting.

        The meeting was a great chance to sit down with people I don’t normally meet. Overall, the group is moving in the right direction, creating a standard that can help automakers bring the goodness of HTML5 into the car.