Showing posts with label Reference vehicle. Show all posts
Showing posts with label Reference vehicle. Show all posts

Wednesday, March 11, 2015

Long time, no see: Catching up with the QNX CAR Platform

By Megan Alink, Director of Marketing Communications for Automotive

It’s a fact — a person simply can’t be in two places at one time. I can’t, you can’t, and the demo team at QNX can’t (especially when they’re brainstorming exciting showcase projects for 2016… but that’s another blog. Note to self.) So what’s a QNX-loving, software-admiring, car aficionado to do when he or she has lost touch and wants to see the latest on the QNX CAR Platform for Infotainment? Video, my friends.

One of the latest additions to our QNX Cam YouTube channel is an update to a video made just over two and a half years ago, in which my colleague, Sheridan Ethier, took viewers on a feature-by-feature walkthrough of the QNX CAR Platform. Now, Sheridan’s back for another tour, so sit back and enjoy a good, old-fashioned catch-up with what’s been going on with our flagship automotive product (with time references, just in case you’re in a bit of a hurry).

Sheridan Ethier hits the road in the QNX reference vehicle based on a modified Jeep Wrangler, running the latest QNX CAR Platform for Infotainment.

We kick things off with a look at one of the most popular elements of an infotainment system — multimedia. Starting around the 01:30 mark, Sheridan shows how the QNX CAR Platform supports a variety of music formats and media sources, from the system’s own multimedia player to a brought-in device. And when your passenger is agitating to switch from the CCR playlist on your MP3 device to Meghan Trainor on her USB music collection, the platform’s fast detection and sync time means you’ll barely miss a head-bob.

The QNX CAR Platform’s native multimedia player — the “juke box” — is just one of many options for enjoying your music.

About five minutes in, we take a look at how the QNX CAR Platform implements voice recognition. Whether you’re seeking out a hot latté, navigating to the nearest airport, or calling a co-worker to say you’ll be a few minutes late, the QNX CAR Platform lets you do what you want to do while doing what you need to do — keeping your hands on the wheel and your eyes on the road. Don’t miss a look at concurrency (previously discussed here by Paul Leroux) during this segment, when Sheridan runs the results of his voice commands (multimedia, navigation, and a hands-free call) smoothly at the same time.

Using voice recognition, users can navigate to a destination by address or point of interest description (such as an airport).

At eight minutes, Sheridan tells us about one of the best examples of the flexibility of the QNX CAR Platform — its support for application environments, including native C/C++, Qt, HTML5, and APK for running Android applications. The platform’s audio management capability makes a cameo appearance when Sheridan switches between the native multimedia player and the Pandora HTML5 app.

Pandora is just one of the HTML5 applications supported by the QNX CAR Platform.

As Sheridan tells us (at approximately 12:00), the ability to project smartphone screens and applications into the vehicle is an important trend in automotive. With technologies like MirrorLink, users can access nearly all of the applications available on their smartphone right from the head unit.

Projection technologies like MirrorLink allow automakers to select which applications will be delivered to the vehicle’s head unit from the user’s connected smartphone. 

Finally, we take a look at two interesting features that differentiate the QNX CAR Platform — last mode persistence (e.g. when the song you were listening to when you turned the car off starts up at the same point when you turn the car back on) and fastboot (which, in the case of QNX CAR, can bring your backup camera to life in 0.8 seconds, far less than the NHTSA-mandated 2 seconds). These features work hand-in-hand to ensure a safer, more enjoyable, more responsive driving experience.

Fastboot in 0.8 seconds means that when you’re ready to reverse, your car is ready to show you the way.

Interested in learning more about the QNX CAR Platform for Infotainment? Check out Paul Leroux’s blog on the architecture of this sophisticated piece of software. To see QNX CAR in action, read Tina Jeffrey’s blog, in which she talks about how the platform was implemented in the reimagined QNX reference vehicle for CES 2015.

Check out the video here:


Wednesday, March 4, 2015

“What do you mean, I have to learn how not to drive?”

The age of autonomous driving lessons is upon us.

Paul Leroux
What would it be like to ride in an autonomous car? If you were to ask the average Joe, he would likely describe a scenario in which he sips coffee, plays video games, and spends quality time with TSN while the car whisks him to work. The average Jane would, no doubt, provide an equivalent answer. The problem with this scenario is that autonomous doesn’t mean driverless. Until autonomous vehicles become better than humans at handling every potential traffic situation, drivers will have to remain alert much or all of the time, even if their cars do 99.9% of the driving for them.

Otherwise, what happens when a car, faced with a situation it can’t handle, suddenly cedes control to the driver? Or what happens when the car fails to recognize a pedestrian on the road ahead?

Of course, it isn’t easy to maintain a high level of alertness while doing nothing in particular. It takes a certain maturity of mind, or at least a lack of ADD. Which explains why California, a leader in regulations for autonomous vehicles, imposes restrictions on who is allowed to “drive” them. Prerequisites include a near-spotless driving record and more than 10 years without a DUI conviction. Drivers must also complete an autonomous driving program, the length of which depends on the car maker or automotive supplier in question. According to a recent investigation by IEEE Spectrum, Google offers the most comprehensive program — it lasts five weeks and subjects drivers to random checks.

1950s approach to improving driver
alertness. Source:
 
Modern Mechanix blog

In effect, drivers of autonomous cars have to learn how not to drive. And, as another IEEE article suggests, they may even need a special license.

Ample warnings
Could an autonomous car mitigate the attention issue? Definitely. It could, for example, give the driver ample warning before he or she needs to take over. The forward collision alerts and other informational ADAS functions in the latest QNX technology concept car offer a hint as to how such warnings could operate. For the time being, however, it’s hard to imagine an autonomous car that could always anticipate when it needs to cede control. Until then, informational ADAS will serve as an adjunct, not a replacement, for eyes, ears, and old-fashioned attentiveness.

Nonetheless, research suggests that adaptive cruise control and other technologies that enable autonomous or semi-autonomous driving can, when compared to human drivers, do a better job of avoiding accidents and improving traffic flow. To quote my friend Andy Gryc, autonomous cars would be more “polite” to other vehicles and be better equipped to negotiate inter-vehicle space, enabling more cars to use the same length of road.

Fewer accidents, faster travel times. I could live with that.


2015 approach to improving driver alertness: instrument cluster from the QNX reference vehicle.

Thursday, February 5, 2015

Have you heard about Phantom Intelligence yet?

If you haven’t, I bet you will. Phantom Intelligence is a startup that is looking to revolutionize LiDAR for automotive. I hadn’t heard of them either until QNX and Phantom Intelligence found themselves involved in a university project in 2014. They had some cool technology and are just all-around good guys, so we started to explore how we could work together at CES 2015. One thing led to another and their technology was ultimately featured in both the QNX reference vehicle and the new QNX technology concept car.

I knew little about LiDAR at the beginning of the partnership. But as I started to ramp up my knowledge I learned that LiDAR can provide valuable sensor input into ADAS systems. Problem is, LiDAR solutions are big, expensive, and have not, for the most part, provided the kind of sensitivity and performance that automakers look for.

Phantom Intelligence is looking to change all this with small, cost-effective LiDAR systems that can detect not just metal, but also people (handy if you are crossing the street and left your Tin Man costume at home) and that are impervious to inclement weather. As a frequent pedestrian this is all music to my ears.

I am still in no way qualified to offer an intelligent opinion on the pros and cons of competing LiDAR technology so I’m just going on the positive feedback I heard from customers and other suppliers into the ADAS space at CES. Phantom turned out to be one of the surprise hits this year and they are just getting started. That’s why I think you will be hear more about them soon.


Both QNX vehicles showcased at CES 2015 use a LiDAR system from Phantom Intelligence to detect obstacles on the road ahead.

Tuesday, January 20, 2015

Driving simulators at CES

CES was just 15 minutes from closing when I managed to slip away from the very busy QNX booth to try out an F1 simulator. Three screens, 6 degrees of freedom, and surround sound came together for the most exciting simulated driving experience I have ever had. I was literally shaking when they dragged me out of the driver’s seat (I didn’t want to stop :-). Mind you, at around $80K for the system, it seems unlikely I will ever own one.

The experience got me thinking about the types of vehicles currently in simulation or in the lab that I fully expect to drive in my lifetime: cars that are virtually impossible to crash, cars that make it painless to travel long distances, and, ultimately, cars that worry about traffic jams so I can read a book.

Re-incarnated: The QNX reference
vehicle.
QNX Software Systems had a very popular simulator of its own at CES this year. You may have seen some details on it already but to recap, it is a new incarnation of our trusty QNX reference vehicle, extended to demonstrate ADAS capabilities. We parked it in front of a 12 foot display and used video footage captured on California’s fabled Highway 1 to provide the closest thing to real-world driving we could create.

The resulting virtual drive showcased the capabilities not only of QNX technology, but of our ecosystem as well. Using the video footage, we provided camera inputs to Itseez’ computer vision algorithms to demonstrate a working example of lane departure warning and traffic sign recognition. By capturing GPS data synchronized with the video footage, and feeding the result through Elektrobit’s Electronic Horizon Solution, we were able to generate curve speed warnings. All this was running on automotive-grade Jacinto 6 silicon from Texas Instruments. LiDAR technology from Phantom Intelligence rounded out the offering by providing collision feedback to the driver.

The lane departure and curve speed warnings in action. Screen-grab from video by Embedded Computing Design.

Meeting the challenge
While at CES, I also had the opportunity to meet with companies that are working to make advanced ADAS systems commercially viable. Phantom Intelligence is one example but I was also introduced to companies that can provide thermal imaging systems and near-infrared cameras at a fraction of what these technologies cost today.

These are all examples of how the industry is rising up to meet the challenge of safer, more autonomous vehicles at a price point that allows for widespread adoption in the foreseeable future. Amazing stuff, really — we are finally entering the era of the Jetsons.

By the way, I can’t remember what booth I was in when I drove the simulator. But I’m willing to bet that the people who experienced the Jeep at CES will remember they were in the QNX booth, seeing technology from QNX and its key partners in this exciting new world.

Wednesday, January 7, 2015

Now with ADAS: The revamped QNX reference vehicle

Tina Jeffrey
Since 2012, our Jeep has showcased what QNX technology can do out of the box. We decided it was time to up the ante...

I walked into the QNX garage a few weeks ago and did a double take. The QNX reference vehicle, a modified Jeep Wrangler, had undergone a major overhaul both inside and out — and just in time for 2015 CES.

Before I get into the how and why of the Jeep’s metamorphosis, here’s a glimpse of its newly refreshed exterior. Orange is the new gray!



The Jeep debuted in June 2012 at Telematics Detroit. Its purpose: to show how customers can use off-the-shelf QNX products, like the QNX CAR Platform for Infotainment and QNX OS, to build a wide range of custom infotainment systems and instrument clusters, using a single code base.

From day one, the Jeep has been a real workhorse, making appearances at numerous events to showcase the latest HMI, navigation, speech recognition, multimedia, and handsfree acoustics technologies, not to mention embedded apps for parking, internet radio streaming, weather, and smartphone connectivity. The Jeep has performed dependably time and time again, and now, in an era where automotive safety is top of mind, we’ve decided to up the ante and add leading-edge ADAS technology built on the QNX OS.

After all, what sets the QNX OS apart is its proven track record in safety-certified systems across market segments — industrial, medical, and automotive. In fact, the QNX OS for Automotive Safety is certified to the highest level of automotive functional safety: ISO 26262, ASIL D. Using a pre-certified OS component is key to the overall integrity of an automotive system and makes system certification much easier.

The ultimate (virtual) driving experience
How better to showcase ADAS in the Jeep, than by a virtual drive? At CES, a 12-foot video screen in front of the Jeep plays a pre-recorded driving scene, while the onboard ADAS system analyzes the scene to detect lane markers, speed signs, and preceding vehicles, and to warn of unintentional lane departures, excessive speed, and imminent crashes with vehicles on the road ahead. Onboard computer vision algorithms from Itseez process the image frames in real time to perform these functions simultaneously.

Here’s a scene from the virtual drive, in which the ADAS system is tracking lane markings and has detected a speed-limit sign:



If the vehicle begins to drift outside a lane, the steering wheel provides haptic feedback and the cluster displays a warning:



The ADAS system includes EB Assist eHorizon, which uses map data with curve-speed information to provide warnings and recommendations, such as reducing your speed to navigate an upcoming curve:



The Jeep also has a LiDAR system from Phantom Intelligence (formerly Aerostar) to detect obstacles on the road ahead. The cluster displays warnings from this system, as well as warnings from the vision-based collision-detection feature. For example:



POSTSCRIPT:
Here’s a short video of the virtual drive, taken at CES by Brandon Lewis of Embedded Computing Design, in which you can see curve-speed warnings and lane-departure warnings:



Fast-boot camera
Rounding out the ADAS features is a rear-view camera demo that can cold boot in 0.8 seconds on a Texas Instruments Jacinto 6 processor. As you may recall, NHTSA has mandated that, by May 2018, most new vehicles must have rear-view technology that can display a 10-by-20 foot area directly behind the vehicle; moreover, the display must appear no more than 2 seconds after the driver throws the vehicle into reverse. Backup camera and other fastboot requirements such as time-to-last-mode audio, time-to-HMI visible, and time-to-fully-responsive HMI are critically important to automakers. Be sure to check out the demo — but don’t blink or you’ll miss it!

Full-featured infotainment
The head unit includes a full-featured infotainment system based on the QNX CAR Platform for Infotainment and provides information such as weather, current song, and turn-by-turn directions to the instrument cluster, where they’re easier for the driver to see.



Infotainment features include:

Qt-based HMI — Can integrate other HMI technologies, including EB Guide and Crank Storyboard.

Natural language processing (NLP) — Uses Nuance’s Vocon Hybrid solution in concert with the QNX NLP technology for natural interaction with infotainment functions. For instance, if you ask “Will I need a jacket later today?”, the Weather Network app will launch and provide the forecast.

EB street director — Provides embedded navigation with a 3D map engine; the map is synched up with the virtual drive during the demo.

QNX CAR Platform multimedia engine — An automotive-hardened solution that can handle:
  • audio management for seamless transitions between all audio sources
  • media detection and browsing of connected devices
  • background synching of music for instant media playback — without the need for the synch to be completed

Support for all smartphone connectivity options — DLNA, MTP, MirrorLink, Bluetooth, USB, Wi-Fi, etc.

On-board application framework — Supports Qt, HTML5, APK (for Android apps), and native OpenGL ES apps. Apps include iHeart, Parkopedia, Pandora, Slacker, and Weather Network, as well as a Settings app for phone pairing, over-the-air software updates, and Wi-Fi hotspot setup.

So if you’re in the North Hall at CES this week, be sure to take a virtual ride in the QNX reference vehicle in Booth 2231. Beneath the fresh paint job, it’s the same workhorse it has always been, but now with new ADAS tech automakers are thirsting for.

Wednesday, December 17, 2014

One day I’ll be Luke Skywalker

Cyril Clocher
What happens when you blend ADAS with infotainment? Guest post by Cyril Clocher, business manager for automotive processors at Texas Instruments

As we all begin preparing for our trek to Vegas for CES 2015, I would like my young friends (born in the 70s, of course) to reflect on their impressions of the first episode of Lucas’s trilogy back in 1977. On my side, I perfectly remember thinking one day I would be Luke Skywalker.

The eyes of young boys and girls were literally amazed by this epic space opera and particularly by technologies used by our heroes to fight the Galactic Empire. You have to remember it was an era where we still used rotary phones and GPS was in its infancy. So you can imagine how impactful it was for us to see our favorite characters using wireless electronic gadgets with revolutionary HMIs such as natural voice recognition, gesture controls or touch screens; droids speaking and enhancing human intelligence; and autonomous vehicles traveling the galaxy safely while playing chess with a Wookiee. Now you’re with me…

But instead of becoming Luke Skywalker a lot of us realized that we would have a bigger impact by inventing or engineering these technologies and by transforming early concepts into real products we all use today. As a result, smartphones and wireless connectivity are now in our everyday lives; the Internet of Things (IoT) is getting more popular in applications such as activity trackers that monitor personal metrics; and our kids are more used to touch screens than mice or keyboards, and cannot think of on-line gaming without gesture control. In fact, I just used voice recognition to upgrade the Wi-Fi plan from my Telco provider.

But the journey is not over yet. Our generation has still to deliver an autonomous vehicle that is green, safe, and fun to control – I think the word “drive” will be obsolete for such a vehicle.

The automotive industry has taken several steps to achieve this exciting goal, including integration of advanced and connected in-car infotainment systems in more models as well as a number of technologies categorized under Advanced Driver Assistance Systems (ADAS) that can create a safer and unique driving experience. From more than a decade, Texas Instruments has invested in infotainment and ADAS: “Jacinto” and TDAx automotive processors as well as the many analog companion chips supporting these trends.

"Jacinto 6 EP" and "Jacinto 6 Ex"
infotainment processor
s
A unique approach of TI is our capability to leverage best of both worlds for non-safety critical features, and to provide a seamless integration of informational ADAS functions into existing infotainment systems so the vehicle better informs and warns the driver. We announced that capability at SAE Convergence in Detroit in October 2014 with the “Jacinto 6 Ex” processor (DRA756), which contains powerful CPU, graphics multimedia, and radio cores with differentiated vision co-processors, called embedded vision engines (EVE), and additional DSPs that perform the complex ADAS processing.

For the TI’s automotive team, the CES 2015 show is even more exciting than in previous years, as we’ve taken our concept of informational ADAS to the next step. With joint efforts and hard work from both TI and QNX teams, we’ve together implemented a real informational ADAS system running the QNX CAR™ Platform for Infotainment on a “Jacinto 6 Ex” processor.

I could try describing this system in detail, but just like the Star Wars movies, it’s best to experience our “Jacinto 6 Ex” and QNX CAR Platform-based system in person. Contact your TI or QNX representative today and schedule a meeting to visit our private suite at CES at the TI Village (N115-N119) or to immerse yourself in a combined IVI, cluster, megapixel surround view, and DLP® based HUD display with augmented reality running on a single “Jacinto 6 Ex” SoC demonstration. And don't forget to visit the QNX booth (2231), where you can see the QNX reference vehicle running a variety of ADAS and infotainment applications on “Jacinto 6” processors.

Integrated cockpit featuring DLP powered HUD and QNX CAR Platform running on a single “Jacinto 6 Ex” SoC.
One day I’ll experience Skywalker’s life as I will no doubt have the opportunity to control an intelligent and autonomous vehicle with my biometrics, voice, and gestures while riding with my family to the movie theater playing chess with my grandkids, not yet a Wookiee.

Monday, December 8, 2014

Cast your vote: which CES show car, past or present, should get a makeover at this year’s show?

Lynn Gayowski
Lynn Gayowski
2015 CES is only a few weeks away! This year in addition to showcasing a new technology concept car, we'll have some exciting updates to one of our existing vehicles. Before we unveil which vehicle will receive its CES facelift, we want to hear from you.

Starting today, through Monday, January 5, cast your vote on which CES show car, past or present, from QNX Software Systems you would most like to see revamped at this year's show. We will announce the results on Tuesday, January 6 – the first day of the show. Here is our full list of cars:


What will it be — the BMW Z4 Roadster or the Bentley Continental GT? Perhaps it's the LTE Connected Car based on a Toyota Prius or the Kia Soul that we had on display last year?

Let the voting begin!

Wednesday, December 3, 2014

Words to the wise: discover, integrate, trust, and experience

Lynn Gayowski
Lynn Gayowski
It's hard to believe that 2015 CES is right around the corner. And like elves in Santa's workshop, we've been hard at work on our awesome show demos — which includes a new technology concept car and updates to one of our reference vehicles (more on that later).

At the heart of our CES presence, from our booth theme to show demos, will be four words that encapsulate the key values that QNX Software Systems delivers — discover, integrate, trust, and experience. Each week leading up to CES, we'll highlight one of these words and outline how it relates to the core of QNX Software Systems and its technologies.

We're kicking off the series tomorrow so be sure to check back to read our latest blog post.

Monday, July 21, 2014

The lost concept car photos

Have you ever rummaged through old boxes in your basement and discovered family photos you had totally forgotten about — or never knew existed? I experienced a moment like that a couple of weeks ago. Except, in this case, no basement was involved. And the box wasn't a box, but a shared drive. And the photos weren't of my family, but of cars. QNX technology concept cars, to be exact.

At least once a year, the QNX concept team retrofits a new vehicle to demonstrate how our technology can help auto companies push the envelope in connectivity, infotainment, and acoustics. And, in every case, we take pictures — sometimes, lots of them. Inevitably, we end up choosing a few images for publicity purposes and filing the others. But as I discovered, the images we don't use are often just as good as the ones we do use. We just don't need all of them!

In any case, stumbling across these photos was great fun. I thought you might enjoy them, too, so here goes...

The Porsche
First up is the QNX technology concept car based on a Porsche 911, which made its debut at 2012 CES. We had originally planned to drive the car back to Ottawa once CES was over — but that was before we spoke to our friends at Texas Instruments, who provided the silicon for the car's instrument cluster and infotainment system. They liked the car so much, they asked if we could bring it to their HQ in Dallas, where the following two photos were taken. All I can say is, Dallas is home to at least one awesome cool photographer. Because rather than curse the crazy lighting, the photographer used it to create some playful compositions:





If you look below, you'll see another shot of the Porsche, taken just before we shipped it off to CES. The image really doesn't belong in this collection, as it appeared once on a partner website. But it's rare nonetheless, so I decided to include it. And besides, it's cool. Literally.



Did you know? The original Porsche 911, which debuted in the early 60s, was dubbed the 901. Problem was, Peugeot had exclusive rights in France to three-digit car names with a 0 in the middle. And so, the 901 became the 911.



The Bentley
Next up is the QNX technology concept car based on a Bentley Continental GT. In this image, the driver is interacting with the center stack's main control knob, which was mounted directly on a 17" touchscreen. See the row of icons just above the knob? These represented HVAC, music, navigation, hands-free calling and other system functions. The  system would automatically display these icons as your hand approached the display; you would then turn the knob to choose the function you wanted. (This image was taken by a BlackBerry employee, whose name I have most ungraciously forgotten.)



As with our all concept vehicles, the intent was to showcase the technology that we had built into the car's dashboard and center stack. Which probably explains why the following image of the car's exterior was never published. Pity, as it's quite lovely — a classic case of flare adding flair.



Did you know? Those wheels aren't just for show. The Bentley comes equipped with a 616 hp W12 engine (yup, three banks of cylinders) that can do 0-60 mph in a little over 4 seconds — it took me way longer than that to type this sentence.



The Jeep
Next up is the Jeep Wrangler, which serves as the QNX reference vehicle. The Jeep plays a different role than the other vehicles highlighted here: instead of demonstrating how QNX technology can help automotive companies innovate, it shows what the QNX CAR Platform for Infotainment can do right out of the box. In this image, you can see the vehicle's main navigation menu:



Did you know? The original infotainment system in the reference vehicle could post Facebook updates that listed the title and artist of the song currently playing. The system performed this magic in response to simple voice commands.



The Vette
The QNX technology concept car based on a Chevrolet Corvette made its debut at SAE Convergence 2010. Among other things, it showed how digital instrument clusters can morph on the fly to provide drivers with context-sensitive information, such as turn-by-turn directions. You can see a slicker, more sophisticated approach to reconfigurable clusters in our most recent technology concept car based on a Mercedes CLA45.



Did you know? We used the Corvette to demonstrate how QNX technology enables automotive companies to create customizable, reskinnable user interfaces. Check out this post on the Corvette's 30-day UI challenge.



The Prius
The first QNX-powered technology concept car was a digitally modded Prius — aka the LTE Connected Car. The car was a joint project of several companies, including QNX and Alcatel-Lucent, who wanted to demonstrate how 4G/LTE networks could transform the driving experience with a host of new in-vehicle applications.

Here's the car with a very proud-looking Derek Kuhn, who spearheaded the LTE Connected Car project while serving as a VP at Alcatel-Lucent. Derek subequently joined QNX as VP of sales and marketing:



Did you know? When this car was created, telecom companies had yet to light up their first commercial LTE towers. Also, the car had more infotainment systems than any other QNX technology concept car: two in the front (one for the driver and one for the front-seat passenger) and two in the back.



Some things get lost, albeit temporarily. And some you just never see again. Fortunately, all these images belong to the first category. Any favorites?

Tuesday, February 4, 2014

Head to the polls and vote for your favorite CES Car of Fame

Over the last couple of months we have recapped the stars of the QNX garage – our technology concept cars and reference vehicle — in the CES Cars of Fame series. And now, we are opening the floor to you!

Starting today through February 14 you can vote for your favorite vehicle that we have featured at CES. Did the eye-catching Bentley strike your fancy or did the updated Jeep put you into another gear? It’s all up to you. We will announce the fan favorite on Tuesday, February 18.

So once again here is the full list of our CES Cars of Fame blog posts. Have one last look and cast your vote:

Cast your vote here.

Wednesday, December 11, 2013

Is this the most jazzed-up Jeep ever to hit CES?

The fourth installment in the CES Cars of Fame series. Our inductee for this week: a Jeep that gets personal.

Paul Leroux
It might not be as hip as the Prius or as fast as the Porsche. But it's fun, practical, and flexible. Better yet, you can drive it just about anywhere. Which makes it the perfect vehicle to demonstrate the latest features of the QNX CAR Platform for Infotainment.

It's called the QNX reference vehicle, and it's been to CES in Las Vegas, as well as to Detroit, New York City, and lots of places in between. It's our go-to vehicle for whenever we want to hit the road and showcase our latest infotainment technology. It even made a guest appearance at IBM's recent Information On Demand 2013 Big Data conference, where it demonstrated the power of connecting cars to the cloud.

The reference vehicle, which is based on a Jeep Wrangler, serves a different purpose than our technology concept cars. Those vehicles take the QNX CAR Platform as a starting point to demonstrate how the platform can help automakers hit new levels of innovation. The reference vehicle plays a more modest, but equally important, role: to show what our the platform can do out of the box.

For instance, we updated the Jeep recently to show how version 2.1 of the QNX CAR Platform will allow developers to blend a variety of application and HMI technologies on the same display. In this case, the Jeep's head unit is running a mix of native, HTML5, and Android apps on an HMI built with the Qt application framework:



Getting personal
We also use the Jeep to demonstrate the platform's support for customization and personalization. For instance, here is the first demonstration instrument cluster we created specifically for the Jeep:



And here's a more recent version:



These clusters may look very different, but they share the same underlying features, such as the ability to display turn-by-turn directions, weather updates, and other information provided by the head unit.

Keeping with the theme of personalization, the Jeep also demonstrates how the QNX CAR Platform allows developers to create re-skinnable HMIs. Here, for example, is a radio app in one skin:



And here's the same app in a different skin:



This re-skinnability isn't just cool; it also demonstrates how the QNX CAR Platform can help automotive developers create a single underlying code base and re-use it across multiple vehicle lines. Good, that.

Getting complementary
The Jeep is also the perfect vehicle to showcase the ecosystem of complementary apps and services integrated with the QNX CAR Platform, such as the (very cool) street director navigation system from Elektrobit:



To return to the question, is this really the most jazzed-up Jeep to hit CES? Well, it will be making a return trip to CES in just a few weeks, with a whole new software build. So if you're in town, drop by and let us know what you think.

Wednesday, October 30, 2013

My top moments of 2013 — so far

Paul Leroux
Yes, I know, 2013 isn’t over yet. But it’s been such a milestone year for our automotive business that I can’t wait another two months to talk about it. And besides, you’ll be busy as an elf at the end of December, visiting family and friends, skiing the Rockies, or buying exercise equipment to compensate for all those holiday carbs. Which means if I wait, you’ll never get to read this. So let’s get started.


We unveil a totally new (and totally cool) technology concept car
Times Square. We were there.
It all began at 2013 CES, when we took the wraps off the latest QNX technology concept car — a one-of-a-kind Bentley Continental GT. The QNX concept team outfitted the Bentley with an array of technologies, including a high-definition DLP display, a 3D rear-view camera, cloud-based voice recognition, smartphone connectivity, and… oh heck, just read the blog post to get the full skinny.

Even if you weren’t at CES, you could still see the car in action. Brian Cooley of CNET, Michael Guillory of Texas Instruments, the folks at Elektrobit, and Discovery Canada’s Daily Planet were just some of the individuals and organizations who posted videos. You could also connect to the car through a nifty web app. Heck, you could even see the Bentley’s dash on the big screen in Times Square, thanks to the promotional efforts of Elektrobit, who also created the 3D navigation software for the concept car.

We ship the platform
We wanted to drive into CES with all cylinders firing, so we also released version 2.0 of the QNX CAR Platform for Infotainment. In fact, several customers in the U.S., Germany, Japan, and China had already started to use the platform, through participation in an early access program. Which brings me to the next milestone...

Delphi boards the platform
The first of many.
Also at CES, Delphi, a global automotive supplier and long-time QNX customer, announced that version 2.0 of the QNX CAR Platform will form the basis of its next-generation infotainment systems. As it turned out, this was just one of several QNX CAR customer announcements in 2013 — but I’m getting ahead of myself.

We have the good fortune to be featured in Fortune
Fast forward to April, when Fortune magazine took a look at how QNX Software Systems evolved from its roots in the early 1980s to become a major automotive player. Bad news: you need a subscription to read the article on the Fortune website. Good news: you can read the same article for free on CNN Money. ;-)

A music platform sets the tone for our platform
In April, 7digital, a digital music provider, announced that it will integrate its 23+ million track catalogue with the QNX CAR Platform. It didn't take long for several other partners to announce their platform support. These include Renesas (R-Car system-on-chip for high-performance infotainment), AutoNavi (mobile navigation technology for the Chinese market), Kotei (navigation engine for the Japanese market), and Digia (Qt application framework).

We stay focused on distraction
Back in early 2011, Scott Pennock of QNX was selected to chair an ITU-T focus group on driver distraction. The group’s objective was serious and its work was complex, but its ultimate goal was simple: to help reduce collisions. This year, the group wrapped up its work and published several reports — but really, this is only the beginning of QNX and ITU-T efforts in this area.

We help develop a new standard
Goodbye fragmentation; hello
standard APIs.
Industry fragmentation sucks. It means everyone is busy reinventing the wheel when they could be inventing something new instead. So I was delighted to see my colleague Andy Gryc become co-chair of the W3C Automotive and Web Platform Business Group, which has the mandate to accelerate the adoption of web technologies in the car. Currently, the group is working to draft a standard set of JavaScript APIs for accessing vehicle data information. Fragmentation, thy days are numbered.

We launch an auto safety program
A two-handed approach to
helping ADAS developers.
On the one hand, we have a 30-year history in safety-critical systems and proven competency in safety certifications. On the other hand, we have deep experience in automotive software design. So why not join both hands together and allow auto companies to leverage our full expertise when they are building digital instrument clusters, advanced driver assistance systems (ADAS), and other in-car systems with safety requirements?

That’s the question we asked ourselves, and the answer was the new QNX Automotive Safety Program for ISO 26262. The program quickly drew support from several industry players, including Elektrobit, Freescale, NVIDIA, and Texas Instruments.

We jive up the Jeep
A tasty mix of HTML5 & Android
apps, served on a Qt interface,
with OpenGL ES on the side.
If you don’t already know, we use a Jeep Wrangler as our reference vehicle — basically, a demo vehicle outfitted with a stock version of the QNX CAR Platform. This summer, we got to trick out the Jeep with a new, upcoming version of the platform, which adds support for Android apps and for user interfaces based on the Qt 5 framework.

Did I mention? The platform runs Android apps in a separate application container, much like it handles HTML5 apps. This sandboxed approach keeps the app environment cleanly partitioned from the UI, protecting both the UI and the overall system from unpredictable web content. Good, that.

The commonwealth’s leader honors our leader
I only ate one piece. Honest.
Okay, this one has nothing to do with automotive, but I couldn’t resist. Dan Dodge, our CEO and co-founder, received a Queen Elizabeth II Diamond Jubilee Medal in recognition of his many achievements and contributions to Canadian society. To celebrate, we gave Dan a surprise party, complete with the obligatory cake. (In case you’re wondering, the cake was yummy. But any rumors suggesting that I went back for a second, third, and fourth piece are total fabrications. Honestly, the stories people cook up.)

Mind you, Dan wasn’t the only one to garner praise. Sheridan Ethier, the manager of the QNX CAR development team, was also honored — not by the queen, but by the Ottawa Business Journal for his technical achievements, business leadership, and community involvement.

Chevy MyLink drives home with first prize — twice
There's nothing better than going home with first prize. Except, perhaps, doing it twice. In January, the QNX-based Chevy MyLink system earned a Best of CES 2013 Award, in the car tech category. And in May, it pulled another coup: first place in the "Automotive, LBS, Navigation & Safe Driving" category of the 2013 CTIA Emerging Technology (E-Tech) Awards.

Panasonic, Garmin, and Foryou get with the platform
Garmin K2 platform: because
one great platform deserves
another.
August was crazy busy — and crazy good. Within the space of two weeks, three big names in the global auto industry revealed that they’re using the QNX CAR Platform for their next-gen systems. Up first was Panasonic, who will use the platform to build systems for automakers in North America, Europe, and Japan. Next was Foryou, who will create infotainment systems for automakers in China. And last was Garmin, who are using the platform in the new Garmin K2, the company’s infotainment solution for automotive OEMs.

And if all that wasn’t cool enough…

Mercedes-Benz showcases the platform
Did I mention I want one?
When Mercedes-Benz decides to wow the crowds at the Frankfurt Motor Show, it doesn’t settle for second best. Which is why, in my not so humble opinion, they chose the QNX CAR Platform for the oh-so-desirable Mercedes-Benz Concept S-Class Coupé.

Mind you, this isn’t the first time QNX and Mercedes-Benz have joined forces. In fact, the QNX auto team and Mercedes-Benz Research & Development North America have collaborated since the early 2000s. Moreover, QNX has supplied the OS for a variety of Mercedes infotainment systems. The infotainment system and digital cluster in the Concept S-Class Coupé are the latest — and arguably coolest — products of this long collaboration.

We create noise to eliminate noise
Taking a sound approach to
creating a quieter ride.
Confused yet? Don’t be. You see, it’s quite simple. Automakers today are using techniques like variable cylinder management, which cut fuel consumption (good), but also increase engine noise (bad). Until now, car companies have been using active noise control systems, which play “anti-noise” to cancel out the unwanted engine sounds. All fine and good, but these systems require dedicated hardware — and that makes them expensive. So we devised a software product, QNX Acoustics for Active Noise Control, that not only out-performs conventional solutions, but can run on the car’s existing audio or infotainment hardware. Goodbye dedicated hardware, hello cost savings.

And we flub our lines on occasion
Our HTML5 video series has given companies like Audi, OnStar, Gartner, TCS, and Pandora a public forum to discuss why HTML5 and other open standards are key to the future of the connected car. The videos are filled with erudite conversation, but every now and then, it becomes obvious that sounding smart in front of a camera is a little harder than it looks. So what did we do with the embarrassing bits? Create a blooper reel, of course.

Are these bloopers our greatest moments? Nope. Are they among the funniest? Oh yeah. :-)