Showing posts with label HMIs. Show all posts
Showing posts with label HMIs. Show all posts

Wednesday, March 11, 2015

Long time, no see: Catching up with the QNX CAR Platform

By Megan Alink, Director of Marketing Communications for Automotive

It’s a fact — a person simply can’t be in two places at one time. I can’t, you can’t, and the demo team at QNX can’t (especially when they’re brainstorming exciting showcase projects for 2016… but that’s another blog. Note to self.) So what’s a QNX-loving, software-admiring, car aficionado to do when he or she has lost touch and wants to see the latest on the QNX CAR Platform for Infotainment? Video, my friends.

One of the latest additions to our QNX Cam YouTube channel is an update to a video made just over two and a half years ago, in which my colleague, Sheridan Ethier, took viewers on a feature-by-feature walkthrough of the QNX CAR Platform. Now, Sheridan’s back for another tour, so sit back and enjoy a good, old-fashioned catch-up with what’s been going on with our flagship automotive product (with time references, just in case you’re in a bit of a hurry).

Sheridan Ethier hits the road in the QNX reference vehicle based on a modified Jeep Wrangler, running the latest QNX CAR Platform for Infotainment.

We kick things off with a look at one of the most popular elements of an infotainment system — multimedia. Starting around the 01:30 mark, Sheridan shows how the QNX CAR Platform supports a variety of music formats and media sources, from the system’s own multimedia player to a brought-in device. And when your passenger is agitating to switch from the CCR playlist on your MP3 device to Meghan Trainor on her USB music collection, the platform’s fast detection and sync time means you’ll barely miss a head-bob.

The QNX CAR Platform’s native multimedia player — the “juke box” — is just one of many options for enjoying your music.

About five minutes in, we take a look at how the QNX CAR Platform implements voice recognition. Whether you’re seeking out a hot latté, navigating to the nearest airport, or calling a co-worker to say you’ll be a few minutes late, the QNX CAR Platform lets you do what you want to do while doing what you need to do — keeping your hands on the wheel and your eyes on the road. Don’t miss a look at concurrency (previously discussed here by Paul Leroux) during this segment, when Sheridan runs the results of his voice commands (multimedia, navigation, and a hands-free call) smoothly at the same time.

Using voice recognition, users can navigate to a destination by address or point of interest description (such as an airport).

At eight minutes, Sheridan tells us about one of the best examples of the flexibility of the QNX CAR Platform — its support for application environments, including native C/C++, Qt, HTML5, and APK for running Android applications. The platform’s audio management capability makes a cameo appearance when Sheridan switches between the native multimedia player and the Pandora HTML5 app.

Pandora is just one of the HTML5 applications supported by the QNX CAR Platform.

As Sheridan tells us (at approximately 12:00), the ability to project smartphone screens and applications into the vehicle is an important trend in automotive. With technologies like MirrorLink, users can access nearly all of the applications available on their smartphone right from the head unit.

Projection technologies like MirrorLink allow automakers to select which applications will be delivered to the vehicle’s head unit from the user’s connected smartphone. 

Finally, we take a look at two interesting features that differentiate the QNX CAR Platform — last mode persistence (e.g. when the song you were listening to when you turned the car off starts up at the same point when you turn the car back on) and fastboot (which, in the case of QNX CAR, can bring your backup camera to life in 0.8 seconds, far less than the NHTSA-mandated 2 seconds). These features work hand-in-hand to ensure a safer, more enjoyable, more responsive driving experience.

Fastboot in 0.8 seconds means that when you’re ready to reverse, your car is ready to show you the way.

Interested in learning more about the QNX CAR Platform for Infotainment? Check out Paul Leroux’s blog on the architecture of this sophisticated piece of software. To see QNX CAR in action, read Tina Jeffrey’s blog, in which she talks about how the platform was implemented in the reimagined QNX reference vehicle for CES 2015.

Check out the video here:


Wednesday, March 4, 2015

“What do you mean, I have to learn how not to drive?”

The age of autonomous driving lessons is upon us.

Paul Leroux
What would it be like to ride in an autonomous car? If you were to ask the average Joe, he would likely describe a scenario in which he sips coffee, plays video games, and spends quality time with TSN while the car whisks him to work. The average Jane would, no doubt, provide an equivalent answer. The problem with this scenario is that autonomous doesn’t mean driverless. Until autonomous vehicles become better than humans at handling every potential traffic situation, drivers will have to remain alert much or all of the time, even if their cars do 99.9% of the driving for them.

Otherwise, what happens when a car, faced with a situation it can’t handle, suddenly cedes control to the driver? Or what happens when the car fails to recognize a pedestrian on the road ahead?

Of course, it isn’t easy to maintain a high level of alertness while doing nothing in particular. It takes a certain maturity of mind, or at least a lack of ADD. Which explains why California, a leader in regulations for autonomous vehicles, imposes restrictions on who is allowed to “drive” them. Prerequisites include a near-spotless driving record and more than 10 years without a DUI conviction. Drivers must also complete an autonomous driving program, the length of which depends on the car maker or automotive supplier in question. According to a recent investigation by IEEE Spectrum, Google offers the most comprehensive program — it lasts five weeks and subjects drivers to random checks.

1950s approach to improving driver
alertness. Source:
 
Modern Mechanix blog

In effect, drivers of autonomous cars have to learn how not to drive. And, as another IEEE article suggests, they may even need a special license.

Ample warnings
Could an autonomous car mitigate the attention issue? Definitely. It could, for example, give the driver ample warning before he or she needs to take over. The forward collision alerts and other informational ADAS functions in the latest QNX technology concept car offer a hint as to how such warnings could operate. For the time being, however, it’s hard to imagine an autonomous car that could always anticipate when it needs to cede control. Until then, informational ADAS will serve as an adjunct, not a replacement, for eyes, ears, and old-fashioned attentiveness.

Nonetheless, research suggests that adaptive cruise control and other technologies that enable autonomous or semi-autonomous driving can, when compared to human drivers, do a better job of avoiding accidents and improving traffic flow. To quote my friend Andy Gryc, autonomous cars would be more “polite” to other vehicles and be better equipped to negotiate inter-vehicle space, enabling more cars to use the same length of road.

Fewer accidents, faster travel times. I could live with that.


2015 approach to improving driver alertness: instrument cluster from the QNX reference vehicle.

Monday, January 5, 2015

To infotainment... and beyond! First look at new QNX technology concept car

The new car delivers everything you’d expect in a concept vehicle from QNX. But the real buzz can be summarized in a four-letter word: ADAS

The technology in today's cars is light-years ahead of the technology in cars 10 or 20 years ago. The humans driving those cars, however, have changed little in the intervening years. They still need to focus on a host of mundane driving tasks, from checking blind spots and monitoring road signs to staying within the lane and squeezing into parking spaces. In fact, with all the technology now in the car, including a variety of brought-in devices, some drivers suffer from information overload and perform worse, instead of better, at these crucial tasks.

Advanced driver assistance systems, or ADAS, can go a long way to offset this problem. They come in a variety of shapes and sizes — from drowsiness monitoring to autonomous emergency braking — but most share a common goal: to help the driver avoid accidents.

Which brings us to the new QNX technology concept car. As you’d expect, it includes all the advanced infotainment features, including smartphone connectivity and rich app support, offered by the QNX CAR Platform for Infotainment. But it also integrates an array of additional technologies — including cameras, LiDAR, ultrasonic sensors, and specialized navigation software — to deliver ADAS capabilities that simplify driving tasks, warn of possible collisions, and enhance overall driver awareness.

Mind you, the ADAS features shouldn’t come as any more of a surprise than the infotainment features. After all, QNX Software Systems also offers the QNX OS for Automotive Safety, a solution based on decades of experience in safety-critical systems and certified to ISO 26262, Automotive Safety Integrity Level D — the highest level achievable.

Okay, enough blather. Time to check out the car!

The “I want that” car
If the trident hasn’t already tipped you off, the new technology concept car is based on a Maserati QuattroPorte GTS. I won’t say much about the car itself, except I want one. Did I say want? Sorry, I meant lust. Because omigosh:



The differentiated dash
Before we run through the car’s many features, let’s stop for a minute and savor the elegant design of its QNX-powered digital instrument cluster and infotainment system. To be honest, I have an ulterior motive for sharing this image: if you compare the systems shown here to those of previous QNX technology concept cars (here, here, and here), you’ll see that they each project a distinct look-and-feel. Automakers need to differentiate themselves, and, as a group, these cars illustrate how the flexibility of the QNX platform enables unique, branded user experiences:



The multi-talented digital instrument cluster
Okay, let’s get behind the wheel and test out the digital cluster. Designed to heighten driver awareness, the cluster can show the current speed limit, display an alert if you exceed the limit, and even recommend an appropriate speed for upcoming curves. Better yet, it can display turn-by-turn directions provided by the car’s infotainment system.

Normally, the cluster displays the speed limit in a white circle. But in this image, the cluster displays it in red, along with a red bar to show how much you are over the limit — a gentle reminder to ease off the gas:



Using LiDAR input, the cluster can also warn of obstacles on the road ahead:



And if that’s not enough, the cluster provides intelligent parking assist to help you back into tight spaces. Here, for example, is an impromptu image we took in the QNX garage. The blue-and-yellow guidelines represent the car’s reverse trajectory, and the warning on right says that you are about to run over an esteemed member of the QNX concept team!



The rear- and side-view mirrors that aren’t really mirrors
By their very nature, car mirrors have blind spots. To address this problem, the QNX concept team has transformed the car’s rear- and side-view mirrors into video displays that offer a complete view of the scene behind and to the sides of the vehicle. As you can see in this image, the side-view displays can also display a red overlay to warn of cars, bikes, people, or anything else approaching the car’s blind zones:



The ADAS display for enhancing obstacle awareness
I don’t have pictures yet, but the car also includes an innovative LED-based display lets you gauge the direction and proximity of objects to the front, rear, and sides of the vehicle — without having to take your eyes off the road. Stretching the width of the dash, the display integrates input from the car’s ultrasonic and LiDAR sensors to provide a centralized view of ADAS warnings.

The easy-to-use infotainment system
To demonstrate the capabilities of the QNX CAR™ Platform for Infotainment, we’ve outfitted the car with a feature-rich, yet intuitive, multimedia head unit. For instance, see the radio tuner in the following image? That’s no ordinary tuner. To change channels, you can just swipe across the display; if your swipe isn’t perfectly accurate, the radio will automatically zero in on the nearest station or preset.

Better yet, the radio offers “iHeart drive anywhere radio.” If you drive out of range of your favorite AM/FM radio station, the system will detect the problem and automatically switch to the corresponding digital iHeartRadio station. How cool is that?



Other infotainment features include:
  • Natural voice recognition — For instance, if you say “It’s way too cold in here,” the HVAC system will respond by raising the heat.
  • Integration with a wide variety of popular smartphones.
  • Support for multiple concurrent app environments, along with a variety of Android and HTML5 apps, as well as an HMI built with the Qt framework.
  • A backseat display that lets passengers control HVAC functions, navigation, song selection, and other infotainment features.

The oh-so-awesome partners
The car is a testament not only to QNX technology, but to the ecosystem of technology partners that provide complementary solutions for QNX customers. Peek under the hood, and you'll find the latest tech from Elektrobit, iHeart, Nuance, Pandora, Parkopedia, Phantom Intelligence, Qualcomm, RealVNC, Rightware, and TE Connectivity.

The other stuff
Do not, for one minute, think that the Maserati is the only attraction in the QNX booth. Far from it. We will also showcase a significantly revamped QNX reference vehicle, outfitted with lane departure warnings, traffic sign recognition, and other ADAS features, as well as the latest version of the QNX CAR Platform — more in an upcoming post.

Visitors to the booth will also have the opportunity to experience:
  • a 3D navigation solution from Aisin AW
  • a digital instrument cluster designed by HI Corporation
  • two QNX CAR Platform demo systems, one powered by a dual-core Intel Atom E3827 processor, the other by an NVIDIA Tegra Visual Computing Module
  • the latest incarnation of the Oscar-winning Flying Cam SARAH aerial camera system


Wednesday, December 17, 2014

One day I’ll be Luke Skywalker

Cyril Clocher
What happens when you blend ADAS with infotainment? Guest post by Cyril Clocher, business manager for automotive processors at Texas Instruments

As we all begin preparing for our trek to Vegas for CES 2015, I would like my young friends (born in the 70s, of course) to reflect on their impressions of the first episode of Lucas’s trilogy back in 1977. On my side, I perfectly remember thinking one day I would be Luke Skywalker.

The eyes of young boys and girls were literally amazed by this epic space opera and particularly by technologies used by our heroes to fight the Galactic Empire. You have to remember it was an era where we still used rotary phones and GPS was in its infancy. So you can imagine how impactful it was for us to see our favorite characters using wireless electronic gadgets with revolutionary HMIs such as natural voice recognition, gesture controls or touch screens; droids speaking and enhancing human intelligence; and autonomous vehicles traveling the galaxy safely while playing chess with a Wookiee. Now you’re with me…

But instead of becoming Luke Skywalker a lot of us realized that we would have a bigger impact by inventing or engineering these technologies and by transforming early concepts into real products we all use today. As a result, smartphones and wireless connectivity are now in our everyday lives; the Internet of Things (IoT) is getting more popular in applications such as activity trackers that monitor personal metrics; and our kids are more used to touch screens than mice or keyboards, and cannot think of on-line gaming without gesture control. In fact, I just used voice recognition to upgrade the Wi-Fi plan from my Telco provider.

But the journey is not over yet. Our generation has still to deliver an autonomous vehicle that is green, safe, and fun to control – I think the word “drive” will be obsolete for such a vehicle.

The automotive industry has taken several steps to achieve this exciting goal, including integration of advanced and connected in-car infotainment systems in more models as well as a number of technologies categorized under Advanced Driver Assistance Systems (ADAS) that can create a safer and unique driving experience. From more than a decade, Texas Instruments has invested in infotainment and ADAS: “Jacinto” and TDAx automotive processors as well as the many analog companion chips supporting these trends.

"Jacinto 6 EP" and "Jacinto 6 Ex"
infotainment processor
s
A unique approach of TI is our capability to leverage best of both worlds for non-safety critical features, and to provide a seamless integration of informational ADAS functions into existing infotainment systems so the vehicle better informs and warns the driver. We announced that capability at SAE Convergence in Detroit in October 2014 with the “Jacinto 6 Ex” processor (DRA756), which contains powerful CPU, graphics multimedia, and radio cores with differentiated vision co-processors, called embedded vision engines (EVE), and additional DSPs that perform the complex ADAS processing.

For the TI’s automotive team, the CES 2015 show is even more exciting than in previous years, as we’ve taken our concept of informational ADAS to the next step. With joint efforts and hard work from both TI and QNX teams, we’ve together implemented a real informational ADAS system running the QNX CAR™ Platform for Infotainment on a “Jacinto 6 Ex” processor.

I could try describing this system in detail, but just like the Star Wars movies, it’s best to experience our “Jacinto 6 Ex” and QNX CAR Platform-based system in person. Contact your TI or QNX representative today and schedule a meeting to visit our private suite at CES at the TI Village (N115-N119) or to immerse yourself in a combined IVI, cluster, megapixel surround view, and DLP® based HUD display with augmented reality running on a single “Jacinto 6 Ex” SoC demonstration. And don't forget to visit the QNX booth (2231), where you can see the QNX reference vehicle running a variety of ADAS and infotainment applications on “Jacinto 6” processors.

Integrated cockpit featuring DLP powered HUD and QNX CAR Platform running on a single “Jacinto 6 Ex” SoC.
One day I’ll experience Skywalker’s life as I will no doubt have the opportunity to control an intelligent and autonomous vehicle with my biometrics, voice, and gestures while riding with my family to the movie theater playing chess with my grandkids, not yet a Wookiee.

Wednesday, November 19, 2014

A question of getting there

The third of a series of posts on the QNX CAR Platform. In this installment, we turn to a key point of interest: the platform’s navigation service.

From the beginning, we designed the QNX CAR Platform for Infotainment with flexibility in mind. Our philosophy is to give customers the freedom to choose the hardware platforms, application environments, user-interface tools, and smartphone connectivity protocols that best address their requirements. This same spirit of flexibility extends to navigation solutions.

For evidence, look no further than our current technology concept car. It can support navigation from Elektrobit:



from Nokia HERE:



and from Kotei Informatics:



These are but a few examples. The QNX CAR Platform can also support navigation solutions from companies like AISIN AW, NavNGo, TCS, TeleNav, and ZENRIN DataCom, enabling automakers and automotive Tier 1 suppliers to choose the navigation solution, or solutions, best suited to the regions or demographics they wish to target. (In addition to these embedded solutions, the platform can also provide access to smartphone-based navigation services through its support for MirrorLink and other connectivity protocols — more on this in a subsequent post.)

Under the hood
In our previous installment, we looked at the QNX CAR Platform’s middleware layer, which provides infotainment applications with a variety of services, including Bluetooth, radio, multimedia discovery and playback, and automatic speech recognition. The middleware layer also includes a navigation service that, true to the platform’s overall flexibility, allows developers to use navigation engines from multiple vendors and to change engines without affecting the high-level navigation applications that the user interacts with.

An illustration is in order. If you look the image below, you’ll see OpenGL-based map data rendered on one graphics layer and, on the layer above it, Qt-based application data (current street, distance to destination, and other route information) pulled from the navigation engine. By taking advantage of the platform’s navigation service, you could swap in a different navigation engine without having to rewrite the Qt application:



To achieve this flexibility, the navigation service makes use of the QNX CAR Platform’s persistent/publish subscribe (PPS) messaging, which cleanly abstracts lower-level services from the higher-level applications they communicate with. Let's look at another diagram to see how this works:



In the PPS model, services publish information to data objects; other programs can subscribe to those objects and receive notifications when the objects have changed. So, for the example above, the navigation engine could generate updates to the route information, and the navigation service could publish those updates to a PPS “navigation status object,” thereby making the updates available to any program that subscribes to the object — including the Qt application.

With this approach, the Qt application doesn't need to know anything about the navigation engine, nor does the navigation engine need to know anything about the Qt app. As a result, either could be swapped out without affecting the other.

Here's another example of how this model allows components to communicate with one another:
  1. Using the system's human machine interface (HMI), the drivers asks the navigation system to search for a point of interest (POI) — this could take the form of a voice command or a tap on the system display.
  2. The HMI responds by writing the request to a PPS “navigation control” object.
  3. The navigation service reads the request from the PPS object and forwards it to the navigation engine.
  4. The navigation engine returns the result.
  5. The navigation service updates the PPS object to notify the HMI that its request has been completed. It also writes the results to a database so that all subscribers to this object can read the results.
By using PPS, the navigation service can make details of the route available to a variety of applications. For instance, it could publish trip information that a weather app could subscribe to. The app could then display the weather forecast for the destination, at the estimated time of arrival.

To give developers a jump start, the QNX CAR Platform comes pre-integrated with Elektrobit’s EB street director navigation software. This reference integration shows developers how to implement "command and control" between the HMI and the participating components, including the navigation engine, navigation service, window manager, and PPS interface. As the above diagram indicates, the reference implementation works with both of the HMIs — one based on HTML5, the other based on Qt — that the QNX CAR Platform supports out of the box.


Previous posts in the QNX CAR Platform series:


Wednesday, October 22, 2014

A question of architecture

The second of a series on the QNX CAR Platform. In this installment, we start at the beginning — the platform’s underlying architecture.

In my previous post, I discussed how infotainment systems must perform multiple complex tasks, often all at once. At any time, a system may need to manage audio, show backup video, run 3D navigation, synch with Bluetooth devices, display smartphone content, run apps, present vehicle data, process voice signals, perform active noise control… the list goes on.

The job of integrating all these functions is no trivial task — an understatement if ever there was one. But as with any large project, starting with the right architecture, the right tools, and the right building blocks can make all the difference. With that in mind, let’s start at the beginning: the underlying architecture of the QNX CAR Platform for Infotainment.

The architecture consists of three layers: human machine interface (HMI), middleware, and platform.



The HMI layer
The HMI layer is like a bonus pack: it supports two reference HMIs out of the box, both of which have the same appearance and functionality. So what’s the difference? One is based on HTML5, the other on Qt 5. This choice demonstrates the underlying flexibility of the platform, which allows developers to create an HMI with any of several technologies, including HTML5, Qt, or a third-party toolkit such as Elektrobit GUIDE or Crank Storyboard.

A choice of HMIs
Mind you, the choice goes further than that. When you build a sophisticated infotainment system, it soon becomes obvious that no single tool or technology can do the job. The home screen, which may contain controls for Internet radio, hands-free calls, HVAC, and other functions, might need an environment like Qt. The navigation app, for its part, will probably use OpenGL ES. Meanwhile, some applications might be based on Android or HTML5. Together, all these heterogeneous components make up the HMI.

The QNX CAR Platform embraces this heterogeneity, allowing developers to use the best tools and application environments for the job at hand. More to the point, it allows developers to blend multiple app technologies into a single, unified user interface, where they can all share the same display, at the same time.

To perform this blending, the platform employs several mechanisms, including a component called the graphical composition manager . This manager acts as a kind of universal framework, providing all applications, regardless of how they’re built, with a highly optimized path to the display.

For example, look at the following HMI:



Now look at the HMI from another angle to see how it comprises several components blended together by the composition manger:



To the left, you see video input from a connected media player or smartphone. To the right, you see a navigation application based on OpenGL ES map-rendering software, with an overlay of route metadata implemented in Qt. And below, you see an HTML page that provides the underlying wallpaper; this page could also display a system status bar and UI menu bar across all screens.

For each component rendered to the display, the graphical composition manager allocates a separate window and frame buffer. It also allows the developer to control the properties of each individual window, including location, transparency, rotation, alpha, brightness, and z-order. As a result, it becomes relatively straightforward to tile, overlap, or blend a variety of applications on the same screen, in whichever way creates the best user experience.

The middleware layer
The middleware layer provides applications with a rich assortment of services, including Bluetooth, multimedia discovery and playback, navigation, radio, and automatic speech recognition (ASR). The ASR component, for example, can be used to turn on the radio, initiate a Bluetooth phone call from a connected smartphone, or select a song by artist or song title.

I’ll drill down into several of these services in upcoming posts. For now, I’d like to focus on a fundamental service that greatly simplifies how all other services and applications in the system interact with one another. It’s called persistent/publish subscribe messaging, or PPS, and it provides the abstraction needed to cleanly separate high-level applications from low-level business logic and services.

PPS messaging provides an abstraction layer between system services and high-level applications

Let’s rewind a minute. To implement communications between software components, C/C++ developers must typically define direct, point-to-point connections that tend to “break” when new features or requirements are introduced. For instance, an application communicates with a navigation engine, but all connections enabling that communication must be redefined when the system is updated with a different engine.

This fragility might be acceptable in a relatively simple system, but it creates a real bottleneck when you are developing something as complex, dynamic, and quickly evolving as the design for a modern infotainment system. PPS addresses the problem by allowing developers to create loose, flexible connections between components. As a result, it becomes much easier to add, remove, or replace components without having to modify other components.

So what, exactly, is PPS? Here’s a textbook answer: an asynchronous object-based system that consists of publishers and subscribers, where publishers modify the properties of data objects and the subscribers to those objects receive updates when the objects have been modified.

So what does that mean? Well, in a car, PPS data objects allow applications to access services such as the multimedia engine, voice recognition engine, vehicle buses, connected smartphones, hands-free calling, and contact databases. These data objects can each contain multiple attributes, each attribute providing access to a specific feature — such as the RPM of the engine, the level of brake fluid, or the frequency of the current radio station. System services publish these objects and modify their attributes; other programs can then subscribe to the objects and receive updates whenever the attributes change.

The PPS service is programming-language independent, allowing programs written in a variety of programming languages (C, C++, HTML5, Java, JavaScript, etc.) to intercommunicate, without any special knowledge of one another. Thus, an app in a high-level environment like HTML5 can easily access services provided by a device driver or other low-level service written in C or C++.

I’m only touching on the capabilities of PPS. To learn more, check out the QNX documentation on this service.

The platform layer
The platform layer includes the QNX OS and the board support packages, or BSPs, that allow the OS to run on various hardware platforms.

An inherently modular and extensible architecture
A BSP may not sound like the sexiest thing in the world — it is, admittedly, a deeply technical piece of software — but without it, nothing else works. And, in fact, one reason QNX Software Systems has such a strong presence in automotive is that it provides BSPs for all the popular infotainment platforms from companies like Freescale, NVIDIA, Qualcomm, and Texas Instruments.

As for the QNX Neutrino OS, you could write a book about it — which is another way of saying it’s far beyond the scope of this post. Suffice it to say that its modularity, extensibility, reliability, and performance set the tone for the entire QNX CAR Platform. To get a feel for what the QNX OS brings to the platform (and by extension, to the automotive industry), I invite you to visit the QNX Neutrino OS page on the QNX website.

Thursday, October 16, 2014

Attending SAE Convergence? Here’s why you should visit booth 513

Cars and beer don’t mix. But discussing cars while having a beer? Now you’re talking. If you’re attending SAE Convergence next week, you owe it to yourself to register for our “Spirits And Eats” event at 7:00 pm Tuesday. It’s the perfect occasion to kick back and enjoy the company of people who, like yourself, are passionate about cars and car electronics. And it isn’t a bad networking opportunity either — you’ll meet folks from a variety of automakers, Tier 1s, and technology suppliers in a relaxed, convivial atmosphere.

But you know what? It isn’t just about the beer. Or the company. It’s also about the Benz. Our digitally modded Mercedes-Benz CLA45 AMG, to be exact. It’s the latest QNX technology concept car, and it’s the perfect vehicle (pun fully intended) for demonstrating how QNX technology can enable next-generation infotainment systems. Highlights include:

  • A multi-modal user experience that blends touch, voice, and physical controls
  • A secure application environment for Android, HTML5, and OpenGL ES
  • Smartphone connectivity options for projecting smartphone apps onto the head unit
  • A dynamically reconfigurable digital instrument cluster that displays turn-by-turn directions, notifications of incoming phone calls, and video from front and rear cameras
  • Multimedia framework for playback of content from USB sticks, DLNA devices, etc.
  • Full-band stereo calling — think phone calls with CD quality audio
  • Engine sound enhancement that synchronizes synthesized engine sounds with engine RPM

Here, for example, is the digital cluster:



And here is a closeup of the head unit:



And here’s a shot of the cluster and head unit together:



As for the engine sound enhancement and high-quality hands-free audio, I can’t reproduce these here — you’ll have come see the car and experience them first hand. (Yup, that's an invite.)

If you like what you see, and are interested in what you can hear, visit us at booth #513. And if you'd like to schedule a demo or reserve some time with a QNX representative in advance, we can accommodate that, too. Just send us an email.

Tuesday, October 7, 2014

A question of concurrency

The first of a new series on the QNX CAR Platform for Infotainment. In this installment, I tackle the a priori question: why does the auto industry need this platform, anyway?

Define your terms, counseled Voltaire, and in keeping with his advice, allow me to begin with the following:

Concurrency \kən-kûr'-ən-sē\ n (1597) Cooperation, as of agents, circumstances, or events; agreement or union in action.

A good definition, as far as it goes. But it doesn’t go far enough for the purposes of this discussion. Wikipedia comes closer to the mark:

“In computer science, concurrency is a property of systems in which several computations execute simultaneously, and potentially interact with each other.”

That’s better, but it still falls short. However, the Wikipedia entry also states that:

“the base goals of concurrent programming include correctness, performance and robustness. Concurrent systems… are generally designed to operate indefinitely, including automatic recovery from failure, and not terminate unexpectedly.”

Now that’s more like it. Concurrency in computer systems isn’t simply a matter of doing several things all at once; it’s also a matter of delivering a solid user experience. The system must always be available and it must always be responsive: no “surprises” allowed.

This definition seems tailored-made for in-car infotainment systems. Here, for example, are some of the tasks that an infotainment system may perform:

  • Run a variety of user applications, from 3D navigation to Internet radio, based on a mix of technologies, including Qt, HTML5, Android, and OpenGL ES
  • Manage multiple forms of input: voice, touch, physical buttons, etc. 
  • Support multiple smartphone connectivity protocols such as MirrorLink and Apple CarPlay 
  • Perform services that smartphones cannot support, including:
    • HVAC control
    • discovery and playback of multimedia from USB sticks, DLNA devices, MTP devices, and other sources
    • retrieval and display of fuel levels, tire pressure, and other vehicle information
    • connectivity to Bluetooth devices
  • Process voice signals to ensure the best possible quality of phone-based hands-free systems — this in itself can involve many tasks, including echo and noise removal, dynamic noise shaping, speech enhancement, etc. 
  • Perform active noise control to eliminate unwanted engine “boom” noise 
  • Offer extremely fast bootup times; a backup camera, for example, must come up within a second or two to be useful
     
Jugging multiple concurrent tasks
The primary user of an infotainment system is the driver. So, despite juggling all these activities, an infotainment system must never show the strain. It must always respond quickly to user input and critical events, even when many activities compete for system resources. Otherwise, the driver will become annoyed or, worse, distracted. The passengers won’t be happy, either.

Still, that isn’t enough. Automakers also need to differentiate themselves, and infotainment serves as a key tool for achieving differentiation. So the infotainment system must not simply perform well; it must also allow the vehicle, or line of vehicles, to project the unique values, features, and brand identity of the automaker.

And even that isn’t enough. Most automakers offer multiple vehicle lines, each encompassing a variety of configurations and trim levels. So an infotainment design must also be scalable; that way, the work and investment made at the high end can be leveraged in mid-range and economy models. Because ROI.

Projecting a unique identity
But you know what? That still isn’t enough. An infotainment system design must also be flexible. It must, for example, support new functionality through software updates, whether such updates are installed through a storage device or over the air. And it must have the ability to accommodate quickly evolving connectivity protocols, app environments, and hardware platforms. All with the least possible fuss.

The nitty and the gritty
Concurrency, performance, reliability, differentiation, scalability, flexibility — a tall order. But it’s exactly the order that the QNX CAR Platform for Infotainment was designed to fill.

Take, for example, product differentiation. If you look at the QNX-powered infotainment systems that automakers are shipping today, one thing becomes obvious: they aren’t cookie-cutter systems. Rather, they each project the unique values, features, and brand identity of each automaker — even though they are all built on the same, standards-based platform.

So how does the QNX CAR Platform enable all this? That’s exactly what my colleagues and I will explore over the coming weeks and months. We’ll get into the nitty and sometimes the gritty of how the platform works and why it offers so much value to companies that develop infotainment systems in various shapes, forms, and price points.

Stay tuned.

POSTSCRIPT: Read the next installment of the QNX CAR Platform series, A question of architecture.

Tuesday, September 16, 2014

Ontario tech companies team up to target the connected car

To predict who will play a role tomorrow's connected vehicles, you need to look beyond the usual suspects.

When someone says “automobile,” what’s the first word that comes to mind? Chances are, it isn’t Ontario. And yet Ontario — the Canadian province that is home to QNX headquarters — is a world-class hub of automotive R&D and manufacturing. Chrysler, Ford, General Motors, Honda, and Toyota all have plants here. As do 350 parts suppliers. In fact, Ontario produced 2.5 million vehicles in 2012 alone.

No question, Ontario has the smarts to build cars. But to fully appreciate what Ontario has to offer, you need to look beyond the usual suspects in the auto supply chain. Take QNX Software Systems, for example. Our roots are in industrial computing, but in the early 2000s we started to offer software technology and expertise to the world’s automakers and tier one suppliers. And now, a decade later, QNX offers the premier platform for in-car infotainment, with deployments in tens of millions of vehicles.

QNX Software Systems is not alone. Ontario is home to many other “non-automotive” technology companies that are playing, or are poised to play, a significant role in creating new automotive experiences. But just who are these companies? The Automotive Parts Manufacturers Association (APMA) of Canada would like you to know. Which is why they've joined forces with QNX and other partners to build the APMA Connected Vehicle.

A showcase for Ontario technology.
The purpose of the vehicle is simple: to showcase how Ontario companies can help create the next generation of connected cars. The vehicle is based on a Lexus RX350 — built in Ontario, of course — equipped with a custom-built infotainment system and digital instrument cluster built on QNX technology. Together, the QNX systems integrate more than a dozen technologies and services created in Ontario, including gesture recognition, biometric security, emergency vehicle notification, LED lighting, weather telematics, user interface design, smartphone charging, and cloud connectivity.

Okay, enough from me. Time to nuke some popcorn, dim the lights, and hit the Play button:



Wednesday, September 10, 2014

QNX-powered Audi Virtual Cockpit drives home with CTIA award

Congratulations to our friends at Audi! The new Audi Virtual Cockpit, which is based on the QNX OS, has just won first prize, connected car category, in the 2014 CTIA Hot for the Holidays awards.

I’ve said it before and I’ll say it again: the Audi Virtual Cockpit is an innovative, versatile, and absolutely ravishing piece of automotive technology. But you don’t have to take my word for it — or the word of the CTIA judges, for that matter. Watch the video and see for yourself:



Created in 2009, the Hot for the Holidays awards celebrate the most desirable mobile consumer electronics products for the holiday season. The winners for this year’s awards were announced this afternoon, at the CTIA Super Mobility event in Las Vegas. Andrew Poliak of QNX Software Systems was on hand and he took this snap of the award:



Visit the CTIA website to see the full list of winners. And visit the Audi website to learn more about the Audi Virtual Cockpit.

Tuesday, July 29, 2014

QNX-powered 2015 Audi TT named best-connected car

Is it innovative, beautiful, versatile, or just plain cool? I haven’t quite decided, so I’m thinking it’s all of the above. The QNX-based virtual cockpit in the 2015 Audi TT is a ravishing piece of automotive technology, and it brings driver convenience to a new level by integrating everything from speed and navigation to music and handsfree calling — all in a single, user-configurable display.

It seems I’m not the only one who's impressed. Because last week, 42,500 readers of “auto motor und sport” and “CHIP” chose the Audi TT as the industry's best-connected car. In fact, Audi took top honors in several categories, including navigation, telephone integration, sound system, entertainment/multimedia, and connected car.

To get an idea of what all the fuss is about, check out our video of the Audi TT’s virtual cockpit in action. We filmed this at CES earlier this year:



For more information on the award and the Audi TT, read Audi's press release.

Monday, July 21, 2014

The lost concept car photos

Have you ever rummaged through old boxes in your basement and discovered family photos you had totally forgotten about — or never knew existed? I experienced a moment like that a couple of weeks ago. Except, in this case, no basement was involved. And the box wasn't a box, but a shared drive. And the photos weren't of my family, but of cars. QNX technology concept cars, to be exact.

At least once a year, the QNX concept team retrofits a new vehicle to demonstrate how our technology can help auto companies push the envelope in connectivity, infotainment, and acoustics. And, in every case, we take pictures — sometimes, lots of them. Inevitably, we end up choosing a few images for publicity purposes and filing the others. But as I discovered, the images we don't use are often just as good as the ones we do use. We just don't need all of them!

In any case, stumbling across these photos was great fun. I thought you might enjoy them, too, so here goes...

The Porsche
First up is the QNX technology concept car based on a Porsche 911, which made its debut at 2012 CES. We had originally planned to drive the car back to Ottawa once CES was over — but that was before we spoke to our friends at Texas Instruments, who provided the silicon for the car's instrument cluster and infotainment system. They liked the car so much, they asked if we could bring it to their HQ in Dallas, where the following two photos were taken. All I can say is, Dallas is home to at least one awesome cool photographer. Because rather than curse the crazy lighting, the photographer used it to create some playful compositions:





If you look below, you'll see another shot of the Porsche, taken just before we shipped it off to CES. The image really doesn't belong in this collection, as it appeared once on a partner website. But it's rare nonetheless, so I decided to include it. And besides, it's cool. Literally.



Did you know? The original Porsche 911, which debuted in the early 60s, was dubbed the 901. Problem was, Peugeot had exclusive rights in France to three-digit car names with a 0 in the middle. And so, the 901 became the 911.



The Bentley
Next up is the QNX technology concept car based on a Bentley Continental GT. In this image, the driver is interacting with the center stack's main control knob, which was mounted directly on a 17" touchscreen. See the row of icons just above the knob? These represented HVAC, music, navigation, hands-free calling and other system functions. The  system would automatically display these icons as your hand approached the display; you would then turn the knob to choose the function you wanted. (This image was taken by a BlackBerry employee, whose name I have most ungraciously forgotten.)



As with our all concept vehicles, the intent was to showcase the technology that we had built into the car's dashboard and center stack. Which probably explains why the following image of the car's exterior was never published. Pity, as it's quite lovely — a classic case of flare adding flair.



Did you know? Those wheels aren't just for show. The Bentley comes equipped with a 616 hp W12 engine (yup, three banks of cylinders) that can do 0-60 mph in a little over 4 seconds — it took me way longer than that to type this sentence.



The Jeep
Next up is the Jeep Wrangler, which serves as the QNX reference vehicle. The Jeep plays a different role than the other vehicles highlighted here: instead of demonstrating how QNX technology can help automotive companies innovate, it shows what the QNX CAR Platform for Infotainment can do right out of the box. In this image, you can see the vehicle's main navigation menu:



Did you know? The original infotainment system in the reference vehicle could post Facebook updates that listed the title and artist of the song currently playing. The system performed this magic in response to simple voice commands.



The Vette
The QNX technology concept car based on a Chevrolet Corvette made its debut at SAE Convergence 2010. Among other things, it showed how digital instrument clusters can morph on the fly to provide drivers with context-sensitive information, such as turn-by-turn directions. You can see a slicker, more sophisticated approach to reconfigurable clusters in our most recent technology concept car based on a Mercedes CLA45.



Did you know? We used the Corvette to demonstrate how QNX technology enables automotive companies to create customizable, reskinnable user interfaces. Check out this post on the Corvette's 30-day UI challenge.



The Prius
The first QNX-powered technology concept car was a digitally modded Prius — aka the LTE Connected Car. The car was a joint project of several companies, including QNX and Alcatel-Lucent, who wanted to demonstrate how 4G/LTE networks could transform the driving experience with a host of new in-vehicle applications.

Here's the car with a very proud-looking Derek Kuhn, who spearheaded the LTE Connected Car project while serving as a VP at Alcatel-Lucent. Derek subequently joined QNX as VP of sales and marketing:



Did you know? When this car was created, telecom companies had yet to light up their first commercial LTE towers. Also, the car had more infotainment systems than any other QNX technology concept car: two in the front (one for the driver and one for the front-seat passenger) and two in the back.



Some things get lost, albeit temporarily. And some you just never see again. Fortunately, all these images belong to the first category. Any favorites?