Showing posts with label Paul Leroux. Show all posts
Showing posts with label Paul Leroux. Show all posts

Wednesday, March 4, 2015

“What do you mean, I have to learn how not to drive?”

The age of autonomous driving lessons is upon us.

Paul Leroux
What would it be like to ride in an autonomous car? If you were to ask the average Joe, he would likely describe a scenario in which he sips coffee, plays video games, and spends quality time with TSN while the car whisks him to work. The average Jane would, no doubt, provide an equivalent answer. The problem with this scenario is that autonomous doesn’t mean driverless. Until autonomous vehicles become better than humans at handling every potential traffic situation, drivers will have to remain alert much or all of the time, even if their cars do 99.9% of the driving for them.

Otherwise, what happens when a car, faced with a situation it can’t handle, suddenly cedes control to the driver? Or what happens when the car fails to recognize a pedestrian on the road ahead?

Of course, it isn’t easy to maintain a high level of alertness while doing nothing in particular. It takes a certain maturity of mind, or at least a lack of ADD. Which explains why California, a leader in regulations for autonomous vehicles, imposes restrictions on who is allowed to “drive” them. Prerequisites include a near-spotless driving record and more than 10 years without a DUI conviction. Drivers must also complete an autonomous driving program, the length of which depends on the car maker or automotive supplier in question. According to a recent investigation by IEEE Spectrum, Google offers the most comprehensive program — it lasts five weeks and subjects drivers to random checks.

1950s approach to improving driver
alertness. Source:
 
Modern Mechanix blog

In effect, drivers of autonomous cars have to learn how not to drive. And, as another IEEE article suggests, they may even need a special license.

Ample warnings
Could an autonomous car mitigate the attention issue? Definitely. It could, for example, give the driver ample warning before he or she needs to take over. The forward collision alerts and other informational ADAS functions in the latest QNX technology concept car offer a hint as to how such warnings could operate. For the time being, however, it’s hard to imagine an autonomous car that could always anticipate when it needs to cede control. Until then, informational ADAS will serve as an adjunct, not a replacement, for eyes, ears, and old-fashioned attentiveness.

Nonetheless, research suggests that adaptive cruise control and other technologies that enable autonomous or semi-autonomous driving can, when compared to human drivers, do a better job of avoiding accidents and improving traffic flow. To quote my friend Andy Gryc, autonomous cars would be more “polite” to other vehicles and be better equipped to negotiate inter-vehicle space, enabling more cars to use the same length of road.

Fewer accidents, faster travel times. I could live with that.


2015 approach to improving driver alertness: instrument cluster from the QNX reference vehicle.

Monday, March 2, 2015

Hypervisors, virtualization, and taking control of your safety certification budget

A new webinar on how virtualization can help you add new technology to existing designs.

First things first: should you say “hypervisor” or “virtual machine monitor”? Both terms refer to the same thing, but is one preferable to the other?

Hypervisor certainly has the greater sex appeal, suggesting it was coined by a marketing department that saw no hope in promoting a term as coldly technical as virtual machine monitor. But, in fact, hypervisor has a long and established history, dating back almost 50 years. Moreover, it was coined not by a marketing department, but by a software developer.

“Hypervisor” is simply a variant of “supervisor,” a traditional name for the software that controls task scheduling and other fundamental operations in a computer system — software that, in most systems, is now called the OS kernel. Because a hypervisor manages the execution of multiple OSs, it is, in effect, a supervisor of supervisors. Hence hypervisor.

No matter what you call it, a hypervisor creates multiple virtual machines, each hosting a separate guest OS, and allows the OSs to share a system’s hardware resources, including CPU, memory, and I/O. As a result, system designers can consolidate previously discrete systems onto a single system-on-chip (SoC) and thereby reduce the size, weight, and power consumption of their designs — a trinity of benefits known as SWaP.

That said, not all hypervisors are created equal. There are, for example, Type 1 “bare metal” hypervisors, which run directly on the host hardware, and Type 2 hypervisors, which run on top of an OS. Both types have their benefits, but Type 1 offers the better choice for any embedded system that requires fast, predictable response times — most safety-critical systems arguably fall within this category.

The QNX Hypervisor is an example of a Type 1 “bare metal” hypervisor.


Moreover, some hypervisors make it easier for the guest OSs to share hardware resources. The QNX Hypervisor, for example, employs several technologies to simplify the sharing of display controllers, network connections, file systems, and I/O devices like the I2C serial bus. Developers can, as a result, avoid writing custom shared-device drivers that increase testing and certification costs and that typically exhibit lower performance than field-hardened, vendor-supplied drivers.

Adding features, without blowing the certification budget
Hypervisors, and the virtualization they provide, offer another benefit: the ability to keep OSs cleanly isolated from each other, even though they share the same hardware. This benefit is attractive to anyone trying to build a safety-critical system and reduce SWaP. Better yet, the virtualization can help device makers add new and differentiating features, such as rich user interfaces, without compromising safety-critical components.

That said, hardware and peripheral device interfaces are evolving continuously. How can you maintain compliance with safety-related standards like ISO 26262 and still take advantage of new hardware features and functionality?

Enter a new webinar hosted by my inimitable colleague Chris Ault. Chris will examine techniques that enable you to add new features to existing devices, while maintaining close control of the safety certification scope and budget. Here are some of the topics he’ll address:

  • Overview of virtualization options and their pros and cons
     
  • Comparison of how adaptive time partitioning and virtualization help achieve separation of safety-critical systems
     
  • Maintaining realtime performance of industrial automation protocols without directly affecting safety certification efforts
     
  • Using Android applications for user interfaces and connectivity

Webinar coordinates:
Exploring Virtualization Options for Adding New Technology to Safety-Critical Devices
Time: Thursday, March 5, 12:00 pm EST
Duration: 1 hour
Registration: Visit TechOnLine

Monday, January 26, 2015

New to 26262? Have I got a primer for you

Driver error is the #1 problem on our roads — and has been since 1869. In August of that year, a scientist named Mary Ward became the first person to die in an automobile accident, after being thrown from a steam-powered car. Driver error was a factor in Mary’s death and, 145 years later, it remains a problem, contributing to roughly 90% of motor vehicle crashes.

Can ADAS systems mitigate driver error and reduce traffic deaths? The evidence suggests that, yes, they help prevent accidents. That said, ADAS systems can themselves cause harm, if they malfunction. Imagine, for example, an adaptive cruise control system that underestimates the distance of a car up ahead. Which raises the question: how can you trust the safety claims for an ADAS system? And how do you establish that the evidence for those claims is sufficient?

Enter ISO 26262. This standard, introduced in 2011, provides a comprehensive framework for validating the functional safety claims of ADAS systems, digital instrument clusters, and other electrical or electronic systems in production passenger vehicles.

ISO 26262 isn’t for the faint of heart. It’s a rigorous, 10-part standard that recommends tools, techniques, and methodologies for the entire development cycle, from specification to decommissioning. In fact, to develop a deep understanding of 26262 you must first become versed in another standard, IEC 61508, which forms the basis of 26262.

ISO 26262 starts from the premise that no system is 100% safe. Consequently, the system designer must perform a hazard and risk analysis to identify the safety requirements and residual risks of the system being developed. The outcome of that analysis determines the Automotive Safety Integrity Level (ASIL) of the system, as defined by 26262. ASILs range from A to D, where A represents the lowest degree of hazard and D, the highest. The higher the ASIL, the greater the degree of rigor that must be applied to assure the system avoids residual risk.

Having determined the risks (and the ASIL) , the system designer selects an appropriate architecture. The designer must also validate that architecture, using tools and techniques that 26262 either recommends or highly recommends. If the designer believes that a recommended tool or technique isn’t appropriate to the project, he or she must provide a solid rationale for the decision, and must justify why the technique actually used is as good or better than that recommended by 26262.

The designer must also prepare a safety case. True to its name, this document presents the case that the system is sufficiently safe for its intended application and environment. It comprises three main components: 1) a clear statement of what is claimed about the system, 2) the argument that the claim has been met, and 3) the evidence that supports the argument. The safety case should convince not only the 26262 auditor, but also the entire development team, the company’s executives, and, of course, the customer. Of course, no system is safe unless it is deployed and used correctly, so the system designer must also produce a safety manual that sets the constraints within which the product must be deployed.

Achieving 26262 compliance is a major undertaking. That said, any conscientious team working on a safety-critical project would probably apply most of the recommended techniques. The standard was created to ensure that safety isn’t treated as an afterthought during final testing, but as a matter of due diligence in every stage of development.

If you’re a system designer or implementer, where do you start? I would suggest “A Developer’s View of ISO 26262”, an article recently authored by my colleague Chris Hobbs and published in EE Times Automotive Europe. The article provides an introduction to the standard, based on experience of certifying software to ISO 26262, and covers key topics such as ASILs, recommended verification tools and techniques, the safety case, and confidence from use.

I also have two whitepapers that may prove useful: Architectures for ISO 26262 systems with multiple ASIL requirements, written by my colleague Yi Zheng, and Protecting software components from interference in an ISO 26262 system, written by Chris Hobbs and Yi Zheng.

Tuesday, January 13, 2015

Tom’s Guide taps QNX concept car with CES 2015 award

Have you ever checked out a product review on Tom’s Guide? If so, you’re not alone. Every month, this website attracts more than 2.5 million unique visitors — that’s equivalent to the population of Toronto, the largest city in Canada.

The folks at Tom’s Guide test and review everything from drones to 3D printers. They love technology. So perhaps it’s no surprise that they took a shine to the QNX technology concept car. In fact, they liked it so much, they awarded it the Tom’s Guide CES 2015 Award, in the car tech category.

To quote Sam Rutherford of Tom’s Guide, “After my time with QNX’s platform, I was left with the impression there’s finally a company that just “gets it” when it comes to the technology in cars. The company has learned from the success of modern mobile devices and brought that knowledge to the auto world…”.

I think I like this Sam guy.

Engadget was also impressed...
A forward-looking approach to seeing
behind you.
The Tom’s Guide award is the second honor QNX picked up at CES. We were also shortlisted for an Engadget Best of CES award, for the digital rear- and side-view mirrors on the QNX technology concept car.

If you haven’t seen the mirrors in action, they offer a complete view of the scene behind and to the sides of the vehicle — goodbye to the blind spots associated with conventional reflective mirrors. Better yet, the side-view digital mirrors have the smarts to detect cars, bicycles, and other objects, and they will display an alert if an object is too close when the driver signals a lane change.

In addition to the digital mirrors, the QNX technology concept car integrates several other ADAS features, including speed recommendations, forward-collision warnings, and intelligent parking assist. Learn more here.

Monday, January 5, 2015

Volkswagen and LG Gear up with QNX

Design wins put QNX technology in a wide range of infotainment systems, instrument clusters, and ADAS solutions.

Earlier today, QNX Software Systems announced that infotainment systems powered by the QNX Neutrino OS are now shipping in several 2015 Volkswagen vehicle models, including the Touareg, Passat, Polo, Golf, and Golf GTI.

The systems include the RNS 850 GPS navigation system in the Volkswagen Touareg, which recently introduced support for 3D Google Earth maps and Google Street View. The system also offers realtime traffic information, points-of-interest search, reverse camera display, voice control, Bluetooth connectivity, rich multimedia support, four-zone climate control, a high-resolution 8-inch color touchscreen, and other advanced features.

Bird's eye view: the RNS 850 GPS navigation system for the Volkswagen Touareg SUV. Source: VW

“At Volkswagen, we believe deeply in delivering the highest quality driving experience, regardless of the cost, size, and features of the vehicle,” commented Alf Pollex, Head of Connected Car and Infotainment at Volkswagen AG. “The scalable architecture of the QNX platform is well-suited to our approach, enabling us to offer a full range of infotainment systems, from premium level to mass volume, using a single, proven software base for our Modular Infotainment Modules (MIB) and the RNS 850 system.”

QNX and LG: a proven partnership
QNX also announced that LG Electronics’ Vehicle Components (VC) Company will use a range of QNX solutions to build infotainment systems, digital instrument clusters, and advanced driver assistance systems (ADAS) for the global automotive market.

The new initiative builds on a long history of collaboration between LG and QNX Software Systems, who have worked together on successful, large-volume telematics production programs. For the new systems, QNX will provide LG with the QNX CAR Platform for Infotainment, the QNX Neutrino OS, the QNX OS for Automotive Safety, and QNX Acoustics for Voice.

“QNX Software Systems has been our trusted supplier
for more than a decade... helping LG deliver millions
of high-quality systems to the world’s automakers”

— Won-Yong Hwang, LG's VC Company

“QNX Software Systems has been our trusted supplier for more than a decade, providing flexible software solutions that have helped LG deliver millions of high-quality systems to the world’s automakers,” commented Won-Yong Hwang, Director and Head of AVN development department, LG Electronics’ VC Company. “This same flexibility allows us to leverage our existing QNX expertise in new and growing markets such as ADAS, where the proven reliability of QNX Software Systems’ technology can play a critical role in addressing automotive safety requirements.”

Visit the QNX website to learn more about the Volkswagen and LG announcements.

To infotainment... and beyond! First look at new QNX technology concept car

The new car delivers everything you’d expect in a concept vehicle from QNX. But the real buzz can be summarized in a four-letter word: ADAS

The technology in today's cars is light-years ahead of the technology in cars 10 or 20 years ago. The humans driving those cars, however, have changed little in the intervening years. They still need to focus on a host of mundane driving tasks, from checking blind spots and monitoring road signs to staying within the lane and squeezing into parking spaces. In fact, with all the technology now in the car, including a variety of brought-in devices, some drivers suffer from information overload and perform worse, instead of better, at these crucial tasks.

Advanced driver assistance systems, or ADAS, can go a long way to offset this problem. They come in a variety of shapes and sizes — from drowsiness monitoring to autonomous emergency braking — but most share a common goal: to help the driver avoid accidents.

Which brings us to the new QNX technology concept car. As you’d expect, it includes all the advanced infotainment features, including smartphone connectivity and rich app support, offered by the QNX CAR Platform for Infotainment. But it also integrates an array of additional technologies — including cameras, LiDAR, ultrasonic sensors, and specialized navigation software — to deliver ADAS capabilities that simplify driving tasks, warn of possible collisions, and enhance overall driver awareness.

Mind you, the ADAS features shouldn’t come as any more of a surprise than the infotainment features. After all, QNX Software Systems also offers the QNX OS for Automotive Safety, a solution based on decades of experience in safety-critical systems and certified to ISO 26262, Automotive Safety Integrity Level D — the highest level achievable.

Okay, enough blather. Time to check out the car!

The “I want that” car
If the trident hasn’t already tipped you off, the new technology concept car is based on a Maserati QuattroPorte GTS. I won’t say much about the car itself, except I want one. Did I say want? Sorry, I meant lust. Because omigosh:



The differentiated dash
Before we run through the car’s many features, let’s stop for a minute and savor the elegant design of its QNX-powered digital instrument cluster and infotainment system. To be honest, I have an ulterior motive for sharing this image: if you compare the systems shown here to those of previous QNX technology concept cars (here, here, and here), you’ll see that they each project a distinct look-and-feel. Automakers need to differentiate themselves, and, as a group, these cars illustrate how the flexibility of the QNX platform enables unique, branded user experiences:



The multi-talented digital instrument cluster
Okay, let’s get behind the wheel and test out the digital cluster. Designed to heighten driver awareness, the cluster can show the current speed limit, display an alert if you exceed the limit, and even recommend an appropriate speed for upcoming curves. Better yet, it can display turn-by-turn directions provided by the car’s infotainment system.

Normally, the cluster displays the speed limit in a white circle. But in this image, the cluster displays it in red, along with a red bar to show how much you are over the limit — a gentle reminder to ease off the gas:



Using LiDAR input, the cluster can also warn of obstacles on the road ahead:



And if that’s not enough, the cluster provides intelligent parking assist to help you back into tight spaces. Here, for example, is an impromptu image we took in the QNX garage. The blue-and-yellow guidelines represent the car’s reverse trajectory, and the warning on right says that you are about to run over an esteemed member of the QNX concept team!



The rear- and side-view mirrors that aren’t really mirrors
By their very nature, car mirrors have blind spots. To address this problem, the QNX concept team has transformed the car’s rear- and side-view mirrors into video displays that offer a complete view of the scene behind and to the sides of the vehicle. As you can see in this image, the side-view displays can also display a red overlay to warn of cars, bikes, people, or anything else approaching the car’s blind zones:



The ADAS display for enhancing obstacle awareness
I don’t have pictures yet, but the car also includes an innovative LED-based display lets you gauge the direction and proximity of objects to the front, rear, and sides of the vehicle — without having to take your eyes off the road. Stretching the width of the dash, the display integrates input from the car’s ultrasonic and LiDAR sensors to provide a centralized view of ADAS warnings.

The easy-to-use infotainment system
To demonstrate the capabilities of the QNX CAR™ Platform for Infotainment, we’ve outfitted the car with a feature-rich, yet intuitive, multimedia head unit. For instance, see the radio tuner in the following image? That’s no ordinary tuner. To change channels, you can just swipe across the display; if your swipe isn’t perfectly accurate, the radio will automatically zero in on the nearest station or preset.

Better yet, the radio offers “iHeart drive anywhere radio.” If you drive out of range of your favorite AM/FM radio station, the system will detect the problem and automatically switch to the corresponding digital iHeartRadio station. How cool is that?



Other infotainment features include:
  • Natural voice recognition — For instance, if you say “It’s way too cold in here,” the HVAC system will respond by raising the heat.
  • Integration with a wide variety of popular smartphones.
  • Support for multiple concurrent app environments, along with a variety of Android and HTML5 apps, as well as an HMI built with the Qt framework.
  • A backseat display that lets passengers control HVAC functions, navigation, song selection, and other infotainment features.

The oh-so-awesome partners
The car is a testament not only to QNX technology, but to the ecosystem of technology partners that provide complementary solutions for QNX customers. Peek under the hood, and you'll find the latest tech from Elektrobit, iHeart, Nuance, Pandora, Parkopedia, Phantom Intelligence, Qualcomm, RealVNC, Rightware, and TE Connectivity.

The other stuff
Do not, for one minute, think that the Maserati is the only attraction in the QNX booth. Far from it. We will also showcase a significantly revamped QNX reference vehicle, outfitted with lane departure warnings, traffic sign recognition, and other ADAS features, as well as the latest version of the QNX CAR Platform — more in an upcoming post.

Visitors to the booth will also have the opportunity to experience:
  • a 3D navigation solution from Aisin AW
  • a digital instrument cluster designed by HI Corporation
  • two QNX CAR Platform demo systems, one powered by a dual-core Intel Atom E3827 processor, the other by an NVIDIA Tegra Visual Computing Module
  • the latest incarnation of the Oscar-winning Flying Cam SARAH aerial camera system


Tuesday, December 23, 2014

There’s experience, and then there’s experience

Or how a single word can have a trunkful of meanings.

"Liked your blog post. It was so random.” That, believe it or not, is one of the nicest things anyone has ever said to me. You may think it funny that I see this as a compliment. But truth be told, randomness is part of my mental DNA — as anyone who has attempted to hold a conversation with me can attest. Even Google seems to agree. A few years ago, they temporarily closed my Blogger account because, according to their algorithms, my posts consisted of random, machine-generated words. I kid you not.

So why am I going on about this? Well, someone asked me about QNX Software Systems’ experience in the automotive market and, sure enough, my mind went off in several directions all at once. Not that that’s unusual. In this case, however, there was justification for my response. Because when it comes to cars and QNX, experience has a rich array of meanings.

First, there is the deep experience that QNX amassed in the automotive industry. We’ve been at it for 15 years, working hand-in-hand with car makers and tier one suppliers to create infotainment systems, digital instrument clusters, connectivity modules, and handsfree units for tens of millions of vehicles.

Next, there’s the experience of working with QNX the company. In the auto industry, almost every automaker and tier one supplier has unique demands — not to mention immovable deadlines. As a result, they need a supplier, like QNX, that’s deeply committed to the success of their projects, and that can provide the expert engineering services they need to meet start-of-production commitments. No shrink-wrapped solutions for this crowd.

Then, there’s the experience of using QNX technology to build automotive systems — or any type of system, for that matter. Take the QNX OS, for example. Its microkernel architecture makes it easier to isolate and repair bugs, its industry-standard APIs make it easy to port or reuse existing code, and its persistent publish/subscribe technology offers a highly flexible approach to integrating high-level applications with low-level business logic and services.

And last, there’s the experience of using systems based on QNX technology. One reason we build technology concept cars is because words cannot express the rich, integrated user experiences that our technology can enable — experiences that blend graphics, acoustics, touch interfaces, natural language processing, and other technologies to make driving simpler and more convenient.

Nor can words express the sheer variety of user experiences that our platform makes possible. If you look at the QNX-powered infotainment systems that automakers ship today, it soon becomes obvious that they aren’t cookie-cutter systems. Rather, each system projects the unique values, features, and brand identity of the automaker. For evidence, look no further than GM OnStar and the Audi Virtual Cockpit. They are totally distinct from each other, yet both are built on the very same OS platform.

On a personal note, I must mention one last form of experience: that of working with my QNX colleagues. Because that, to me, is the most wonderful experience of all.

Monday, December 15, 2014

QNX celebrates crystal anniversary in automotive

Long-term success in the auto market relies on a potent mix of passion, persistence, innovation, and quality. And let's not forget trust.

Imagine, for a minute, that you are a bird. Not just any bird, but a bird that can fly 11,000 kilometers, non-stop, without food or rest.

That’s hard to imagine, I know. But the bird in question — the bar-tailed godwit — is very real, and its ability to fly across vast distances is well documented. Every year, as winter approaches, the godwit lifts off from its breeding grounds in Alaska, bears southwest, and doesn't stop beating its wings until it touches down in New Zealand. Total uninterrupted flight time: 216 hours.

The godwit epitomizes indomitable drive, infused with a dose of pure stick-with-it-ness. Qualities that, to me, characterize QNX Software Systems’ success in the auto market — a story that took flight 15 years ago.

Bar-tailed godwit: long-distance champion
© Andreas Trepte
It all started in 1999, when Motorola and QNX unveiled mobileGT, an automotive reference platform based on the QNX Neutrino OS. For the first time, QNX publicly threw its hat into the automotive ring. Mind you, QNX was already busy behind the scenes: 1999 also marked the first year that QNX technology shipped in passenger vehicles. It’s been a steady climb ever since, and you can now find QNX technology in tens of millions of vehicles.

There are many technical reasons why QNX has become a premier software provider for the automotive market. But for automakers and their tier one suppliers, technology alone isn’t enough. They also need to know that, as a supplier, you are deeply committed to the success of their projects — like the flight of the godwit, bailing out halfway isn’t an option. They also need to trust that, when you say you’ll do something, you will. And that you’ll do it on time. Even if you have to cross an ocean to do it.

In short, you might enter this market because of your skills and passion, but you thrive in it because you behave as a real partner, working in concert with your customers and fellow technology suppliers. That’s why I refer to our fifteenth anniversary in the car business with the same language used to describe a fifteenth wedding anniversary. Because we’re committed, we’re passionate, and we’re in for the long haul.

Wednesday, December 10, 2014

The power of together

Bringing more technologies into the car is all well and good. The real goal, however, is to integrate them in a way that genuinely improves the driving experience.

Can we all agree that ‘synergy’ has become one of the most misused and overused words in the English language? In the pantheon of verbal chestnuts, synergy holds a place of honor, surpassed only by ‘best practices’ and ‘paradigm shift’.

Mind you, you can’t blame people for invoking the word so often. Because, as we all know, the real value in things often comes from their interaction — the moment they stop acting alone and start working in concert. The classic example is water, yeast, and flour, a combination that yields something far more flavorful than its constituent parts. I am speaking, of course, of bread.

Automakers get this principle. Case in point: adaptive cruise control, which takes a decades-old concept — conventional cruise control — and marries it with advances in radar sensors and digital signal processing. The result is something that doesn’t simply maintain a constant speed, but can help reduce accidents and, according to some research, traffic jams.

At QNX Software Systems, we also take this principle to heart. For example, read my recent post on the architecture of the QNX CAR Platform and you’ll see that we consciously designed the platform to help things work together. In fact, the platform's ability to integrate numerous technologies, in a seamless and concurrent fashion, is arguably its most salient quality.

This ability to blend disparate technologies into a collaborative whole isn't just a gee-whiz feature. Rather, it is critical to enabling the continued evolution and success of the connected car. Because it’s not enough to have smartphone connectivity. Or cloud connectivity. Or digital instrument clusters. Or any number of ADAS features, from collision warnings to autonomous braking. The real magic, and real value to the consumer, occurs when some or all of these come together to create something greater than the sum of the parts.

Simply put, it's all about the — dare I say it? — synergy that thoughtful integration can offer.

At CES this year, we will explore the potential of integration and demonstrate the unexpected value it can bring. The story begins on the QNX website.

Thursday, December 4, 2014

Beyond the dashboard: discover how QNX touches your everyday life

QNX technology is in cars — lots of them. But it’s also in everything from planes and trains to smart phones, smart buildings, and smart vacuum cleaners. If you're interested, I happen to have an infographic handy...

I was a lost and lonely soul. Friends would cut phone calls short, strangers would move away from me on the bus, and acquaintances at cocktail parties would excuse themselves, promising to come right back — they never came back. I was in denial for a long time, but slowly and painfully, I came to the realization that I had to take ownership of this problem. Because it was my fault.

To by specific, it was my motor mouth. Whenever someone asked what I did for a living, I’d say I worked for QNX. That, of course, wasn’t a problem. But when they asked what QNX did, I would hold forth on microkernel OS architectures, user-space device drivers, resource manager frameworks, and graphical composition managers, not to mention asynchronous messaging, priority inheritance, and time partitioning. After all, who doesn't want to learn more about time partitioning?

Well, as I subsequently learned, there’s a time and place for everything. And while my passion about QNX technology was well-placed, my timing was lousy. People weren’t asking for a deep dive; they just wanted to understand QNX’s role in the scheme of things.

As it turns out, QNX plays a huge role, and in very many things. I’ve been working at QNX Software Systems for 25 years, and I am still gobsmacked by the sheer variety of uses that QNX technology is put to. I'm especially impressed by the crossover effect. For instance, what we learn in nuclear plants helps us offer a better OS for safety systems in cars. And what we learn in smartphones makes us a better platform supplier for companies building infotainment systems.

All of which to say, the next time someone asks me what QNX does, I will avoid the deep dive and show them this infographic instead. Of course, if they subsequently ask *how* QNX does all this, I will have a well-practiced answer. :-)

Did I mention? You can download a high-res JPEG of this infographic from our Flickr account and a PDF version from the QNX website.



Stay tuned for 2015 CES, where we will introduce even more ways QNX can make a difference, especially in how people design and drive cars.

And lest I forget, special thanks to my colleague Varghese at BlackBerry India for conceiving this infographic, and for the QNX employees who provided their invaluable input.

Wednesday, November 19, 2014

A question of getting there

The third of a series of posts on the QNX CAR Platform. In this installment, we turn to a key point of interest: the platform’s navigation service.

From the beginning, we designed the QNX CAR Platform for Infotainment with flexibility in mind. Our philosophy is to give customers the freedom to choose the hardware platforms, application environments, user-interface tools, and smartphone connectivity protocols that best address their requirements. This same spirit of flexibility extends to navigation solutions.

For evidence, look no further than our current technology concept car. It can support navigation from Elektrobit:



from Nokia HERE:



and from Kotei Informatics:



These are but a few examples. The QNX CAR Platform can also support navigation solutions from companies like AISIN AW, NavNGo, TCS, TeleNav, and ZENRIN DataCom, enabling automakers and automotive Tier 1 suppliers to choose the navigation solution, or solutions, best suited to the regions or demographics they wish to target. (In addition to these embedded solutions, the platform can also provide access to smartphone-based navigation services through its support for MirrorLink and other connectivity protocols — more on this in a subsequent post.)

Under the hood
In our previous installment, we looked at the QNX CAR Platform’s middleware layer, which provides infotainment applications with a variety of services, including Bluetooth, radio, multimedia discovery and playback, and automatic speech recognition. The middleware layer also includes a navigation service that, true to the platform’s overall flexibility, allows developers to use navigation engines from multiple vendors and to change engines without affecting the high-level navigation applications that the user interacts with.

An illustration is in order. If you look the image below, you’ll see OpenGL-based map data rendered on one graphics layer and, on the layer above it, Qt-based application data (current street, distance to destination, and other route information) pulled from the navigation engine. By taking advantage of the platform’s navigation service, you could swap in a different navigation engine without having to rewrite the Qt application:



To achieve this flexibility, the navigation service makes use of the QNX CAR Platform’s persistent/publish subscribe (PPS) messaging, which cleanly abstracts lower-level services from the higher-level applications they communicate with. Let's look at another diagram to see how this works:



In the PPS model, services publish information to data objects; other programs can subscribe to those objects and receive notifications when the objects have changed. So, for the example above, the navigation engine could generate updates to the route information, and the navigation service could publish those updates to a PPS “navigation status object,” thereby making the updates available to any program that subscribes to the object — including the Qt application.

With this approach, the Qt application doesn't need to know anything about the navigation engine, nor does the navigation engine need to know anything about the Qt app. As a result, either could be swapped out without affecting the other.

Here's another example of how this model allows components to communicate with one another:
  1. Using the system's human machine interface (HMI), the drivers asks the navigation system to search for a point of interest (POI) — this could take the form of a voice command or a tap on the system display.
  2. The HMI responds by writing the request to a PPS “navigation control” object.
  3. The navigation service reads the request from the PPS object and forwards it to the navigation engine.
  4. The navigation engine returns the result.
  5. The navigation service updates the PPS object to notify the HMI that its request has been completed. It also writes the results to a database so that all subscribers to this object can read the results.
By using PPS, the navigation service can make details of the route available to a variety of applications. For instance, it could publish trip information that a weather app could subscribe to. The app could then display the weather forecast for the destination, at the estimated time of arrival.

To give developers a jump start, the QNX CAR Platform comes pre-integrated with Elektrobit’s EB street director navigation software. This reference integration shows developers how to implement "command and control" between the HMI and the participating components, including the navigation engine, navigation service, window manager, and PPS interface. As the above diagram indicates, the reference implementation works with both of the HMIs — one based on HTML5, the other based on Qt — that the QNX CAR Platform supports out of the box.


Previous posts in the QNX CAR Platform series:


Wednesday, October 22, 2014

A question of architecture

The second of a series on the QNX CAR Platform. In this installment, we start at the beginning — the platform’s underlying architecture.

In my previous post, I discussed how infotainment systems must perform multiple complex tasks, often all at once. At any time, a system may need to manage audio, show backup video, run 3D navigation, synch with Bluetooth devices, display smartphone content, run apps, present vehicle data, process voice signals, perform active noise control… the list goes on.

The job of integrating all these functions is no trivial task — an understatement if ever there was one. But as with any large project, starting with the right architecture, the right tools, and the right building blocks can make all the difference. With that in mind, let’s start at the beginning: the underlying architecture of the QNX CAR Platform for Infotainment.

The architecture consists of three layers: human machine interface (HMI), middleware, and platform.



The HMI layer
The HMI layer is like a bonus pack: it supports two reference HMIs out of the box, both of which have the same appearance and functionality. So what’s the difference? One is based on HTML5, the other on Qt 5. This choice demonstrates the underlying flexibility of the platform, which allows developers to create an HMI with any of several technologies, including HTML5, Qt, or a third-party toolkit such as Elektrobit GUIDE or Crank Storyboard.

A choice of HMIs
Mind you, the choice goes further than that. When you build a sophisticated infotainment system, it soon becomes obvious that no single tool or technology can do the job. The home screen, which may contain controls for Internet radio, hands-free calls, HVAC, and other functions, might need an environment like Qt. The navigation app, for its part, will probably use OpenGL ES. Meanwhile, some applications might be based on Android or HTML5. Together, all these heterogeneous components make up the HMI.

The QNX CAR Platform embraces this heterogeneity, allowing developers to use the best tools and application environments for the job at hand. More to the point, it allows developers to blend multiple app technologies into a single, unified user interface, where they can all share the same display, at the same time.

To perform this blending, the platform employs several mechanisms, including a component called the graphical composition manager . This manager acts as a kind of universal framework, providing all applications, regardless of how they’re built, with a highly optimized path to the display.

For example, look at the following HMI:



Now look at the HMI from another angle to see how it comprises several components blended together by the composition manger:



To the left, you see video input from a connected media player or smartphone. To the right, you see a navigation application based on OpenGL ES map-rendering software, with an overlay of route metadata implemented in Qt. And below, you see an HTML page that provides the underlying wallpaper; this page could also display a system status bar and UI menu bar across all screens.

For each component rendered to the display, the graphical composition manager allocates a separate window and frame buffer. It also allows the developer to control the properties of each individual window, including location, transparency, rotation, alpha, brightness, and z-order. As a result, it becomes relatively straightforward to tile, overlap, or blend a variety of applications on the same screen, in whichever way creates the best user experience.

The middleware layer
The middleware layer provides applications with a rich assortment of services, including Bluetooth, multimedia discovery and playback, navigation, radio, and automatic speech recognition (ASR). The ASR component, for example, can be used to turn on the radio, initiate a Bluetooth phone call from a connected smartphone, or select a song by artist or song title.

I’ll drill down into several of these services in upcoming posts. For now, I’d like to focus on a fundamental service that greatly simplifies how all other services and applications in the system interact with one another. It’s called persistent/publish subscribe messaging, or PPS, and it provides the abstraction needed to cleanly separate high-level applications from low-level business logic and services.

PPS messaging provides an abstraction layer between system services and high-level applications

Let’s rewind a minute. To implement communications between software components, C/C++ developers must typically define direct, point-to-point connections that tend to “break” when new features or requirements are introduced. For instance, an application communicates with a navigation engine, but all connections enabling that communication must be redefined when the system is updated with a different engine.

This fragility might be acceptable in a relatively simple system, but it creates a real bottleneck when you are developing something as complex, dynamic, and quickly evolving as the design for a modern infotainment system. PPS addresses the problem by allowing developers to create loose, flexible connections between components. As a result, it becomes much easier to add, remove, or replace components without having to modify other components.

So what, exactly, is PPS? Here’s a textbook answer: an asynchronous object-based system that consists of publishers and subscribers, where publishers modify the properties of data objects and the subscribers to those objects receive updates when the objects have been modified.

So what does that mean? Well, in a car, PPS data objects allow applications to access services such as the multimedia engine, voice recognition engine, vehicle buses, connected smartphones, hands-free calling, and contact databases. These data objects can each contain multiple attributes, each attribute providing access to a specific feature — such as the RPM of the engine, the level of brake fluid, or the frequency of the current radio station. System services publish these objects and modify their attributes; other programs can then subscribe to the objects and receive updates whenever the attributes change.

The PPS service is programming-language independent, allowing programs written in a variety of programming languages (C, C++, HTML5, Java, JavaScript, etc.) to intercommunicate, without any special knowledge of one another. Thus, an app in a high-level environment like HTML5 can easily access services provided by a device driver or other low-level service written in C or C++.

I’m only touching on the capabilities of PPS. To learn more, check out the QNX documentation on this service.

The platform layer
The platform layer includes the QNX OS and the board support packages, or BSPs, that allow the OS to run on various hardware platforms.

An inherently modular and extensible architecture
A BSP may not sound like the sexiest thing in the world — it is, admittedly, a deeply technical piece of software — but without it, nothing else works. And, in fact, one reason QNX Software Systems has such a strong presence in automotive is that it provides BSPs for all the popular infotainment platforms from companies like Freescale, NVIDIA, Qualcomm, and Texas Instruments.

As for the QNX Neutrino OS, you could write a book about it — which is another way of saying it’s far beyond the scope of this post. Suffice it to say that its modularity, extensibility, reliability, and performance set the tone for the entire QNX CAR Platform. To get a feel for what the QNX OS brings to the platform (and by extension, to the automotive industry), I invite you to visit the QNX Neutrino OS page on the QNX website.

Tuesday, October 21, 2014

A sweet ride? You’d better 'beleave' it

Is Autumn the best season for a long, leisurely Sunday drive? Well, I don’t know about your neck of the woods, but in my neck, the trees blaze like crimson, orange, and yellow candles, transfiguring back roads into cathedrals of pure color. When I see every leaf on every tree glow like a piece of sunlight-infused stained glass, I make a religious effort to jump behind the wheel and get out there!

Now, of course, you can enjoy your Autumn drive in any car worth its keep. But some cars make the ride sweeter than others — and the Mercedes S Class Coupe, with its QNX-powered infotainment system and instrument cluster, is deliciously caloric.

This isn’t a car for the prim, the proper, the austere. It’s for pure pleasure – whether you take pleasure in performance, luxury, or beauty of design. Or all three. The perfect car, in other words, for an Autumn drive. Which is exactly what the folks at Mercedes thought. In fact, they made a photo essay about — check it out on their Facebook page.


Source: Mercedes

Thursday, October 16, 2014

Attending SAE Convergence? Here’s why you should visit booth 513

Cars and beer don’t mix. But discussing cars while having a beer? Now you’re talking. If you’re attending SAE Convergence next week, you owe it to yourself to register for our “Spirits And Eats” event at 7:00 pm Tuesday. It’s the perfect occasion to kick back and enjoy the company of people who, like yourself, are passionate about cars and car electronics. And it isn’t a bad networking opportunity either — you’ll meet folks from a variety of automakers, Tier 1s, and technology suppliers in a relaxed, convivial atmosphere.

But you know what? It isn’t just about the beer. Or the company. It’s also about the Benz. Our digitally modded Mercedes-Benz CLA45 AMG, to be exact. It’s the latest QNX technology concept car, and it’s the perfect vehicle (pun fully intended) for demonstrating how QNX technology can enable next-generation infotainment systems. Highlights include:

  • A multi-modal user experience that blends touch, voice, and physical controls
  • A secure application environment for Android, HTML5, and OpenGL ES
  • Smartphone connectivity options for projecting smartphone apps onto the head unit
  • A dynamically reconfigurable digital instrument cluster that displays turn-by-turn directions, notifications of incoming phone calls, and video from front and rear cameras
  • Multimedia framework for playback of content from USB sticks, DLNA devices, etc.
  • Full-band stereo calling — think phone calls with CD quality audio
  • Engine sound enhancement that synchronizes synthesized engine sounds with engine RPM

Here, for example, is the digital cluster:



And here is a closeup of the head unit:



And here’s a shot of the cluster and head unit together:



As for the engine sound enhancement and high-quality hands-free audio, I can’t reproduce these here — you’ll have come see the car and experience them first hand. (Yup, that's an invite.)

If you like what you see, and are interested in what you can hear, visit us at booth #513. And if you'd like to schedule a demo or reserve some time with a QNX representative in advance, we can accommodate that, too. Just send us an email.

Wednesday, October 15, 2014

Are you ready to stop micromanaging your car?

I will get to the above question. Honest. But before I do, allow me to pose another one: When autonomous cars go mainstream, will anyone even notice?

The answer to this question depends on how you define the term. If you mean completely and absolutely autonomous, with no need for a steering wheel, gas pedal, or brake pedal, then yes, most people will notice. But long before these devices stop being built into cars, another phenomenon will occur: people will stop using them.

Allow me to rewind. Last week, Tesla announced that its Model S will soon be able to “steer to stay within a lane, change lanes with the simple tap of a turn signal, and manage speed by reading road signs and using traffic-aware cruise control.” I say soon because these functions won't be activated until owners download a software update in the coming weeks. But man, what an update.

Tesla may now be at the front of the ADAS wave, but the wave was already forming — and growing. Increasingly, cars are taking over mundane or hard-to-perform tasks, and they will only become better at them as time goes on. Whether it’s autonomous braking, automatic parking, hill-descent control, adaptive cruise control, or, in the case of the Tesla S, intelligent speed adaptation, cars will do more of the driving and, in so doing, socialize us into trusting them with even more driving tasks.

Tesla Model S: soon with autopilot
In other words, the next car you buy will prepare you for not having to drive the car after that.

You know what’s funny? At some point, the computers in cars will probably become safer drivers than humans. The humans will know it, but they will still clamor for steering wheels, brake pedals, and all the other traditional accoutrements of driving. Because people like control. Or, at the very least, the feeling that control is there if you want it.

It’s like cameras. I would never think of buying a camera that didn’t have full manual mode. Because control! But guess what: I almost never turn the mode selector to M. More often than not, it’s set to Program or Aperture Priority, because both of these semi-automated modes are good enough, and both allow me to focus on taking the picture, not on micromanaging my camera.

What about you? Are you ready for a car that needs a little less micromanagement?

Tuesday, October 7, 2014

A question of concurrency

The first of a new series on the QNX CAR Platform for Infotainment. In this installment, I tackle the a priori question: why does the auto industry need this platform, anyway?

Define your terms, counseled Voltaire, and in keeping with his advice, allow me to begin with the following:

Concurrency \kən-kûr'-ən-sē\ n (1597) Cooperation, as of agents, circumstances, or events; agreement or union in action.

A good definition, as far as it goes. But it doesn’t go far enough for the purposes of this discussion. Wikipedia comes closer to the mark:

“In computer science, concurrency is a property of systems in which several computations execute simultaneously, and potentially interact with each other.”

That’s better, but it still falls short. However, the Wikipedia entry also states that:

“the base goals of concurrent programming include correctness, performance and robustness. Concurrent systems… are generally designed to operate indefinitely, including automatic recovery from failure, and not terminate unexpectedly.”

Now that’s more like it. Concurrency in computer systems isn’t simply a matter of doing several things all at once; it’s also a matter of delivering a solid user experience. The system must always be available and it must always be responsive: no “surprises” allowed.

This definition seems tailored-made for in-car infotainment systems. Here, for example, are some of the tasks that an infotainment system may perform:

  • Run a variety of user applications, from 3D navigation to Internet radio, based on a mix of technologies, including Qt, HTML5, Android, and OpenGL ES
  • Manage multiple forms of input: voice, touch, physical buttons, etc. 
  • Support multiple smartphone connectivity protocols such as MirrorLink and Apple CarPlay 
  • Perform services that smartphones cannot support, including:
    • HVAC control
    • discovery and playback of multimedia from USB sticks, DLNA devices, MTP devices, and other sources
    • retrieval and display of fuel levels, tire pressure, and other vehicle information
    • connectivity to Bluetooth devices
  • Process voice signals to ensure the best possible quality of phone-based hands-free systems — this in itself can involve many tasks, including echo and noise removal, dynamic noise shaping, speech enhancement, etc. 
  • Perform active noise control to eliminate unwanted engine “boom” noise 
  • Offer extremely fast bootup times; a backup camera, for example, must come up within a second or two to be useful
     
Jugging multiple concurrent tasks
The primary user of an infotainment system is the driver. So, despite juggling all these activities, an infotainment system must never show the strain. It must always respond quickly to user input and critical events, even when many activities compete for system resources. Otherwise, the driver will become annoyed or, worse, distracted. The passengers won’t be happy, either.

Still, that isn’t enough. Automakers also need to differentiate themselves, and infotainment serves as a key tool for achieving differentiation. So the infotainment system must not simply perform well; it must also allow the vehicle, or line of vehicles, to project the unique values, features, and brand identity of the automaker.

And even that isn’t enough. Most automakers offer multiple vehicle lines, each encompassing a variety of configurations and trim levels. So an infotainment design must also be scalable; that way, the work and investment made at the high end can be leveraged in mid-range and economy models. Because ROI.

Projecting a unique identity
But you know what? That still isn’t enough. An infotainment system design must also be flexible. It must, for example, support new functionality through software updates, whether such updates are installed through a storage device or over the air. And it must have the ability to accommodate quickly evolving connectivity protocols, app environments, and hardware platforms. All with the least possible fuss.

The nitty and the gritty
Concurrency, performance, reliability, differentiation, scalability, flexibility — a tall order. But it’s exactly the order that the QNX CAR Platform for Infotainment was designed to fill.

Take, for example, product differentiation. If you look at the QNX-powered infotainment systems that automakers are shipping today, one thing becomes obvious: they aren’t cookie-cutter systems. Rather, they each project the unique values, features, and brand identity of each automaker — even though they are all built on the same, standards-based platform.

So how does the QNX CAR Platform enable all this? That’s exactly what my colleagues and I will explore over the coming weeks and months. We’ll get into the nitty and sometimes the gritty of how the platform works and why it offers so much value to companies that develop infotainment systems in various shapes, forms, and price points.

Stay tuned.

POSTSCRIPT: Read the next installment of the QNX CAR Platform series, A question of architecture.

Thursday, September 18, 2014

A glaring look at rear-view mirrors

Some reflections on the challenge of looking backwards, followed by the vexing question: where, exactly, should video from a backup camera be displayed?

Mirror, mirror, above the dash, stop the glare and make it last! Okay, maybe I've been watching too many Netflix reruns of Bewitched. But mirror glare, typically caused by bright headlights, is a problem — and a dangerous one. It can create temporary blind spots on your retina, leaving you unable to see cars or pedestrians on the road around you.

Automotive manufacturers have offered solutions to this problem for decades. For instance, many car mirrors now employ electrochromism, which allows the mirror to dim automatically in response to headlights and other light sources. But when, exactly, did the first anti-glare mirrors come to market?

According to Wikipedia, the first manual-tilt day/night mirrors appeared in the 1930s. These mirrors typically use a prismatic, wedge-shaped design in which the rear surface (which is silvered) and the front surface (which is plain glass) are at angles to each other. In day view, you see light reflected off the silvered rear surface. But when you tilt the mirror to night view, you see light reflected off the unsilvered front surface, which, of course, has less glare.

Manual-tilt day/night mirrors may have debuted in the 30s, but they were still a novelty in the 50s. Witness this article from the September 1950 issue of Popular Science:



True to their name, manual-tilt mirrors require manual intervention: You have to take your hand off the wheel to adjust them, after you’ve been blinded by glare. Which is why, as early as 1958, Chrysler was demonstrating mirrors that could tilt automatically, as shown in this article from the October 1958 issue of Mechanix Illustrated:


Images: Modern Mechanix blog

Fast-forward to backup cameras
Electrochromic mirrors, which darken electronically, have done away with the need to tilt, either manually or automatically. But despite their sophistication, they still can't overcome the inherent drawbacks of rear-view mirrors, which provide only a partial view of the area behind the vehicle — a limitation that contributes to backover accidents, many of them involving small children. Which is why NHTSA has mandated the use of backup cameras by 2018 and why the last two QNX technology concept cars have shown how video from backup cameras can be integrated with other content in a digital instrument cluster.

Actually, this raises the question: just where should backup video be displayed? In the cluster, as demonstrated in our concept cars? Or in the head unit, the rear-view mirror, or a dedicated screen? The NHTSA ruling doesn’t mandate a specific device or location, which isn't surprising, as each has its own advantages and disadvantages.

Consider, for example, ease of use: Will drivers find one location more intuitive and less distracting than the alternatives? In all likelihood, the answer will vary from driver to driver and will depend on individual cognitive styles, driving habits, and vehicle design.

Another issue is speed of response. According to NHTSA’s ruling, any device displaying backup video must do so within 2.5 seconds of the car shifting into the reverse. Problem is, the ease of complying with this requirement depends on the device in question. For instance, NHTSA acknowledges that “in-mirror displays (which are only activated when the reverse gear is selected) may require additional warm-up time when compared to in-dash displays (which may be already in use for other purposes such as route navigation).”

At first blush, in-dash displays such as head units and digital clusters have the advantage here. But let’s remember that booting quickly can be a challenge for these systems because of their greater complexity — many offer a considerable amount of functionality. So imagine what happens when the driver turns the ignition key and almost immediately shifts into reverse. In that case, the cluster or head unit must boot up and display backup video within a handful of seconds. It's important, then, that system designers choose an OS that not only supports rich functionality, but also allows the system to start up and initialize applications in the least time possible.