Wednesday, May 21, 2014

ZENRIN Datacom integrates mobile navigation app with QNX CAR Platform

Don't know if you've noticed, but a variety of navigation software vendors have been integrating their solutions with the QNX CAR Platform for Infotainment. In the last few months alone, QNX has announced partnerships with Nokia HERE, Kotei Informatics, and AISIN AW — this in addition to its longstanding partnerships with navigation leaders like Elektrobit, TCS, and TeleNav.

The new partnerships are a boon to automakers and Tier 1 suppliers, especially those that target multiple geographies. More than ever, these companies can choose the navigation solution, or solutions, best suited to a given country or region.

The good news continues with ZENRIN DataCom, a leading provider of mapping services and products from Japan. ZENRIN is now integrating its Its-mo NAVI [Drive] 2013 application — which offers fuel prices, nearby parking spots, and other location-based features — with the QNX CAR Platform. In fact, ZENRIN and QNX demonstrated this integration last week at the Smartphone Japan conference in Tokyo.

The choice of venue may seem surprising, but it makes sense: Its-mo NAVI [Drive] is a smartphone app that, thanks to the collaboration between ZENRIN and QNX, can now run on head units as well. More to the point, this integration illustrates the benefit of building support for mobile app environments into a car infotainment platform: automakers can tap into a much larger developer community.

A spokesperson from ZENRIN DataCom says it best: “The automotive market in Japan and the rest of Asia is a vibrant and compelling environment for app developers but market volume is significantly lower than that for smartphones. A cross-platform concept is key as it enables apps to run on both smartphones and vehicle head units with minimal changes. The QNX CAR Platform, with its rich support for mobile application environments, is a very attractive feature for app developers in the mobile world.”

If you’d like more about ZENRIN and its navigation app, I invite you to read the press release and visit the ZENRIN website.

Wednesday, May 14, 2014

Using a smaller BOM to make less boom

Don't know about you, but where I live, the price of gas has rocketed through the troposphere and is fast approaching the upper stratosphere. Which is to say, it has gone through the proverbial roof. It was almost $1.40 a liter (over $5 a gallon) the last time I stopped at a pump and is set to climb even higher, now that summer is approaching.

Small wonder that, for many car buyers, fuel economy is top of mind. Automakers are wise to this and have adopted a variety of measures to make their cars sip gas more slowly. For instance, many cars now deactivate cylinders when engine load is light and use fewer sound-damping materials to shed weight — because schlepping fewer pounds means less work, and less work means less gas.

These techniques save gas all right, but at a price: increased engine “boom” noise that can both annoy and fatigue the driver — not to mention everyone else in the vehicle. That's a problem. To address it, automakers use active noise control, or ANC, which plays noise-cancelling signals over speakers in the vehicle cabin. All fine and good, but until now, ANC solutions have used dedicated hardware, which can drive up Bill of Materials (BOM) costs and make it difficult to leverage the latest ANC technologies.

What to do? That's the subject of a recent whitepaper by my inestimable colleague Tina Jeffrey. Tina outlines some design considerations for ANC systems (choosing the right microphones makes a difference, for example) but mostly, she focuses on the advantages of running ANC logic on the processor or DSP of the car's infotainment system — as opposed to on a dedicated ANC module.

The benefits are many, including lower BOM costs, greater design flexibility, better cooperation between various acoustic functions in the car and — here's the one I like — less boom. But why sit there listening to me drone on about this? Download Tina's paper now and get the real deal.




Software-based ANC: a smaller BOM, with less boom.

Tuesday, May 6, 2014

Infotainment gets an upgrade

So, a few weeks ago, we invited a group of Canadian journalists to come visit the QNX garage and get a hands-on intro to our latest concept cars. Sami Haj-Assaad of AutoGuide was one of our esteemed guests, and here's his take on what he experienced:



Did you notice? The floors in the video are remarkably clean for a working garage. That's because we set loose a QNX-powered robot vacuum from Neato Robotics before the cameras started rolling. If you haven't yet seen a Neato Botvac in action, you owe it to yourself to check out this video.

If you're a visual designer, keep reading...
Would you, by any chance, be a creative, experienced, and just plain awesome visual designer? Because if you are, have I got a job for you. Right now, QNX is looking for a designer to help create its next generation of concept vehicles — and lots of other things to boot. This is a one-of-a-kind opportunity to put your mark on some very cool technology; find out more here.

Wednesday, April 30, 2014

Bad idea, good idea

Why equip cars with external-sounding speakers? I thought you'd never ask. As it turns out, it can be a really bad idea. Or a really good one.

Here, for example, is a case where bad arguably prevails:


Source: Modern Mechanix blog

No doubt, the person who devised this system in 1931 thought it a brilliant, or at least entertaining, idea. Fortunately, common sense prevailed and the era of the "auto speaker," with its potential to scare the living daylights out of pedestrians, never came to pass.

But here's the thing: equipping cars with external-sounding speakers can be a great idea, when done for the right reasons. For example, some hybrid and electric vehicles are dangerously quiet for bicyclists and visually impaired pedestrians. Adding speakers to emit audible alerts or to project synthesized engine sounds can be just what the doctor ordered. Or rather, what the parliament ordered: earlier this month, members of the European Parliament stated that they want automakers to install acoustic alerting systems in hybrid vehicles by July 2019.

Mind you, safety isn't the only reason to project synthesized engine sounds. For example, fuel-saving techniques can make even powerful engines sound wimpy — a problem when high performance is a key ingredient of a car's branding. In that case, the automaker may wish to project synthesized engine sounds over both external and internal speakers. The speakers can help preserve the car's wow factor (provided they're not too loud) and the internal speakers, in particular, can make it easier for car owners who drive manual to shift gears by ear. The QNX concept car for acoustics offers a good example of this technology in action.

All of which to say, engine sound enhancement, also known as ESE, is here to stay. And it's not a bad time to be in the automotive-speaker business, either.

Wednesday, April 23, 2014

The next chapter in car infotainment: seamless mobile integration

Tina Jeffrey
According to a survey from Forrester Research, 50% of North Americans who plan to buy cars in the next 12 months say that technology options will play an important role in their purchasing decisions. The fact is, consumers want to remain connected all the time; they don’t want to park their digital lifestyle while driving. This begs the question: what’s an automaker to do?

Allow consumers to bring in content and apps on their mobile devices. We are becoming increasingly attached to our smartphones, and this is driving a trend towards mobile-centric car infotainment. The trend is of particular benefit to buyers of low-end vehicles, in which built-in features such as navigation and speech recognition can be cost prohibitive. A smartphone-driven head unit reduces costs by leveraging the existing connectivity and processing power of the mobile device; it also provides easy access to apps the consumer has already downloaded. In fact, integration between the mobile device and head unit offers numerous benefits: it helps the car keep pace with the consumer-device lifecycle, it endows the car with app store capabilities, and it lets the car connect to the cloud through the mobile device, eliminating the need for a built-in connection.

Using the phone's connectivity and
processing power to deliver apps and
software updates.
Design in-vehicle systems to be compatible with all leading smartphones. To satisfy this requirement, the vehicle must support both proprietary and standards-based connectivity protocols, using Bluetooth, USB, and Wi-Fi. Automakers will need to deliver platforms that include support for CarPlay, iPod Out (for older Apple devices), DLNA (for BlackBerry phones and other devices), MirrorLink, and Miracast, as well as the solution that the Open Automotive Alliance (OAA) promises to reveal later this year. By offering this widespread connectivity, automakers can avoid snubbing any significant portion of their prospective customer base.

Leverage and enable the mobile development community to build the apps consumers want. With companies like Apple and Google now in the fray, native brought-in apps will be a certainty, but automakers should continue to embrace HTML5 as an application platform, given its ”write once, run anywhere” mantra. HTML5 remains the most widely used cross-platform application environment and it gives automakers access to the largest pool of developers worldwide. And, as the first W3C vehicle information API specification is ratified, HTML5 application developers will be able to access vehicle information and develop compelling, car-appropriate apps that become an integral part of our daily commute.

Thursday, April 10, 2014

12 autonomous car articles worth reading

You know what's fascinating about autonomous cars? Everything. They raise as many questions as they do answers, and many of those questions drive right to the heart of how we see ourselves and the world around us. For instance, will autonomous cars introduce a new era of independence for the elderly? Will they change the very nature of car ownership? Will they reduce traffic fatalities and help make traffic jams a thing of the past?

Technically, legally, economically, and socially, autonomous cars are a game-changer. I like thinking about them, and I like reading what other people think about them. And just what have I been reading? I thought you'd never ask. Here, in no particular order, are 12 articles that have caught my eye in the last month.

So there you have it. I don't, of course, agree with every point in every article, but they have all taught me something I didn't know or clarified something I already knew. I hope they do the same for you.

Tuesday, April 8, 2014

QNX helps drive new autonomous vehicle project

Have I ever mentioned the QNX-in-Education program? Over the decades, it has supported an array of university research projects, in fields ranging from humanoid robotics to autonomous aircraft. Harvard University, for example, has been a program member for more than 20 years, using QNX technology to measure and analyze ozone depletion in the stratosphere.

So, on the one hand, QNX Software Systems supports scientific and engineering research. On the other hand, it's a leader in automotive software. You know what that means: it was only a matter of time before those two passions came together. And in fact, QNX has just announced its role in DEEVA, a new autonomous car project from the Artificial Vision and Intelligent Systems Laboratory (VisLab) of the University of Parma.

A glimpse of DEEVA (Source VisLab).

The folks at VisLab already have several autonomous projects under the belts. Last year, for example, they launched a self-driving car that can negotiate downtown rush-hour traffic and complex situations like traffic circles, traffic lights, and pedestrian crossings. DEEVA incorporates the team's latest insights into autonomous drive and features a rich set of sensors that deliver a complete 3D view of the circumference of the vehicle.

With its 30-year history in safety-critical systems, QNX OS technology offers a natural choice for a project like DEEVA. According to Professor Alberto Broggi, president and CEO of VisLab, "in the design of our vehicle, we selected building blocks offering high reliability with proven safety records; the operating system powering the vital elements of the vehicle is one of those and is why we chose the QNX OS.”

The QNX OS controls several systems in DEEVA, including path and trajectory planning, realtime fusion of laser data and visual data, and the user interface.

You can read the press release here and see photos of DEEVA here

Tuesday, March 18, 2014

My next car will know when I'm in a good mood

It will also know when I'm sad, grumpy, or just plain mad. Okay, it won't quite figure it out on its own — I will have to tell it. How will my car get so smart? Because Gracenote has been working to personalize the music experience in a way that dynamically adapts content to the driver’s mood and musical taste.

It's not just about play-listing locally stored content and displaying album art anymore. The folks at Gracenote can now merge multiple occupants’ music collections and find common ground. They can create stations that incorporate cloud-based content from multiple sources. And they can adapt the nature of the content played in the car to the mood of the occupants through their Mood Grid technology.

This not only makes the music experience intensely personal but it does it automagically, keeping the driver’s eyes on the road and hands on the wheel. Here’s a short video from 2014 CES that provides a quick overview — on a system powered by QNX technology, of course :-)



Wednesday, March 12, 2014

Crowd-sourced maps: the future of in-car navigation?

Guest post by Daniel Gast, innovation manager, Elektrobit Automotive

Crowdsourcing has become a major trend. Even McDonald’s has been getting into the act, asking consumers to submit new ideas for burgers. In 2013 the company’s “My Burger 3.0” campaign elicited an enormous response in Germany, with more than 200,000 burger ideas and more than 150,000 people voting for their favorites.

From burgers we go to a key component of navigation systems: digital maps. OpenStreetMap (OSM), a well-known and globally crowdsourced project, is dedicated to creating free worldwide maps and has attracted more than 100,000 registered contributors. These people volunteer their services, creating digital maps without being paid; take a glimpse of their work at www.openstreetmap.org.

Why is the amount of data behind OSM constantly growing?
Creating OSM maps is a kind of charity work, open to all to contribute and to use with free licenses. The technology behind it is very user friendly, which will help ensure long-term loyalty among contributors. But probably the most important factor is the fun it brings. Contributing content to this project consists of recording streets, buildings, bridges, forests, point of interests, and other items that you would benefit from having in a map. For many OSM editors, this is their favorite hobby — they are “addicts” in the best sense of the word. They love the project and aspire to create a perfect map. That’s the reason why the growing amount of available map data is of very good quality.

Can automakers and drivers benefit from crowd-sourced map data like OpenStreetMap?
Yes, they can. Because so many people contribute to the project, the amount of data is growing continuously. Every contributor can add or edit content at any time, and changes are integrated into the public OSM database immediately.

In the beginning only streets were collected, but because the data format is extensible, editors can add data like parking spots or pedestrian walkways. For instance, a group of firemen added hydrants for their region to the map material, using OSM’s flexibility to define and add new content. Automakers could take advantage of this flexibility to integrate individual points of interest like car repair shops or to drive business models with third-party partners, such as couponing activities.

Because it’s free of charge, OSM data could, in the mid to long term, develop into a competitive and low-priced alternative to databases being provided by commercial map data suppliers.

For their part, automakers could easily provide toolkits that allow drivers to edit wrong or missing map data on the go. Or even better, allow them to personalize maps with individual content like preferred parking places or favorite burger restaurants.

Are automotive infotainment systems ready for these new kinds of map data?
From a technical point of view, automotive software like the QNX CAR Platform for Infotainment or EB street director navigation can, without modifications, interpret this new kind of data, since the OSM map data can be converted to a specific format, much like commercial map data. It’s like creating your individual burger: the bread and meat remains the same, but you opt for tomatoes instead of onions.

That said, some gaps in the OSM data must be filled before it can provide full-blown automotive navigation. Features like traffic signs, lane information, and turn restrictions are available, but coverage remains limited. Also, the regional coverage varies widely — coverage in Germany, for example, is much higher than in countries in Africa or South America.

From the automaker’s perspective, it could be an interesting challenge to encourage the community to contribute this type of content. One opportunity to support this idea is to develop an OSM-based navigation system for mobile use. After reaching maturity the system could be easily merged into the vehicle and would allow drivers to use premium directions from automotive-approved infotainment systems like EB street director — which we saw at CES in the QNX CAR Platform — for less money.



Daniel Gast has worked for Elektrobit since 2000, initially as software engineer, later as product manager for EB street director navigation. Subsequent to this he took over the responsibility for the business area navigation solutions. He now coordinates innovation management for Elektrobit Automotive. Daniel studied computer science in Erlangen.

Keep up to date with Elektrobit's latest automotive news and products by signing up for the EB Automotive Newsletter — Ed.

Tuesday, March 11, 2014

Tackling fragmentation with a standard vehicle information API

Tina Jeffrey
Has it been a year already? In February 2013 QNX Software Systems became a contributing member of the W3C’s Automotive Web Platform Business Group, which is dedicated to accelerating the adoption of Web technologies in the auto industry. Though it took a while to rev up, the group is now in full gear and we’re making excellent progress towards our first goal of defining a vehicle information API for passenger vehicles.

The plan is to establish a standard API for accessing speed, RPM, tire pressure, and other vehicle data. The API will enable consistent app development across automakers and thereby reduce the fragmentation that affects in-vehicle infotainment systems. Developers will be able to use the API for apps running directly on the head unit as well as for apps running on mobile devices connected to the head unit.

Parallel processing
Let me walk you through our work to date. To get started, we examined API specifications from four member organizations: QNX, Webinos, Intel, and GENIVI. Next, we collected a superset of the attributes from each spec and categorized each attribute into one of several functional groups: vehicle information, running status, maintenance, personalization, driving safety, climate/environment, vision systems, parking, and electric vehicles. Then, we divvied up these functional groups among teams who worked in parallel: each team drafted an initial API for their allotted functional group before sharing it with the members at large.

Throughout this effort, we documented a set of API creation guidelines to capture the intent and reasoning behind our decisions. These guidelines cover details such as data representation, attribute value ranges and increments, attribute naming, and use of callback functions. The guidelines also capture the rules that govern how to grow or extend the APIs, if and when necessary.

Driving towards closure
In December the business group editors began to pull the initial contributions into a single draft proposal. This work is progressing and will culminate in a member’s face-to-face meeting mid-March in Santa Clara, California, where we will review the draft proposal in its entirety and drive this first initiative towards closure.

I’m sure there will be lots more to talk about, including next potential areas of focus for the group. If you're interested in following our progress, here’s a link to the draft API.

Enjoy!

Tuesday, March 4, 2014

Self-driving cars? We had ‘em back in ‘56

Quick: What involves four-part harmony singing, control towers in the middle of the desert, and a dude smoking a fat stogie? Give up? It's the world of self-driving cars, as envisioned in 1956.

No question, Google’s self-driving car has captured the public imagination. But really, the fascination is nothing new. For instance, at the 1939 World’s Fair, people thronged to see GM’s Futurama exhibit, which depicted a world of cars controlled by radio signals. GM continued to promote its autonomous vision in the 1950s with the Firebird II, a turbine-powered car that could drive itself by following an "electronic control strip" embedded in the road. Here, for example, is a GM-produced video from 1956 in which a musically adept family goes for an autonomous drive:



Fast-forward to today, when it seems that everyone is writing about self-driving cars. Most articles don’t add anything new to the discussion, but their ubiquity suggests that, as a society, we are preparing ourselves for a future in which we give up some degree of control to our vehicles. I find it fascinating that an automaker was at the avant-garde of this process as far back as the 1930s. Talk about looking (way) ahead.

And you know what’s cool? Comparing the vision of the good life captured in the above video with the vision captured in the “Imagined” video that QNX produced 56 years later. In both cases, autonomous drive forms part of the story. And in both cases, an autonomous car helps to bring family together, though in completely different ways. It seems that, no matter how much technology (and our vision of technology) changes, the things closest to our hearts never do:



One more thing. Did you notice how much the sets in the GM video look like something straight out of the Jetson’s, right down to the bubble-domed car? They did to me. Mind you, the video predates the Jetson’s by several years, so if anything, the influence was the other way around.


Tuesday, February 25, 2014

QNX drives home (quietly) with embedded award

Every year, the organizers of the Embedded World conference hold the embedded AWARDs to recognize the most innovative software, hardware, and tools for embedded developers. And this year, they selected QNX Acoustics for Active Noise Control, the new QNX solution for eliminating engine "boom" noise in cars, as the winner in the software category.

This marks the third time that QNX Software Systems has taken home an embedded AWARD. The company also won in 2004 for power management technology and in 2006 for its multicore tools and OS — and in 2010, it nabbed a finalist spot for its persistent publish/subscribe messaging. That's a lot of plaques.

QNX Acoustics for ANC eliminates the need
for costly dedicated ANC hardware.
So why did QNX Acoustics for ANC get the blue ribbon treatment? I can't speak on behalf of the Embedded World judges, but check out this overview I wrote a few months ago. Or better yet, read this deeper dive from my colleague Tina Jeffrey.

Or skip the middle man entirely and check out the product page, which does a nice job of summarizing what QNX Acoustics for ANC is all about.

Thursday, February 20, 2014

When is a road trip not a road trip?

The über-cool modified Mercedes-Benz CLA45 AMG that QNX unveiled at CES is on its inaugural road trip. Well, sort of. It's actually winging its way to Barcelona for Mobile World Congress 2014.

For those that didn't have a chance to see it at CES, the car incorporates state-of-the-art voice recognition; navigation from Elektrobit, Here, and Kotei; smartphone connectivity based on Miracast and on MirrorLink from RealVNC; advanced multi-media streaming, including iHeartRadio; and a reconfigurable digital instrument cluster, all delivered in a user-centric, multi-modal experience.

For those that have seen it, it is still worthwhile taking the time to check it out because this time around it's powered by Qualcomm’s Snapdragon S602A Automotive Solution. We announced the relationship with Qualcomm at CES and just over a month later we're showcasing the relationship in the Mercedes. Imagine what we'll do by Telematics Detroit.

You can check it out in the Qualcomm booth, Hall 3, Mobile World Congress 2014

Now powered by Qualcomm’s Snapdragon S602A Automotive Solution: the latest QNX technology
concept car, based on a Mercedes CLA45 AMG.

Wednesday, February 12, 2014

Frankenstein and the future networked car

So what do Frankenstein and the future networked car have in common, you ask? Simple: both are compelling stories brought to life in Geneva, Switzerland.

In Mary Shelley’s Frankenstein the creature is seen climbing Mont-Salève after having fled Geneva during a lightning storm:

“I thought of pursuing the devil; but it would have been in vain, for another flash discovered him to me hanging among the rocks of the nearly perpendicular ascent of Mont-Salève.”

Mont-Salève, overlooking Geneva
Photo: Benoit Kornmann
Of course, the future networked car is a very different type of story, but compelling nonetheless. The laboratory in this story is the ITU Symposium on The Future Networked Car being held within the Geneva Auto Show on March 5 to 6, where many new ideas will be brought to life by convening leaders and technical experts from the automotive and ICT communities.

The event, organized by the International Telecommunications Union (ITU), will consist of high-level dialogues and several technical sessions; these include a session on integrating nomadic devices in cars, where I will discuss how technology standards can help minimize driver distraction. The dialogues will cover road safety and innovation for the future car, and will feature key leaders such as the presidents of Fédération Internationale de l’Automobile (Jean Todt) and Infiniti (Johan de Nysschen). The technical sessions will explore automated driving, connected car use cases, emergency services, and, of course, nomadic device integration. Speakers for these sessions come from a mix of automakers, tier one suppliers, ICT companies, standards development organizations (SDOs), industry groups, and government agencies.

The symposium also includes a session jointly organized by the ITU and UNECE Inland Transport Committee that deals with the human factors and regulatory issues introduced by automated driving. This session is an encouraging sign that the ITU and UNECE will continue the collaboration they started last June (see my previous post, “UN agencies take major step towards international standards for driver distraction”).

Hope to see you in Geneva!

Monday, February 10, 2014

Why I should have gone to CES this year

No problem, I said, I'll be happy to stay back at the office. After all, somebody has to hold down the fort while everyone is at CES, and it may as well be me.

Of course, I didn't know what Audi was bringing to the show. Because if I did, I wouldn't have been so willing to take one for the team. If you're wondering what I am talking about, it's the new user-programmable instrument cluster for the upcoming 2015 Audi TT. It's based on the QNX CAR Platform for Infotainment, and it's about the coolest thing I've seen in a car, ever — even if I haven't yet had a chance to see it in person.

Roll the tape...





Wednesday, February 5, 2014

My top 10 QNX Auto posts from 2013

Normally, people write this kind of post at the beginning or end of a calendar year. But as an old friend once said, “Paul defines his own kind of normal.” He may have been right, I don’t know. What I do know is that this is definitely a personal list. It consists of posts that either made me laugh, taught me something I didn’t know, or helped me see things in a new light. I hope they do the same for you.

Disclosure: I wrote a couple of the posts in question. Because, sometimes, the best way to learn about something or see it in a new light is to write about it. :-)

Okay, enough preliminaries, let’s get to it…

  • What happens when autonomous becomes ubiquitous? — One question, seventeen answers.
     
  • Top 10 lessons learned from more than a decade in automotive — When it comes to software in the car, John Wall is the man.
     
  • Protecting software components in an ISO 26262 system — Sometimes, software components can be downright delinquent.
     
  • Why doesn’t my navigation system understand me? — Big data might be important, but small data can add a personal touch.
     
  • Top 10 challenges facing the ADAS industry — For ADAS systems to be successful, a safety culture must be embedded in every organization in the supply chain. And that’s just the first challenge.
     
  • Reducing driver distraction with ICTs — Yes, mobile phones can contribute to driver distraction. But they can also help solve the problem.
     
  • A sound approach to creating a quieter ride — Paradoxically, the best way to eliminate engine noise is to generate noise.
     
  • What's the word on HTML5? — If you want to know what experts at Audi, OnStar, Gartner, Pandora, TCS, and QNX think about HTML5 in the car, this is the post with the most (videos, that is).
     
  • A matter of context — A look at how digital instrument clusters can help provide the right information, at the right time.
     
  • My top moments of 2013 — Because this reminds me of the fantastic momentum QNX is building in automotive.
     
  • HTML5 blooper reel — Because laughter.

Oops, I guess that makes 11.

Tuesday, February 4, 2014

Head to the polls and vote for your favorite CES Car of Fame

Over the last couple of months we have recapped the stars of the QNX garage – our technology concept cars and reference vehicle — in the CES Cars of Fame series. And now, we are opening the floor to you!

Starting today through February 14 you can vote for your favorite vehicle that we have featured at CES. Did the eye-catching Bentley strike your fancy or did the updated Jeep put you into another gear? It’s all up to you. We will announce the fan favorite on Tuesday, February 18.

So once again here is the full list of our CES Cars of Fame blog posts. Have one last look and cast your vote:

Cast your vote here.

Thursday, January 30, 2014

QNX acoustics technology shortlisted for 2014 embedded AWARD

Okay, first things first. I didn't get the capitalization wrong. The name of the award really is spelled that way. I thought it odd at first, but I'm getting used to it. And besides, who am I to complain? After all, I spend a good part of my life promoting a product whose name is spelled all uppercase, and... where was I? Oh yes, the award!

Every year, the folks who organize the embedded world Exhibition&Conference hold the embedded AWARDs, which honor the most innovative software, hardware, and tools for embedded developers. And this year, the competition judges selected QNX Acoustics for Active Noise Control as a finalist in the software category.

If you aren’t familiar with our ANC solution, allow me to provide an overview — which will also help explain why the embedded AWARD judges are so impressed.

Automakers need to reduce fuel consumption. And to do that, they employ techniques such as variable engine displacement and operating the engine at lower RPM. These techniques may save gas, but they also result in "boom" noise that permeates the car's interior and can lead to driver distraction. And who needs more distraction?

QNX Acoustics for Active Noise Control can integrate 
seamlessly into a vehicle's infotainment system.
To reduce this noise, automakers use ANC, which plays “anti-noise” (sound proportional but inverted to the offending engine tones) over the car's speakers. The problem is, existing ANC systems require dedicated hardware, which adds design complexity, not to mention significant Bill of Materials costs. And who needs more costs?

Enter QNX Acoustics for ANC. Rather than use dedicated hardware, QNX ANC provides a software library that can run on the existing DSP or CPU of the car's head unit or audio system. This approach not only reduces hardware costs, also enables better performance, faster development, and more design flexibility. I could go on, but I will let my colleague Tina Jeffrey provide the full skinny.

Did I mention? This wouldn’t be the first time QNX Software Systems is tapped for an embedded AWARD. It has won two so far, in 2006 and 2004, for innovations in multi-core and power-management technology. It was also a finalist in 2010, for its persistent publish/subscribe messaging. Here's to making it a hat trick.

Friday, January 10, 2014

QNX at CES: The media’s take

No, CES isn’t over yet. But the technology concept cars showcased in the QNX booth have already stoked the interest of journalists attending the event — and even of some not attending the event. So here, in no particular order, are examples of what they're saying.

Oh, and I’ve added a couple of stories that aren’t strictly CES-related, but appeared this week. They were too relevant to pass up.

That's it for now. I aim to post more stories and videos early next week. Stay tuned.

Thursday, January 9, 2014

In good company: QNX partner solutions at 2014 CES

Guest post by Peter McCarthy of the QNX global partnerships team

Peter McCarthy
If anyone thinks that creating an infotainment system is easy, they obviously haven’t thought about it hard enough. It is, in fact, a massive undertaking that requires seamless integration of navigation engines, voice technologies, app environments, HMI tools, Internet music services, smartphone connectivity, automotive-hardened processors — the list goes on.

No single company could possibly offer all of these technologies. And even if it could, it still wouldn’t address the needs of automakers and tier one suppliers, who need the power of choice. Any company building an infotainment system needs the flexibility to combine Navigation Engine A with Processor B and Bluetooth Solution C.

Enabling customers to enjoy such choice without worrying about integration issues is something that QNX works very hard at. For evidence, look no further than our latest technology concept car, a modified Mercedes-Benz CLA45 AMG, which debuted this week at our CES booth. The car integrates an array of partner tech, including:

Meanwhile, the head unit in our reference vehicle, also featured in the QNX booth, integrates several partner apps and holds the distinction of being the world’s first in-vehicle implementation of Qualcomm’s Snapdragon Automotive Solutions. And if that’s not enough, our booth contains demos of a navigation engine from Aisin AW and a digital instrument cluster built with HMI tools from HI Corporation.

Mind you, the action isn’t restricted to the QNX booth. Several partners have also gotten into the act and are demonstrating QNX-based systems in their CES booths and meeting rooms. For instance:

  • Elektrobit — Demonstrating a new concept electric vehicle that sports an instrument cluster and infotainment system based on the QNX Neutrino Realtime Operating System.
     
  • Freescale — Demonstrating the QNX CAR Platform for Infotainment on its i.MX 6 Applications Processors for Automotive.
     
  • Gracenote — Demonstrating how its technology can personalize the in-vehicle music experience, using a system based on the QNX Neutrino OS.
     
  • NVIDIA — Demonstrating Audi's newest infotainment system featuring the NVIDIA Tegra processor and the QNX Neutrino OS.
     
  • Qualcomm — Demonstrating the QNX CAR Platform on Snapdragon Automotive Solutions.
     
  • Red Bend Software — Demonstrating virtualization technology that runs the QNX CAR Platform and a digital instrument cluster on dual displays driven by a single processor.
     
  • Texas Instruments — Demonstrating the QNX CAR Platform running on its latest Jacinto processors

For the fully skinny on QNX partner technology at CES, I invite you to check out our press release, along with joint announcements that we have issued with Aisin AW, HERE, HI Corporation, and Qualcomm.



About Peter
When he isn't talking on oversized mobile phones, Peter McCarthy serves as director of global partnerships at QNX Software Systems, where he is responsible for establishing and fostering partnerships with technology and services companies in all of the company's target industries.

Wednesday, January 8, 2014

"I want one"

Yesterday, I bemoaned that words and pictures could never capture the unique experience of being in one of the new QNX technology concept cars. But you know what? Video comes a little bit closer.

Of course, it can't capture everything. For instance, it can't reproduce the richness and clarity of the cars' full-band and wideband phone calls, or the sheer auditory relief offered by QNX active noise control software. But it can capture the reaction of someone experiencing these technologies.

For instance, in this video, I love watching how Adam from CrackBerry.com reacts to our latest innovations in automotive acoustics. Especially the part where says "I want one."

Did I mention? The clip also contains footage of our infotainment and digital cluster systems in action. Check it out:



A big thanks to Adam and the CrackBerry team for visiting us at CES.

QNX at CES: a key fob on steroids

Have you ever wished that your key fob could do more than lock and unlock doors, and chirp your horn? If so, you’ll be interested in some great tech that QNX Software Systems has developed in partnership with DotLinker and is demonstrating this week at CES.

To show what this technology can do, we’ve created a custom “key fob” app that connects to our Mercedes-Benz CLA45 AMG technology concept car. The app is written in HTML5, our cross-platform language of choice, so it will run on any smartphone. Here is the app’s main menu:



Remote repair, over the air
A really cool feature of this technology is that the connection to your car is hosted on a cloud service, thanks to our DotLinker integration. This approach could allow multiple devices owned by you, your spouse, or your kids to access the vehicle's state simultaneously. It could also allow your dealership to access the vehicle’s state online — with your permission of course — without having you bring the car into the shop. The dealer tech could simply pull up a management console on an iPad to see what’s wrong, order the parts you need, and book a single, quick fix-it appointment:



If the problem can be fixed by software, that same technician could make changes over the air. It might be as simple as setting a Bluetooth pairing option that you can’t find (aka remote device management), or downloading new software to the car (aka firmware over-the-air updates):



Dodging the vortex
How about a remote start from anywhere? The Buick Enclave with OnStar shows just how nice this can be when it’s bitter cold outside and you’re beyond traditional key-fob distance. This feature should come in handy this week with the dreaded “Polar Vortex”! Also, you never know when a coworker might need to borrow your ice scraper out of the trunk — stay nice and warm inside while he or she gets it. If it ever does get back to hot summer days (and it will, eventually), this same remote access could let you open your car’s sunroof or windows.

The key fob app supports remote start; remote open/close of doors, windows, roof, and trunk; and, for good measure, remote control of turn signals:




Where did I park that thing?
“Hey kids, meet me back at the car!” Finding your vehicle’s location is a modern necessity, especially when the parking lot is bigger than the state of Rhode Island. Okay, the Las Vegas Convention Center isn’t that big, but it sure feels that way by the end of the show:



Backseat DJ
Finally, what about controlling the car’s media player from the phone? Let your kids DJ the car’s playlist from the back seat from their tablets or smartphones to keep the trip to Grandma’s entertaining. Just remember you gave them that power when they dish up their favorite screamo band, “A Scar for the Wicked”.



What if?
Now imagine... what if your next car came with a key-fob app? What features would you hope to see? And what do you think would be the killer key-fob feature of all time? Over-the-air updates? Remote location tracking? Or something completely different?


Tuesday, January 7, 2014

The QNX sound machine at CES

If you’ve ever had the pleasure of attending the Consumer Electronics Show, you’ll know that it’s a crowded place full of lights and noise. In the automotive North Hall, much of the cacophony comes from the legions of car customizers blasting bass from sedan-sized speakers. This year, QNX has brought a new kind of technology concept car to CES, based on a Kia Soul, that offers some subtler forms of sound artistry. (Sorry, hamster fans—I don’t think we’ll have your favorite mascot in the QNX booth.)


A sound ride: the new QNX technology concept car for acoustics

Let’s start with noise. Everyone likes a booming radio, sometimes. But if that’s the only tool you have to drown out engine noise you’ll go deaf. That’s where Active Noise Control (ANC) comes in. Think of ANC as a more sophisticated version of noise cancelling headphones that you don’t need to wear. Not only does ANC help keep the car’s cabin quiet, but the QNX solution is software based and doesn’t require a dedicated hardware module, saving the OEM and the consumer money.

The best part about ANC is that it helps cars become more fuel efficient. Huh? To keep car interiors quiet, automakers add baffling in the doors and under the floor to help mute engine noise. Dragging around that extra weight costs fuel. So removing the ballast (I mean baffles) lets the automakers make more fuel-efficient cars. And with ANC, which helps eliminate the extra noise caused by this approach, everyone wins.

Beyond wideband
Next up: a new level of call quality. If you’ve had the pleasure of conversing between two newer smartphones (BlackBerry Z10 or Z30, iPhone 5, Nokia Lumina 520, Samsung Galaxy S4, ...) you may have noticed that the call sounded better than what you’re used to. That’s because many newer phones support something called wideband audio (or HD Voice), which transmits more audible frequencies to make the call sound clearer. That’s good, but QNX wants to show what’s possible beyond wideband. So in the QNX technology concept car for acoustics, we’re demoing a new audio feature called full-band stereo calling, which is like having phone calls with CD quality audio. A full-band call has over six times the transmitted frequency range of a standard call, and more than double that of wideband. And as the name suggests, full-band stereo provides two independent channels, adding to the depth and sense of presence, making the call quality something that just has to be experienced.

Sound like a V8, sip like a Volt
Lastly — we get to pump up the volume! The technology concept car for acoustics also sports engine sound enhancement (ESE), which plays synthesized engine sounds over speakers inside the car. With ESE, your engine appears to sound a little more throaty. It may not be obvious, but this is also a fuel saving technology! As carmakers look for creative ways to turn gasoline slurpers into sippers, they’re implementing technologies that dynamically modify engine cylinder firing. Those changes can sometimes make a perfectly powerful engine sound anemic, which negatively impacts customer first impressions. Unfortunately, most people want a car that sounds and performs like it has a huge V8 even if they expect it to sip gas like a Chevy Volt. Both ANC and ESE can help the customer get over their performance anxiety. ESE also lets drivers get in tune with their engine, making it easier to shift by ear.

If you’re up for a little fun, you can also use ESE to make your car sound like something completely different. We’re playing the ESE audio outside the car as well as inside it. The Kia is using QNX ESE audio to masquerade as another car. Tweet us at @QNX_Auto if you can guess what it is!


The wraps are off! First look at the new QNX technology concept car

A quick tour of one of the vehicles that QNX is unveiling at 2014 CES

You know what? Writing this post isn’t easy. All I’ve got are words and pictures, and neither could ever do justice to the user experience offered by the new QNX technology concept car. They cannot, for example, recreate the rich, luminous sound of the car’s full-band and wide-band hands-free calls. Nor can they evoke how the car blends speech recognition with a touch interface and physical controls to make navigation, Internet radio, and other applications wonderfully easy to use.

But on second thought, words and pictures aren’t that bad. Especially when the car — and the in-dash systems that the QNX concept team created for it — are so downright gorgeous. So what are we sitting around for? Time for a tour!

Actually... hold that thought. I just want to mention that, if you visit our Flickr page, you can find full-resolution versions of most of the images I've posted here. Because why settle for low res? Okay, back to the tour.

The car
I've got two things to say here. First, the car is based on a Mercedes-Benz CLA45 AMG. If you guessed the model correctly based on the teaser images we published on the QNX website, I bow in homage to your eagle eye. Second, while we snapped this photo in the QNX garage, don’t think for a minute that the garage is ever this neat and tidy. On any given day, it’s chock full of drill presses, tool boxes, work tables, embedded boards, and QNX engineers joyously modding the world’s coolest cars — exactly the kind of place you expect it to be. And want it to be! But to humor the photographer, we (temporarily) made this corner clutter-free. We're nice that way.



The dash
Let's get behind the wheel, where you can see the car's custom-built digital instrument cluster and infotainment system. The bold design, the clean layout, the super-easy-to-access controls — they all add up to systems you want to interact with. Just as important, the look-and-feel of the instrument cluster and infotainment system are totally different from the corresponding systems in our previous concept car — an excellent illustration of how the QNX platform can help customers create their own branded experiences.



The multi-talented cluster
Time to zoom in on the digital instrument cluster, which helps simplify driving tasks and minimize distraction with an impressive array of features. Turn-by-turn directions pulled from the navigation system? Check. Video feed from front and rear-view cameras? Check. Notifications of incoming phone calls? Check. Alerts of incoming text messages, which you can listen to at the touch of a steering-wheel button? Check.



The Android app support
Automakers want to tap into the talents of the mobile app community, and the QNX CAR Platform for Infotainment helps them do just that, with built-in support for Android, OpenGL ES, and HTML5. In the concept car, for example, you'll find an Android Jellybean version of iHeartRadio, Clear Channel’s digital radio service, running in a secure application container. The QNX CAR Platform takes this same sandboxed approach to running HTML5 apps — perfect for protecting both the HMI and the overall system from unpredictable web content:



Helping you get there in more ways than one
We designed the QNX CAR Platform to give automotive developers the greatest possible choice and flexibility. And that’s exactly what you see when it comes to navigation. For instance, the car supports navigation from Elektrobit:



and from HERE:



and from Kotei Informatics:



If that’s not enough, a demo system in the QNX booth at CES also demonstrates a navigation system from Aisin AW — more on that in an upcoming post.

Pardon me while I barge in
As I alluded earlier, what you can't see in the new concept car is just as important as what you can see. For instance, if you look at this image, you'll see the infotainment system's media player. But what you can't see is new acoustics technology from QNX that lets you "barge in" and issue voice commands even when a song is playing. How cool is that?



When you find yourself in times of trouble...
... don't let it be, but rather, check and see. And to do that, you can use the infotainment system's virtual mechanic, which keeps tabs on your car's health, including fluid levels, brake wear, and, in this case, low tire pressure:



The cloud connection
Hold on, what's this? It looks like a smartphone app with an interface similar to that of the virtual mechanic, above. In fact, it's a lot more than that, and it touches on some cool (and very new) technology that can help cars become fully managed citizens of the cloud. More on that in an upcoming post.



That's it for now. For more details on what QNX is showcasing this week at CES, check out the press releases posted on the QNX website. And stay tuned to this channel for further updates from 2014 CES — including a profile of our very new QNX technology concept car for acoustics.