Krispy Kremes, marketing and the ethics of autonomous cars

in Uncategorized by

In one possible future, your autonomous car always takes a different route, right past your favorite fast food joint — say, a Krispy Kreme — and in an increasingly interconnected world, the car can determine that a delicious hot glazed donut is exactly what you want.

How would your car know, asks Patrick Lin of The Atlantic. The on-board computer stores your driving habits, knowing that you are a compulsive snacker. You liked Krispy Kreme on social media and visited the website. Since your car connects to your home computer, it knows that too.  

A drowsiness detection system — something available today — recognizes when you need that extra cup of coffee, so a driverless car will reroute you to your favorite pick-me-up place, Krispy Kreme.

That future may be convenient or disturbing, depending on how you see it. Either way, the future is on the road ahead, especially after visiting January’s Consumer Electronics Show (CES) in Las Vegas, where automakers offered deals for online services or applications for web-enabled cars in the not-so-distant future.

Of course, whenever there are free or cheap online services, online advertising is always near.

Lin suggests that online advertising could be anything from a banner ad scrolling across the windshield to the car automatically steering to a location. Make no mistake, advertising is coming to autonomous automobiles — and it is coming soon.

Google’s recent acquisition of Nest — manufacturer of “smart” thermostats and appliances — is seen as the first step toward the “Internet of Things.” If the web jumps from a personal computer to home appliances, why shouldn’t it also be in your car?

The idea brings up a crucial point; should a driverless car be able to take people where they should not go? With mounds of collected data — GPS coordinates, wireless information and such — what are the ethics of taking a driver to a place where he or she should not be; like violating a restraining order, to a bar if they are alcoholic, or a fast food restaurant if the user has Type 2 diabetes?

Not just for legal reasons, either.

Who is responsible if a car drives someone where he or she is forbidden to go, such as a registered sex offender near a school where they are not supposed to be within 2,000 feet. Is it the fault of the driver or the autonomous car?

Human are just that – human. They make mistakes and have lapses in judgment.  Autonomous cars might handle the road better in many cases than the people inside, especially if passengers give orders to the car that are inherently risky.

Refusing a dangerous order by a human is not a defect, it’s a feature.

This makes programming autonomous cars an extremely serious decision for manufacturers, Lin writes. The result can be a product — one that is reliable and secure — for millions of people to use.

It is the reason there should be a focus on ethics, not just on law, which is what Lin and his colleagues are doing at the Center for Automotive Research at Stanford (CARS); they are steering the future of transportation down a responsible path.

Phil Ammann is a St. Petersburg-based journalist and blogger. With more than three decades of writing, editing and management experience, Phil produced material for both print and online, in addition to founding HRNewsDaily.com. His broad range includes covering news, local government and culture reviews for Patch.com, technical articles and profiles for BetterRVing Magazine and advice columns for a metaphysical website, among others. Phil has served as a contributor and production manager for SaintPetersBlog since 2013. He lives in St. Pete with his wife, visual artist Margaret Juul and can be reached at phil@floridapolitics.com and on Twitter @PhilAmmann.