LONDON -- Britain's goal to be a leader in adopting self-driving cars could backfire unless automakers and government regulators spell out the current limitations of the technology, insurance companies warn.
Insurers are key players in the shift to automated driving, with some investing in a technology they believe will slash accidents and deaths and save them billions in payouts.
But they are worried drivers might equate today's lower levels of automation with fully self-driving vehicles, potentially causing more accidents in the short term and permanently damaging public confidence in the technology.
"What you describe things as is incredibly important, so people don't use them inappropriately," said David Williams, managing director of underwriting at AXA insurance. "I genuinely believe the world will be a safer place with autonomous vehicles and I really don't want that derailed."
In what would be a world first, Britain is considering regulating the use of Automated Lane Keeping Systems (ALKS) on its roads, possibly even on motorways at speeds of up to 70 mph (113 kph). It is also deciding whether to describe them to the general public as "automated" systems.
It is that one word - automated - that has stirred controversy and put the country at the center of a global debate about self-driving terminology at a sensitive moment in its evolution.
The technology is evolving rapidly and there is no consensus on how to deploy it or what to call some features. Regulations in the Americas, Europe and Asia lag far behind technical developments and issues over accident liability are unresolved.
ALKS use sensors and software to keep cars within a lane, accelerating and braking without driver input. They are "Level 3" technology on the auto industry's five-point scale towards fully autonomous "Level 5" driving - meaning they can operate under specific conditions, but require driver intervention.
However, some experts say ALKS should be called "assisted-driving technology" to avoid potentially misleading consumers into believing they can let their attention wander at the wheel.
The dangers of drivers apparently misunderstanding the limits of technology has already become an issue in the U.S., where regulators have been looking into about 20 crashes involving Tesla's driver assistance tools, such as its "Autopilot" system - a "Level 2" technology that requires the driver's constant attention.
Britain's Thatcham Research said it had tested cars with the technologies underpinning ALKS and found they cannot swerve out of lane to avoid obstacles, see pedestrians emerging from cars at roadside, or read road signs. The car can alert the driver to resume control, but with a potentially fatal lag at high speeds.
"If this technology was really automated and could do what you or I could do, insurers would welcome it," said Matthew Avery, Thatcham's research director.
"But this will lead to confusion, it's going to lead to unnecessary crashes, and potentially injuries or fatalities" if ALKS are not marketed accurately, he added.
Britain's transport ministry said its primary concern was public safety and it had not decided to permit the use of ALKS at high speeds or whether to call the technology "automated." Its decisions are expected later this year.