As companies like Google Inc. and General Motors Corp. develop self-driving robotic cars, California has proposed rules to govern autonomous vehicles on public roads. Because of studies that show an accident rate among driverless cars double that of human-operated vehicles, the state has drafted rules requiring that a human be in a robot-operated car and be ready to take over operation at all times.
A representative from Google, who commented on the drafted rules, expressed disappointment with the proposed regulations. The company has already developed a prototype self-driving car without a gas pedal or steering wheel.
Accidents may arise from driverless vehicles because of their programmed inability to violate traffic laws. Low-speed crashes to the rear ends of driverless cars account for the bulk of their accidents. Researchers have attributed most of these crashes to aggressively driving or inattentive humans unprepared for an encounter with robot-operated cars that always act with extreme caution and never break a road rule. Computer programmers at Carnegie Mellon University and Google continue to debate the possible solutions, which could include teaching driverless cars to make decisions about when to break a rule. For example, exceeding the speed limit might be necessary to merge into freeway traffic or change lanes.
Navigating traffic has proven to be challenging for computer-programmed machines and humans alike. Sometimes the outright negligence of human drivers causes car crashes. A person injured in such an accident might wish to speak with an attorney about filing a personal injury lawsuit. A lawyer could interpret police reports and work with an outside investigator to assemble evidence of negligence for a lawsuit. Negotiations with an insurance company or an individual could be conducted by an attorney, who might try to gain a settlement for the victim.