54
IEEE IV 2011, June 5 2011, Baden-Baden How can new sensor technologies impact next generation safety systems? Perception Horizon Approach to Accident Avoidance by Active Intervention Presenter: S. Durekovic Principal Engineer, NAVTEQ

Perception Horizon Approach to Accident Avoidance by …durekovic.com/-projects/interactIVe/documents/... · 2014-10-06 · Perception Horizon Approach to Accident Avoidance by Active

Embed Size (px)

Citation preview

IEEE IV 2011, June 5 2011, Baden-Baden How can new sensor technologies impact next generation safety systems?

Perception Horizon Approach to Accident Avoidance

by Active Intervention

Presenter:

S. Durekovic Principal Engineer,

NAVTEQ

2

Project overview: History

• October 2006: First ideas for follow-up activities after PReVENT

• December 2006: Workshop in Rüsselsheim, working name for a

new project: AVOID

• January 2007: Aachen – Definition of project structure

• March 2007: Brussels (AVOID – interactIVe) – finalisation

of project framework

• May 2007: Submission

• June 2007: Hearing

New Start

• 29. May 2008: EUCAR – Decision of project resubmission

• Submission: 1. April 2009

• Hearing: 9. June 2009

• October 2009: Grant Agreement

3

Project overview: Facts

• Budget: EUR 30 Million

• Co-funding by the European Commission: EUR 17 Million

• Duration: 42 months (January 2010 – June 2013)

• Coordinator: Aria Etemad, Ford Research and Advanced

Engineering Europe

• 10 Countries: Czech Republic, Finland, France, Germany, Greece,

Italy, Spain, Sweden, The Netherlands, UK

4

Consortium

• OEMs

• Suppliers

• Research

• SMEs

5

Mission: Research concept

SECONDS EMIC INCA

Continuous driver

support

Collision

avoidance

Collision

mitigation

6

Objectives

• Create an innovative model and platform for enhancing the perception of

the driving situation

• Extend range of possible scenarios and usability of ADAS by multiple

integrated functions and active interventions

• Improve decision strategies for active safety and driver-vehicle-interaction

• Develop solutions for collision mitigation that can improve the market in-

take within lower-class vehicle segments

• Further encourage the application of standard methodologies for the

evaluation of ADAS

7 PLEASE FILL

IN THE DATE

Project presentation

Project Structure

8

Interactions

9

System Blocks

10

Demonstrators

4 Simulators

11

Demonstrators

12

Timeline

13

14 PLEASE FILL

IN THE DATE

Project presentation

SP3: IWI strategies

(Information/Warning/Intervention)

• Analysis of user needs, expectations, and behaviour

• Definition of detailed use cases based on target scenarios from the

application-oriented sub-projects

• Definition of requirements for IWI strategies based on use cases

• Iterative design, prototyping, and user testing of IWI strategies based on

the initial requirements

15

IWI strategies – theatre technique

16

17

SP4: SECONDS -- Integration of Driver Assistance functions

Driver Assistance system conceived as a co-driver, who gives

active advices when the driver is not following a safe manoeuvre

18

SECONDS

• Conceive, develop, and test continuous support functions

• Functions: continuous support (integrated longitudinal and lateral support),

curve speed control, enhanced dynamic pass predictor, safe cruise

• Innovative concept for splitting the driving task (amount of control) between

driver and vehicle

• Build-up four demonstrator vehicles (passenger cars)

• Implementation of perception platform

• IWI strategies taking into account haptic devices

19

• Collision due to drifting

• Lane change collision

• Rear End Collision

• Accidents in Corners

• Overtaking situations

• Excessive speed

• Crossing accident

• Collision with pedestrian

• Collision with animal

SECONDS Target Scenarios

20

SECONDS – splitting driving task between driver and vehicle

22

SECONDS Use of Perception Horizon for planning

Use of environment description including road, obstacles and

environment

23

SECONDS Integration of longitudinal and lateral support

Planning and evaluation of alternative manoeuvres that address

simultaneously both lateral and longitudinal control tasks

24

25

SP5: INCA

• Development of integrated collision avoidance and vehicle path control for

passenger cars and commercial vehicles

• Combination of lateral and longitudinal active interventions by autonomous

braking and steering

• “Vehicle path control” module dynamically evaluates a collision free

trajectory in rapidly changing driving scenarios

• INCA addresses rear-end collisions, lateral and head-on collision situations

• Three demonstrator vehicles (two passenger cars and one heavy load

truck equipped and tested); two of them shared with SECONDS

26

The INCA use cases are based on the following target

scenarios:

• Head-on collisions: since these are the most frequent. Addressing head

on collisions results tough requirements on all parts of the system.

• Run-off the road accidents: these are the second largest group and

today’s technology (lane departure warning) is already effective but there is

a potential for improvements and reduction of nuisance alarms by

introducing road edge detection.

• Rear end collisions: these accidents represent the third largest group.

State of the art systems gives collision mitigation and there are also

systems that provide full collision avoidance at low speeds. The INCA

active steering approach has a possibility to increase the situations where

full avoidance can be offered by the system.

• Lane change accidents: these accidents represent a fairly small group in

the accident statistics. However due to the large blind spots in a truck lane

change truck drivers are aware of the hazard and an assistance system is

expected to be helpful.

27

INCA Challenges

• Today we use braking for "longitudinal" threat.

• In order to cope with avoidance manoeuvres by steering a number of

challenges is added: for example:

• How to decide and describe the best path to follow in order to avoid ” lateral

threat; with boundary conditions by the vehicle itself and environment - like

road curvature and width ?

• How to judge the risk of collision with oncoming ( or objects from behind)

given

1) sensor limitations (accuracy / delays ) and

2) long enough prediction horizon - in order to avoid unrealistic

steering torque demands.

• Calculate the risk for passing the road edge is new and not done before,

the sensor part is an obvious more difficult, - specially compared to lane

tracking – but

28

INCA – accident avoidance manoeuvre

29

30 PLEASE FILL

IN THE DATE

Project presentation

• Development of cost-efficient collision mitigation systems

• Emergency braking and steering based on frontal surround perception

• Low cost system architectures or low-cost additions to existing ADAS:

specific attention paid to cost-effective hardware and software components

• Build-up two demonstrator vehicles (Emergency steering assistant and

autonomous emergency braking and/or steering system)

SP6: EMIC

31 PLEASE FILL

IN THE DATE

Project presentation

SP6: EMIC – architecture

HMI

Steering

Braking Evaluation

Demonstrator

Target

Scenarios

Sensor1

Sensor2

Sensor3

Requirements

Perception

SP2 SP6 SP7

Situation

assessment

& action

planning

Driver model

32

EMIC: Target Scenarios

33

EMIC Challenges & Solutions #1

• Collision mitigation covering a wider range of scenarios

Challenge: crossing traffic

Solution/Approach: detection of the wheels with monocam

• data from medium-FOV system (monocular stream)

• data from wide-FOV camera

34

EMIC Challenges & Solutions #2

•Increase efficiency through early triggering of the function

Challenge: achieve early triggering

Solution/Approach: implementation of a driver model

Environment Driver Traffic

Road condition

Weather

Traffic flow

Construction site

Driver type Driving manoeuvre Driver capability

Risk behaviour

Comfort level

Drowsiness

Distraction

Manoeuvre prediction

Deviation from normal

driving behaviour

35

36 PLEASE FILL

IN THE DATE

Project presentation

SP7: Evaluation and legal aspects

• Definition of a test and evaluation framework for the assessment of each

application with respect to human factors and technical performance

• Development of test scenarios, procedures, and evaluation methods

• Provision of tools for evaluation like equipment, test catalogues,

procedures, questionnaires or software and support for testing

• Definition of test and evaluation criteria

• Analysis of legal aspects for broad exploitation of the applications

37

38

SP2 Perception: Predecessors

Data fusion central role in current & future ITS

Stand alone sensors not sufficient (physical limitations)

Fusion of information from heterogeneous sources to provide a holistic environment perception

Perception sensors: radars, cameras, laserscanners etc.

Digital maps

Wireless communication (V2X)

Fusion evolvement through European projects

PReVENT – ProFusion2

SAFESPOT

HAVEit

interactIVe

39

SP2 Perception

• Specifications for sensor interfaces and fusion modules

• Multi sensor approaches and sensor data fusion

• Common perception framework for multiple safety applications

• Unified output interface from the perception layer to the application layer

• Integration of different information sources like sensors, digital maps, and

communications

• Innovative model and platform for enhancement of perception of traffic

situations

40

Perception– Largest Sub-Project

Funding: 3.8 M€

Budget: 6.6 M€

Resources: 495 person months

Duration: 42 months

Start: 1. January 2010

SP leader: Uri Iurgel, Delphi

Angelos Amditis, ICCS

Countries: France, Germany, Greece, Italy, Netherlands,

Sweden, UK

41

Perception – Sensor Fusion Process

42

Perception Concept

43 Kick-off meeting

– January 20,

2010

• WP21: Technical Management (DEL)

• WP22: Interactions (ICCS)

• WP23: Requirements (DAI)

• WP24: Architecture & Specifications (VTEC)

• WP25: Sensor Data Fusion Research (ICCS)

• WP26: VSP specific Development (DEL)

• WP27: Test & Evaluation (PASS)

Perception Project Structure

45

Perception: Foundation for Applications

Design a common perception framework for safety applications from

multiple sensor inputs

Enhancement and research on multi sensor approaches and sensor data

fusion (including digital map -ADASIS v2 + communication data)

Key Objectives:

46

Perception: Innovation …

Development of an innovative model and platform for enhancing the perception

of the traffic situation in the vicinity of the vehicle (based on PReVENT/PF2

and SAFESPOT experience )

General interfaces for different sensor types to minimize effort in the next levels

of processing

Integrated internal architecture to serve multiple applications

Unified output interface from the perception layer to the application layer

Applicable to different demos and applications with minor adaptation;

Closer to the plug & play approach

Reference perception platform implementation

47

Perception: … and Challenges

Active intervention poses “hard” real-time requirements for data processing &

fusion modules

Reference perception platform (generic I/O interfaces, integration of multiple

modules, synchronization)

Novel multi-sensor data fusion research (attention focused approach, object

classification, road edge detection)

48

Perception: … and Challenges

Active intervention poses “hard” real-time requirements for data processing &

fusion modules

Reference perception platform (generic I/O interfaces, integration of multiple

modules, synchronization)

Novel multi-sensor data fusion research (attention focused approach, object

classification, road edge detection)

49

Perception: Zooming In…

50

PERCEPTION PLATFORM

GPS

Vehicle

Sensors/

Gyroscope

Camera

MAP

Lidar

Ultrasonic

V2X Nodes

Radar

Enhanced Vehicle

Positioning

P

E

R

C

E

P

T

I

O

N

H

O

R

I

Z

O

N

Road Edge

Detection

Frontal Near Range

Perception

Side/Rear Object

Perception

Frontal Object

Perception

VRUs Detection

Moving Object

Classification

Assignment of

Objects-Lanes

Recognition

Unavoidable Crash

Free Space

Detection

Vehicle State

Filter

Lane Recognition

Vehicle Trajectory

Calculation

Road Data Fusion EVRP-ToRoad

ADASIS Horizon

EnvironmentTemperature/

Rain Sensors

51

Perception: Outputs

Perception Horizon collects synchronize and sends out the following

semantic categories of driving information [Perception Modules]:

• Road information fused from maps and camera sensor (RDF by exploiting info from RED/LR

module);

• Maps and positioning information (ADASIS H);

• Enhanced Positioning information (EVP module);

• Ego-vehicle information with respect to:

• CAN data (VSF)

• current road scenario (EVRP-To-Road module);

• future trajectory (VTC module);

• Object information considering:

• Detections either in frontal either in side-rear field (FOP/ SROP modules respectively);

• Assignment of objects to lanes given the road and ego-vehicle trajectory information

(AOL module);

• Frontal object classification (MOC module);

PERCEPTION PLATFORM

GPS

Vehicle

Sensors/

Gyroscope

Camera

MAP

Lidar

Ultrasonic

V2X Nodes

Radar

Enhanced Vehicle

Positioning

P

E

R

C

E

P

T

I

O

N

H

O

R

I

Z

O

N

Road Edge

Detection

Frontal Near Range

Perception

Side/Rear Object

Perception

Frontal Object

Perception

VRUs Detection

Moving Object

Classification

Assignment of

Objects-Lanes

Recognition

Unavoidable Crash

Free Space

Detection

Vehicle State

Filter

Lane Recognition

Vehicle Trajectory

Calculation

Road Data Fusion EVRP-ToRoad

ADASIS Horizon

EnvironmentTemperature/

Rain Sensors

52

Perception:

Road Data Fusion Example

53

Perception:

Frontal Object Perception Example

Example of output visualisation in ADTF display,

with left raw video, middle grid-based fusion and right 3D scene representation (different

classes are colour-coded).

Frontal perception will combine information from lidar, radar (grid-based fusion) and

camera (appearance based features) to reliably detect objects and their distances to

the ego-vehicle + class assignment to moving objects

54

Perception:

Instead of conclusion… (work is still in progress)

Sensor data fusion research:

Fusion of heterogeneous

information from different

sources

research on situation

refinement based on

combination of Perception

Modules outputs (high level

information extraction)

Testing the integration of different

applications in interactIVe exploiting

the advanced fusion techniques

developed in SP2

5 Demonstrator

Vehicles 2 Development

Vehicles

SP2

…Perception of automotive environment (highly dynamic) difficult and

challenging task

55

Thank you.

Dr. Angelos Amditis, ICCS -- Lars Bjelkeflo, VTECH

Uri Iurgel, Deplhi -- Sinisa Durekovic, NAVTEQ

56

Thank you.

This work was also supported by the European Commission under interactIVe,

a large scale integrated project part of the FP7-ICT for Safety and Energy

Efficiency in Mobility.

The authors would like to thank all partners within interactIVe for their

cooperation and valuable contribution.