Tải bản đầy đủ (.pdf) (92 trang)

A Localisation and Navigation System for anAutonomous Wheel Loader

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (1.93 MB, 92 trang )

A Localisation and Navigation System for an
Autonomous Wheel Loader

Robin Lilja
Master’s Thesis, January 2011

Mälardalen University
School of Innovation, Design and Engineering


A Localisation and Navigation System for an Autonomous Wheel Loader

Written by
Robin Lilja

Supervisors
Torbjörn Martinsson (Volvo Construction Equipment)
Doctor (Ph.D) Giacomo Spampinato (Mälardalen University)

Examiner
Professor Lars Asplund

Mälardalen University
School of Innovation, Design and Engineering
Box 883, 721 23 Västerås
SWEDEN
/>
Release Date:

2011-01-17


Edition:

First

Comments:

This Master’s Thesis report is submitted as partial fulfilment of
the requirements for the degree of Master of Science in Robotic
Engineering. The report represents 30 ECTS points.

Images:

Front logotype is a property of Mälardalen University, all others
are produced by the author or obtained from Volvo CE.

Rights:

c Robin Lilja 2011


Dedicated to my mother for her never-ending encouragement and support.


Abstract
Autonomous vehicles are an emerging trend in robotics, seen in a vast range of applications and
environments. Consequently, Volvo Construction Equipment endeavour to apply the concept of
autonomous vehicles onto one of their main products. In the company’s Autonomous Machine
project an autonomous wheel loader is being developed. As an objective given by the company; a
demonstration proving the possibility of conducting a fully autonomous load and haul cycle should
be performed.

Conducting such cycle requires the vehicle to be able to localise itself in its task space and navigate
accordingly. In this Master’s Thesis, methods of solving those requirements are proposed and evaluated on a real wheel loader. The approach taken regarding localisation, is to apply sensor fusion,
by extended Kalman filtering, to the available sensors mounted on the vehicle, including; odometric
sensors, a Global Positioning System receiver and an Inertial Measurement Unit.
Navigational control is provided through an interface developed, allowing high level software to
command the vehicle by specifying drive paths. A path following controller is implemented and
evaluated.
The main objective was successfully accomplished by integrating the developed localisation and
navigational system with the existing system prior this thesis. A discussion of how to continue the
development concludes the report; the addition of a continuous vision feedback is proposed as the
next logical advancement.
Keywords: Autonomous Vehicle, Sensor Fusion, Kalman Filtering, Path Following


Acknowledgements
First of all I would like to thank my nearest colleagues Niclas Evestedt and Jonathan Blom for
all the hilarious discussions and moments we had together for the past months, making the long
hours spent in the wheel loader endurable. Their technical and analytical feedback is appreciated as
well. A special gratitude is directed to my friend and colleague Magnus Saaw for proof-reading this
thesis. Special thanks to my supervisor at Mälardalen University, Dr. Giacomo Spampinato, for his
competent and insightful discussion on Kalman filters. Dr. Martin Magnusson at Örebro University
deserves a special recognition for his work and assistance on the vision system. A thank to Staffan
Backen and Ulf Andersson at Luleå University of Technology for lending the DGPS equipment.
Finally, a thank to my supervisor at Volvo CE, Torbjörn Martinsson, for his visionary inspiration
and enthusiasm shown for the Autonomous Machine project.


Contents
List of Figures


vii

List of Tables

x

1 Introduction

1

1.1

Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

1.1.1

Autonomous Machine Project . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

1.1.2

Volvo Construction Equipment . . . . . . . . . . . . . . . . . . . . . . . . . .

1

1.1.3


Wheel Loaders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1

1.2

Problem Specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

2

1.3

Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

3

1.4

Safety . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

3

1.5

Delimitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

3

1.6


Thesis Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

4

1.7

Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

4

1.7.1

Autonomous Ground Vehicles . . . . . . . . . . . . . . . . . . . . . . . . . . .

5

1.7.2

Autonomous Mining Equipment . . . . . . . . . . . . . . . . . . . . . . . . .

6

Platform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6

1.8

2 Coordinate Systems


8

2.1

Local Planar Frame . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

8

2.2

Vehicle Body Fixed Frame . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

9

3 Sensors
3.1

10

Sensor Measurement Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
iv

10


Contents

3.2

Odometry Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .


10

3.2.1

Articulation Angle Sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11

3.2.2

Rotary Encoder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11

3.3

Inertial Measurement Unit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

12

3.4

Global Positioning System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

13

3.4.1

Dilution of Precision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .


14

3.4.2

Differential GPS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

15

4 Vehicle Modelling

16

4.1

Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

16

4.2

Wheel Slip . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

18

4.2.1

Longitudinal Slip . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

18


4.2.2

Lateral Slip . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19

Kinematic Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

20

4.3.1

22

4.3

Model Verification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5 Sensor Fusion

24

5.1

Multisensor Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

24

5.2


Kalman Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

24

5.2.1

Historical Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

24

5.2.2

Linear Dynamic System Model . . . . . . . . . . . . . . . . . . . . . . . . . .

25

5.2.3

Linear Kalman Filter

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

26

5.2.4

Kalman Filter Extensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

28


5.2.5

Implementation Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

30

6 Localisation
6.1

6.2

33

Full Model Kalman Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

33

6.1.1

Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

33

6.1.2

Covariance Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

35


6.1.3

Evaluation and Performance

. . . . . . . . . . . . . . . . . . . . . . . . . . .

37

Parameter Estimating Kalman Filter . . . . . . . . . . . . . . . . . . . . . . . . . . .

38

6.2.1

38

Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

v


Contents

6.3

6.2.2

Covariance Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

40


6.2.3

Evaluation and Performance

. . . . . . . . . . . . . . . . . . . . . . . . . . .

41

Slip Estimating Kalman Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

46

6.3.1

Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

46

6.3.2

Covariance Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

47

6.3.3

Evaluation and Performance

48


. . . . . . . . . . . . . . . . . . . . . . . . . . .

7 Control
7.1

7.2

49

Vehicle Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

49

7.1.1

Hydraulic Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

49

7.1.2

Speed and Brake . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

49

Navigation Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

51


7.2.1

Path Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

51

7.2.2

Path Following . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

51

8 System Design and Integration
8.1

57

Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

57

8.1.1

Original Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

57

8.1.2

Revised Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .


58

8.2

Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

59

8.3

Realtime System Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

59

8.4

Integration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

60

9 Conclusion
9.1

9.2

61

Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .


61

9.1.1

Localisation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

61

9.1.2

Navigation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

62

Recommendations and Further Work . . . . . . . . . . . . . . . . . . . . . . . . . . .

62

References

62

A Kinematic Model Calculations

66

B Circular Path Approximation

68
vi



Contents

C Evaluation Course Descriptions

70

C.1 Evaluation Course POND . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

70

C.2 Evaluation Course WOODS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

71

D Rehandling Cycle Demonstration

73

E Heuristic Drift Reduction

75

F Sensor Specifications

77

F.0.1 Calibrated Outputs of Inertial Measurement Unit . . . . . . . . . . . . . . . .


77

F.0.2 Global Positioning System Receiver . . . . . . . . . . . . . . . . . . . . . . .

78

vii


List of Figures
1.1

A wheel loader of the 120F model used as platform in the Autonomous Machine project.

2

1.2

An overview of the platform hardware. . . . . . . . . . . . . . . . . . . . . . . . . . .

7

2.1

Illustration of the local tangental plane and its Cartesian coordinate system. . . . . .

8

2.2


The vehicle body fixed coordinate system. . . . . . . . . . . . . . . . . . . . . . . . .

9

3.1

Illustration of the conceptual idea of GPS positioning. . . . . . . . . . . . . . . . . .

13

4.1

Illustration of the lateral slip angle and the related velocity vectors.

. . . . . . . . .

19

4.2

Schematic illustration of an articulated vehicle. . . . . . . . . . . . . . . . . . . . . .

20

4.3

A schematic description of the soft sensor based on the full kinematic model. . . . .

22


4.4

Comparison between the angular velocity as measured by the gyro and the soft sensor. 23

4.5

The absolute error between the gyro measurement and the calculated angular velocity. 23

5.1

Direct pre-filtering implementation scheme. . . . . . . . . . . . . . . . . . . . . . . .

30

5.2

Direct filtering implementation scheme. . . . . . . . . . . . . . . . . . . . . . . . . .

31

5.3

Indirect feedforward implementation scheme. . . . . . . . . . . . . . . . . . . . . . .

31

5.4

Indirect feedback implementation scheme. . . . . . . . . . . . . . . . . . . . . . . . .


32

6.1

Path estimated by the full model filter, compared to the path measured by the GPS.

37

6.2

Estimation of the gyro bias. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

41

6.3

Estimation of the average wheel radius. . . . . . . . . . . . . . . . . . . . . . . . . .

41

6.4

The estimated path illustrated together with the path as measured by the GPS.

. .

42

6.5


Estimation of the gyro bias. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

43

6.6

Estimation of the average wheel radius. . . . . . . . . . . . . . . . . . . . . . . . . .

43

viii


List of Figures

6.7

Illustration of the estimated path in comparison with a true ground reference. . . . .

44

6.8

The positional error illustrated together with the horizontal dilution of precision. . .

45

6.9

The absolute orientation error illustrated. . . . . . . . . . . . . . . . . . . . . . . . .


45

6.10 The difference between the integrated gyro orientation and the orientation derived
from GPS velocity components. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

46

6.11 Illustration of the estimated lateral body slip together with the articulation angle. .

48

7.1

The velocity measured by the rotary encoder illustrated together with the given setpoint. 50

7.2

The geometry of the pure pursuit algorithm. . . . . . . . . . . . . . . . . . . . . . . .

52

7.3

The geometry relating the curvature to the articulation angle. . . . . . . . . . . . . .

54

7.4


The estimated driven path illustrated together with the given waypoints.

. . . . . .

55

7.5

Comparison between the articulation angle setpoint and the measured angle. . . . .

56

8.1

A schematic description of the implemented system. . . . . . . . . . . . . . . . . . .

60

8.2

Illustration of the subsystem architecture with a resource access selector. . . . . . . .

60

B.1 Geometry of circular paths. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

69

C.1 An approximation of the evaluation course POND. . . . . . . . . . . . . . . . . . . .


70

C.2 The logged Horizontal DOP value during the three laps. . . . . . . . . . . . . . . . .

71

C.3 The evaluation course denoted as WOODS. . . . . . . . . . . . . . . . . . . . . . . .

72

C.4 The logged Horizontal DOP value during the lap. . . . . . . . . . . . . . . . . . . . .

72

D.1 The complete autonomous load and haul cycle. . . . . . . . . . . . . . . . . . . . . .

73

E.1 Heuristic drift reduction implementation scheme. . . . . . . . . . . . . . . . . . . . .

76

E.2 Comparison between gyro bias estimations conducted by an EKF and the HDR algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

ix

76


List of Tables

F.1 Accelerometer specification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

77

F.2 Rate gyroscope specification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

77

F.3 Magnetometer specification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

77

F.4 GPS receiver specification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

78

x


Nomenclature
This nomenclature lists all abbreviations and parameters used throughout the thesis, only the most
commonly used variables are listed.

Parameters
R

Average wheel radius

[m]


L1,2

Distance between the articulation hinge and a wheel axle

[m]

P1,2

Wheel axle midpoint position

[m, m]

Variables, greek letters
ϕ

Articulation angle

[rad]

ϕ˙

Articulation angle rate

[rad s−1 ]

v

Vehicle velocity

[m s−1 ]


ω

Wheel angular velocity

[rad s−1 ]

θ
θ˙

Vehicle orientation / yaw

[rad]

Vehicle angular rate / yaw rate

[rad s−1 ]

β

Body slip

[rad]

Variables, latin letters
b

Gyro bias

[rad s−1 ]


l

Look-ahead distance

[m]

T

Engine throttle

[%]

E

Eastward position

[m]

N

Northward position

[m]

xi


Abbreviations
Volvo CE


Volvo Construction Equipment

AGV

Autonomous Ground Vehicle

GPS

Global Positioning System

DGPS

Differential Global Positioning System

CEP

Circular Error Probability

HDOP

Horizontal Dilution of Precision

VDOP

Vertical Dilution of Precision

WGS84

World Geodetic System 1984


MEMS

Microelectromechanical Systems

ALVINN

Autonomous Land Vehicle In a Neural Network

ANN

Artificial Neural Network

IMU

Inertial Measurement Unit

LIDAR

LIght Detection And Ranging

ADC

Analogue-to-Digital Converter

DAC

Digital-to-Analogue Converter

GPIO


General purpose Input/Output

DSP

Digital Signal Processor

KF

Kalman Filter

EKF

Extended Kalman Filter

UKF

Unscented Kalman Filter

LHD

Load-Haul-Dump (vehicle)

IIR

Infinite Impulse Response

PP

Pure Pursuit


SMA

Simple-Moving-Average

PID

Proportional-Integral-Derivative (regulator)

PI

Proportional-Integral (regulator)

UDP

User Datagram Protocol

TCP

Transmission Control Protocol

SLAM

Synchronous Localisation and Mapping

MATLAB

MATrix LABoratory

IEEE


Institute of Electrical and Electronics Engineers

MDU

Mälardalen University

PIP

Packaged Industrial Personal Computer

GUI

Graphical User Interface

CAN

Controller Area Network

LSB

Least SigniÞcant Bit

xii


Chapter 1

Introduction
1.1

1.1.1

Background
Autonomous Machine Project

The Autonomous Machine project is a initiative taken by Volvo Construction Equipment (Volvo
CE) to develop and build autonomous wheel loaders and in the prolonging autonomous haulers.
The project has been ongoing for three years, where the development has mainly been based on the
work of Master’s Theses.

1.1.2

Volvo Construction Equipment

Volvo CE is one of the leading companies on the construction equipment and heavy machinery
market. The company’s history begins back in 1950 when AB Volvo bought an agricultural machine
manufacturer renamed to Volvo BM AB. The company expanded globally during the 1980s and
1990s by purchasing companies in America, Asia and Europe. Today Volvo CE’s product range is
of a big diversity featuring for an instance, wheel loaders, haulers, crawling and wheeled excavators,
motor graders, demolition equipment and pipelayers.

1.1.3

Wheel Loaders

A wheel loader is a versatile vehicle able to perform a wide variety of work tasks in many different
environments. In its typical configuration a wheel loader is equipped with an actuator arm with two
degrees of freedom. At the endpoint of the arm a tool is attached. The probably most common tool
is a bucket, but there exist a vast selection of tools suited for different situations and tasks.
High mobility and flexibility is obtained by utilising articulated steering; a type of steering where a


1


Chapter 1: Introduction

hydraulic hinge divides the vehicle in two sections making them able to rotate relative each other.
Another characteristic feature of a wheel loader is the stiff front axle. Having a stiff front axle makes
the vehicle stable and able to handle heavy loads. The rear axle is often freely suspended in a pivot
for increased mobility in rough terrain.

Figure 1.1: A wheel loader of the 120F model used as platform in the Autonomous Machine project.

1.2

Problem Specification

The intention of this thesis is to enable the autonomous wheel loader used in the Autonomous
Machine project to navigate in a given task space. In short, the problem at hand could be divided
into two distinct parts. Firstly, the system needs to be capable of localising itself in its task space
using available sensors mounted on the vehicle. The second problem is to autonomously steer the
wheel loader in its task space to and from given positions.
In order to solve the first and the second problem, it is necessary to understand how an articulated
vehicle such as a wheel loader behaves and responds in terms of steering i.e. a model needs to be
derived accordingly. The localisation of the vehicle will be based on the readings of the vehicle’s
internal and external sensors. Therefore, it is essential to establish models of the sensors in terms
of errors and noise. Another concern is how to, with respect to errors and noise, combine the
information provided by the sensor readings. Finally, steering the vehicle requires a stable and
accurate control law, but also a way to represent and communicate the intended drive path.


2


Chapter 1: Introduction

1.3

Objectives

The main objective of this thesis is to conceptually demonstrate the possibility of utilising an autonomous wheel loader for the purpose of conducting a simple and repetitive work task at a production site. The targeted production site in the Autonomous Machine project is an asphalt plant.
Material rehandling is the simplest and most repetitive task found at such site; gravel or similar
materials is stocked at the site and where a wheel loader is utilised for the purpose of transporting the
material from the stock into the production. In the case of the asphalt plant the different materials
are unloaded in pockets leading to a conveyor belt feeding the plant.
The motivation of the main objective, and thus this Master’s Thesis, is given as follows. A human
driver becomes tired and unfocused by performing the same task for the duration of an entire shift.
Two consequences arise; a tired driver is potentially dangerous and could cause lethal accidents or
serve injuries; the second consequence is decreased productivity. Another motivation is the ability
of an autonomous vehicle to always drive in an economical way if the situation allows it, which could
be overlooked by an unfocused driver.

1.4

Safety

The vehicle is assumed to be operating in an enclosed and secured area. No humans will be present
in the vicinity of the autonomous vehicle nor in its task space during autonomous operation. A
wireless industrial classified safety stop has been installed in the vehicle allowing the responsible
operator to shut it down, at shutdown the parking brake is automatically activated bringing the
vehicle to a halt.

All personnel involved in the development of this autonomous vehicle have been educated and
certified accordingly for operating wheel loaders.

1.5

Delimitations

The vehicle’s task space is assumed to be moderately planar, as a consequence only planar motion
will be treated. Trajectory drive paths are calculated or stored in a strategic high level software,
and thus no such planning will be conducted in the following work. Furthermore, obstacles located
in the task space is assumed to be static. Obstacle avoidance will therefore not be a subject of this
thesis.

3


Chapter 1: Introduction

1.6

Thesis Outline

This section is intended to give a brief overview of the contents found in the chapters constituting
this Master’s Thesis report.
The coordinate systems used throughout the work presented in this Master’s Thesis, are defined in
the brief chapter 2.
Chapter 3 gives a study of the internal and external sensors mounted on the vehicle. The study
includes the modelling of the sensors and a discussion regarding the involved measurement errors.
Chapter 4 starts with a study of related work regarding the modelling of articulated vehicles. The
study tries to outline the main aspects to consider during the modelling process. Thereafter a

kinematic model is derived based on the findings in the previous study.
In chapter 5 the reader is introduced to the concept of multisensor systems and sensor fusion with
Kalman filters, both the classical linear Kalman filter and the extended Kalman filter are presented.
A short review of another common extension is given, namely the unscented Kalman Þlter. The
chapter ends with an overview of different implementation methods and schemes.
Chapter 6 continues the report by applying previously discussed technique of sensor fusion to the
problem of localising the vehicle in its task space. Thereafter the second problem of steering the
vehicle is attended in chapter 7. The chapter does also describe other aspects regarding the vehicle’s
control.
The design and work of integrating the developed solution into the existing system architecture is
described in chapter 8, an description of the implemented communication interface is also given.
Chapter 9 finalises the report by discussing the findings and results made through the work of this
thesis.

1.7

Related Work

Autonomous vehicles have traditionally been a subject of military usage. However, as in the case
of many technologies they emerge from the military sector into civilian applications as the price of
the technology gets more affordable. Autonomous vehicles are complex systems requiring computational power and often equipped with a wide variety of sensors. The rapid progress of computers
becoming smaller, more powerful and affordable enhances the possibility to develop autonomous
systems without a military founding. Regarding sensors, the technology of microelectromechanical
systems (MEMS) revolutionised sensors, both regarding their price and size. Today rather advanced
sensors such as accelerometers and gyros could be found in cars, mobile telephones and video gaming
devices.

4



Chapter 1: Introduction

The term autonomous is often interchanged with and equalled to the term unmanned. The latter
term refers only to a vessel that carries no human operator, hence it by definition could be remotely
operated by a human. In contrary, an autonomous vehicle operates without a human operator’s
intervention.

1.7.1

Autonomous Ground Vehicles

One of the earliest designs of a fully autonomous ground vehicle (AGV) was the Autonomous Land
Vehicle In a Neural Network (ALVINN) developed 1993 at the Carnegie Mellon University. The
concept was based on learning an artificial neural network (ANN) to follow a road. The road was
sensed with an image array constituted by 30 times 32 pixels. The system successfully drove 145
kilometres without human assistance, and is capable of driving on both dirt roads and paved roads
with multiple lanes.
The probably most recognised AGVs today are the participators of the DARPA Grand Challenge
held in 2004 and 2005, and the DARPA Urban Challenge held in 2007. In the latest Grand Challenge,
an off-road course stretching over 210 kilometres was the subject of autonomous driving with a time
limit of 10 hours. The winning team from Stanford University completed the course in just under 7
hours [1]. The course was completed by 5 of the 23 participating teams, in the first challenge non of
the 15 vehicles managed to complete the course. The Urban Challenge featured a 96 kilometre long
urban course to be completed within 6 hours, won by the Carnegie Mellon University completing
the course in a little over 4 hours.
The participating vehicles are standard cars retrofitted with computers and sensors. Typical sensors
are Global Positioning System (GPS) receivers, laser rangers (LIDAR), cameras, microwave radar
and inertial measurement units (IMU). The large amount of data produced by the vehicles’ sensor
systems put high demands on the software architecture; emphasis is on high efficiency and configurability. The high complexity of the involved control and optimisation problems led to the usage
of adaptable and learning algorithms.

An interesting project, and maybe a little more related, known as Autonomous Navigation for Forest
Machines is held as a part of the research conducted at the Umeå University [2]. The project’s
purpose is the design and development of algorithms and techniques intended for the navigation
of autonomous vehicles in off-road environments. A forest machine has successfully been used for
autonomously drive along previously learnt paths. The vehicle is able to localise itself by using GPS
in conjunction with laser ranging and odometry.

5


Chapter 1: Introduction

1.7.2

Autonomous Mining Equipment

One established actor providing autonomous vehicle’s for the use in the underground mining industry
is Sandvik’s AutoMine system [28]. There are several benefits of autonomous vehicle’s in the mining
industry. First of all it increases the safety and improves the working conditions of the personnel
by removing them from the hazardous environment of an underground mine. The more economical
benefits are, among others, stated as increased productivity and lower maintenance cost.
The system is not fully autonomous since the bucket load sequence is tele-operated by a human.
However, one human operator is capable of controlling a number of vehicles since they take turns
loading their buckets. The communication is managed over a common wireless local area network
with added realtime features.
There are several publications related to the autonomous loaders used in underground mines, especially regarding the modelling of such vehicles. Due to their similarity to a wheel loader; a more
in depth related work study is given in chapter 4 treating the modelling of the autonomous wheel
loader.

1.8


Platform

The selected platform for the autonomous wheel loader is conventional wheel loader of the 120F
model; a model belonging to the midsize wheel loaders offered by Volvo CE. With a weight of 20
ton it is capable of lifting some 6 to 7 ton in normal operation and is typically used in applications
involving material rehandling.
A ruggedised industrial computer, a PIP8 manufactured by MPL AG, featuring a realtime software
environment is used for interfacing the vehicle’s different functions and sensors. The sensors are
interfaced via RS232 and by the PIP8’s analogue-to-digital converters (ADC). The realtime environment, known as xPC Target, is capable of running compiled Simulink models in realtime. The
models are developed in the standard Simulink environment and then compiled through C to a
single file that is downloaded to the PIP8 computer.
The hydraulic functions powering the actuator and the articulated steering are controlled by the
PIP8’s digital-to-analogue converters (DAC) via an electrical servo system. The throttle and the
brake system is controlled in a similar manner. Utility functions with on-off characteristics are
controlled by general purpose input/output (GPIO) ports. Besides the DAC and the GPIO interface,
the PIP8 is connected to the vehicle’s Controller Area Network (CAN) bus. At the current stage
the CAN bus is mostly used for supervision purposes.

6


Chapter 1: Introduction

An additional computer is utilised for non-realtime tasks denoted as the high level strategic software.
The high level software is developed in parallel with this thesis as a separate thesis [3]. The main
responsibility of the high level software is to coordinate and command the vehicle accordingly to fulfil
the current objective, and provide an graphical user interface (GUI). This computer is connected to
the PIP8 computer through a standard Ethernet connection.
LIDAR


Vision Algorithms

Strategic Software

Sensors

GUI

Realtime System

Vehicle Hardware

Figure 1.2: An overview of the platform hardware.
A third computer dedicated to the execution of vision algorithms on the data gathered from a SICK
MLS291 laser ranger (LIDAR), mounted with a servo on the cab’s roof edge, was added to the
hardware architecture during the timeline of this thesis. The vision algorithms are able to detect
piles and other objects that the vehicle autonomously needs to interact with. The algorithms are
developed by the Örebro University as a part of their research in the field of autonomous vehicles. Interaction with the vision algorithms are conducted by the high level software over Ethernet, utilising
a TCP/IP interface. The interface delivers a vector relative the vehicle to the found objects.

7


Chapter 2

Coordinate Systems
2.1

Local Planar Frame


Commonly positions are given in geodetic coordinates comprised by latitude, longitude and altitude,
where the Earth is approximated by an ellipsoid. A widely known such system is the World Geodetic
System 1984 (WGS84) used by for an instance the Global Positioning System (GPS).
However, since the vehicle is intended to just travel on a relative small geographical region, rendering
the effect of the Earth’s curvature rather insignificant, it is feasible to approximate that region with
a planar system. The planar system is given by the tangental plane to the Earth’s surface fixed to
a reference point located in the region of interest.
N (north)
U (up)

E (east)

Figure 2.1: Illustration of the local tangental plane and its Cartesian coordinate system.

8


Chapter 2: Coordinate Systems

The transformation from longitude λ and latitude φ in the geodetic coordinate system to the eastward
E and northward N coordinates in the local tangental plane are done in accordance with

E = Re ∆λcos(φ)
N = Re ∆φ
where Re is the Earth’s radius at the current latitude, and

∆λ = λ − λref erence
∆φ = φ − φref erence .


2.2

Vehicle Body Fixed Frame

The vehicle body fixed frame is a coordinate system having its origin in the middle point of the
vehicle’s rear wheel axle. The reason why the rear body of the wheel loader was selected to host
the vehicle body frame was simply due to the fact that all vital sensors are located there. The
x-axis points in the forward direction denoted as the longitudinal axis. Perpendicular to the x-axis,
the y-axis points to the left. The y-axis is also referred to as the lateral axis. Lastly the z-axis is
specified to point skywards, which completes the right hand system.
x
ˆ





Figure 2.2: The vehicle body fixed coordinate system.

9


Chapter 3

Sensors
In this section the reader will be introduced to the sensors installed on the vehicle. Sensors that do not
concern the vehicle’s translational movement are, for the cause of simplicity and comprehensibility,
omitted.

3.1


Sensor Measurement Model

It is crucial to understand what a particular sensor actually measures in order to utilise the obtained
data in an appropriate manner. The measurement of a physical quantity is not a direct reflection
of its true value since it contains errors of different characteristics. In the simplified measurement
model (3.1.1) of the arbitrary physical quantity x, errors such as scalar error kx , bias error bx , normal
distributed white noise ηx and quantisation error ∆x are taken into account. In reality other errors
exist, but they will be assumed to be rather small in comparison to the errors already accounted for
in the model and are therefore neglected.

x
˜ = (1 + k)x + bx + ηx + ∆x

3.2

(3.1.1)

Odometry Sensors

In order to perform simple odometry one needs to measure the articulation angle and the linear
velocity of the vehicle as shown later in chapter 4. The two sensors described below were mounted
prior the work of this thesis.

10


Chapter 3: Sensors

3.2.1


Articulation Angle Sensor

The sensor measuring the articulation angle is a linear potentiometer working in the range 0.5 to 4.5
volts, where 0.5 volt corresponds to a -60 degrees angle and 4.5 volt to 60 degrees. However, only a
limited span of the available range is utilised since the mechanical range of the articulation angle is
between -36 to 36 degrees. The given model below models the sensor in terms of degrees

ϕ˜ = ϕ + bϕ + ηϕ

(3.2.1)

where the bias error bϕ has been experimentally determined to 0.3 degrees and the noise ηϕ has been
approximated to have one standard deviation of 0.03 degrees.

3.2.2

Rotary Encoder

An encoder wheel together with two inductive sensors, mounted on the centre drive shaft delivering
power to the front and rear wheel axles, are used to calculate the angular velocity of the wheels. The
output from the inductive sensors are counted in a register, where one least significant bit (LSB)
correlates to a drive shaft rotation of 0.148 radians. A 18.37:1 drive shaft to wheel ratio gives a
wheel rotation of 0.008 radians per LSB. The wheel angular velocity derived from the encoder, by
differentiation and averaging, is modelled by

ω
˜ = (1 + kω )ω + ∆ω

(3.2.2)


where kω and ∆ω denote a scalar error and a quantisation error, respectively. Regarding the scalar
error kω , it is believed to be rather small due to excessive calibrations. The quantisation error ∆ω
has been approximated to ±0.05 radians per second. The vehicle’s linear velocity is calculated in
accordance with

v˜ = R˜
ω

(3.2.3)

where R denotes the average wheel radius. However, the wheel radius will change over time depending on tyre wear down and the current load of the the wheel loader. Therefore it would be beneficial
to estimate the radius with for an instance a Kalman filter (KF), which will be further investigated
in chapter 6.

11


×