Upload
others
View
2
Download
0
Embed Size (px)
Citation preview
Wireless Vehicular Blind-Spot Monitoring
Method and System
by
Chen Liu
Xiaodong Xu
Final report submitted in partial satisfaction of the requirements for the degree of
Bachelor of Science
in
Electrical and Computer Engineering
in the
Faculty of Engineering
of the
University of Manitoba
Faculty Supervisor:
Prof. Robert Mcleod
March 11, 2013
© Copyright by Chen Liu, Xiaodong Xu, 2013
! 1!
Abstract
The wireless vehicular blind-spot monitoring method and system we designed is a novel
smartphone based platform to help drivers detect objects within a predefined perimeter of the
vehicle. The purpose is to detect and monitor objects encroaching upon a vehicle’s blind-spot
area and warn the driver if a potential collision is detected. This blind-spot monitoring system is
designed for vehicles that do not have a factory-equipped blind-spot detection system. The system
leverages the smartphone as the user interface. Therefore, it can be easily installed on any type of
car to enhance safety.
This method and system utilizes two ultrasonic sensors for the vehicle to detect
approaching objects, including other vehicles, pedestrians and other possible obstacles, in the
blind-spot area. The information feedback received from these sensors is wirelessly
communicated to a smartphone application. Once any object is detected within a predefined area
or warning distance, the application activates visual warning to notify the driver that the object is
in the blind-spot area and otherwise obstructed from view. When the method and system detects a
potential collision on the basis of the rate of change for the distance between the vehicle and
object, an audio alert is evoked to further warn the driver to ensure safe vehicle operation.
Based on the initial testing, our designed method and system achieves the desired results.
It is able to correctly detect objects in the detecting area, and evoke the visual warning and audio
alert promptly and properly in the application. Furthermore, the system is capable of handling the
interference caused by slight changes in the distance between the vehicle and object. As such, the
audio alert only occurs when the vehicle moves towards the detected object or vice versa.
Therefore, the design goal was achieved. This work indicates the feasibility and flexibility of this
wireless vehicular blind-spot monitoring method and system.
! 2!
Contributions
The project describes a novel method for monitoring a vehicle’s blind-spot region. The
major contribution of this project is to bring a safety feature, which is only available on high-end
vehicles, to all possible vehicles with low cost. In addition, the project demonstrates the idea of
combining Bluetooth Low Energy technology with a modern smartphone for a sensor-based
application. Unlike other vehicle safety enhancing accessories, this project is extremely portable
and power source independent, the manual effort of installation is minimal, and the smartphone
application would be accessible via online application store for free.
Group Members’ Contributions
The development process as a whole was divided into three major categories, which were
the monitoring and warning system, detection system and the collision prediction algorithm. For
each major category, several subtasks were assigned to each member based on the nature of the
task (e.g. hardware or software) and each member’s interest.
Chen Liu:
• Bluetooth Low Energy framework implementation.
• Detection system design and implementation.
• Driver circuit design and implementation.
Xiaodong Xu:
• Monitoring and warning system design and implementation.
• Collision prediction algorithm implementation.
Chen Liu & Xiaodong Xu:
• Collision prediction algorithm design and simulation.
• Initial system testing.
! 3!
Assistance from University of Manitoba
We had some discussions with our supervisor about the collision prediction algorithm. At
that point, we shared some ideas about the working principle and optimization of the algorithm.
Our supervisor gave us some valuable feedback about how could we improve the efficiency and
accuracy of the collision prediction algorithm. In addition, our supervisor lent us some electronic
components used for this project. The ECE Machine Shop gave us some advice on designing a
printed circuit board and helped us to fabricate our printed circuit board. The ECE tech-shop also
gave us some valuable advice on part selection and lent us some electronic components.
! 4!
Acknowledgments
The authors of this report would like to thank our supervisor Prof. Robert McLeod for his
help and comments on our algorithm design and proofreading this report as well as lending
components to us for this project. Thanks to University of Manitoba ECE tech-shop for lending
us electronic components. Thanks to connectBlue support team on providing the software
development kit. We also thank Blair Yoshida, Prof. Behzad Kordi, Aidan Topping and Senthil
Thiruppathi for their valuable feedback and comments on this report.
! 5!
Table of Contents
Abstract…………………………………………………………………….…………….1
Contributions………………………………………………………….…………………2
Acknowledgments…………………………………………………………….………….4
Table of Contents………………………………………………………………………..5
List of Figures……………………………………………………………………………7
List of Tables……………………………………………………………………………..9
Nomenclature…………………………………………………………………………...10
1. Introduction…………………………………………………………………………..12
2. Background…………………………………………………………………………..13
3. System Overview…………………………………………………………………….15
4. Design Decisions……………………………………………………………………..16
4.1. Display and Control Unit……………………………………………………16
4.2. Wireless Technology………………………………………………………..16
4.3. Detection Sensor…………………………………………………………….17
4.4. Bluetooth Module and Development Board Selection……………………..19
4.5. Mounting Method.………………………………………………………..…21
4.6. Power Source………………………………………………………………..21
5. The Monitoring and Warning System……………………………………………...22
5.1. Bluetooth Low Energy Transceiver Module………………………………...22
5.2. System Server……………………………………………………………….23
5.3. Collision Prediction Module………………………………………………...25
5.4. GPS Service Module……………………………………………………..…25
5.5. View Controllers…………………………………………………………….25
5.6. Warning Method…………………………………………………………….26
5.7. Graphical User Interface…………………………………………………….28
5.8. User Experience Design……………………………………………………..33
6. The Detection System………………………………………………………………..35
6.1. SRF05 ultrasonic sensor…………………………………………………….35
6.2. Software for OLP425 Module………………………………………………37
! 6!
6.2.1 Software for Bluetooth Low Energy communication……………...38
6.2.2 Software interacts with SRF05 ultrasonic sensor………………….42
6.3. Interfacing the OLP425 and SRF05 ultrasonic sensor………………………44
6.4. Power consumption for detection system…………………………………...46
7. Collision Prediction Algorithm……………………………………………………...47
7.1. Working Principle…………………………………………………………...47
7.2. Methods to Avoid False Alarm………………………………………….......49
8. Project Testing……………………………………………………………………….53
8.1. Ultrasonic Sensor Test………………………………………………………53
8.2. Bluetooth Low Energy Transceiver Module Test…………………………..54
8.3. Collision Prediction Algorithm Test………………………………………..55
8.4. System Initial Test…………………………………………………………..56
9. Problems……………………………………………………………………………...57
9.1. iOS Programming Problems………………………………………………...57
9.2. Embedded System Programming Problems…………………………………58
10. Conclusion…………………………………………………………………………..60
10.1. Future Work………………………………………………………………..60
11. Budget Summary…………………………………………………………………...62
Reference………………………………………………………………………………..63
Appendix A System Server Program Code………………………………………….65
Appendix B Collision Prediction Module Program Code………………………….71
Appendix C Blind-spot Monitoring View Controller Program Code……………..73
Appendix D SRF05 Ultrasonic Sensor Control Program Code……………………86
Vita………………………………………………………………………………………88
! 7!
List of Figures
Figure 2-1: Mercedes blind-spot detection indicator shown in the red rectangle………………...14
Figure 2-2: Volvo blind-spot detection sensor and indicator...…………………………………...14
Figure 3-1: System overall architecture…………………………………………………………..15
Figure 4-1: SRF05 ultrasonic sensor.….………………………………………………………….18
Figure 4-2: SRF05 beam pattern……….…………………………………………………………18
Figure 4-3: Arduino board and OLP425 Bluetooth Low Energy module size comparison……...20
Figure 5-1: Smartphone application software architecture……………………………………….22
Figure 5-2: System server..……………………………………………………………………….24
Figure 5-3: View controllers..……………………………………………………………….……26
Figure 5-4: Warning method..……………………………………………………………………27
Figure 5-5: Application home view..…………………………………………………………….28
Figure 5-6: Application configuration view.…………………………………….………………29
Figure 5-7: Connecting process prompt………………………………………………………….29
Figure 5-8: Connecting failed prompt……………………………………………………………30
Figure 5-9: Detection system connected prompt…………………………………………………30
Figure 5-10: Blind-spot monitoring scene with GPS disabled/enabled………………………….31
Figure 5-11: The presence of object detected……………………………………………………31
Figure 5-12: The potential danger detected…………………………………………………...….31
Figure 5-13: Devices found………………………………………………………………………32
Figure 5-14: Detection system disconnected……………………………………………………..32
Figure 5-15: Wrong request warning prompt…………………………………………………….33
Figure 6-1: Block diagram of the detection system………………………………………………35
Figure 6-2: I/O pins for SRF05 sensor unit………………………………………………………36
Figure 6-3: SRF05 Timing diagram in mode 1…………………………………………………..37
Figure 6-4: Advertising data packet structure overview…………………………………………39
Figure 6-5: Flowchart for Bluetooth Low Energy initialization process…………………………40
Figure 6-6: Flowchart for write attribute function……………………………………………….41
Figure 6-7: Flowchart for distance measurement process……………………………………….43
Figure 6-8: Driver circuit schematic……………………………………………………………...44
Figure 6-9: Pulse stream for SRF05 trigger input………………………………………………..45
Figure 6-10: Pulse stream of echo output with a maximum amplitude of 3.3V………………….46
! 8!
Figure 7-1: Timeline diagram between two measurements………………………………………48
Figure 7-2: Scenario of another vehicle passing by our vehicle………………………………….49
Figure 7-3: Scenario of our vehicle moving back and forth towards the other vehicle…………..50
Figure 8-1: Arduino development board connects with the SRF05 ultrasonic sensor……………53
Figure 8-2: Arduino with the Bluetooth Low Energy shield and ultrasonic sensor……………...54
Figure 8-3: Detection system prototype on breadboard…………………………………………..56
! 9!
List of Tables
Table 8-1: Results from collision prediction algorithm false positive test……………………….55
Table 11-1: Budget summary for both software and hardware…………………………………..62
! 10!
Nomenclature
Arduino An open sourced microcontroller development platform.
AA battery A standard size of battery.
ABC American broadcasting company.
Blind-spot An area around the vehicle that cannot be directly observed by
the driver through the rear-view and side-view mirrors.
Callback A callback is a function which you pass to an API to be called at
later time. In other words, the callback function is a function that
is called through a function pointer.
Digital inverter A digital inverter implements the logic NOT operation. An input
of logic level high will be converted to logic level low at output.
Direction register Direction register controls the direction of a digital pin, it can set
the direction of a specific pin to be either input or output.
ConnectBlue A Sweden based company focusing on Bluetooth enabled
products.
ECE Electrical and Computer Engineering
Functional test Testing the application against the application’s specification.
Home view The home screen view of a iOS application.
GATT Generic attribute protocol.
GPIO General purpose input and output.
Graphical user interface A type of user interface that allows users to interact with
electronic devices using images rather than text commands.
I/O Input and output.
IAR A software company focusing on embedded development tools.
mAh Milliampere-hour, a unit used to indicate battery capacity.
Main function A main function is where a program starts execution.
RedBearLab A company that designs the Arduino Bluetooth Low Energy
shields.
SDK Software development kit.
Shoulder check To look backwards over one’s shoulder while driving, before
changing lanes, to see if any vehicles are in the blind-spot area.
! 11!
SOC An integrated circuit (IC) that integrates all components of
other electronic system into a single chip.
Uniform velocity Constant velocity, the moving object does not have any
acceleration.
UUID Universally unique identifier.
Xcode An Integrated Development Environment (IDE) containing a
suite of software development tools developed by Apple for
developing software for OS X and iOS.
! 12!
1. Introduction
The purpose of this report is to describe the development of a wireless vehicular blind-
spot monitoring method and system for conveying important information to a driver-operator to
ensure safe vehicle operation. This method and system includes an audio and visual vehicle blind-
spot detection and warning system for assisting a driver in checking the status of blind-spot areas.
The method and system is based on methods for detecting the presence and proximity of vehicles,
obstacles, and pedestrians, particularly in the vehicle blind-spot area.
The system comprises wireless sensor and communications means for the transmission of
status and information to a smartphone for subsequent data processing. It further comprises a
graphical, visual indication system established in conjunction with a smartphone application in
order to determine and convey the status of objects around the perimeter of the vehicle.
Operationally, if the distance between the vehicle and the detection of a single object or plurality
of objects is within a predetermined threshold, the smartphone application activates visual
warning to notify the driver of the proximity of a tracked object. In addition, the smartphone
application evokes audio alert for any potential danger of collision or contact, if the vehicle and
object appears to be approaching each other while in close proximity. The main purpose of this
project is directed toward facilitating safe and controlled vehicle navigation in the presence of
static and moving objects.
! 13!
2. Background
Nowadays, road safety is a significant concern with increasing vehicle density. Among
the various causes of car collisions, the inability to completely check the vehicle blind-spot areas
(regions that are difficult to observe for the driver through the mirrors, typically near the rear
sides of the vehicle) plays an important role. A research study, conducted by the American
Psychological Association, indicates that the failure to check a blind-spot area is the most
common driving behavioral mistake [1]. Another study, made by Professor Joanne M Wood,
states that blind-spot area checking is more often neglected when changing lanes. At that time,
drivers fail to check their blind-spot area on 63% of trials. Further compounding the problem, the
study also reported that a considerable proportion of participants never, or only occasionally,
make an appropriate blind-spot area check [2]. According to Accident Exchange, an analysis of
50,000 car accidents revealed that the number of crashes caused by drivers failing to check
vehicle blind-spot area has risen by 50% over the last two years [3]. Furthermore, checking the
blind-spot area divides attention between the forward view of the road and awareness of other
vehicles beside and behind the driver. This division of attention also causes potential dangers
even if the driver has a good practice for checking the blind-spot area.
Therefore, in order to reduce avoidable car accidents and enhance the safety of vehicles,
developing a vehicle blind-spot monitoring system will be greatly beneficial. With the assistance
of an automated blind-spot detection system, the driver can know the status of vehicle blind-spot
area in real time without requiring a shoulder check. Moreover, drivers can always concentrate on
the forward view of the road without distraction by turning their head to check the blind-spot
area. This behavior will considerably help drivers ensure safe vehicle operation, especially when
considering a lane change.
Nowadays, many car manufacturer companies have adopted and integrated the blind-spot
detection system for their new cars. However, this technology has only been applied to high end
or luxury cars, so it only serves a small percentage of drivers. Furthermore, the blind-spot
detection systems they use have to be factory-equipped. The systems have to be integrated into
vehicles’ body during the manufacturing. For instance, as shown in Figure 2-1 and Figure 2-2,
Mercedes embeds the blind-spot detection indicator into side mirrors, and Volvo integrates the
blind-spot detection sensor into the side mirror frames.
! 14!
Figure 2-1: Mercedes blind-spot detection indicator shown in the red rectangle [4].
Figure 2-2: Volvo blind-spot detection sensor and indicator [5].
Obviously, these factory blind-spot detection systems cannot be applied to other vehicles easily.
Therefore, the blind-spot detection systems in use are very limited for most people. This
drawback highlights the advantages of our designed blind-spot monitoring system. The merit of
our designed system is that it is flexible and feasible for various vehicles, especially for those do
not have a factory-equipped blind-spot detection system. It can be installed and repaired with
minimum human effort. In addition, because of its low cost, it is affordable for majority of
people.
! 15!
3. System Overview
The blind-spot monitoring system consists of two detection systems and one monitoring
and warning system. The overall architecture of the system is illustrated in Figure 3-1.
Figure 3-1: System overall architecture.
These two detection systems are deployed to monitor and detect objects in the blind-spot
areas. The detection systems transfer the associated information of the detected object to the
monitoring and warning system wirelessly. Each detection system comprises a distance sensor,
wireless transceiver, and power supply. The monitoring and warning system is responsible for
processing the data received from detection systems and warning the driver properly. The
monitoring and warning system is established on a display and control unit, which also provides a
user interface for the user to interact with system.
Due to the budget limitation, the project designed at present only has one detection
system, which means the present blind-spot monitoring system can only monitor the blind-spot
area on one side of the vehicle. In the future, we will implement the other detection system to
complete the entire project.
! 16!
4. Design Decisions
This section describes the design decisions we have made during the design phase of this
project. The design decisions include the selections of display, wireless technology, detection
sensor, Bluetooth module, and mounting method. Each section explains the reason of our
selection in detail.
4.1. Display and Control Unit
We decided to adopt the modern smartphone as the display and control unit for the
monitoring and warning system of the project. Since the smartphone is widely used nowadays,
there are three benefits to using it as our display:
1. The smartphone’s powerful graphical display functions can facilitate and enhance the
blind-spot area monitoring.
2. The smartphone can provide a powerful processor and comprehensive application
programming interface (API) to support the algorithm development.
3. The cost of the project is reduced substantially because that the project does not need to
include a specific display and a central control unit.
For the choice of the smartphone, we selected the iPhone 4S as our target platform for two
primary reasons:
1. Due to budget limitations, we were restricted to use our available smartphones for the
project. In this case, the iPhone 4S and Samsung Galaxy Nexus were only options we
had.
2. As the project requires the smartphone to support Bluetooth Low Energy (the reason for
choosing Bluetooth Low Energy will be explained in section 4.2), iPhone 4S was selected
as our only choice at this time.
4.2. Wireless Technology
As our project utilizes wireless communication between the sensor unit and the
control/display unit, the choice of the wireless technology became an important decision to make.
For a project based on a wireless sensor, the most appropriate wireless technologies are Bluetooth,
IEEE 802.11(Wi-Fi) and ZigBee. Among these options, only the ones that are supported by
! 17!
smartphone could be considered, since the smartphone should be able to transmit or receive the
wireless signal directly to or from the sensor. Therefore, we disregarded the ZigBee option
because it is not built-in to any smartphones at present. In addition, as our blind-spot detection
system is not embedded into the vehicle, an external power source, such as a battery, is required.
In order to achieve the goal of longer battery-life, the power consumption of the system is a
critical issue. Based on these criteria we chose Bluetooth over IEEE 802.11(Wi-Fi) as it
consumes less power while providing us enough bandwidth for data transmission.
Furthermore, in 2011 Bluetooth v4.0 introduced a low energy technology, which is
Bluetooth Low Energy, enabling application devices that can operate for months or even years on
tiny coin-cell batteries [6]. With its extremely low power consumption and exceedingly short set-
up time, Bluetooth Low Energy is ideal for applications requiring episodic or periodic transfer of
small amounts of data [7]. Accordingly, this technology is suitable for our project. Using
Bluetooth Low Energy for data transmission, not only extends the service life of our detection
system, but also saves considerable power for the user’s smartphone. Therefore, we decided to
utilize Bluetooth Low Energy to implement the wireless communication.
4.3. Detection Sensor
There are several sensors being used to measure distance to an object. The most common
object distance sensors are infrared sensors, microwave sensors, radar sensors, and ultrasonic
sensors. Infrared sensor performance will be interfered with by sunlight. In addition, depending
on the object’s surface, color and shade, the reading may be different [8]. Therefore, the
inaccurate distance measurement of the infrared sensor cannot meet the requirement for
conveying correct information to a driver-operator to ensure safe vehicle operation. For the
microwave sensor, the sensor may detect undesirable movements due to its strong penetration,
which means some movements behind the object surface may also be detected [9]. These
undesirable detected movements will interfere with blind-spot monitoring results and may trigger
false alarms. In addition, the radar sensor is far more expensive compared to other methods, and
was not selected as this violates our design objective of low cost.
Compared to the other options, the ultrasonic sensor best matches the requirements for
our project due to the following reasons:
1. It is less affected by target materials and surfaces, and it is not affected by color and
sunlight.
2. It has resistance to external disturbances such as vibration, infrared radiation,
! 18!
ambient noise, and EMI radiation [10].
3. Discrete distances to moving objects can be detected and measured.
4. Ultrasonic transducers are also of low cost and are widely available. Therefore, the
ultrasonic sensor is ideal for detecting the presence of static and moving objects for
our project.
There was an SRF05 ultrasonic sensor (shown in Figure 4-1) available, and it had enough
detecting range (detail is explained in section 5.6) with low power consumption and low cost.
Therefore, SRF05 was selected as our detection sensor. In addition, according to the beam pattern
of SRF05 ultrasonic sensor, as shown in Figure 4-2, the beam angle is about 55o. Thus, if the
sensor is mounted appropriately (i.e. no object exists in the front area of ultrasonic emitter with
radius 30 cm), the ground and vehicle’s parts will not block or reflect the ultrasonic beam so that
the detection result can be trusted with more certainty.
Figure 4-1: SRF05 ultrasonic sensor.
Figure 4-2: SRF05 beam pattern.
! 19!
4.4. Bluetooth Module and Development Board Selection
The project utilizes Bluetooth Low Energy connection as mentioned in section 4.2,
selecting the right Bluetooth Low Energy development board is important. After our initial
research, we found that the majority of the Bluetooth Low Energy devices operate on a SoC
called CC2540, which has an 8051 microprocessor built in, from Texas Instruments. There are
two development kits available for CC2540 from Texas Instruments, which are the CC2540
Development Kit and CC2540 Mini Development Kit. The CC2540 Development Kit costs
$299.00, due to our budget limitation we chose the CC2540 Mini Development Kit, which costs
$99.0. However, after researching the CC2540 Mini Development Kit, we found that the CC2540
Mini Development Kit does not provide any GPIO pins for development purposes. As result, we
had to give up using the CC2540 Mini Development Kit, because using GPIO pins to interact
with the ultrasonic sensor is required for our project. We then found another CC2540 based
Bluetooth Low Energy development kit, which is called OLP425, from connectBlue The OLP425
Development Kit costs $62.99 and has three easy-to-solder GPIO pins for developers to use. The
only drawback with the OLP425 Development Kit is that it needs a CC Debugger from Texas
Instruments, which costs $50.00. The combination of OLP425 Development Kit and the CC
Debugger costs $112.99, which is within our budget limitation. At this time, we decided to report
to our supervisor about our development board selection. After we talked about our development
board selection with our supervisor, we were fortunate to find that our supervisor had a CC
Debugger and would lend it to us for free.
However, the software license for the development tool of the OLP425, which is IAR
Embedded Workbench for 8051, cost more than one thousand dollars. Alternatively, we found a
free evaluation license for IAR Embedded Workbench for 8051 with a 30-days time limitation. It
was obvious that 30 days were not enough for us to finish building the entire project, as it could
take a couple of weeks for us to learn how to program for OLP425 since neither of us have
programmed for OLP425 before. We decided that we would first build an Arduino based
prototype for testing purpose, since Arduino is an open source platform and it is easy to program.
In addition, we found there is a Bluetooth Low Energy shield for Arduino available from
RedBearLab for $33.00 and Chen Liu has an Arduino development board that can be used for this
project. As a result, we decided to test the iPhone Bluetooth Low Energy communication
framework, ultrasonic sensor and the collision prediction algorithm on the Arduino platform first.
After the collision prediction algorithm and iPhone Bluetooth Low Energy framework are
working properly, we would focus on programming and interfacing the OLP425 Bluetooth Low
! 20!
Energy module. In this case, we would be able to program the OLP425 module within the 30-day
time frame to avoid the high cost of the IAR software license.
The option of using Arduino platform for the entire project is considered as well.
However, it is not adopted due to the following reasons:
1. As shown in Figure 4-3, the size of the Arduino board is much larger than the OLP425
module. Because one of the purposes of this project is to make the sensor unit as
portable as possible, the OLP425 module is a much better choice.
2. The CC2540 based OLP425 is much more power efficient than the Arduino board.
3. Arduino is mainly for hobbyist projects rather than professional project.
Figure 4-3: Arduino board and OLP425 Bluetooth Low Energy module size comparison.
After consideration, we decided we would not use the Arduino platform as our final prototype for
this project, since the Arduino based system will not meet the requirements of our project.
! 21!
4.5. Mounting Method
Because our blind-spot detection system is not embedded into the vehicle, the system has
to be exterior mounted. Therefore, the mounting method is important for the project. There were
two major considerations when we tried to develop a mounting method for the system: one was
the attachment method and the other was the mounting position of the sensor unit. Some
attachment methods were proposed. Including using epoxy to attach the sensor unit to the bottom
of the vehicle, this method could be used for both passenger vehicles and trucks. Another option
for a passenger vehicle was to use a customized frame that attaches to the rear jack point of the
vehicle. Two separate pieces would be required, when using epoxy, for the attachment unit,
because the user should be able to remove the sensor unit for replacing batteries. In addition, the
mounting position for passenger vehicles and trucks should be different. For passenger vehicles
we consider mounting it in front of the rear wheel, while we would mount it behind rear wheel for
a truck. The reasons are the following:
1. Passenger vehicles and trucks have different blind-spot areas.
2. Some passenger vehicles do not have factory-equipped mudguards, so the sensor unit will
be damaged by the splashing mud if it is mounted behind the rear wheel.
Because this project focused on developing the detection unit and the smartphone
application, no final decision about the mounting method was made. Therefore, to develop an
easy to use and reliable mounting method remains as the future work.
4.6. Power Source
We considered the following criteria when we were selecting our power source: capacity,
availability and price. Based on these criteria, we decided to use 4 AA batteries in series as the
power source for this project. The biggest consideration of using the AA battery is its low cost
and high availability. The capacity for a typical AA battery is about 2800mAh, which is big
enough for a passenger car use case (detailed power consumption analysis is in section 6.4).
However, this topic still remains open, as depending on the use case, such as a
commercial vehicle or bus, a higher performance battery may be required for the system.
! 22!
5. The Monitoring and Warning System
The monitoring and warning system is responsible for presenting the status of vehicle
blind-spot areas and evoking visual alert and audio alert to properly warn the driver. These alerts
are established using a smartphone application. The monitoring and warning system comprises
five major elements: Bluetooth Low Energy transceiver module, system server, collision
prediction module, GPS service module, and view controllers (Figure 5-1). These elements
cooperate with others for translating the received raw data from the detection system to an
effective warning event.
Figure 5-1: Smartphone application software architecture.
5.1. Bluetooth Low Energy Transceiver Module
The Bluetooth Low Energy transceiver module provides an interface for Bluetooth Low
Energy communication services. It controls the Bluetooth Low Energy connection, disconnection,
! 23!
and data transmission. Once the application navigates to the blind-spot monitoring view, this
module will start to automatically search the detection systems (on both sides of the vehicle). If
the detection systems are found, the Bluetooth Low Energy transceiver module connects them
immediately. If not, the view controller will provide a notification message to inform the user so
that the user can reconnect the detection systems when available. After the connection is
established, this module will listen on the receiving data port to check if the data is updated.
When new data comes in, it will be sent to the system server directly for data processing. In the
meantime, once disconnection is requested from the user, this module will disconnect the
detection systems at once.
5.2. System Server
The system server provides a central platform for processing data and handling user
requests. For data processing, the mechanism is briefly described in Figure 5-2. Overall, the
system server needs to accomplish three subtasks. First of all, since the data received from the
Bluetooth Low Energy transceiver module represents the flight time of reflected ultrasonic wave,
the system server needs to convert it to a valid object distance value (refer to section 6.1 for
details). Depending on this distance value, the system server determines whether it should send a
signal to the view controller for evoking or clearing the visual alert. If the object distance value is
greater than the warning threshold, this data will be disregarded and the system server will not
proceed to the subsequent subtasks. Secondly, if the object distance attracts the system server’s
attention, it will be recorded. In addition, according to the object previous distance, current
distance, and the time elapsed between the detections of these two distances, the system server
calculates the object approaching instantaneous speed and passes it to the collision prediction
module. Therefore, the current instantaneous speed will be taken into account when calculating
the object approaching average speed. Thirdly, from the collision prediction module, the server
system extracts the predicted remaining time before the potential danger of collision or contact
takes place. In terms of this predicted time, the system server determines whether it should send a
signal to the view controller for triggering the audio alert.
Moreover, the system server also takes the user’s requests from the view controller and
assigns corresponding tasks for the Bluetooth Low Energy transceiver module. At present, the
application only allows the user to connect and disconnect the detection systems. Refer to
appendix A for the system server program code.
! 24!
Figure 5-2: System server.
! 25!
5.3. Collision Prediction Module
The collision prediction module calculates the remaining time for any potential danger of
collision. This module implements the weighted moving average algorithm (see section 7 for
detail description) for obtaining the object approaching average speed. Then using the current
object distance calculates the collision time. The collision prediction module is significant for the
warning system. Not only does it determine the audio alert, but also its accuracy considerably
affects the performance of the warning system. If the predicted time is longer than the expected,
the warning system would not able to alert the driver promptly. On the contrary, if the predicted
time is shorter than the expected, the warning system is so sensitive that the number of false
alarm will be increased. Refer to appendix B for the implementation program code.
5.4. GPS Service Module
The GPS service module is used to track the vehicle’s speed in order to enable and
disable the audio alert intelligently. Generally, when the vehicle is moving at a low speed, it is
more likely to detect any object is in close proximity to the vehicle, such as parking. In this case,
the audio alert will be frequently evoked for warning the driver. However, the audio alert is
unnecessary at this time since the vehicle is under the safe operation. In order to remove these
unnecessary audio alerts, the vehicle’s speed can be used to determine whether the audio alert
should be enabled or not. For this reason, the GPS service module is integrated into the
monitoring and warning system. Once the GPS service is enabled, it updates the vehicle’s speed
to the system server frequently. Therefore, in terms of the vehicle’s speed, the system server
determines if it should block the signal for evoking the audio alert. With the contribution of the
GPS service, the warning system is able to filter out some unnecessary alerts in order to provide a
better performance for all situations.
5.5. View Controllers
The view controller manages the application’s user interface. It controls the visual
appearance and displayed contents, and it also handles exchanges between the views. In fact, the
view controller is an essential link between the visual appearance and application’s internal data.
Furthermore, it provides a display view that the user can interacted with. As such, the user is able
to interact with the application.
! 26!
Figure 5-3: View controllers.
The monitoring and warning system includes three view controllers for home view,
configuration view, and blind-spot monitoring view respectively, as shown in Figure 5-3. First of
all, the home view is the first view shown when the application is launched. The home view
controller primarily provides links between configuration view and monitoring view. Hence, the
user can navigate the application to the requested view by these links. Secondly, the configuration
view controller provides an interface to receive the user’s request for enabling or disabling the
GPS service. Subsequently, the system will trigger the GPS service internally according to the
received request. Thirdly, the blind-spot monitoring view controller, which is the most important,
controls the visual alert and audio alert during the vehicle blind-spot areas monitoring.
Furthermore, it provides an interface for the user to start and stop the entire monitoring and
warning system. Additionally, if the GPS service is enabled, the blind-spot monitoring view
controller will also display the vehicle speed for the user (see Figure 5-10). Refer to appendix C
for the program code of the blind-spot monitoring view controller.
5.6. Warning Method
The warning method includes a visual alert and an audio alert, and they are evoked in
terms of the level of potential danger. The visual alert notifies the driver that there exists an
obstacle in the vehicle blind-spot areas. When the level of danger is upgraded, the audio alert is
! 27!
evoked to warn the driver of the detected potential danger of collision or contact. Figure 5-4
shows the warning method mechanism.
Figure 5-4: Warning method.
The standard urban lane width is 3.6 meters [11], and the total outside width of any
vehicle and its load should not exceed 2.6 meters in North America and Europe [12]. As such, the
normal distance between two vehicles in adjacent two lanes should be approximate 1 to 1.5
meters. In order to provide an effective warning, the warning system only evoke the visual alert if
the detected object distance is less than or equal to 1.5 meters. Otherwise, the detected object
would be disregarded since it would not cause vehicle potential safety hazard due to its position.
After the visual alert is enabled, the warning system will start using the collision
prediction algorithm (detailed discussion for collision prediction algorithm is in section 7). If the
collision prediction algorithm returns a positive result, which means the algorithm predicts a
collision would happen, our warning system starts the audio alert until the collision prediction
algorithm returns a negative result.
! 28!
5.7. Graphical User Interface
The main purpose of the graphical user interface of our system is to provide a simulated
graphical view for monitoring the blind-spot areas and allows users to interact with system. The
visual design and interaction design of the graphical user interface affects the user experience
directly. Furthermore, the graphical view of the blind-spot areas has a direct effect on the user for
understanding the status of the blind-spot areas. This section describes the graphical user interface
design for the monitoring and warning system.
The graphical user interface is implemented by a smartphone application. Once the
application starts, it will go to the home view as shown in Figure 5-5. It presents a “Start” button
and a “Setting” icon. The “Start” button is used to start to monitor the vehicle blind-spot areas,
and the “Setting” icon is for application configuration.
Figure 5-5: Application home view.
When the “Setting” icon is pressed, the view will be switched to the configuration view,
shown in Figure 5-6. At present, the application can only allow the user to enable and disable the
GPS service. This view also provides a note to inform the user of the function of GPS for the
monitoring and warning system. The GPS is used to track the vehicle’s speed so that the audio
alert will be disabled at low speed and enabled at high speed. The usage of GPS can remove
unnecessary audio alerts when the vehicle is moving at narrow space with low speed, such as
when parking. If the GPS is enabled, the vehicle speed will be displayed in the blind-spot
monitoring view. However, the GPS service has high power consumption. Even though the
application is switched to the background, the GPS service still runs. If the GPS is disabled
! 29!
(default setting), the audio alert will be enabled all the time. The “Home” button here is for
switching back to the home view.
Figure 5-6: Application configuration view.
If the “Start” button is pressed in the home view, the application will go to the view for
monitoring blind-spot areas. Once the view is loaded, the application will try to connect to the
blind-spot detection system automatically. The user will be informed of this process by the view
as shown in Figure 5-7.
Figure 5-7: Connecting process prompt.
If the application cannot find the detection system, the application will pop up a prompt
(Figure 5-8) to notify the user that no detection system can be found.
! 30!
Figure 5-8: Connecting failed prompt.
If the connection is established successfully, the connection indicator will be changed
from a red square to a green square at the top. In addition, a notification message will be
presented (Figure 5-9) to inform the user, and will disappear automatically after 3 seconds.
Figure 5-9: Detection system connected prompt.
Once the blind-spot detection system is connected, the SRF05 ultrasonic sensor will be
activated for detecting. At this time, the entire designed project is ready for monitoring the
vehicle blind-spot areas. When the blind-spot areas are clear, the monitoring view will be
presented as shown in Figure 5-10.
! 31!
Figure 5-10: Blind-spot monitoring scene with GPS disabled/enabled.
Whenever the system detects the presence of objects within a predefined perimeter of the
vehicle in the blind-spot areas, a visual alert will be evoked and maintained until the detected
objects exit the predefined warning area. For the present application, a red rectangle will be
shown on the either side of vehicle image to indicate the presence of object (Figure 5-11). When
detecting any potential danger of collision, an audio alert will be played until the potential danger
no longer exists. A warning image will appear as well to emphasize the detected danger (Figure
5-12).
Figure 5-11: The presence of object detected. Figure 5-12: The potential danger detected.
In the monitoring view, there are four buttons available for controlling the system. The
“Home” button is used to switch back to the home view. The user can use this button to navigate
! 32!
to the configuration scene in order to change the settings. The “Device” button is for listing the
found devices, as shown in Figure 5-13. The “Connect” and “Disconnect” buttons are used to
connect and disconnect the detection system.
Figure 5-13: Devices found.
Once the detection system is disconnected, the monitoring and warning system stops
working. In the meantime, the SRF05 ultrasonic sensor is switched off to save power. At that
time, the connection indicator will be changed to red at the top, and a prompt as shown in Figure
5-14 will pop up to confirm the disconnection.
Figure 5-14: Detection system disconnected.
Furthermore, the application provides the protection mechanism for preventing the
incorrect operating use. Specifically, the application disregards the connecting or disconnecting
! 33!
requests from the user if the detection system is already connected or disconnected. A prompt will
be presented if a wrong request being asked for (Figure 5-15).
Figure 5-15: Wrong request warning prompt.
Overall, this graphical user interface is able to provide an easy-to-use operating interface
for users and efficiently perform the monitoring and warning system for the designed project.
5.8. User Experience Design
Considering the user experience, the application has two additional features. First of all,
the application can continue running when it is switched to the background, which means the
audio alert can still be evoked for the detected potential danger even if the application is in the
background. For this reason, the driver can lock the Smartphone and rely on the audio alert to
ensure safe vehicle operation. The purpose of this feature is to considerably save the
smartphone’s power, since the application’s view doesn’t have to be displayed all the time.
Moreover, the audio alert warning system can provide enough time for the driver to correct
his/her dangerous vehicle operation. Nevertheless, the tradeoff of running the application in the
background is losing additional protection because that the audio alert only occurs when the
driver’s vehicle operation result in a potential danger. In order to enhance the safety, the ideal
usage of the application is to show the blind-spot monitoring view all the time. In this way, the
status of blind-spot areas can be reported to the driver all the time in order to prevent the driver
from doing any dangerous vehicle operation in advance. As such, the visual alert and audio alert
will provide double protection for the driver in creating a very safe system. Based on this
! 34!
requirement, the second additional feature of the application is that the automatic screen lock
event of Smartphone be disabled. Therefore, once the application is navigated to the blind-spot
monitoring view, the screen will display all the time until exiting this view, when the automatic
screen lock is enabled again.
! 35!
6. The Detection System
This section describes the detail implementation of the detection system. As shown in Figure
6-1, the detection system includes and SRF05 ultrasonic sensor, a driver circuit for interfacing the
OLP425 module with SRF05 ultrasonic sensor, the OLP425 Bluetooth Low Energy module and a
battery power source.
Figure 6-1: Block diagram of the detection system.
In this section, we will discuss each component in detail. In addition, we will also discuss the
power consumption and battery life of the detection system.
Furthermore, as mentioned in section 4.4, the Arduino platform based prototype is only for
Bluetooth communication and algorithm testing purpose, we will not give a detailed description
for the Arduino platform based prototype.
6.1. SRF05 ultrasonic sensor
For this project we selected the SRF05 ultrasonic sensor, reasons for picking SRF05
ultrasonic sensor were discussed in section 4.3. The SRF05 ultrasonic sensor has two operating
! 36!
modes, which are mode 1 and mode 2. In mode 1, the trigger input and the echo output are
separated. In mode 2, the sensor unit uses the same pin for both trigger input and echo output. For
this project, we are going to use mode 1. Because the digital I/O pin for OLP425 can only operate
on a logic level of 3.3V, meanwhile the SRF05 ultrasonic sensor requires a 5V trigger input and it
outputs a 5V pulse on the echo output. Both trigger input and echo output will require a driver
circuit, which will be described in section 6.3, they have to use separated pins as voltage
amplification and voltage step down require different driver circuits.
The SRF05 ultrasonic sensor has 5 I/O pins and 5 reserved pins as shown in Figure 6-2.
Figure 6-2: I/O pins for SRF05 sensor unit [13].
To operate in mode 1, pin#4 has to be left unconnected. In addition, the sensor unit requires a 5V
power supply connected to pin#1 and ground to pin#5. Pin#2 is the echo output from the sensor
unit and pin#3 is the trigger input for the sensor unit.
As shown in Figure 6-3, in mode 1 the SRF05 sensor unit requires a 10 microseconds input
pulse for the trigger pin in order to initiate one measurement.
Reserved pins for manufacture use only.
Pin#1 (5V) Pin#2 (Echo)
Pin#3 (Trigger) Pin#4 (Mode)
Pin#5 (Ground)
! 37!
Figure 6-3: SRF05 Timing diagram in mode 1 [13].
After receiving the trigger pulse, the sensor unit will send out an ultrasonic sound wave. At the
same time, the echo output on SRF05 sensor unit starts to generate a pulse with logic 1. When the
sensor unit receives a reflected ultrasonic sound wave the logic level on echo output switches to 0
(with minimum pulse duration of 100 microseconds). In other case, if the sensor does not receive
the reflected wave, it will switch the echo output to logic 0 after a 30 milliseconds time period
[13].
After we receive the echo output from the ultrasonic sensor, we need to calculate the
detected distance based on the pulse duration of the echo output pulse. For SRF05, the pulse
duration from echo output indicates the total travel time of the ultrasonic sound burst that the
sensor sent out. We know that the speed of sound in air is 340 m/s. To calculate the distance
between the ultrasonic sensor and the detected object, we use the total travel time times the speed
of sound to get the total traveling distance of the sound wave. Then we divide our result distance
by two, because the sound wave travels twice of the distance in total (it travels to the detected
object then gets reflected back).
6.2. Software for OLP425 Module
To develop the software for OLP425 module, we use the SDK for OLP425 module
provided by the connectBlue support team. The SDK for OLP425 module is based on a Bluetooth
Low Energy framework from Texas Instruments (Bluetooth Low Energy stack version 1.2.1).
Trigger input
Ultrasonic burst transmitted by SRF05
Echo output
8 cycle of ultrasonic sound burst
Echo pulse between 100 microseconds and 30 milliseconds
Trigger pulse with a minimum of 10 microseconds
! 38!
The SDK from connectBlue would provide all necessary functions for Bluetooth Low Energy’s
physical communication. As we mentioned in section 4.4, the OLP425 Bluetooth Low Energy
module is a development board that based on Texas Instruments’ CC2540, which is a SoC
solution for Bluetooth Low Energy communication. It has an 8051 ultra-low power
microcontroller alone with a Bluetooth Low Energy transceiver packed in one single chip. The
Bluetooth Low Energy stack provided by Texas Instruments gives us a software framework to
utilize the Bluetooth Low Energy functionality within the chip. In addition, the 8051 ultra-low
power microcontroller provides us all the functionality we need for data processing, such as timer
and GPIO. In addition to the CC2540 SoC, the OLP425 module also has a temperature sensor and
an accelerometer.
The software built for OLP425 has two major parts, one part is responsible for Bluetooth
Low Energy communication and the other part is responsible for interacting with the SRF05
ultrasonic sensor unit.
6.2.1. Software for Bluetooth Low Energy communication
Bluetooth Low Energy uses GATT protocol for data transmission between a server
device and client device. In general, a device holds the data, such as a Bluetooth Low Energy
enabled sensor device, acting as a server and the device requests the data acting as a client. To
establish a data communication channel via Bluetooth Low Energy, the server first needs to enter
advertise mode. Within the advertise mode, the server will keep broadcasting its advertising data
packet. The server’s advertising data packet contains device name, service UUID and service
characteristic UUID. The device name indicates the name of the device. The service UUID is a
unique identifier for each individual service that available from the server. In addition, each
Bluetooth Low Energy server is capable of containing multiple services. For each service, it can
contain several characteristics and each characteristic represents a specific functionality of the
service, such as reading data and writing data. Figure 6-4 shows the basic structure of a
advertising data packet.
! 39!
Figure 6-4: Advertising data packet structure overview.
The length of both service UUID and characteristic UUID can be either 16-bits or 128-bits.
Meanwhile, the client will initial a connection request when it discovers the server. After the
connection has been established, the server will turn off its advertise mode and start interacting
with the client.
For this project, the OLP425 module acts as a server and the smartphone application acts
as a client. Because we only have one service available from the server, which is the distance
measurement service, we only assign one service UUID to the server. In addition, the distance
measurement service contains two characteristics, one is to read the sensor data and the other one
is to identify the position of the sensor (e.g. left or right). We use 16-bits for all of our UUIDs as
this is a prototype project, in the future we will change the length of our UUIDs to 128-bits to
avoid UUID conflicts with other devices. In order for the client to read data from the server, we
used a feature within Bluetooth Low Energy called notification. Once the server has its
notification enabled, the client will be able to receive data from server whenever the server has a
new data.
We wrote two software modules to process the Bluetooth Low Energy communication
for our system. The first module is to handle the general Bluetooth Low Energy initialization
process. The second module, which is the sensor service program, handles all the client requests
for the sensor service.
Figure 6-5 illustrates the flowchart for the general Bluetooth Low Energy initialization
process. As shown in the flowchart, we first pass all necessary parameters to the Bluetooth Low
Energy framework. Then we initialize the sensor service program. At that time, we start the
advertise mode for OLP425 module. Because we want to minimize the power consumption of the
system, the module will turn off the advertise mode and go to sleep after a 30 second period.
While the OLP425 module is in sleep mode, it keeps checking its accelerometer reading. The
OLP425 module will be wake up and start advertise again if the accelerometer detects a
!
Device name: Device A Service A with UUID: UUID A
Characteristic 1 with UUID 1
Characteristic 2 with UUID 2 Other services …
Other characteristics …
! 40!
movement from the module. The reason for this design is whenever a driver sits on a car, the car
would have a small movement and then Bluetooth Low Energy module would wake up.
Figure 6-5: Flowchart for Bluetooth Low Energy initialization process.
! 41!
The first part of the sensor service program is service initialization, which will be called
from the other module during the general initialization process. Within the service initialization
process, we pass all the UUIDs for our service, including service UUID and both characteristic
UUIDs to the Bluetooth Low Energy framework. In addition, we need to register two callback
functions to the Bluetooth Low Energy framework. The first callback function is the write
attribute function, which will be called by the Bluetooth Low Energy framework when there is a
write request or a notification status change request for the sensor service. Figure 6-6 shows the
flowchart for the write attribute function.
Figure 6-6: Flowchart for write attribute function.
For this project we only need to check whether the notification status is changed. Then we can
turn on and off the periodic sensor-reading task depending on the current notification status. The
! 42!
other callback function is the read attribute function, which will be called by the Bluetooth Low
Energy framework when there is a read request for the sensor service. The read attribute function
will check the incoming request, if the request is asking for the sensor position ID then the
function will send back the current sensor position ID. For this prototype, we hard coded the
sensor position to be on the left. In the future, a hardware interface, such as a button, can be
added to change the sensor’s position so the sensor unit can be used on the left and/or right side
of the vehicle.
6.2.2. Software interacts with SRF05 ultrasonic sensor
Figure 6-7 illustrates the flowchart of the distance measurement process, which reads
data from the SRF05 ultrasonic sensor. As we mentioned in section 6.1, in order to start send a
distance sensing request to SRF05 ultrasonic sensor we need to send a trigger pulse with a
minimum pulse duration of 10 microseconds. We accomplish this by first setting the trigger
output pin to logic level high. Then a 10 microseconds delay is generated using an empty while
loop. After the delay, we reset the trigger output pin to logic level low and then the SRF05
ultrasonic sensor receives a 10 microseconds pulse. After we send out the pulse, we wait for
sensor echo output pin to rise to logic level high and start the timer, which means the SRF05
ultrasonic sensor starts waiting for the sound wave to be reflected back. When the sensor echo
output pin switch back to logic low the timer is stopped, which means the distance measurement
process is finished. There are three timers available on OLP425 module, we choose to use timer 1
at a 1 MHz clock frequency. From section 6.1, we know that the maximum pulse duration from
the SRF05 ultrasonic sensor’s echo output is 30 milliseconds. Every count in the counter register
represents 1 microsecond at 1MHz clock frequency. As a result, the value within timer 1’s
counter register will be the duration of echo pulse from the SRF05 ultrasonic sensor in
microseconds. After we get the timer counter register’s value, we would send out the value to
smartphone application. Then the detection system waits for 25 milliseconds before starting the
next measurement process cycle. The program code for this function can be found in Appendix
D.
! 43!
! 44!
Figure 6-7: Flowchart for distance measurement process.
6.3. Interfacing the OLP425 and SRF05 ultrasonic sensor
As we mentioned in section 6.1, the OLP425 module operates on a logic level of 3.3V
and the SRF05 ultrasonic operates on a logic level of 5V. In addition, the power source, which are
four AA batteries connect in series, provides a 6V power supply. It is important for us to design a
driver circuit for the system, as all components operate on different voltage levels.
Figure 6-8: Driver circuit schematic.
Figure 6-8 shows the schematic for the entire driver circuit. First goal of the driver
circuit is to provide 5V power supply the SRF05 ultrasonic sensor and 3.3V power supply for and
OLP425 module. We use two voltage regulators to achieve this goal. The first voltage regulator,
which is U1 in the schematic, will takes a 6V input and regulates the output to 5V then we
connect the 5V power to the power supply pin of the SRF05 ultrasonic sensor. In addition, we
connect the second regulator, which is U2 showing on the schematic, with the 5V output from
U1. U2 takes a 5V input and regulates its output to 3.3V. At this point, both SRF05 ultrasonic
sensor and the OLP425 module have a regulated correct power supply. As we mentioned in
section 6.1, the trigger input for SRF05 ultrasonic sensor requires a pulse with amplitude of 5V.
However, the output from OLP425 module is at the voltage of 3.3V. As a result, we need to
amplify this trigger signal to 5V. Because the pulse we are generating has 10 microseconds
! 45!
duration, the highest frequency for this pulse stream would be 100 kHz. We decide to use a 5V
digital inverter as our amplifier based on the following reasons:
1. A simple analog amplifier would not be able to operate at 100 kHz.
2. The digital inverter does not require any extra driver circuit to operate and it is
really easy for us to use.
In order to amplify the signal using a digital invert, we simply connect one output pin to
another input pin on the inverter. As shown in the schematic, the output from the trigger input,
which is from OLP425, connects to another input on the inverter. At the end, the second output
would generate a signal, which has the same logic level as the trigger input signal, with amplitude
of 5V or 0V. As a result, the trigger input signal generated by the OLP425 module is amplified
from 3.3V or 0V to 5V or 0V. Figure 6-9 shows the amplified pulse, which has a pulse duration
of 10 microseconds, we generated for SRF05 trigger input from the OLP425 module.
Figure 6-9: Pulse stream for SRF05 trigger input.
Another goal of this driver circuit is to step down the echo output pulse signal from the
SRF05 ultrasonic sensor from 5V to 3.3V. To archive this goal, we design a voltage divider with
three 3.3k resistors as shown on the schematic. Figure 6-10 shows a screenshot from the
oscilloscope, which displays a echo output pulse stream from the SRF05 ultrasonic sensor with a
maximum amplitude of 3.3V.
! 46!
Figure 6-10: Pulse stream of echo output with a maximum amplitude of 3.3V.
6.4. Power consumption for detection system
Because one of the goals for this project is to run as long as possible, power consumption
becomes critical. In order to calculate the total power consumption for our detection system, we
need to sum up the current drawing for both SRF05 ultrasonic sensor and the OLP425 module.
For the SRF05 the current draw is 4mA when it is operating [13]. For OLP425, the main power
consumption is coming from the CC2540 chip it is using. An application notes from Texas
Instruments specified a method of calculating the average current draw for CC2540 during
connection event [14]. Based on this application note, we calculate the average current draw for
our OLP425 module during the connection event is about 0.92 mA. As a result the total current
draw for the detection system, when it is operating, is 4.92 mA. In addition, the current draw for
the system while the detection system is sleeping can be neglected because the current draw for
sleep mode is smaller than 1uA.
As we mentioned in section 4.6, we choose to use 4 AA batteries in series as the power
source. For a typical AA battery, the capacity is about 2800 mAh, which means the battery is able
to provide 2800 mA current constantly for an hour. We know that the average current draw for
the detection system under operation is 4.92 mA. We use the battery capacity divide the average
current draw, it give us a total of 569 hours of battery life for the detection system.
A study from ABC shows that Americans drive about 87 minutes a day on average [15].
As a result, the detection system can work for 392 days on average and this is a reasonable long
battery life for this project.
! 47!
7. Collision Prediction Algorithm
A false audio alarm can distract driver’s attention. Therefore, developing a reliable collision
prediction algorithm and reducing the false positive rate as much as possible become important
for this project. In this section, we will describe the working principle of our collision prediction
algorithm and methods to avoid false alarm. The application program code used to implement this
algorithm is in Appendix B.
7.1. Working Principle
The working principle behind our collision prediction algorithm is called time-to-collision
rule. We developed our time-to-collision rule based on a vehicle side avoidance system called
time-to-line-crossing rule. Research done by Shannon Hetrick states that the time-to-line-crossing
rule, which is a similar algorithm to our time-to-collision rule, is the most promising alternative to
turn on signal rule in terms of vehicle side collision avoidance [16]. Because the turn on signal
rule, which is for driver to turn on turning signal when changing lanes, is not feasible for this
project, using the time-to-line-crossing rule is our best choice. In addition, we did a small
modification to the time-to-line-crossing rule used in Shannon’s research. As Shannon states in
his research the time-to-line-crossing rule depends on the time duration between the current time
and the time when vehicle crossing the lane. For this project, we adjust this rule to estimate the
time duration between the current time and the time of the potential collision based on vehicle’s
lateral velocity. Then the algorithm will compare the time duration with a given threshold. The
algorithm predicts a potential collision to be true, if the estimated time duration is smaller than
the give threshold.
In order to determine the time duration between the current time and the time of the
potential collision, we need to find two variables: our vehicle’s lateral velocity and the distance
between the vehicle and the detected object.
As we mentioned in section 6, we can get the distance between our vehicle and the object
from the detection module. In addition, we need to get our vehicle’s current lateral velocity.
As we mentioned in section 6, the time interval between each distance measurement is 25
milliseconds and the maximum time for SRF05 ultrasonic sensor perform one measurement is 30
milliseconds. As a result, the maximum time interval between data being received by the
smartphone application is 55 milliseconds. Because 55 milliseconds is a really short amount of
time and the vehicle’s lateral velocity is changing at a slow rate, the velocity changing rates
! 48!
within the 55 milliseconds time window can be neglected. As a result, we model the vehicle’s
lateral velocity as uniform velocity between 2 consecutive distance measurement results. We
know that the formula to find uniform velocity during time T is:
! = !!!!
Within the above formula, D represents the object’s distance change during time T. For us to find
D we just need to find out the difference between the two consecutive distance measurement
results and the difference will be the vehicle’s lateral distance change between the two
consecutive measurements. In addition, we need to find out the time duration T between two
consecutive measurements. Figure 7-1 shows a timeline diagram between two consecutive
measurement operations.
Figure 7-1: Timeline diagram between two measurements.
As shown in Figure 7-1, we can see that the time duration between two consecutive
measurements is time duration from time point Ta to time point Tb. As mentioned in section
6.2.2, the time interval between measurements is the 25 milliseconds time delay before
performing each measurement task. In addition, T1 and T2 represent the traveling time for the
ultrasonic sound wave from detected object to the detection system. Then, we can find both T1
and T2 via dividing the corresponding distance by the speed of sound in air. As a result, we can
find the time duration between two consecutive distance measurements by summing up T1, T2
and 25 milliseconds. At the end, we divide the distance difference by the time difference between
Sound wave sent out
Reflected sound wave received
Sound wave sent out !
Time point Tb, object detected !
Reflected sound wave received !
Measurement#1 Time point Ta, object detected
Measurement#2 !
Time interval between measurements
Time duration T1
Time duration T2
! 49!
two consecutive distance measurements. As a result, we will get the vehicle’s lateral velocity at a
single time point.
Another important variable within the collision prediction algorithm is the timing
threshold. According to Shannon’s research, the timing threshold for time-to-line-crossing rule
should be more than 1.25 seconds [16]. However, we find that for a smartphone application there
is a small delay between start the audio function and the smartphone’s speaker actually plays the
audio. As a result, we adjust our timing threshold to 2 seconds to make sure when driver hears the
audio alert the time remaining for collision will be longer than 1.25 seconds.
After we find our vehicle’s lateral velocity and our vehicle’s side distance to another
object, we can divide the distance by our vehicle’s lateral velocity to get the estimated time
duration before any potential collision. Then the algorithm will compare this estimated time
duration with our given timing threshold to determine whether a collision is going to happen.
7.2. Methods to Avoid False Alarm
For our collision prediction algorithm, we need to try our best to avoid false positive so that
driver will not be distracted from any unnecessary false audio alarm. In fact, we found two
common scenarios those would cause false positive with our collision prediction algorithm.
Based on the working principle of our collision prediction algorithm, if the detection system
detects a big gap between two measurements, our algorithm will predict a collision because the
lateral velocity in this case will be really high. As shown in Figure 7-2, this scenario will happen
when another vehicle is passing by our vehicle on either side of our vehicle.
! 50!
Figure 7-2: Scenario of another vehicle passing by our vehicle.
The reason causing the false alarm is that the detection system will return a large distance value
when there is no vehicle besides us. However, as shown in Figure 7-2, if another vehicle enters
our detection range for the first time, the difference between the previous distance measurement,
when there is no vehicle next to us, and the current distance measurement, another vehicle just
enter our detection range (the detection range is discussed in section 5.6), will be large.
Therefore, a false alarm will be generated. In order to avoid this type of false alarm, we add a
conditional check after we receive each distance data. If the distance data is out of our detection
range, the algorithm will discard the measurement data. In this case, we will only start our
prediction process after we find there is another vehicle within out detection rage. By using this
conditional check, we can completely eliminate the false alarm caused by another vehicle passing
by.
As shown in Figure 7-3, the other common scenario that can cause false alarm is when
our vehicle is moving back and forth towards the detected object.
Our vehicle
Other vehicle
No object in blind-spot
Our vehicle
Other vehicle
Moving forward
Moving forward Object
detected
! 51!
Figure 7-3: Scenario of our vehicle moving back and forth towards the other vehicle.
That type of false alarm is hard for us to eliminate, because it is hard for the system to predict
whether the our vehicle will keep approaching the detected object or move away from the
detected object based on the current distance measurement results. To avoid that type of false
alarm, we decided to utilize the weighted moving average technique for us to improve the
reliability of our prediction result. Weighted moving average is a statistical analysis technique to
find the weighted average value for a set of data. The difference between weighted average and
average is for weighted average the more recent data will have a bigger weight than the less
recent data, thus the most recent data will have the biggest impact on the average value. Our
collision prediction algorithm calculates the weighted moving average value for our vehicle’s
lateral velocity before we perform the timing estimation. For our collision prediction algorithm,
we set our average buffer size to be 15, which means we will always calculate the weighted
average value for the most recent 15 data points. Because the averaged lateral velocity is partially
depends on the previous lateral velocity, the averaged lateral velocity’s changing rate is smaller
than the lateral velocity’s true changing rate. As a result, there will be a small delay before the
averaged lateral velocity reaches the true value of the vehicle’s current lateral velocity. During
the delay period, if our vehicle starts to move away from the detected object, the estimated time-
to-collision will increase before it goes below the threshold, because the averaged lateral velocity
will start to decrease before it reaches it maximum value. In that case, the alarm will not be
triggered when our vehicle moves away from the detected object after approaching it. By using
the weighted moving average technique, the false alarm caused by our vehicle moves back and
forth towards the detected object can be partially eliminated. However, it is still impossible for us
Our vehicle
Other vehicle
Our vehicle is moving right
Our vehicle
Other vehicle
Our vehicle is moving left
! 52!
to eliminate the false positive completely, because the weighted moving average can only
eliminate false alarm when the distance between the back and forth movement is short. As a
future work, we plan to investigate other type of numerical analysis techniques that can help us to
build a more effective algorithm for eliminating the false alarm caused by our vehicle moving
back and forth.
! 53!
8. Project Testing
This section describes all the testing and integration work we have done for this project.
In order to assure our design and component selection meets our project requirement, we conduct
the following tests: ultrasonic sensor test, Bluetooth Low Energy transceiver module test,
collision prediction algorithm test and system initial test. As we mentioned in section 4.4, we
performed our ultrasonic sensor test, Bluetooth Low Energy transceiver module test and
prediction algorithm test on our Arduino based prototype. In addition, we tested OLP425 based
system on breadboard for the system initial test.
8.1. Ultrasonic Sensor Test
In order to test the SRF05 ultrasonic sensor, we configured the Arduino development board
so we could read the ultrasonic sensor result from our computer. Figure 8-1 shows the Arduino
based system we used for ultrasonic sensor testing.
Figure 8-1: Arduino development board connects with the SRF05 ultrasonic sensor.
During the ultrasonic sensor test, we used a cardboard to approach the SRF05 ultrasonic sensor,
while recorded the distance measurement between the ultrasonic sensor and the cardboard. In
! 54!
addition, we used a ruler to keep tracking the true distance between the ultrasonic sensor and the
cardboard. After we compared the measurement results and true distance values, we find that the
SRF05 ultrasonic sensor was able to provide an accurate distance measurement result. As a result,
we concluded that the accuracy of theSRF05 ultrasonic sensor was able to meet our project
requirement.
8.2. Bluetooth Low Energy Transceiver Module Test
The smartphone Bluetooth module test was another test we performed when we were
designing our project. The purpose of this test was to perform a functional test on our Bluetooth
Low Energy transceiver module, which is mentioned in section 5.1. To perform this test, we
connected the Arduino board with the Arduino Bluetooth Low Energy shield and connected the
shield with the ultrasonic sensor. Figure 8-2 shows system we used for this test.
Figure 8-2: Arduino with the Bluetooth Low Energy shield and ultrasonic sensor.
Moreover, we programed the Arduino system so it could interact with the smartphone application
via Bluetooth Low Energy connection. The implementation of the Arduino software will not be
discussed as it beyond the scope of this project report. Then, we tried to start the SRF05
ultrasonic sensor and then retrieve the reading data using the smartphone application via
! 55!
Bluetooth Low Energy connection. As a result, we successfully sent and received data from the
Arduino platform via Bluetooth Low Energy connection. Therefore, we concluded that our
Bluetooth Low Energy transceiver module was functioning properly and the module could be
used for our project.
8.3. Collision Prediction Algorithm Test
As we mentioned in section 7.2, our system requires that the collision prediction algorithm to
avoid false alarm caused by our vehicle moving back and forth towards the detected object.
Therefore, we utilize the weighted moving average technique within our collision prediction
algorithm. This section describes the test we performed to determine whether using the weighted
moving average technique could eliminate false alarm.
In order to compare the effectiveness of the weighted moving average technique, we
designed a test with false positive input for the system. To perform the false positive test, we
moved a cardboard towards the detection sensor unit with a uniform motion from 1.5 meters
away, which is our maximum detection range (detail discussion about detection range is in
section 5.6). We stopped moving the cardboard and moved it away from the sensor unit after it
had moved about 10 cm. We used this test to simulate the situation where our vehicle moves back
and forth towards the detected object. Table 8-1 shows the result from the previous test.
Table 8-1: Results from collision prediction algorithm false positive test.
Algorithm type Total trails Number of collision reported
Without weighted moving average 20 20
With weighted moving average 20 3
From Table 8-1, we found that adding the feature of weighted moving average to our collision
prediction algorithm has eliminated 85% of the false positive trails within 20 trails. Because, the
moving speed of the cardboard is no constant within all 20 trails, we still had a 15% false positive
rate. As a result, adopting the technique of weighted moving average into our algorithm could
help us to eliminate a high percentage of false positive caused by our vehicle moving back and
forth towards the detected object. Therefore, we concluded that adding the weighted moving
average technique to our collision prediction algorithm would increase the reliability of our
project.
! 56!
8.4. System Initial Test
The system test was conducted on the breadboard version of our detection system shown
in Figure 8-3. After we finished building our system on breadboard, we did both true positive and
false positive tests on the system to make sure it was functioning as we expected.
Figure 8-3: Detection system prototype on breadboard.
As a result, the system was able to display the visual warning indication when the system
detected a object within the detection range (detailed discussion about the detection range is in
section 5.6). In conclusion, the overall perform of this system met the goal of our project.
Furthermore, we plan to conduct a field test on our project in the future in order to further verify
the performance of our project.
! 57!
9. Problems
During the process of building the project, we encountered additional problems out of our
expectation. This section describes some problems we encountered during iOS programming and
embedded system programming.
9.1. iOS Programming
Since the designed project utilizes an iPhone to build the monitoring and warning system,
programming on iOS is essential and crucial for the project performance. Nevertheless, we
completely have no experience in iOS programming and we almost learn it from zero. Essentially,
we have to understand the iOS programming language, the structure of an application software,
the usage of Xcode for building the application program, the method of designing and creating a
graphical view, and the method of establishing an interface between the graphical view and the
application program. Furthermore, for the requirements of our designed project, we also need to
know how to utilize the techniques of Bluetooth Low Energy, audio, GPS, and running the
application in background. In addition, during the learning, there is no tutorial book available we
can read and no experienced person we can consult with. The only reference we have is the Apple
iOS documentation. All problems we encountered have to be solved by ourselves. Obviously, the
iOS programming is a big challenge for building the designed project.
In order to design a good graphical user interface, we wanted to design a custom user
interface button for our application. At first, we built a class for modifying the iOS default button
and created all buttons under this class. However, if there were multiple buttons exist in the same
view, some buttons’ title would disappear somehow. We read the Apple iOS documentation
repeatedly in case that we missed some information which would result in such problem, and
found that the way we used to modify the button was correct. Then we searched the related
problems on the Internet and there was no useful information could be found for this specific
problem. Hence, we tried different ways for modifying the button, and found that if the button’s
title was set both in the program and interface builder of Xcode, sometimes the button’s title
would disappear. Therefore, after we cleared the button’s title in the interface builder, all buttons
were presented in the right way as we expected.
For considering the power consumption of GPS service (the purpose of using GPS
service is described in section 5.4), we decided to provide an interface for allowing the user to
enable and disable it. As such, we created a configuration view to control the GPS service, and
! 58!
the vehicle’s speed provided by the GPS service would be displayed in the blind-spot monitoring
view (the detail is described in section 5). To achieve this goal, we created a class for using the
GPS service so that the configuration view controller and the blind-spot monitoring view
controller could use this class to execute required tasks. However, at the beginning of the design,
the GPS service could not be enabled and disabled by the user. That causes the blind-spot
monitoring view (Figure 5-10) continued to display the vehicle’s speed even if the GPS is
switched off from the configuration view (Figure 5-6), or vice versa. First of all, we looked into
the program and ensured that we used correct internal functions to start and stop the GPS service
referring to Apple iOS documentation. Then we looked for additional information on the Internet
and found out many same problems other people encountered. But there was no valid solution
provided for solving this problem. Subsequently, we added debugging messages into the program
in order to find out the place where causing the problem. According to our debugging, we noticed
that the created object of the class for controlling the GPS service would be lost when switching
the view in the application. This was because that the object would be initialized every time when
the configuration view was loaded so that object was not same with the original one anymore. For
this reason, we modified the initialization of the GPS service class in order to ensure the returned
object was a shared instance. In this case, no matter where this class was created or initialized, the
returned object must be the original one. As result, our problem was solved successfully.
9.2. Embedded System Programming
As we mentioned in section 6.2.2, we needed to generate a 10 microseconds pulse from
the OLP425 module. In order to generate the pulse, we need to toggle the output from logic low
to logic high then back to logic low. However, when we started testing our OLP425 module, the
pulse does not show in the oscilloscope. We double-checked our pin assignment and pin register
setting with the CC2540 technical document and found no error. We were confused and started to
debug our system. We started our debug process by just setting the output pin to be 1 and exam
the results. As a result, the output pin was not able to deliver logic high to the oscilloscope. At
that point, we realize there must be a problem with the pin value assignment. We then set a break
point at the point right after we assign the output pin to value 1. Then we started running the
program in debug mode and we found the value for the output pin did not change to 1 after
assignment. From the previous observation, we suspect there is a problem with the pin’s direction
register. Then we review the part of the program’s I/O initialization process. However, we did not
find any error within the I/O initialization process in the program. In addition, we check the main
! 59!
function of the program, where all the initialization functions are called, and find a bug within the
main function. Within the main function, the I/O initialization is called before the device driver
initialization function. We believe this can cause our problem, because the device driver should
be initialized before initializing the I/O pins. In order to solve this problem, we switch the order
of the I/O initialization process and the device driver initialization process. After this, we retested
our program and it worked as we expected.
! 60!
10. Conclusion
This report presents a wireless vehicular blind-spot monitoring method and system for
conveying important information to a driver to ensure safe vehicle operation. The purpose of this
system is to detect and monitor objects encroaching upon a vehicle’s blind-spot areas and warn
the driver if a potential collision is detected. Therefore, the system facilitates safe and controlled
vehicle navigation in the presence of static and moving objects. The presented blind-spot
monitoring system is designed to be used with vehicles that do not have a factory-equipped blind-
spot detection system.
The motivation for developing this project is to provide a flexible and low-cost blind-spot
detection system with low power consumption for various vehicles. In order to achieve this goal,
the project utilizes a smartphone as the display for the monitoring and warning system, and use
Bluetooth Low Energy for wireless communication. The monitoring and warning system is
established on a smartphone application and provides a graphical user interface for the user to
interact with the system. The detection system comprises an ultrasonic distance sensor and
Bluetooth Low Energy module for detecting objects and transmitting data respectively.
At present, due to our budget limitation, the project only has one detection system, which
means the present blind-spot monitoring system can only monitor the blind-spot area on one side
of the vehicle. However, one detection system is capable of proving the feasibility of our
designed method and system. According to the initial testing, our designed method and system
achieved the desired results. The system was able to correctly detect objects within the detection
range, and evoke the visual alert and audio alert properly in the smartphone application.
Moreover, the system had capability of handling the interference caused by the slightly changes
in the distance between the vehicle and detected object. As such, the audio alert only occurred
when the vehicle and the detected object appeared to be approaching to each other. Overall, the
design goal was achieved. The testing result verified the feasibility and flexibility of this wireless
vehicular blind-spot monitoring method and system.
10.1. Future Work
The initial system testing provided a positive testing result, but the testing was only
implemented in the lab. Therefore, the future work is required to conduct a field test for further
verification of the system performance when using on the road. In addition, another detection
system can be added in the future to monitor the blind-spot areas on both sides of the vehicle. In
! 61!
order to identify the positions of two detection systems on the vehicle (left or right), one
suggested method would be adding a mechanical switch in the detection system to change the
position ID. In this case, the position ID of each detection system can be set to its corresponding
value. At that time, the monitoring and warning system will be able to display the visual alert at
the corresponding position in the application.
! 62!
11. Budget Summary
Table 11-1 concludes the budget summary for this project including both software and
hardware. Some components were borrowed or given for free, is listed in this table too. In
addition, this budget summary does not include the tax and shipping cost.
Table 11-1: Budget summary for both software and hardware.
Component Cost (CAD) Comment
SRF05 ultrasonic sensor 27.00 Purchased online
SRF05 ultrasonic sensor Free Borrowed from supervisor
OLP425 Bluetooth Low
Energy module
57.97 Purchased online
X-band motion detector 38.83 Purchased online
Texas Instruments CC
Debugger
Free Borrowed from supervisor
Arduino development board Free Provided by Chen Liu
Arduino Bluetooth Low
Energy shield
29.89 Purchased online
Voltage regulators, resistors
and capacitors
Free Provided by ECE tech shop
IAR Embedded workbench for
8051 software license
Free 30-days evaluation license
iPhone 4S Free Provided by Xiaodong Xu
Battery holder Free Borrowed from supervisor
Breadboard Free Provided by ECE tech shop
Apple developer license Free Provided by supervisor
Total cost: 153.69 CAD
! 63!
References
[1] K.J. Anstey and J. Wood. “Chronological Age and Age-Related Cognitive Deficits Are
Associated With an Increase in Multiple Types of Driving Errors in Late Life”,
Neuropsychology, 2011, Vol. 25, No. 5, 613–621. [American Psychological Association],
[Accessed Feb. 25 2013].
[2] J.M. Wood. “The On-Road Difficulties of Older Drivers and their Relationship with Self-
Reported Motor Vehicle Crashes”, The Journal of the American Geriatrics Society,
57(11), pp. 2062-2069. [Accessed Jan. 10 2013].
[3] CNET. “2011 Mercedes-Benz CL550 4Matic”. [Online]. Available:
http://reviews.cnet.com/coupe-hatchback/2011-mercedes-benz-cl550/4505-10867_7-
34444896-2.html. [ Accessed Jan 08, 2013].
[4] Gizmag. “Volvo Launches Blind Spot Information System (BLIS)”. [Online]. Available:
http://www.gizmag.com/go/2937/. [Accessed Jan 08, 2013].
[5] David M. (Sep 2011). “Blind spot crashes increase” [The Telegraph]. Available at:
http://www.telegraph.co.uk/motoring/news/8779153/Blind-spot-crashes-increase.html.
[Accessed Jan 10, 2013].
[6] Bluetooth. “Bluetooth 4.0 with low energy technology paves the way for Bluetooth Smart
devices”. [Online]. Available: http://www.bluetooth.com/Pages/low-energy.aspx.
[Accessed Sep 20, 2012].
[7] Medical Electronics Design. “Bluetooth Low Energy vs. Classic Bluetooth: Choose the
Best Wireless Technology For Your Application”. [Online]. Available:
http://www.medicalelectronicsdesign.com/article/bluetooth-low-energy-vs-classic-
bluetooth-choose-best-wireless-technology-your-application. [ Accessed Sep 20, 2012].
[8] Eric. Infrared vs. Ultrasonic - What You Should Know [Online].
Available: http://www.societyofrobots.com/member_tutorials/book/export/html/71 [Jan
10, 2013]
! 64!
[9] VSTAR. (2011, September 22). Motion Detectors for movement detection [Online].
Available: http://www.hkvstar.com/news/motion-detectors-for-movement-detection.html
[Jan 11, 2013]
[10] MigatronCorp. Understanding Ultrasonics [Online].
Available: http://migatron.com/understanding_ultrasonics.htm [Jan 12, 2013]
[11] Ingrid B. Potts, Douglas W. Harwood, and Karen R. Richard. “Relationship of Lane
Width to Safety for Urban and Suburban Arterials”. 2007. [Accessed Mar 1, 2013].
[12] California Department of Transportation. “Vehicle Widths”. Aug 19, 2010. [Online].
Available: http://www.dot.ca.gov/hq/traffops/trucks/trucksize/width.htm. [Accessed Mar
1, 2013].
[13] Robot-electronics, “SRF05 – Ultra- Sonic Ranger,” Technical Document. [Online].
Avaiable: http://www.robot-electronics.co.uk/htm/srf05tech.htm. [Accessed Oct. 2012]
[14]! Texas Instruments, “Measuring Bluetooth Low Energy Power Consumption,”
www.ti.com. [Online]. Avaiable: http://www.ti.com/lit/an/swra347a/swra347a.pdf.
[Accessed Feb. 2013]
[15] G.Langer, “Poll: Traffic in the United States,”, www.abcnews.go.com, Feb. 13, 2005,
[Online]. Avaiable: http://abcnews.go.com/Technology/Traffic/story?id=485098&page=1
. [Accessed Jan. 2013]
[16] S. Hetrick, “Examination of driver lane change behavior and the potential effectiveness
of warning onset rules for lane change or ‘side’ crash avoidance systems,” M.S.thesis,
Virginia Polytechnic Institute & State University
! 65!
Appendix A – System Server Program Code
Header file #import <UIKit/UIKit.h> #import <Foundation/Foundation.h> #import "SensorDiscovery.h" #import "SensorService.h" #import "MovingAverageInfo.h" #define hc_to_i(c) ((c >= '0' && c <= '9')? c-'0': ((c >= 'A' && c <= 'F')? c-'A'+0x0A: ((c >= 'a' && c <= 'f')? c-'a'+0x0A:0))) extern Boolean gpsEnFlag; struct dataStruct { double prevTimestampLeft; double currTimestampLeft; double prevDistanceLeft; double currDistanceLeft; }; // Delegate to provide warnings on BSViewController @protocol ServerDelegate <NSObject> -(void) deviceFound:(CBPeripheral *)newDevice; -(void) deviceConnected:(NSString *)deviceName; -(void) deviceDisconnected:(NSString *)deviceName; -(void) deviceConnectFail:(NSString *)deviceName; -(void) leftObjectPresented:(float)distance; -(void) leftAreaClear; -(void) rightObjectPresented:(float)distance; -(void) rightAreaClear; -(void) leftCollisionPredicted; -(void) leftCollisionPredictionClear; -(void) rightCollisionPredicted; -(void) rightCollisionPredictionClear; @end @interface BSServer : NSObject <SensorDiscoveryDelegate, DistanceSensorProtocol> { NSMutableArray *sensorServiceArray; struct dataStruct dataInfo; NSTimeInterval timestamp; WeightedMovingAverageInfo *mvInfo; float moveAvg; uint32_t rxDataCount; } @property (nonatomic,assign) id <ServerDelegate> delegate;
! 66!
@property (strong, nonatomic) SensorDiscovery *sensorDisc; @property (strong, nonatomic) SensorService *sensorServiceLeft; @property (strong, nonatomic) SensorService *sensorServiceRight; @property (strong, nonatomic) WeightedMovingAverageInfo *mvInfo; + (id) sharedInstance; - (void)connectDevice: (CBPeripheral *)newPeripheral; - (void)disconnectDevice; @end Source file #import "BSServer.h" #import "BLEDefines.h" #define WARNING_DISTANCE_THRESHOLD 100 #define COLLISION_ALERT_IN_SEC 2.0 #define MAX_DATA_SIZE 10 #define SOUND_SPEED_IN_AIR_PER_MS 0.34 Boolean gpsEnFlag = false; @implementation BSServer @synthesize delegate; @synthesize sensorDisc; @synthesize sensorServiceLeft; @synthesize sensorServiceRight; @synthesize mvInfo; static BSServer *instance = nil; + (BSServer *)sharedInstance { if (instance == nil) { instance = [[BSServer alloc] init]; } return instance; } - (id)init { self=[super init]; if(self) { rxDataCount = 0; // Initialization timestamp = [[NSDate date] timeIntervalSince1970];
! 67!
// Initializes all element of dataInfo to 0. memset(&dataInfo, 0, sizeof(dataInfo)); sensorServiceArray = [NSMutableArray new]; //sensorDisc = [[SensorDiscovery alloc] init]; [[SensorDiscovery sharedInstance] setDiscoveryDelegate:self]; [[SensorDiscovery sharedInstance] setPeripheralDelegate:self]; [[SensorDiscovery sharedInstance] startScanning]; mvInfo = [[WeightedMovingAverageInfo alloc] init]; moveAvg = 0.0; } return self; } #pragma mark - BLE delegate -(void) foundNewPeripheral:(CBPeripheral *)newPeripheral; { if(newPeripheral != nil) { NSLog(@"[server]->Peripheral Found"); [[self delegate] deviceFound:newPeripheral]; } } -(void) disableRead { NSLog(@"[Server]-> disableRead"); [[SensorDiscovery sharedInstance] disconnectPeripheral:[sensorServiceLeft peripheral]]; } - (void) sensorServiceDidChangeStatus:(SensorService *)service connectToService:(BOOL )connected { NSLog(@"[Server]-> sensorServiceDidChangeStatus"); if(service != nil) { if([[SensorDiscovery sharedInstance] connectedServices] !=nil) { sensorServiceArray = [[SensorDiscovery sharedInstance] connectedServices]; if(connected) { sensorServiceLeft = [sensorServiceArray objectAtIndex:0]; [[self delegate] deviceConnected:sensorServiceLeft.peripheral.name]; } else {
! 68!
NSLog(@"Start to disconnect"); [[self delegate] deviceDisconnected:sensorServiceLeft.peripheral.name]; } } else{ NSLog(@"[Server] Error: connectedServices is null."); } } } #if ARDUINO_BLE_SHIELD - (void) sensorService:(SensorService *)service updateSensorReadingValue:(NSData *)data; { u_char rxData[MAX_DATA_SIZE]; uint32_t rawData = 0; if(data != nil && service != nil) { rxDataCount++; [data getBytes:rxData length:data.length]; for(int i = 0; i < data.length; i++) { rawData += hc_to_i(rxData[i]) << (data.length-1-i) * 4; } NSLog(@"~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~"); NSLog(@"RX Data Count [%d]", rxDataCount); NSLog(@"[Server]-> Rx Data Int: %u", rawData); [self processData:(((double)rawData)/100)]; } } #else - (void) sensorService:(SensorService *)service updateSensorReadingValue:(NSData *)data; { u_char rxData[MAX_DATA_SIZE]; int rawData = 0; double distanceInCm; double timeInMs; if(data != nil && service != nil) { rxDataCount++; [data getBytes:rxData length:data.length]; rawData = rxData[0] + rxData[1]*256; NSLog(@"~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~");
! 69!
NSLog(@"RX Data Count [%d]", rxDataCount); NSLog(@"High to Low: [%d - %d]", rxData[1], rxData[0]); NSLog(@"[Server]-> Rx Data Int: %d", rawData); timeInMs = ((double)rawData/1000); distanceInCm = ((timeInMs * SOUND_SPEED_IN_AIR_PER_MS) / 2) * 100; NSLog(@"Distance: %f cm", distanceInCm); [self processData:distanceInCm]; } } #endif - (void)connectDevice: (CBPeripheral *)newPeripheral; { if(newPeripheral != nil) { NSLog(@"[Server]-> try to connect device"); [[SensorDiscovery sharedInstance] connectPeripheral:newPeripheral]; } } - (void)disconnectDevice { if(sensorServiceLeft != nil) { [sensorServiceLeft stopReadingData]; } } - (void)processData:(double)distanceInCm; { double timeDiffInSec; double disDiffInCm; double speed; double collisionTimeInSec; if(distanceInCm <= WARNING_DISTANCE_THRESHOLD) { [[self delegate] leftObjectPresented:distanceInCm]; [[self delegate] rightAreaClear]; dataInfo.prevDistanceLeft = dataInfo.currDistanceLeft; dataInfo.currDistanceLeft = distanceInCm; timeDiffInSec = ((dataInfo.prevDistanceLeft*0.01)/340) + 0.025 + ((dataInfo.currDistanceLeft*0.01)/340); disDiffInCm = dataInfo.prevDistanceLeft - dataInfo.currDistanceLeft; if(dataInfo.prevDistanceLeft != 0) {
! 70!
speed = disDiffInCm / timeDiffInSec; moveAvg = [mvInfo getWeightedMovingAvg:speed]; NSLog(@"Dis %f || Dis Diff = %f || Time Diff = %f ", distanceInCm, disDiffInCm, timeDiffInSec); NSLog(@"-[WMA: ### %f ### Sample: %f ###]-", moveAvg, speed); if([mvInfo bufferFull]) { collisionTimeInSec = dataInfo.currDistanceLeft / moveAvg; NSLog(@"[Server]-> COLLISION TIME: [%f]", collisionTimeInSec); if(((collisionTimeInSec <= COLLISION_ALERT_IN_SEC) && (collisionTimeInSec >= 0)) && (dataInfo.currDistanceLeft <= WARNING_DISTANCE_THRESHOLD)) { NSLog(@"----****** [Alert] ******-----"); [[self delegate] leftCollisionPredicted]; } else { NSLog(@"<<<-------- Clear --------->>>"); [[self delegate] leftCollisionPredictionClear]; } } } } else if(distanceInCm <= 200) { [mvInfo clearBuffer]; dataInfo.prevDistanceLeft = 0; dataInfo.currDistanceLeft = 0; //[[self delegate] rightObjectPresented:distanceInCm]; [[self delegate] leftCollisionPredictionClear]; [[self delegate] leftAreaClear]; } else { [mvInfo clearBuffer]; dataInfo.prevDistanceLeft = 0; dataInfo.currDistanceLeft = 0; [[self delegate] leftCollisionPredictionClear]; [[self delegate] rightAreaClear]; [[self delegate] leftAreaClear]; } } @end
! 71!
Appendix B – Collision Prediction Module Program Code
Header file #import <Foundation/Foundation.h> #define QUEUE_SIZE 15 #define WEIGHT_UNIT 1 @interface WeightedMovingAverageInfo : NSObject { int start_index; int end_index; double sample_queue[QUEUE_SIZE]; } - (void)clearBuffer; - (Boolean)bufferFull; - (double)getWeightedMovingAvg:(double)new_sample; @end
Source file #import "MovingAverageInfo.h" @implementation WeightedMovingAverageInfo -(id)init { self=[super init]; if(self) { //initialization start_index = 0; end_index = 0; } return self; } -(void) clearBuffer { start_index = 0; end_index = 0; } - (Boolean)bufferFull { Boolean rc = false; if( ((end_index + 1) % QUEUE_SIZE) == (start_index)) { rc = true; }
! 72!
return rc; } -(double) getWeightedMovingAvg:(double)new_sample { double wma; sample_queue[end_index] = new_sample; end_index = (end_index + 1) % QUEUE_SIZE; if( end_index == start_index) { start_index = (start_index + 1) % QUEUE_SIZE; } wma = [self calculateWeightedMovingAvg]; return wma; } -(double) calculateWeightedMovingAvg { double weightedMovingAvg = 0.0; double sum = 0.0; double totalWeight = 0; double weight = WEIGHT_UNIT; int index = start_index; while (index != end_index) { //NSLog(@"%f", sample_queue[index]); sum += sample_queue[index] * weight; index = (index + 1) % QUEUE_SIZE; totalWeight += weight; weight += WEIGHT_UNIT; } weightedMovingAvg = sum / totalWeight; return weightedMovingAvg; } @end
! 73!
Appendix C – Blind-spot Monitoring View Controller Program Code
Header file #import <UIKit/UIKit.h> #import "BSServer.h" #import "SBTableAlert.h" #include "CurrentLocationController.h" #include <AudioToolbox/AudioToolbox.h> #include <AVFoundation/AVFoundation.h> @interface BSViewController : UIViewController <ServerDelegate, SBTableAlertDelegate, SBTableAlertDataSource, CurrentLocationDelegate> { UIImage* warningImg; UIImage* alertImg; UIImage* connectedImg; UIImage* disconnectedImg; } @property (retain, nonatomic) NSMutableArray *devicesFound; @property (strong, nonatomic) BSServer *server; @property (strong, nonatomic) SBTableAlert *devicesFoundWindow; @property (strong, nonatomic) UIAlertView *connectingWindow; @property (strong, nonatomic) UIAlertView *connectedPrompt; @property (strong, nonatomic) IBOutlet UIImageView *warnLeft; @property (strong, nonatomic) IBOutlet UIImageView *warnRight; @property (strong, nonatomic) IBOutlet UIImageView *alertLeft; @property (strong, nonatomic) IBOutlet UIImageView *alertRight; @property (strong, nonatomic) IBOutlet UIImageView *connectionIndicator; @property AVAudioPlayer *audioPlayer; @property (nonatomic, retain) CurrentLocationController *CLController; @property (strong, nonatomic) IBOutlet UIButton *disconnectButton; @property (strong, nonatomic) IBOutlet UIButton *connectButton; @property (strong, nonatomic) IBOutlet UIButton *homeButton; @property (strong, nonatomic) IBOutlet UIButton *deviceButton; @property (strong, nonatomic) IBOutlet UILabel *speedLabel; @property (strong, nonatomic) IBOutlet UILabel *speedValueLabel; - (IBAction)connectButtonPressed:(UIButton *)sender; - (IBAction)disconnectButtonPressed:(UIButton *)sender; - (IBAction)deviceButtonPressed:(UIButton *)sender; @end
! 74!
Source file #import "BSViewController.h" #define CONNECTING_TIMEOUT 5 @interface BSViewController () @end @implementation BSViewController @synthesize devicesFound; @synthesize server; @synthesize devicesFoundWindow; @synthesize connectingWindow; @synthesize connectedPrompt; @synthesize warnLeft; @synthesize warnRight; @synthesize alertLeft; @synthesize alertRight; @synthesize connectionIndicator; @synthesize audioPlayer; @synthesize CLController; @synthesize speedLabel; @synthesize speedValueLabel; @synthesize disconnectButton; @synthesize connectButton; @synthesize homeButton; @synthesize deviceButton; static Boolean deviceFoundFlag = false; static Boolean deviceConnected = false; - (void)viewDidLoad { [super viewDidLoad]; // Do any additional setup after loading the view, typically from a nib. NSLog(@"->Start"); //******************************************************************************************************************** // Set button appearence //******************************************************************************************************************** // Define resizable images UIImage *resizableButton = [[UIImage imageNamed:@"blueButton.png" ]
! 75!
resizableImageWithCapInsets:UIEdgeInsetsMake(18, 18, 18, 18)]; UIImage *resizableButtonHighlighted = [[UIImage imageNamed:@"blueButtonHighlight.png" ] resizableImageWithCapInsets:UIEdgeInsetsMake(18, 18, 18, 18)]; UIColor *textcolor = [UIColor whiteColor]; UIColor *dropShadowTextColor = [[UIColor blackColor] colorWithAlphaComponent:0.36]; // Set disconnect button [self.disconnectButton setBackgroundImage:resizableButton forState:UIControlStateNormal]; [self.disconnectButton setBackgroundImage:resizableButtonHighlighted forState:UIControlStateHighlighted]; [self.disconnectButton setTitleColor:textcolor forState:UIControlStateNormal]; [self.disconnectButton setTitleColor:[textcolor colorWithAlphaComponent:0.5] forState:UIControlStateHighlighted]; [self.disconnectButton setTitleShadowColor:dropShadowTextColor forState:UIControlStateNormal]; [self.disconnectButton setTitleShadowColor:[dropShadowTextColor colorWithAlphaComponent:0.5] forState:UIControlStateHighlighted]; [self.disconnectButton setTitle:@"Disconnect" forState:UIControlStateNormal]; // Set connect button [self.connectButton setBackgroundImage:resizableButton forState:UIControlStateNormal]; [self.connectButton setBackgroundImage:resizableButtonHighlighted forState:UIControlStateHighlighted]; [self.connectButton setTitle:@"Connect" forState:UIControlStateNormal]; [self.connectButton setTitleColor:textcolor forState:UIControlStateNormal]; [self.connectButton setTitleColor:[textcolor colorWithAlphaComponent:0.5] forState:UIControlStateHighlighted]; [self.connectButton setTitleShadowColor:dropShadowTextColor forState:UIControlStateNormal]; [self.connectButton setTitleShadowColor:[dropShadowTextColor colorWithAlphaComponent:0.5] forState:UIControlStateHighlighted]; // Set home button [self.homeButton setBackgroundImage:resizableButton forState:UIControlStateNormal]; [self.homeButton setBackgroundImage:resizableButtonHighlighted forState:UIControlStateHighlighted]; [self.homeButton setTitle:@"Home" forState:UIControlStateNormal]; [self.homeButton setTitleColor:textcolor forState:UIControlStateNormal]; [self.homeButton setTitleColor:[textcolor colorWithAlphaComponent:0.5] forState:UIControlStateHighlighted]; [self.homeButton setTitleShadowColor:dropShadowTextColor
! 76!
forState:UIControlStateNormal]; [self.homeButton setTitleShadowColor:[dropShadowTextColor colorWithAlphaComponent:0.5] forState:UIControlStateHighlighted]; // Set device button [self.deviceButton setBackgroundImage:resizableButton forState:UIControlStateNormal]; [self.deviceButton setBackgroundImage:resizableButtonHighlighted forState:UIControlStateHighlighted]; [self.deviceButton setTitle:@"Device" forState:UIControlStateNormal]; [self.deviceButton setTitleColor:textcolor forState:UIControlStateNormal]; [self.deviceButton setTitleColor:[textcolor colorWithAlphaComponent:0.5] forState:UIControlStateHighlighted]; [self.deviceButton setTitleShadowColor:dropShadowTextColor forState:UIControlStateNormal]; [self.deviceButton setTitleShadowColor:[dropShadowTextColor colorWithAlphaComponent:0.5] forState:UIControlStateHighlighted]; //******************************************************************************************************************** // All instance variables initialization //******************************************************************************************************************** // Initialize blind spot server. server = [BSServer sharedInstance]; server.delegate = self; // Initialize location controller.; CLController = [CurrentLocationController sharedInstance]; CLController.delegate = self; // Initialize an array for devices found. devicesFound = [[NSMutableArray alloc] init]; // Initialize available devices window. devicesFoundWindow = [[SBTableAlert alloc] initWithTitle:@"Device Found" cancelButtonTitle:@"OK" messageFormat:nil]; [devicesFoundWindow.view setTag:2]; [devicesFoundWindow setStyle:SBTableAlertStyleApple]; [devicesFoundWindow setDelegate:self]; [devicesFoundWindow setDataSource:self]; // Initialize warning image and alert image. warningImg = [UIImage imageNamed:@"warningImg.png"]; warnLeft.image = warningImg; warnLeft.alpha = 0.0; warnRight.image = warningImg; warnRight.alpha = 0.0; alertImg = [UIImage imageNamed:@"alertImg.png"];
! 77!
alertLeft.image = alertImg; alertLeft.alpha = 0.0; alertRight.image = alertImg; alertRight.alpha = 0.0; connectedImg = [UIImage imageNamed:@"connectedIcon.png"]; disconnectedImg = [UIImage imageNamed:@"disconnectedIcon.png"]; connectionIndicator.image = disconnectedImg; // Initialize audio player. NSURL *fileURL = [[NSURL alloc] initFileURLWithPath: [[NSBundle mainBundle] pathForResource:@"alarm1" ofType:@"mp3"]]; audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:nil]; // Set any negative integer value to loop the sound indefinitely until you call the stop method. audioPlayer.numberOfLoops = -1; // Set that the audio can be played in background. [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil]; [[AVAudioSession sharedInstance] setActive: YES error: nil]; [[UIApplication sharedApplication] beginReceivingRemoteControlEvents]; //************************************************************************************************************************ // Initial Actions //************************************************************************************************************************ if (!deviceFoundFlag) { connectingWindow = [[UIAlertView alloc] initWithTitle:@"Connecting Device\nPlease Wait..." message:nil delegate:self cancelButtonTitle:nil otherButtonTitles: nil]; [connectingWindow show]; UIActivityIndicatorView *indicator = [[UIActivityIndicatorView alloc] initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleWhiteLarge]; // Adjust the indicator so it is up a few pixels from the bottom of the alert indicator.center = CGPointMake(connectingWindow.bounds.size.width / 2, connectingWindow.bounds.size.height - 50); [indicator startAnimating]; [connectingWindow addSubview:indicator];
! 78!
[NSTimer scheduledTimerWithTimeInterval:CONNECTING_TIMEOUT target:self selector:@selector(closeConnectingWindow) userInfo:nil repeats:NO]; } } - (void) viewWillAppear:(BOOL)animated { [[UIApplication sharedApplication] setIdleTimerDisabled:YES]; if(!gpsEnFlag) { speedValueLabel.text = [NSString stringWithFormat:@"Disabled"]; } else { speedValueLabel.text = [NSString stringWithFormat:@"0.00 km/h"]; } } - (void) viewWillDisappear:(BOOL)animated { [[UIApplication sharedApplication] setIdleTimerDisabled:NO]; } - (void)didReceiveMemoryWarning { [super didReceiveMemoryWarning]; // Dispose of any resources that can be recreated. } #pragma mark - Location Delegate - (void)locationUpdate:(CLLocation *)location { if(location.speed >= 0) { speedValueLabel.text = [NSString stringWithFormat:@"%.2f km/h", [location speed]*18/5]; } else { speedValueLabel.text = [NSString stringWithFormat:@"0.00 km/h"]; } } #pragma mark - Server Delegate - (void) deviceFound:(CBPeripheral *)newDevice {
! 79!
if(newDevice != nil) { NSLog(@"[UI]->device found: %@", newDevice.name); deviceFoundFlag = TRUE; [devicesFound addObject:newDevice]; if(!deviceConnected) { [server connectDevice:[devicesFound objectAtIndex:0]]; } } } - (void) deviceConnected:(NSString *)deviceName { deviceConnected = TRUE; connectionIndicator.image = connectedImg; if(connectingWindow.visible) { [connectingWindow dismissWithClickedButtonIndex:0 animated:YES]; } connectedPrompt = [[UIAlertView alloc] initWithTitle: [NSString stringWithFormat:@"%@ Connected",deviceName] message:@"The Blind Spot Detection System is successfully connected. The Blind Spot Monitoring System is active." delegate:nil cancelButtonTitle:nil otherButtonTitles:nil]; [connectedPrompt show]; [NSTimer scheduledTimerWithTimeInterval:3 target:self selector:@selector(closeConnectedPrompt) userInfo:nil repeats:NO]; } - (void) deviceDisconnected:(NSString *)deviceName; { deviceConnected = FALSE; connectionIndicator.image = disconnectedImg; if(warnLeft.alpha == 1.0) { [UIView animateWithDuration:0.5 delay:0.0 options:UIViewAnimationOptionCurveEaseInOut animations:^ {warnLeft.alpha = 0.0;} completion:^(BOOL finished) {;}]; }
! 80!
if(warnRight.alpha == 1.0) { [UIView animateWithDuration:0.5 delay:0.0 options:UIViewAnimationOptionCurveEaseInOut animations:^ {warnRight.alpha = 0.0;} completion:^(BOOL finished) {;}]; } if(alertLeft.alpha == 1.0) { [UIView animateWithDuration:0.5 delay:0.0 options:UIViewAnimationOptionCurveEaseInOut animations:^ {alertLeft.alpha = 0.0;} completion:^(BOOL finished) {;}]; } if(alertRight.alpha == 1.0) { [UIView animateWithDuration:0.5 delay:0.0 options:UIViewAnimationOptionCurveEaseInOut animations:^ {alertRight.alpha = 0.0;} completion:^(BOOL finished) {;}]; } if(audioPlayer.playing) { [audioPlayer stop]; } UIAlertView *alert = [[UIAlertView alloc] initWithTitle: [NSString stringWithFormat:@"%@ Disconnected",deviceName] message:@"The Blind Spot Detection System is disconnected" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; } - (void) deviceConnectFail:(NSString *)deviceName { connectionIndicator.image = disconnectedImg; if(connectingWindow.visible) { [connectedPrompt dismissWithClickedButtonIndex:0 animated:YES]; UIAlertView *alert = [[UIAlertView alloc] initWithTitle: @"Connection Failed"
! 81!
message:@"The connection with device is failed. Please try again." delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; } } - (void)leftObjectPresented:(float)distance { [UIView animateWithDuration:0.5 delay:0.0 options:UIViewAnimationOptionCurveEaseInOut animations:^ {warnLeft.alpha = 1.0;} completion:^(BOOL finished) {;}]; } - (void)leftAreaClear { if(warnLeft.alpha == 1.0) { [UIView animateWithDuration:0.5 delay:0.0 options:UIViewAnimationOptionCurveEaseInOut animations:^ {warnLeft.alpha = 0.0;} completion:^(BOOL finished) {;}]; } } - (void)leftCollisionPredicted { if(!audioPlayer.playing) { [audioPlayer play]; } if(alertLeft.alpha == 0.0) { [UIView animateWithDuration:0.5 delay:0.0 options:UIViewAnimationOptionCurveEaseInOut animations:^ {alertLeft.alpha = 1.0;} completion:^(BOOL finished) {;}]; } } - (void)leftCollisionPredictionClear { if(audioPlayer.playing) { [audioPlayer stop]; }
! 82!
if(alertLeft.alpha == 1.0) { [UIView animateWithDuration:0.5 delay:0.0 options:UIViewAnimationOptionCurveEaseInOut animations:^ {alertLeft.alpha = 0.0;} completion:^(BOOL finished) {;}]; } } - (void)rightObjectPresented:(float)distance { // Do some actions for right object presented. } - (void)rightAreaClear { // Do some actions for right area clear. } - (void)rightCollisionPredicted { // Do some actions for collision predicted at right area. } - (void)rightCollisionPredictionClear { // Do some actions for collision predicted clear at right area. } #pragma mark - SBTableAlertDataSource - (UITableViewCell *)tableAlert:(SBTableAlert *)tableAlert cellForRowAtIndexPath:(NSIndexPath *)indexPath { UITableViewCell *cell; if (tableAlert.view.tag == 0 || tableAlert.view.tag == 1) { cell = [[UITableViewCell alloc] initWithStyle:UITableViewCellStyleDefault reuseIdentifier:nil]; } else { // Note: SBTableAlertCell cell = [[SBTableAlertCell alloc] initWithStyle:UITableViewCellStyleDefault reuseIdentifier:nil]; } if(indexPath.row < devicesFound.count) { [cell.textLabel setText:[[devicesFound objectAtIndex:indexPath.row] name]]; } else { [cell.textLabel setText:@"Empty"]; }
! 83!
return cell; } - (NSInteger)tableAlert:(SBTableAlert *)tableAlert numberOfRowsInSection:(NSInteger)section { if (tableAlert.type == SBTableAlertTypeSingleSelect) return 5; else return 10; } - (NSInteger)numberOfSectionsInTableAlert:(SBTableAlert *)tableAlert { if (tableAlert.view.tag == 3) return 2; else return 1; } - (NSString *)tableAlert:(SBTableAlert *)tableAlert titleForHeaderInSection:(NSInteger)section { if (tableAlert.view.tag == 3) return [NSString stringWithFormat:@"Section Header %d", section]; else return nil; } #pragma mark - SBTableAlertDelegate - (void)tableAlert:(SBTableAlert *)tableAlert didSelectRowAtIndexPath:(NSIndexPath *)indexPath { if (tableAlert.type == SBTableAlertTypeMultipleSelct) { UITableViewCell *cell = [tableAlert.tableView cellForRowAtIndexPath:indexPath]; if (cell.accessoryType == UITableViewCellAccessoryNone) [cell setAccessoryType:UITableViewCellAccessoryCheckmark]; else [cell setAccessoryType:UITableViewCellAccessoryNone]; [tableAlert.tableView deselectRowAtIndexPath:indexPath animated:YES]; } } - (void)tableAlert:(SBTableAlert *)tableAlert didDismissWithButtonIndex:(NSInteger)buttonIndex { NSLog(@"Dismissed: %i", buttonIndex); } - (void)closeConnectedPrompt { [connectedPrompt dismissWithClickedButtonIndex:0 animated:YES]; } - (void)closeConnectingWindow
! 84!
{ if(connectingWindow.visible) { // Note: connectingWindow will turn to invisible only after this function exits [connectingWindow dismissWithClickedButtonIndex:0 animated:YES]; if(!deviceFoundFlag) { UIAlertView *alert = [[UIAlertView alloc] initWithTitle: @"Connecting Failed" message:@"No device can be found" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; } else { UIAlertView *alert = [[UIAlertView alloc] initWithTitle: @"Timeout" message:@"Connecting is timeout. Please try again" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; } } } - (IBAction)connectButtonPressed:(UIButton *)sender { if(deviceConnected) { UIAlertView *alert = [[UIAlertView alloc] initWithTitle: @"Wrong Command" message:@"The device is already connected" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; } else { if(deviceFoundFlag) { [server connectDevice:[devicesFound objectAtIndex:0]];
! 85!
} else { UIAlertView *alert = [[UIAlertView alloc] initWithTitle: @"Connecting Failed" message:@"No device can be found" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; } } } - (IBAction)disconnectButtonPressed:(UIButton *)sender { if(!deviceConnected) { UIAlertView *alert = [[UIAlertView alloc] initWithTitle: @"Wrong Command" message:@"The device is already disconnected" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; } else { [server disconnectDevice]; } } - (IBAction)deviceButtonPressed:(UIButton *)sender { //TableAlertView message [devicesFoundWindow show]; } @end
! 86!
Appendix D – SRF05 Ultrasonic Sensor Control Program Code
/*********************************************************************
* @fn DistanceMeasNotify
* * @brief Prepare and send a distance measurement notification
*
* @return none
*/
static void distanceMeasNotify(void) {
uint8 *p = distanceMeas.value;
CLKCONCMD &= ~0x38; //1MHz
T1CTL &= ~0x03; //stop timer;
T1CTL &= ~0x04;
T1CTL |=0x08;
T1CNTL &= 0x00; //reset counter
T1CCTL0 &= ~0x40; // Disable channel 0 interrupts
//get the ultrasonic reading
//first turn on trigger pin
int tempIndex=65;
//P0_0 J7-2
P0_0=1;
while(--tempIndex>0)
! 87!
{
asm("NOP"); //delay 10 us
}
P0_0=0;
//turn off the trigger pin
//P0_1 J7-3
while(P0_1 == 0){}//wait for the response
T1CTL |= 0x01; //start the timer
while(P0_1 ==1)
{
}
T1CTL &= ~0x03; //stop the timer
//low_counter = T1CNTL; //read the counter
*p++= T1CNTL;
*p++= T1CNTH;
//*p++= (high_counter>>8);
distanceMeas.len = (uint8)(p - distanceMeas.value);
DistanceSensor_MeasNotify(connHandle, &distanceMeas); //notify the
service we have a new value
}