Upload
others
View
4
Download
0
Embed Size (px)
Citation preview
M.A.C.S.: The Mobile Autonomous Containment System
A container that autonomously follows a person around.
Matthew Hoffman, Gage Mondok, Aaron Pulver
10 December 2013
M.A.C.S. 2
Contents Overview: .............................................................................................................................. 4
Needs Statement: ............................................................................................................... 4
Objective Statement: .......................................................................................................... 4
Description: ........................................................................................................................ 4
Diagram: ............................................................................................................................ 5
Requirements Specification: ................................................................................................... 6
Needs: ............................................................................................................................... 6
Engineering Requirements .................................................................................................. 7
Concept Selection: ................................................................................................................. 7
Existing Systems: ............................................................................................................... 7
Concepts considered: ......................................................................................................... 8
Body Tracking and Gesture Recognition: ............................................................................. 9
Concepts Considered: ..................................................................................................... 9
Concept Chosen: ............................................................................................................ 9
Rationale for Choice: ....................................................................................................... 9
Personal Identification:...................................................................................................... 10
Concepts Considered: ................................................................................................... 10
Concept Chosen: .......................................................................................................... 12
Rationale for Choice: ..................................................................................................... 12
Cart Movement and Control System: ................................................................................. 12
Concepts Considered/Chosen: ...................................................................................... 12
Design: ................................................................................................................................ 13
Level 0 Diagram: .............................................................................................................. 13
Body Tracking and Gesture Recognition Subsystem Overview: .......................................... 14
Personal Identification Subsystem Overview: ..................................................................... 16
How This Subsystem Works: ......................................................................................... 18
Why this Design Mitigates Risk: ..................................................................................... 18
Detailed Design: ............................................................................................................ 18
Constraints and Considerations: ........................................................................................... 28
Extensibility ...................................................................................................................... 28
M.A.C.S. 3
Manufacturability .............................................................................................................. 28
Reliability ......................................................................................................................... 28
Ethical Issues ................................................................................................................... 28
Intellectual Property .......................................................................................................... 28
Cost estimates: .................................................................................................................... 28
Testing Strategy: .................................................................................................................. 30
Body Tracking and Gesture Recognition: ........................................................................... 30
Personal Identification Subsystem: .................................................................................... 30
Test Descriptions: ............................................................................................................. 32
Unit Tests: .................................................................................................................... 32
Integration Tests: .......................................................................................................... 32
Acceptance Tests: ......................................................................................................... 33
Risks: .................................................................................................................................. 34
Milestone Chart: ................................................................................................................... 35
Appendix: ............................................................................................................................ 36
Detailed Testing Strategy .................................................................................................. 36
References: ......................................................................................................................... 56
M.A.C.S. 4
Overview:
Needs Statement:
Since the late 1930s[8], customers have been pushing shopping carts through stores to
hold the items they have bought. However, this solution to transporting goods from one part of
an establishment to another is inadequate for people who are disabled in such a way that
prevents them from pushing a cart, or for people who simply want to, or need to, do something
else with their hands while navigating a store or other environment. There is a need for a
system that acts like a cart but does not require the user to operate with his or her hands.
Objective Statement:
The objective of this project is to design and create a system that will allow users to
store items in a container which will follow the user, allowing for hands-free travel around
different environments. The device will be able to maintain a safe and efficient distance from
the user, and will be able to differentiate between people to always follow the correct person.
The user will be able to control various aspects of the device, such as distance of the container
and height of the container, using simple commands. The system must also be easy to use,
requiring limited user input, and must be safe to use by maintaining a safe distance from the
user and by having some basic obstacle avoidance to avoid bumping into other objects.
Description:
In order to follow a user, a Kinect will be used which will be able to track the head and/or
body of the user. The Kinect will also be used to detect arm gestures to send commands to the
system and determine distance to the user. Control can also be established through an Android
application. The Kinect and Android application will be connected to a controller module, a
laptop running Linux, which will handle the logic operations of the system and drive the cart. .
The Kinect will send the location, movement, direction, and speed of the person being tracked.
Combing this data with the user’s smartphone sensors including the accelerometer, gravity
sensor, and magnetic sensor will lead to successful navigation. The system will also possess
various short range sensors in order to avoid collision with other objects while moving. The
physical cart will be a two-wheel drive system capable of carrying a 20 pound load and moving
at a max speed of 4 mph.
M.A.C.S. 5
Diagram:
Figure 1: Cart Design View #1
Camera For tracking
Adjustable Height Basket
Holder for Internal Electrical
Components
Sensors Drivetrain
M.A.C.S. 6
Figure 2: Cart Design View #2
The camera would be mounted higher to allow for a greater field of view. The basket
(middle level) will be adjustable. The bottom layer will contain electrical (controller) and
mechanical (motors) components. The battery will be placed on the back of the bottom layer to
provide stability and easy access. Wiring will be closely bound to the carts frame to avoid
damage and harm to the user.
Requirements Specification: The following lists of needs and specifications represent the goals of the M.A.C.S. and
what criteria will be used to judge whether the system meets these marketing requirements.
Needs:
1.) Must autonomously follow an individual throughout a flat environment with obstacles.
2.) Must adapt in height for different individuals.
3.) Must identify an individual and follow only him or her.
4.) Must maintain a reasonable distance from the individual.
5.) Must have a reasonable battery life
6.) Must respond to hand gestures
7.) Must track small and large adults accurately
Camera For tracking
Adjustable Height Basket
Holder for Internal Electrical
Components
Sensors Drivetrain
M.A.C.S. 7
Engineering Requirements
1.) Must be able to move up to 4 mph with a desired average speed of 3 mph.
2.) This speed should be attainable with up to a 20 lb load.
3.) Height adjustment will provide six inches of vertical change.
4.) The M.A.C.S should be no more than two feet from the followed individual at any time.
5.) Battery should last 1 hour
6.) The M.A.C.S should classify four gestures with an accuracy of 65%.
7.) The M.A.C.S should use smartphone sensors as well as depth sensors to find and calibrate
a user within three seconds of losing him or her.
Analysis:
1.) The speed specification is brought about by the fact that 3 mph is the average preferred
human walking speed [1].
2.) 30 lbs is approximately the maximum comfortable weight that could be loaded into a
handheld shopping basket and thus makes a good target for the max load of the M.A.C.S.
3.) One standard deviation in human height is 6 in [2] so the M.A.C.S should be able to adjust at
least 6 inches.
4.) Human personal space is estimated to be at an average of 1.5 ft. [3]. M.A.C.S. should trail
behind slightly more than this and also allow closer interaction for when an item is being placed
into it.
5.) 1 hr is enough time to complete most grocery store trips.
6.) Gesture recognition is done by machine learning and is very dependent on the person.
Significant data will be needed to get higher rates.
7.) Since the cart will be two feet behind the user, 3 sec is enough time for the cart to turn up
and down an aisle and find the user.
Concept Selection: Several existing systems were analyzed. Each of the following systems provides similar
functionality but also has downsides. Each subsystem of the M.A.C.S. is investigated in more
detail below.
Existing Systems:
1.) WPI autonomous shopping cart.
● This similar project is too slow to respond and relies on following the person’s
hand.
● This system uses a laptop to handle the vision and sensor processing.
2.) IS2You wi-Go autonomous shopping cart.
● A similar product which is too slow to follow a person throughout the store. It also
is very high and difficult for disabled people to use as the basket is relatively
high. It appears to have trouble with corners as well.
● The wi-Go uses the Kinect and image processing to follow a person around. Both
technologies are being seriously considered.
M.A.C.S. 8
3.) Whole Foods “Smarter Cart”
● A Kinect powered shopping cart which uses a tablet and barcode scanners to
match items to a shopping list and ring them up.
● Uses a Windows 8 tablet, UPC scanner and RFID to read the items.
4.) San Diego State Robotic Person Following
● Three different versions have been developed. The first used a fixed camera and
fuzzy logic. The second iteration added camera tilt and more robotic control as
far as speed and distance. The current iteration uses a segway base with an
onboard computer, vision system, and other sensors.
● Currently, it follows a specific color of shirt that a person is wearing.
Concepts considered:
The following are general concepts which were considered as part of the M.A.C.S. There
are three main categories: Processing, Identification, drive-train, and the cart basket itself.
These categories are analyzed independently.
Processor/ Operating System
1.) ARM processor (oDroid/Raspberry Pi…) with Linux distro
2.) Inexpensive laptop/tablet with Windows 7/8
3.) A combination of a microcontroller and laptop/tablet
Identification/Tracking/Object avoidance
1.) Image processing and body tracking with a Kinect or similar product
2.) Smart phone application
3.) Proximity sensors
Drivetrain
1.) Modified wheelchair assembly
2. Custom four-wheel assembly
The cart/basket
1.) Removable basket(s)
2.) Cart/baskets that have automated lift to
3.) Stationary basket(s)
M.A.C.S. 9
Body Tracking and Gesture Recognition:
Concepts Considered:
Three hardware components were considered for body tracking and gesture cognition.
The initial idea was to use a Microsoft Kinect. The Xtion Pro, manufactured by Asus, was also
considered as well as the idea of using another camera and OpenCV to do image processing.
As far as software and image processing, we decided that in order to do gestures and
body tracking, we would like to use skeleton tracking. This concept uses the depth, IR, and RBG
sensors of the Kinect or Xtion Pro to map the skeleton of a user, including his or her major joints
and features such as the torso, hand, and head. This left us with three main options: write it
from scratch, use the Windows SDK, or use open source libraries. Since we ideally want to use
a microprocessor such as a Raspberry Pi or ODroid, the ability to be multi-platform was critical.
After some investigation, two libraries which allow the Kinect to be used on Linux were found:
Libfreenect, and OpenNI.
Table 1
Software Ease of use Platforms Skeleton Tracking Languages
OpenCV Difficult Linux, Windows No C++, Python, Java
(wrappers for
other languages)
MS SDK Easy Windows Yes C#, VB.Net
Libfreenect Moderate Linux, Windows No C,C++,C#, VB.net,
Java
OpenNI Moderate Linux, Windows Yes C++, (wrappers for
.NET and Java)
Concept Chosen:
The Microsoft Kinect was chosen as the primary sensor to do body tracking and aid in
gesture recognition. OpenNI 1.5.2 was chosen to interface with the Kinect.
Rationale for Choice:
Since the cart will be in motion, image processing alone was not going to be enough to
successfully track a user without crashing; we also need to have a depth map of the
surroundings. Although, the Xtion Pro has a slightly larger field of view (58°H and 45°V) and is
compatible with Windows, Linux, and Android, we decided to use a Microsoft Kinect as used
Kinects are much easier to find and are cheaper [7]. A new Xtion Pro and Kinect both cost $179,
whereas a used Kinect can be found for as little as $50.
To interface with the Kinect it was found that Libfreenect is more of a driver than
processing library. OpenNI when combined with another open source driver can be used to do
full skeleton tracking and much more on Ubuntu and other Linux distributions. This led us to
choose OpenNI 1.5.2, a stable open source library created in part by PrimeSense, the
M.A.C.S. 10
manufacturer of the depth sensor in both the Xtion Pro and Kinect. There is excellent
documentation and example code which has already helped us get off the ground and do some
experimentation.
Personal Identification:
Concepts Considered:
Four different concepts were considered as a risk mitigation technique. The first was to
track the color of the user’s shirt, like the San Diego State Robotic Person Following. This
would allow the person to be identified using only the primary tracking sensor. However, this
could create issues when used under different lighting conditions, or if multiple people were
wearing the same shirt. It also provided no way to track the user if the camera could no longer
see him or her.
The next idea was to have a reflective object the user could wear which would then be
detected by the camera. This would require minimum extra hardware and would remove the
lighting uncertainty of the color tracking. However, this could also require there to be a bright
light on the cart, which could be dangerous or even annoying to anyone in the vicinity.
Furthermore, it does not mitigate the risk of the user’s leaving the camera frame.
The next idea was to create a wearable object that would gather the user’s position data
and wirelessly transmit it back to the cart. This would allow the user to be tracked even if not in
the camera’s vision, while still not requiring anything of the user. However, it would also be very
difficult and expensive to create something compact enough to be unobtrusive to the user.
Finally, it was suggested that the user’s smartphone could perform the same function as
the previous wearable tracker. This has many of the same advantages and disadvantages of
the previous design, but would require the user to have a smartphone and have the knowledge
to connect it to a Bluetooth network.
M.A.C.S. 11
Table 2
Number Description Favorable Factors Unfavorable Factors
1 Track Color of
User’s Shirt
Can use primary tracking
sensor, no need for extra
hardware
Multiple users wearing the
same color shirt can cause
issues. Lighting can
complicate color tracking.
No way to follow the user if
they leave the frame.
2 A dedicated
reflective object the
user wears which is
unique to the cart.
Minimal extra hardware
required. Simple to
implement.
Requires the user to wear
something, which many
may be unfavorable to.
May require a light on the
cart, which could be
annoying or even
dangerous to others. No
way to follow the user if
they leave the frame.
3 A wearable object
that tracks the
user’s position
User does not need
anything to use the cart.
Can use position sensor
(gyroscope, accelerometer,
etc.) to track user’s
movement. Can still be
used if target eave field of
view of primary tracking
sensor.
Very difficult and expensive
to create in a form factor
small enough for a user to
comfortably wear. Requires
more complex logic to
differentiate users from
nonusers.
4 Get position data
from the user’s
smartphone and
transmit back via
Bluetooth.
A user’s movements can be
tracked and recorded, so
even if a user is off frame
they can still be tracked.
Requires the user to have a
smartphone, and be able to
connect to a Bluetooth
network. Requires more
complex logic to
differentiate users from
nonusers.
M.A.C.S. 12
Concept Chosen:
The primary method of person identification will be using concept four, where position
data would be gathered from the user’s phone. This was decided because it seemed the most
plausible and reliable which could realistically be implemented within the time frame. Time
allowing, concept one may also be implemented as a first-round check.
Rationale for Choice:
Table 3. Personal identification weighting.
Concept Follow
user over
unfamiliar
terrain
Identify the
user and only
follow that
person
Maintain a
reasonable
distance from
the user
Ease to
user
Ease to
implemen
t
Total
1 – Color
Tracking
0 0 0 2 1 3
2 – Reflective,
wearable,
object
0 1 1 1 3 6
3 – smart,
wearable,
object
1 2 1 1 0 5
4 – smart
phone
integration
1 3 1 0 2 7
Cart Movement and Control System:
Concepts Considered/Chosen:
The first step is to decide upon a material for the cart. Metal (steel) is desirable for its
strength and the ability to form it into many shapes while wood is desirable for its light weight
and low cost. Ultimately cost is the limiting factor for this design, so wood wins out unless
suitable scrap can be found for free. The cart, with batteries and load should weigh about 50 lbs
if made with wood. To move 50 lbs at the target max speed of 5 mph takes approximately 2/3
HP. Two controlled induction motors (CIM) at nearly ½ HP each will reach this target, but there
is still room to reduce the motor power and drive down the cost of the motor directly and the
motor controller also because of reduced current draw. A reduction in spec to a max of 3 mph is
M.A.C.S. 13
in order so that cost goals can be more easily met. With that reduction in spec, the much
cheaper “CCL 9015” motor can be used. This motor only outputs 179W which is ¼ HP vs the
CIM motor’s nearly ½ HP. However, with the reduction in max speed, ¼ HP for each motor is
enough to meet spec. This provides additional value by reducing power consumption and cost
through using a cheaper, lower amperage motor controller. Also, 2 in 1 controllers are available
at this new lower amperage, halving the cost. Gearboxes for these motors are readily available
online for approximately $50, but a much cheaper one could be built from old, used gears.
Knowing that the wheels are 5” in diameter, the gearing can be calculated exactly to give the top
speed at the motor’s most efficient RPM.
For knowledge of obstacles, the Kinect sensor alone is not sufficient. There may be
obstacles down by the bottom of the cart that the Kinect will not see since it is placed up high.
To correct for this a proximity sensor will be mounted to the bottom front of the cart with the data
from the sensor being incorporated into the motor control loop. The sensor works by sending
out an ultrasonic pulse and then measuring how long the pulse takes to return. Since the speed
of sound and time the sound travelled is known, the distance it had to travel can be calculated.
This is the distance the cart is away from an obstacle.
Design:
The basis for the design is that the system will take in information about its environment
and the position of the user in order to determine how it will move. The system will take in
tracking information from the primary vision sensor, the user identification subsystem, collision
sensors, and occasionally feedback from the user. Using these data, the cart will attempt to
follow the user within allowable tolerances. The overall system is divided into three main
subsystems: the user tracking subsystem, the personal identification subsystem, and the control
system.
Level 0 Diagram:
Table 4. Overview of M.A.C.S.
Module: M.A.C.S.: Mobile Autonomous Containment System
Inputs: User Tracking Data User Identification Data Collision Sensor Data User Feedback Power
Outputs Motor Control Signal
Functionality Identify and track an individual user to follow him or her around an unknown environment. The system should not collide with other objects and should be able to accept user feedback and control. The
M.A.C.S. 14
system should not be significantly faster nor slower than the user walking.
Figure 3. Overview of system components.
Body Tracking and Gesture Recognition Subsystem Overview:
Tracking user’s positions relative to the system is a crucial element of the M.A.C.S. In
addition to tracking, the M.A.C.S. will rely on gestures for user interaction. These both present
many risks such as failing to recognize gestures, losing the current user, objects obstructing the
sensors, and selecting the wrong user. The Microsoft Kinect will be used as the main device to
track a specified user and provide their location and velocity relative to the robot platform. The
Kinect will also be used to recognize several hand gestures which will be interpreted to control
the robot. Since the Kinect provides a depth sensor, object avoidance can also be enhanced if
deemed necessary. The Kinect was chosen over other options due to its wide acceptance, open
source libraries, and cost.
It was previously determined that the average person walks at 3 mph [1] therefore the
body tracking must be able to continuously track a person moving up to 5 mph as we have a
tolerance of 2 mph. It was also previously established that human space is estimated to be an
average of 1.5 ft which is why the device must maintain a distance of 1.5–2 ft from the user [2].
As the Kinect has a practical range of 3.9–11 ft, it will be necessary to mount the Kinect on the
back of the cart so that it can successfully track people [4]. In our testing, we found that once
the user is less than 1m (3.2 ft), the skeleton tracking becomes inconclusive.
As far as classifying gestures, the most proven way involves machine learning using
primarily support vector machines or dynamic time warping. These have been shown to have
M.A.C.S. 15
classification rates as high as 99% [5], [6]. There is a difference between a pose and gesture.
Static poses such as hand raising a hand are easier to classify than movements which vary
more person-to-person such as a wave. A 65% classification rate might be on the low side but
depending on the type and number of gestures we decide to classify, the complexity can
significantly increase.
If a user leaves the field of view of the Kinect, the robot should be able to relocate him or
her and begin tracking him or her again within three seconds. The most likely scenario for this to
occur is when the user is walking up and down aisles and he does a turn of 180° and then goes
out of view due to the aisle wall. We estimate that with a person walking at 3 mph (4.4 ft/s),
three seconds allows for a maximum of 13.2 ft before a timeout is reached. Since the robot is
following approximately 2 ft behind the user and should know his or her last few turns from the
smartphone, 13.2 ft will be more than enough distance for the cart to make a turn left or right,
drive 5–6 ft, turn left or right and regain tracking. The Kinect has a horizontal angular field of
view of 57° and a vertical field of view of 43° with a variation of 27° by tilting the Kinect up or
down with the included motorized pivot [4]. By adjusting the angle of the Kinect, people from 4’
to 6‘6” will able to be fully tracked. Also since we need to only track the upper body, the Kinect
can be installed at a fixed angle.
There are two risks related to relying on the Kinect: tracking a specific person and
gesture recognition. The main problem with using the Kinect to track a person is obstruction of
the field of view. If the sensors cannot see the person, the data are corrupted and the skeleton
tracking will fail. To help mitigate this, the Kinect will be mounted high on the cart such as near
the handle (Figure 3). The legs and other lower body parts do not to be tracked so this
placement is ideal. This placement also provides a larger field of view and moves the Kinect
further away from the person so that if the person is close to the cart, the tracking will not fail.
This will allow the tracking of tall adults as well as children. The other piece that will be
incorporated into this design is the use of a smartphone streaming the user’s motions and
current directions. Should the Kinect lose the person, these data will tell the cart which way the
user went, and then the cart can attempt to move that way and relocate the user. Once the user
is in the field of view, the Kinect will start tracking him or her again. As described in previous
sections, the maximum amount of time between losing a user and regaining complete tracking
is three seconds. If for some reason this time limit elapses and the user cannot be found and
tracked, either a warning will be sent to the user’s phone or the cart will turn on a light or make a
sound. When the user comes back into the view, he will have to do some gesture to re-initiate
tracking.
M.A.C.S. 16
Figure 4. Tracking Module.
Figure 5. Kinect placement.
The second major risk with using a sensor such as the Kinect is relying on gestures to
control the cart and sync users. There were two methods which were investigated. The first was
using pseudo-state machines to track various joint positions in space. This naïve approach
might be acceptable for very simple gestures or poses, but it is not nearly as effective as
machine learning techniques. Since ideally four to five gestures will be recognized, this will be a
much safer and effective strategy. As previously mentioned, the two most mentioned algorithms
are dynamic time warping and support vector machines. By using an open source library,
LIBSVM, which has been used in several published papers and recording enough data to do
cross-validation, classification will be successful. Using the process in Figure 5, a 65%
classification rate should be attainable.
Figure 6. Gesture recognition overview.
Personal Identification Subsystem Overview:
This mitigation technique will be based on keeping track of the user’s movements over a
period of time. When the primary sensor is unsure about the position of the user, it will ask the
user’s phone to send the last three seconds of gathered data over Bluetooth. These data will be
whether the user is in motion, what direction they are facing, and possibly his or her speed or
distance covered in that time. From these data, the primary sensor can determine which person
in the frame is the target, or which direction to turn if the user is not in the frame.
M.A.C.S. 17
M.A.C.S. 18
How This Subsystem Works:
The phone will monitor and record the data from the accelerometer, gravity sensor, and
magnetic sensor. The accelerometer can be used to determine when a user is turning or
moving. The gravity and magnetic sensors can be used together to determine the direction the
user is facing relative to the initial point. These data will be averaged over some period of time
before being added to queue which will hold all data over a three-second period. This mitigates
the issues associated with engineering requirement A, that the cart must keep up with the user,
by being able to send a large amount of useful data. The cart can then make an informed
decision on which user to follow based on these data or, if still unsure, request an updated set
of data a short period of time later, until the target is found.
Engineering requirement B, that the cart must follow the correct user, is handled in that
the direction the user is currently travelling can be determined. Therefore, even if the cart is
unsure about which target to follow, it only has to move in the same direction as the user until
the cart finds the target again.
Finally, by keeping a record of the past three seconds of data, engineering requirement
C can be satisfied. Because the cart must be able to find the user within three seconds, the
past three seconds of movement data should be ample to determine where the user was and
predict where he or she will be soon.
Why this Design Mitigates Risk:
The basic risk is that the primary sensing device may not, at all times, be able to identify
the correct user to follow. This mitigation strategy of sending previous and current movement to
the cart from the user’s smartphone allows for the cart to be relatively certain of where its user
is, and therefore quickly track the user again. Therefore, even if the user is lost for any reason,
it should be a fairly easy task of moving the cart and find the user again. This should also be
very fast, as it should not take too long for the cart to find the user, who should not be too far out
of range of the primary tracking sensor.
Detailed Design: Table 5. Personal identification overview.
Module: PID (Personal Identification) Subsystem
Inputs: Accelerometer Data
Magnetic Sensor Data
Gravity Sensor Data
Outputs Past 3 seconds of position data
Functionality The user’s smartphone would gather the data from the three sensors
and store it over three seconds. When requested, it would send
these data to the main system which could use it to determine the
user’s past and present position and movement.
M.A.C.S. 19
Figure 7. Personal identification system diagram.
Table 1. Personal identification Android application overview.
Module: PID (Personal Identification) Subsystem – User Side App
Inputs: Accelerometer Data
Magnetic Sensor Data
Gravity Sensor Data
Data Requests over Bluetooth
Outputs Past 3 seconds of position data
Functionality The user’s smartphone would store position data for three seconds
and listen for a request for these data from the main system. When
the request was received, it would transmit the data over Bluetooth.
M.A.C.S. 20
Figure 8. Personal identification Android application diagram.
Table 6. Personal identification module between cart and Android application.
Module: PID (Personal Identification) Subsystem – Cart Program
Inputs: User’s Position Data
Request from Main System for position information
Outputs Requests for data from the user
Position Data
Functionality When the main system is unsure about the user’s position, it would
ask the subsystem for the user’s position data. The program on the
cart would then request these data from the user’s smartphone.
Once the program had the data, it would use these data to determine
where the user is and where it is heading to send to the main system.
M.A.C.S. 21
Figure 9. Personal identification software system running on the cart.
M.A.C.S. 22
Figure 10: Full System Flowchart
Figure 11: PID Subsystem Main Program Flowchart
M.A.C.S. 23
Figure 12: PID Subsystem User App Flowchart
The final physical design of the cart is shown in figures 13, 14, and 15. The linear
actuator is clearly visible as it is centered on the cart. The battery and other electrical
components will be placed underneath this where they will be protected while still allowing for
easy access for modification can trouble shooting. The cart will be rear-wheel drive and have
two caster wheels for the front. The Kinect will be mounted high near the handle of the cart in
order to provide the largest field of view.
M.A.C.S. 24
Figure 13 Side view of the cart.
M.A.C.S. 25
Figure 14 Top view of the cart.
M.A.C.S. 26
Figure 15 Back view of the cart.
M.A.C.S. 27
Engineering Standards:
Standard Where it is used Why it is used
USB 2.0 Kinect to PC Connection Serial Breakout to PC Connection
Standard, fast, and efficient way to connect peripheral devices to a computer
Bluetooth 2.0 (IEEE 802.15) Smartphone to PC wireless Connection
Efficient means of short range wireless communication
NFC Establishing Bluetooth connection between smartphone and cart
Simple method to transmit data between two devices that do not know about the other
Linux x86-64 3.0 On main PC Standard up-to-date Linux platform
PWM Controlling motor Speed Simple way to communicate with motors
Android SDK (Revision 21) User app for communicating with cart
Recent version of Android SDK which has tools required for operation
OpenNI 1.5.2 Tracking software Open source implementation for Kinect tracking
Multidisciplinary Aspects:
Field of Engineering Where it’s used Details Mechanical Cart Design Need to design the body of
the cart, as well as the drive system
Electrical Cart Power Cart Connectivity
Need to provide power to the various systems on the cart. Need to connect the sensors on the cart
Computer Tracking Code PID Code
Need algorithms to control the tracking and finding of users
Background:
Course Where it is used Computer Science I-IV Writing and design functional and maintainable
code Software Engineering Documenting code and writing code within a
team
Biorobotics Machine learning for gesture recognition IDE Connecting various digital components to a
controller
Applied Programming Implementing complex algorithms in code
M.A.C.S. 28
Constraints and Considerations:
Extensibility
The project as a whole may not be very extensible. I would be possible to expand the
number of carts to allow multiple people to have a cart, but there are few ways to extend an
individual cart into other applications. However, its modular design may allow for individual
components to be incorporated into other applications. For example, one could take the
tracking subsystem and apply it to a different application to allow for different autonomous
following devices.
Manufacturability
The cart should be relatively simply to manufacture. The only difference between the
M.A.C.S. and a regular cart is that it would require a place to store the electrical and computer
components. There are no intricacies in the design that would overly complicate manufacturing
the system,
Reliability
The most obvious points of fault for the system are the moving parts: the wheels, the
motors, and the computer’s hard drive. The motors and hard drives in particular will eventually
wear out and need to be replaced. Unlike a regular cart, the M.A.C.S. has many more
components which would be damaged and prevent the cart from operating normally. However,
most of these components should be well protected against usual wear and tear of a cart, and
should also be relatively easy to replace should the need arise.
Ethical Issues
The entire premise of the system is that it tracks a user using visual and position data.
There are many who would find this unacceptable in any situation. The ethical issue is
mitigated by not logging any information about the user or their position, but the issue of a
computer tracking a person can still cause unease.
Intellectual Property
The visual tracking system uses the OpenNI framework for interacting with the Kinect.
This is a free and open-source alternative to Microsoft’s Kinect SDK. Despite being more
complex than the Microsoft SDK, OpenNI allowed the system to be run on Linux, which was the
target platform for the system.
Cost estimates: Table 2 shows the items which were evaluated for the M.A.C.S. as well as their advantages,
disadvantages, and cost. The Bolded items were chosen to be used for the M.A.C.S.
M.A.C.S. 29
Table 2. Cost analysis.
Hardware Advantage Disadvantage Cost
Webcam/Camera Difficult $25-200 new
Xtion Pro Moderate
difficulty
Backordered $179 new
MS Kinect Moderate
difficulty
$109.99 new, $40
used
Wood Cheap, light Unattractive, less
formable
Free-$25
Metal Formable,
strong
Heavy, expensive Free-$100
CIM Motor Powerful Draws lots of current $30 x2
Other 12V Motors Cheaper May not be powerful
enough
$15 x2
CIM Controller (50A) Powerful Expensive $50 x2
Dual 12V Motor
Controller (15A)
Cheaper May not be powerful
enough
$50
Gearbox (premade) Strong, easy
to use
Expensive $50
Gearbox (homemade) Cheap Weaker, more difficult
to make
Free-$20
USB Serial Board Cheap, easy
to use
Primitive control
system
$20
Linear Actuator Powerful Expensive, drains
battery
$50
Android Smartphone Moderate
difficulty
Free-$200
Bluetooth Module Widely
available
Built in laptop-$20
Additional Processing
(Raspberry Pi)
Moderate May not be necessary $35
M.A.C.S. 30
Proximity Sensor Widely
Available
$12 x 4
Total Cost N/A N/A $283
Testing Strategy:
Body Tracking and Gesture Recognition:
In order to test the proposed system for user tracking, an empty cart will be used and
asked to follow a specific person. This will be done by syncing the user’s phone to the
laptop/controller. Then the user will walk straight for 20 ft. The user will then turn left around a
corner or wall and continue walking. If the system is able to maintain tracking within the
specified three-second recalibration window, the test will be successful. Assuming the previous
test was successful, the user will continue walking and walk around an aisle to simulate a retail
store environment. If the system continues to track the user, the test was successful. These
tests are dependent on having a mobile platform with a mounted Kinect.
Another high level test will test when multiple users are in the field of view. The system
should continue to track the initial user. One user will be tracked and then another person will
walk into the field of view (unobstructed). This will test the robustness of the tracking/user
selection algorithm. If this test is successful, the next step is to have the two people cross paths
several times where they will obstruct each other. The OpenNI library should handle this case,
but none the less it will be tested. Once all of the previous tests have been proven to be
successful, the final test will have a person being tracked, leave the field of view, and then have
another, different person enter the field of view. By using the accelerometer data from the user’s
phone, the new user should not be followed or tracked. When the original user walks back into
the frame the algorithm should detect him or her and start tracking. Should this last test fail, the
user will perform the synchronization gesture to recalibrate the tracking. These tests can be
accomplished once the OpenNI Kinect code and system is merged with the phone data
recording.
Testing the classification rate of the gestures will be done while the machine learning
algorithm is developed. The process involves recording all of the data, dividing it into three
sections. The first half is used to learn the algorithm. The second quarter is used to create
another algorithm that has a similar result to the first. These two algorithms are then adjusted
until they are equivalent. The last quarter of data is used to test the finalized algorithm and
obtain a classification rate. Comparing this number to the desired number of 65% will show if
this is successful.
Personal Identification Subsystem:
The first test performed on this was to do a proof-of-concept test on the system. The
user side of the system saved the last read values from the three sensors of interest:
accelerometer, gravity, and magnetic. The app would then accept requests for the values of
one of the sensors. The system program would run in a loop where every second it would send
a request for the three values. Each value would then be used to determine whether that
sensor determined the user to be facing the center, facing left, or facing right. The program
M.A.C.S. 31
would then display what direction it determined the user was facing based on these data. This
test showed a very simple demonstration of the basic functionality of the system. It was
demonstrated by having a user turn left and right from a central position, and the output of the
main program was checked to ensure it reported the correct direction. This test was completed
on October 17th.
The next testing milestone would be to create a library which could interface with the
other components of the system. To test this, an executable was created which interfaced with
the library. This tested the ability to pass sensor values from the phone to an application
through a library. This was completed on November 4th.
The next testing milestone would be to establish the position the user is facing relative to
the computer. This would be tested by having a user move around the computer and checking
that the program can correctly identify the user’s relative position. These positions would be
front centered, front left, front right, behind centered, behind left, and behind right. This testing
would be completed by November 18th. When this is completed, the next testing step would be
to expand the previous test to indicate the user’s position as an angle from the center position.
The testing strategy for this would, like the previous test, involve the user’s moving around the
computer while making sure that the correct angle is reported for the user’s position.
Furthermore, this test should also report whether the user is moving or standing still.
Next, the system will be able to determine the user’s relative position and whether or not
the user is moving while the system is in motion. This will be tested by having the system follow
the user and as the user turns, the system will be able to adjust where it believes the user to be
based on the system’s current position. For example, if the user is directly ahead of the cart
and then turns to the right, the cart should also be able to turn right and recognize that the user
is once again directly ahead of it. This will be tested by having the user and the system move
independently and making sure that the angle is still reported correctly.
Following this, the system will be expanded to take the previous three seconds of
movement data, instead of just the current data. With this, the cart should be able to follow the
user over more erratic paths. The test for this would be for the user to make two rapid turns
around a corner, so that when the cart turns the first corner the user has already turned the
second.
M.A.C.S. 32
Test Descriptions:
The following tests were developed to thoroughly test the M.A.C.S. individual components, the
integration of the subsystems, and the overall design. The details of each test may be found in
the appendix.
Unit Tests: Table 7. Unit Tests.
Test Case Name Description
Turning / PIDUT01 Verify that the PID can correctly determine which direction the user is facing.
Movement/PIDUT02 Verify that the PID can correctly if the user is moving or standing still.
Movement and Turning/PIDUT03 Verify that the PID can correctly if the user is moving or standing still while they are turning.
Effortless Connection/PIDUT04 Verify that the user can connect to the cart’s Bluetooth effort entirely through the Android app.
Initialize Tracking /PTGRUT01 Verify that a user is correctly identified and is tracked once the tracking as been enabled.
Range of Tracking/PTGRUT02 Verify that users are tracked at various distances from the Kinect sensor.
Basic Gesture Recognition/PTGRUT03 Verify that the four gestures are recognized within one second of their action.
Recalibration of User/PTGRUT04 Verify that if a user goes out of the FOV and is not automatically recognized and tracked upon re-entry, that they can be recalibrated successfully.
Integration Tests: Table 8. Integeration Tests
Maximum Speed/PMAUT01
Verify that the cart can reach the maximum speed.
Turning/PMAUT02 Verify that the cart can perform turns. Vertical Actuation/PMAUT03 Verify that the basket can extend 6” up or
down.
Power Draw/PMAUT04 Verify that the battery life is acceptable and power draw does not overheat the controller.
Basic User Selection via Phone Data/MIT01 Selecting the correct person to follow when the user leaves the field of view and then re-appears.
Advanced User Selection via Phone Data/MIT02
Selecting the correct person to follow when the user leaves the field of view and then re-appears.
Basic Gesture Recognition Response/MIT03 Tests the response of the cart when certain
M.A.C.S. 33
gestures are performed.
Initialization/MIT04 Tests the initialization of the Bluetooth network, PID Android App, and Kinect sensor tracking software.
Gesture Collision Avoidance /MIT05 Tests the collision avoidance of the cart when responding gestures.
Collision Detection /MIT06 Tests the collision avoidance of the cart when responding gestures.
Acceptance Tests: Table 9. Acceptance Tests.
Follow a user in a straight line, Unloaded/MAT01
The system must be able to move up to 5 MPH and The system must be no more than two feet from the followed individual at any time.
Follow a user in a straight line, Loaded/MAT02 The system must be able to move up to 4 MPH, This speed must be attainable with up to a 20 lb. load, and the system must be no more than two feet from the followed individual at any time.
Follow a user along a path/MAT03 The system must autonomously follow an individual throughout a flat environment with obstacles.
General Use Case/MAT04 Tests a general use case for the system.
M.A.C.S. 34
Risks: As previously described in the design section there are several risks both with the
M.A.C.S. as a whole and with the individual subsystems. The personal identification is the
largest risk. Correctly choosing the person to follow is vital to the M.A.C.S. success. By
combining Kinect skeleton tracking with recently recorded smartphone sensor data the correct
person will be chosen. The second major risk is building the cart. With a desired budge of $300,
the most cost effective motors and supporting parts must be chosen. Gearboxes and speed
controllers are well over $50 each. This leaves around $150 for the cart, proximity sensors,
wheels, and a power distribution system. The third and final major risk involves gesture
recognition. By using machine learning and optimization techniques a 65% classification rate is
reasonable but having little experience with real-time classification leaves some doubts. Table
11 summarizes these risks.
Table 10. Summary of major risks.
Risk Level Summary Proposed Solution
Personal Identification
High The system needs to be able to identify a specific person.
Use smartphone sensors and skeleton tracking to find, identify, and select the correct user to follow.
Gesture Recognition
Med The system needs to be able to find and track an individual's body movements, location, and gestures.
Use a Kinect to identify joint locations in real-time. Then feed these into an optimized SVM for classification.
Cart/Drivetrain High The system needs a drivetrain and base which can support the necessary hardware, sensors, and weight of items in the cart while staying in budget.
Use a dual speed controller to drive the motors. A two-wheel drive system with casters for support.
M.A.C.S. 35
Milestone Chart: Table 11. Milestones.
Task Name Scheduled Completion Data Responsible Team Member(s)
First Stage Android App 27 January 2014 Matt
Tracking and PID Integration 3 February 2014 Aaron, Matt
Build Cart Base 3 February 2014 Gage
Cart Identifies User 3 February 2014 Aaron, Matt
Cart Tracks User 3 February 2014 Aaron, Matt
Cart Moves Forward 5 February 2014 Gage
Cart Turns 12 February 2014 Gage
Build Cart Basket 17 February 2014 Gage
Complete User Interface 17 February 2014 Matt
Full tracking and PID integration
3 March 2014 Aaron, Matt
Cart Basket Moves 3 March 2014 Gage
Cart Avoids obstacles 17 March 2014 Gage, Aaron
Gesture Recognition 17 March 2014 Aaron
Finalize Cart 24 March 1014 Gage
Finished Product 29 April 2014 All
M.A.C.S. 36
Appendix:
Detailed Testing Strategy
Personal identification Unit Tests
Test Case Name/ID: PID Unit Test 01 – Turning /PIDUT01
Test Writer: Matthew Hoffman
Description: Verify that the PID can correctly determine which direction the user is facing.
Setup: Open the PID Android app and connect to the cart’s Bluetooth network. The cart should
be stationary, and have an indicator of which direction the user is facing. The user should then
turn in a circle and ensure that the cart is reporting the same direction that the user is facing.
Test User Direction
Expected Output
Pass Fail N/A Comments
1 Facing the cart
South
2 Approx. 450 To the left of the cart
South-East
3 Approx. 900 to the left of the cart
East
4 Approx. 1350 to the left of the cart
North-East
5 Facing Away from the cart
North
6 Approx. 1350 to the right of the cart
North West
7 Approx. 900 to the right of the cart
West
8 Approx. 450 to the right of the cart
South-West
M.A.C.S. 37
Test Case Name/ID: PID Unit Test 02 - Movement/PIDUT02
Test Writer: Matthew Hoffman
Description: Verify that the PID can correctly if the user is moving or standing still.
Setup: Open the PID Android app and connect to the cart’s Bluetooth network. The cart should
be stationary, and have an indicator of user movement. The user should be stationary.
Step Action Expected Result
Pass Fail N/A Comment
1 Walk at a regular pace for a couple of seconds
Cart will indicate movement
2 Speed up to walk at a face pace for a couple of seconds
Cart will indicate movement
3 Stand still for a couple of seconds
Cart will indicate no movement
4 Move around in a contained area of approx. a two foot radius.
Cart will indicate no movement
5 Walk at a slow pace for a couple of seconds
Cart will indicate movement
6 Speed up to walk at a regular pace for a couple of seconds
Cart will indicate movement
M.A.C.S. 38
Test Case Name/ID: PID Unit Test 03 – Movement and Turning/PIDUT03
Test Writer: Matthew Hoffman
Description: Verify that the PID can correctly if the user is moving or standing still while they are
turning.
Setup: Open the PID Android app and connect to the cart’s Bluetooth network. The cart should
be stationary, and have an indicator of user movement and user direction. The user should be
stationary and facing the cart.
Step Action Expected Result
Pass Fail N/A Comment
1 Turn away from the cart
Movement: None Direction: North
2 Walk forward at a regular pace for a couple of seconds
Movement: Yes Direction: North
3 Without stopping, turn right and walk forward for a couple of seconds
Movement: yes Direction: East
4 Stop and turn Right
Movement: None Direction: South
5 Turn in a circle Movement: None Direction: All
6 Walk forward at a slow pace until next to the cart
Movement: yes Direction: South
7 Turn right and walk towards the cart
Movement: Yes Direction: West
8 Turn to face the cart
Movement: None Direction: South
M.A.C.S. 39
Test Case Name/ID: PID Unit Test 04 – Effortless Connection/PIDUT04
Test Writer: Matthew Hoffman
Description: Verify that the user can connect to the cart’s Bluetooth effort entirely through the
Android app.
Setup: Turn off Bluetooth on the phone. The cart should be on and stationary.
Step Action Expected Result
Pass Fail N/A Comment
1 Open the PID app on the phone
The phone’s Bluetooth will turn on, app will open
2 Place the phone, with the App open, near the NFC tag on the cart
The app should indicate that it has found an NFC tag
3 Tap a button on the app to connect to the cart
A message will appear indicating successful connection.
M.A.C.S. 40
Personal Tracking and Gesture Recognition Unit Tests
Test Case Name/ID: PTGR Unit Test 01 – Initialize Tracking /PTGRUT01
Test Writer: Aaron Pulver
Description: Verify that a user is correctly identified and is tracked once the tracking as been
enabled.
Setup: The Kinect is powered on and connected to the laptop. Tracking is enabled and running
Test Action Expected Output
Pass Fail N/A Comments
1 One user walks into field of view
User 1 is tracked/detected
2 One user is already in FOV
User 1 is tracked/detected
3 Two users walk into FOV
User 1 and User 2 are tracked/detected
4 Two users are already in FOV
User 1 and User 2 are tracked/detected
5 One user in FOV and another user walks in FOV
User 1 and then User 2 are detected/tracked
M.A.C.S. 41
Test Case Name/ID: PTGR Unit Test 02 – Range of Tracking/PTGRUT02
Test Writer: Aaron Pulver
Description: Verify that users are tracked at various distances from the Kinect sensor.
Setup: The Kinect is on and tracking is enabled and running. A user is selected/initialized and is
being tracked.
Step User Action Expected Result
Pass Fail N/A Comment
1 Walk to 1m in front of the cart/sensor
The user is still tracked and average distance reported is 1m
2 Walk to 3m in front of the cart/sensor
The user is still tracked and average distance reported is 3m
3 Walk to 5m in front of the cart/sensor
The user is still tracked and average distance reported is 5m
4 Walk to 10m in front of the cart/sensor
The user is still tracked and average distance reported is 10m
5 Walk to 5m in front of the cart and then 5m to the right
The user is still tracked and average distance reported is 7.7m
6 Walk to 5m in front of the cart and then 5m to the left
The user is still tracked and average distance reported is 7.7m
M.A.C.S. 42
Test Case Name/ID: PTGR Unit Test 03 – Basic Gesture Recognition/PTGRUT03
Test Writer: Aaron Pulver
Description: Verify that the four gestures are recognized within one second of their action.
Setup: The Kinect is connected and tracking is enabled and running. A user is selected/tracked
and is standing 3m from the sensor.
Step Physical Gesture Expected Result
Pass Fail N/A Comment
1 “Move Toward” “Move Toward” Recognized
2 “Stop Moving” “Stop Moving” Recognized
3 “Turn Around” “Turn Around” Recognized
4 “Recalibrate” “Recalibrate Recognized”
M.A.C.S. 43
Test Case Name/ID: PTGR Unit Test 04 – Recalibration of User/PTGRUT04
Test Writer: Aaron Pulver
Description: Verify that if a user goes out of the FOV and is not automatically recognized and
tracked upon re-entry, that they can be recalibrated successfully.
Setup: Connect the Kinect and enabled tracking. Disable auto-detection of users. The user
leaves the field of view for 10 seconds.
Step User Action Expected Result
Pass Fail N/A Comment
1 Re-enter the field of view and stand there for 3 seconds.
A new user will be detected but not tracked.
2 Re-enter the field of view and perform the re-calibration gesture
A new user will be detected and then tracked.
3 Re-enter the field of view and then leave the field of view.
A new user will be detected and then lost after approximately 3 seconds
Physical M.A.C.S. Assembly Unit Tests
Test Case Name/ID: PMA Unit Test 01 – Maximum Speed/PMAUT01
Test Writer: Gage Mondok
Description: Verify that the cart can reach the maximum speed
Setup: Drivetrain is complete, powered and supplied I/O
Test Action Expected Output
Pass Fail N/A Comments
1 Set PWM signal to 100%
Motors spin up
2 Give 25ft to reach speed
Speed maxed by 25ft
3 Measure time to travel 50ft at speed
Time should be 11.35s or less (3mph)
M.A.C.S. 44
Test Case Name/ID: PMA Unit Test 02 –Turning/PMAUT02
Test Writer: Gage Mondok
Description: Verify that the cart can perform turns
Setup: Drivetrain is complete, powered and supplied I/O
Test Action Expected Output
Pass Fail N/A Comments
1 Set PWM signal to 25% on right motor and -25% on left
Cart turns counter clockwise about its center
Test Case Name/ID: PMA Unit Test 03 –Vertical Actuation/PMAUT03
Test Writer: Gage Mondok
Description: Verify that the basket can extend 6” up or down.
Setup: Linear actuator is powered and mounted under basket
Test Action Expected Output
Pass Fail N/A Comments
1 Add maximum load to basket
N/A
2 Apply 12VDC to actuator
Full extension of actuator
3 Reverse polarity
Full retraction of actuator
M.A.C.S. 45
Test Case Name/ID: PMA Unit Test 04 –Power Draw/PMAUT04
Test Writer: Gage Mondok
Description: Verify that the battery life is acceptable and power draw does not overheat the
controller
Setup: Drivetrain is complete, powered and supplied I/O
Test Action Expected Output
Pass Fail N/A Comments
1 Set PWM signal to 50% on left motor and -50% on right motor
Cart turns clockwise about its center
2 Measure drain on battery after 5 minutes
N/A
3 Extrapolate battery drain to 1 hour
Battery will last 1 hour or more
4 Measure temperature of motor controller heatsink
Temperature is less than manufacturer specified maximum
M.A.C.S. 46
M.A.C.S. Integration Tests
Test Case Name/ID: M.A.C.S. Integration Test 01 – Basic User Selection via Phone Data/MIT01
Description: Selecting the correct person to follow when the user leaves the field of view and
then re-appears.
Setup: Open the PID Android app and connect to the cart’s Bluetooth network. The cart should
be stationary, and have visual contact with the user. The cart should remain stationary. The
user will then leave the field of view.
Step Action Expected Result
Pass Fail N/A Comment
1 The user re-enters the FOV from the left side
The user will be identified and tracked as they re-enter the field
2 The user re-enters the FOV from the right side
The user will be identified and tracked as they re-enter the field
3 The user re-enters the FOV from left and continues across and out of FOV
The user will be identified and tracked as they re-enter the field then lost as they leave the FOV
4 The user re-enters the FOV from right and continues across and out of FOV
The user will be identified and tracked as they re-enter the field then lost as they leave the FOV
M.A.C.S. 47
M.A.C.S. Integration Tests
Test Case Name/ID: M.A.C.S. Integration Test 02 – Advanced User Selection via Phone
Data/MIT02
Description: Selecting the correct person to follow when the user leaves the field of view and
then re-appears.
Setup: Open the PID Android app and connect to the cart’s Bluetooth network. The cart should
be stationary, and have visual contact with the user and a secondary user. The cart should
remain stationary. The user will then leave the field of view.
Step Action Expected Result Pass Fail N/A Comment 1 The user re-
enters the FOV from the left side
The user will be identified and tracked as they re-enter the field. Second user is not tracked.
2 The user re-enters the FOV from the right side
The user will be identified and tracked as they re-enter the field. Second user is not tracked.
3 The user re-enters the FOV from left and continues across and out of FOV
The user will be identified and tracked as they re-enter the field then lost as they leave the FOV. The second user is not tracked.
4 The user re-enters the FOV from right and continues across and out of FOV
The user will be identified and tracked as they re-enter the field then lost as they leave the FOV. The second user is not tracked.
M.A.C.S. 48
Test Case Name/ID: M.A.C.S. Integration Test 03 – Basic Gesture Recognition
Response/MIT03
Description: Tests the response of the cart when certain gestures are
Setup: Open the PID Android app and connect to the cart’s Bluetooth network. The cart should
be stationary, and have visual contact with the user. The cart should have a fully charged
battery.
Step Physical Gesture Expected Result
Pass Fail N/A Comment
1 “Move Toward” when the cart is stationary.
The cart moves toward the user until it reaches the threshold of 2 ft.
2 “Stop Moving” when the cart is moving toward a user or following him.
The cart stops moving.
3 “Turn Around” when the cart is stationary.
The cart turns 180 degrees.
4 “Recalibrate” at any time.
The cart ignores the gesture since the user is already calibrated.
M.A.C.S. 49
Test Case Name/ID: M.A.C.S. Integration Test 04 – Initialization/MIT04
Description: Tests the initialization of the Bluetooth network, PID Android App, and Kinect
sensor tracking software.
Setup: Open the PID Android app and connect to the cart’s Bluetooth network. The user will
stand in front of the cart to be calibrated.
Step Action Expected Result Pass Fail N/A Comment
1 The user stands alone at a distance of 2 m from the Kinect sensor.
The cart will begin calibration and then begin tracking the user.
2 The user stands alone at a distance of 5 m in front of the Kinect sensor.
The cart will begin calibration and then begin tracking the user.
3 The user stands 2 m in front of the Kinect sensor while a second person stands 3 m in front of the Kinect sensor but 1 m to the left (relative to center).
The cart will begin calibration and then begin tracking the user. The second user is recognized but not identified as the person to track.
4 The user stands 2 m in front of the Kinect sensor while a second person stands 3 m in front of the Kinect sensor but 1 m to the right (relative to center).
The cart will begin calibration and then begin tracking the user. The second user is recognized but not identified as the person to track.
5 The user stands 2 m in front of the Kinect sensor while a second person stands 1 m in front of the Kinect sensor but 1 m to the right (relative to center).
The cart will begin calibration and then begin tracking the user. The second user is recognized but not identified as the person to track.
6 The user stands 2 m in front of the Kinect sensor while a second person stands 1 m in front of the Kinect sensor but 1 m to the left (relative to center).
The cart will begin calibration and then begin tracking the user. The second user is recognized but not identified as the person to track.
7 The user stands 2 m in front of the Kinect sensor while a second person stands 1 m in front of the first person.
The second user is tracked. The first user may be recognized but not tracked.
M.A.C.S. 50
Test Case Name/ID: M.A.C.S. Integration Test 05 – Gesture Collision Avoidance /MIT05
Description: Tests the collision avoidance of the cart when responding gestures.
Setup: The cart should be stationary, and have visual contact with the user. The cart should
have a fully charged battery. A chair will be placed on all four sides of the cart approximately 5
inches from the cart.
Step Physical Gesture Expected Result
Pass Fail N/A Comment
1 “Move Toward” The cart moves to be two inches from the chair in front of it.
2 “Stop Moving” when the cart is moving toward a user or following him.
Cart remains stationary.
3 “Turn Around” when the cart is stationary.
The cart attempts to turn around. If any corner of the cart is within 2 inches of a chair, the operation stops.
4 “Recalibrate” at any time.
The cart remains stationary.
M.A.C.S. 51
Test Case Name/ID: M.A.C.S. Integration Test 06 – Collision Detection /MIT06
Description: Tests the collision avoidance of the cart when responding gestures.
Setup: The cart should be stationary, and have visual contact with the user. The cart should
have a fully charged battery. The user will walk down a hallway.
Step Action Expected Result
Pass Fail N/A Comment
1 The user walks straight.
The cart follows the user and remains approximately 2 ft behind him.
2 The user walks straight, turns right around a corner
The cart follows the user, attempts to turn right around the corner, stops short of the turn because it is too close to the wall.
3 The user walks straight, turns left around a corner
The cart follows the user, attempts to turn left around the corner, stops short of the turn because it is too close to the wall.
4 An object is placed between the user and the cart. The user walks forward.
The cart follows the user until it is within 2 inches of the object. Then it stops.
5 An object is placed behind the cart. The user walks toward the cart.
The cart backs up until it is within 2 inches of the object. Then it stops.
M.A.C.S. 52
M.A.C.S. Acceptance Tests
Test Case Name/ID: M.A.C.S. Acceptance Test 01 – Follow a user in a straight line,
Unloaded/MAT01
Description: Tests the engineering requirements. The System must be able to move up to 5
MPH and The system must be no more than two feet from the followed individual at any time.
Setup: Open the PID Android app and connect to the cart’s Bluetooth network. The cart should
be stationary, and have visual contact with the user. The cart should have a fully charged
battery. The cart should be empty.
Step Action Expected Result
Pass Fail N/A Comment
1 The user should walk forward at a regular pace
The cart will follow while maintaining a two foot distance
2 The user will stop moving
The cart will stop two feet behind the user
3 The user will being moving again, but at a slow pace
The cart will follow at approx. the same pace, while maintaining a two foot distance
4 The user will gradually speed up to a fast pace (approx. 5 MPH)
The cart will also speed up, and maintain a two foot distance
5 The user will suddenly stop
The cart will stop, maintaining a two foot distance.
M.A.C.S. 53
Test Case Name/ID: M.A.C.S. Acceptance Test 02 – Follow a user in a straight line,
Loaded/MAT02
Description: Tests the engineering requirements. The System must be able to move up to 5
MPH, This speed must be attainable with up to a 30 lb. load, and the system must be no more
than two feet from the followed individual at any time.
Setup: Open the PID Android app and connect to the cart’s Bluetooth network. The cart should
be stationary, and have visual contact with the user. The cart should have a fully charged
battery. There should be a load of approximately 30 pounds in the cart.
Step Action Expected Result
Pass Fail N/A Comment
1 The user should walk forward at a regular pace
The cart will follow while maintaining a two foot distance
2 The user will stop moving
The cart will stop two feet behind the user
3 The user will being moving again, but at a slow pace
The cart will follow at approx. the same pace, while maintaining a two foot distance
4 The user will gradually speed up to a fast pace (approx. 5 MPH)
The cart will also speed up, and maintain a two foot distance
5 The user will suddenly stop
The cart will stop, maintaining a two foot distance.
M.A.C.S. 54
Test Case Name/ID: M.A.C.S. Acceptance Test 03 – Follow a user along a path/MAT03
Description: Tests the engineering requirements. The system must autonomously follow an
individual throughout a flat environment with obstacles.
Setup: Open the PID Android app and connect to the cart’s Bluetooth network. The cart should
be stationary, and have visual contact with the user. The cart should have a fully charged
battery.
Step Action Expected Result
Pass Fail N/A Comment
1 The user should walk forward at a regular pace
The cart will follow while maintaining a two foot distance
2 The user will turn a sharp corner that obscure’s the cart’s vision of the robot
The cart will move forward and turn the corner
3 The user will turn around and head back towards the cart
The cart will remain stationary
4 The user will move past the cart and move back to the starting position
The cart will turn around and follow the user back to the starting position
M.A.C.S. 55
Test Case Name/ID: M.A.C.S. Acceptance Test 04 – General Use Case/MAT04
Description: Tests a general use case for the system.
Setup: Open the PID Android app and connect to the cart’s Bluetooth network. The cart should
be stationary, and have visual contact with the user. The cart should have a fully charged
battery. There should be a ten to fifteen pound load in the cart.
Step Action Expected Result
Pass Fail N/A Comment
1 The user should walk forward at a regular pace
The cart will follow while maintaining a two foot distance
2 The user will turn a sharp corner that obscure’s the cart’s vision of the robot. Two non-users will being walking next other user
The cart will move forward and turn the corner
3 The user and one non-user will turn right, while the other non-user will continue to walk straight
The cart turns right.
4 The user stop, the non-user will continue straight
The cart will stop
5 The user will pick something up, weighing approx. five pounds, and walk towards the cart
The cart will remain still
6 The user will place the object in the cart, and then continue moving straight
The cart will follow the user.
M.A.C.S. 56
References: [1] Browning, R. C., Baker, E. A., Herron, J. A. and Kram, R. (2006). "Effects of obesity and
sex on the energetic cost and preferred speed of walking". Journal of Applied Physiology 100
(2): 390–398.
[2] http://www.wolframalpha.com/input/?i=average+human+height
[3] Hall, Edward T. (1966). The Hidden Dimension. Anchor Books. ISBN 0-385-08476-5.
[4] “Kinect for Windows Sensor Components and Specifications”. Microsoft Store. Microsoft.
Retrieved October 30, 2013. http://msdn.microsoft.com/en-us/library/jj131033.aspx
[5] Bhattacharya, S.; Czejdo, B.; Perez, N., "Gesture classification with machine learning
using Kinect sensor data," Emerging Applications of Information Technology (EAIT), 2012 Third
International Conference on , vol., no., pp.348,351, Nov. 30 2012-Dec. 1 2012
[6] Bodiroza, S.; Doisy, G.; Hafner, V.V., "Position-invariant, real-time gesture recognition
based on dynamic time warping," Human-Robot Interaction (HRI), 2013 8th ACM/IEEE
International Conference on , vol., no., pp.87,88, 3-6 March 2013
[7] “Xtion PRO”. Asus. Asus. Retrieved October 31, 2013.
http://www.asus.com/Multimedia/Xtion_PRO/#specifications
[8] lohmann, Birgit. the (all american) history of shopping (carts). 2010.
http://www.designboom.com/history/cart.html (accessed September 2013).