16
REPORT NUMBER: Callaghan Innovation Report 343 TITLE: Investigative Study: Onion Yield Monitoring CLIENT’S NAME: Page Bloomer Associates CONFIDENTIAL TO: Page Bloomer Associates DATE ISSUED: 03/05/2016 PREPARED BY: Chee Kit Wong PEER REVIEWED BY: Patrick Lim and Michael Langley Written permission is required if this report is to be reproduced for promotional purposes or in part. The name of Callaghan Innovation Limited may not be used in advertising without consent. Any samples received by Callaghan Innovation Limited in connection with this investigation that are not reclaimed by the client within 8 weeks of the date of issue of this report will be disposed of.

REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

REPORT NUMBER:

Callaghan Innovation Report 343

TITLE:

Investigative Study: Onion Yield Monitoring

CLIENT’S NAME:

Page Bloomer Associates

CONFIDENTIAL TO:

Page Bloomer Associates

DATE ISSUED:

03/05/2016

PREPARED BY:

Chee Kit Wong

PEER REVIEWED BY: Patrick Lim and Michael Langley

Written permission is required if this report is to be reproduced for promotional purposes or in part. The name of Callaghan Innovation Limited may not be used in advertising without consent. Any samples received by Callaghan Innovation Limited in connection with this investigation that are not reclaimed by the client within 8 weeks of the date of issue of this report will be disposed of.

Page 2: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

Contents Introduction ........................................................................................................................... 3

Objective: Investigative study ............................................................................................ 3

Task 1: Scoping ............................................................................................................. 3

Task 2: Measurement under Controlled Environment..................................................... 3

Task 3: On-Harvester Field Measurements .................................................................... 3

3D sensing technologies ................................................................................................... 4

Experimental Results ............................................................................................................ 5

Measurement under Controlled Environment..................................................................... 5

a) Static Objects .......................................................................................................... 5

b) Measurement of Stationary objects with vibration ................................................. 10

c) Measurement of Moving objects with vibration ...................................................... 10

On-Harvester Field Measurements .................................................................................. 12

Conclusions ........................................................................................................................ 15

Page 3: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during

harvesting. Currently, onions are batch-weighed using load cells at the pack house but the

approach requires constant recalibration and is highly affected by dirt build-up in the device.

Furthermore, this approach does not provide information on yield of the various parts of the

farm. Thus, the client has proposed to develop an on-harvester yield monitoring system that

estimates volume of onions using 3D sensing technologies. The approach is to use a low

cost 3D sensor to measure the cross section dimension of the onion. Assuming onions to be

spherical (or other appropriate models), the volume and ultimately, weight can then be

estimated through density calculations.

Objective: Investigative study

The objective of this project is to investigate the suitability of using low-cost 3D sensing

devices, and in particular, the Microsoft Kinect camera to measure the dimensions of onions.

The investigative study was broken down into the following tasks; the description for each

described below whilst the methodology and experimental results are presented in the

experimental section.

Task 1: Scoping

The team from Callaghan Innovation made multiple visits to Wilcox Farm to observe farm

operations and harvesting machinery. From these visits and in consultation with Dan

Bloomer and Justin Pishief, the team formulated the requirements and constraints for this

investigation, thus enabling the development of a sensing platform that is suitable for initial

analysis on a harvesting machine. For this investigation, we focused only on Wilcox’s

harvesting machine.

Task 2: Measurement under Controlled Environment

From the sensing point of view, the first step is to determine device accuracy and

repeatability in acquiring measurements under controlled environment.

Three different situations were investigated to ascertain sensor suitability:

1) Measurement of static objects

2) Measurement of stationary objects with vibration

3) Measurement of moving objects with vibration

For each, onions of different sizes and shapes, as well as control objects (tennis balls) were

measured using the Kinect and compared to actual values.

Task 3: On-Harvester Field Measurements

Based on client requirements and operating constraints, a prototype test system was

developed and built for on-harvester experiments. The objective of this task is to observe

device performance in the practical environment, as well as validity of any assumptions used

for this investigation. Hence, the primary focus here is to observe data integrity, rather than

measurement validation.

All results were then analysed and findings summarised in this report.

Page 4: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

3D sensing technologies

Recent development in low-cost 3D sensing devices has increased the accessibility to the

technology. One of the more mature products in the market is the PrimeSense-based

Microsoft Kinect, which costs around NZD$300. This device is popular due to a number of

factors:

1. Cost

2. Reliability (indoors only)

3. Accuracy (indoors only)

4. Data accessibility for application development

5. Minimal sensor distortion

In addition to the RGB camera, the Kinect is also equipped with an IR emitter and a depth

sensor. As IR light beams are emitted, the depth sensor reads any signals that are reflected

back to the sensor. This information is effectively the distance between the sensor and the

object in the field of view hence, resulting in a depth image.

Other devices in the market that use the PrimeSense sensor include:

1. Xtion by ASUS 2. Structure Sensor 3. Carmine 1.08

4. Carmine 1.09 (Short Range)

In addition, a number of 3D sensors have been released recently that uses slightly different

technology, but still addresses the low-cost 3D sensing space. These include:

1. Intel RealSense

2. Microsoft Kinect v2

3. Stereolab ZED camera

4. PMD CamBoard nano

A few of these devices were briefly investigated during this study and are discussed in the

experimental section.

Figure 1: The Microsoft Kinect camera, extracted from the Microsoft MSDN website

1

1 https://msdn.microsoft.com/en-us/library/jj131033.aspx

Page 5: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

Experimental Results This section provides a summary of experimental methodology and results, undertaken to

ascertain the suitability of the Microsoft Kinect to reliably measure onion dimensions.

Measurement under Controlled Environment

a) Static Objects

Objects were placed on a table, directly beneath the Microsoft Kinect at a distance of

approximately 850mm, similar in separation to the on-harvester setup. In total, four objects

were used: three onions of different shapes and sizes and a tennis ball as the control object.

There are a number of methods for estimating onion dimensions. The following approaches

were investigated as part of this experiment:

Relative Depth

This approach estimates the diameter of the onions by finding the differences in depth

between the highest point of the crop (df) and the depth of the background (db) [see Figure

2(a)]. An algorithm was implemented to automatically localise and calculate the cross-

sectional length of the bulb. The volume of each object can then be estimated by assuming

the onion to be a sphere.

For the depth of the background (db), a fixed distance (physical measurement between

camera plane and table) was initially used, but was soon found to be too inaccurate. As can

be seen by the volume estimates in Figure 3, the volume estimates ranged from 145mm3 -

149mm3 when the actual volume is 134mm3. To overcome this problem, a depth image of

the table (without any objects) was captured and subsequently used for computing

foreground dimensions. This approach significantly increased the accuracy of the system, as

can be seen Figure 4.

To estimate the depth of observed objects (df), two algorithms were implemented and

compared:

a) Peak depth value around the region of interest of the onion (PV)

b) Depth at centroid of detected objects (CV)

(a) (b)

Figure 2: Diameter estimate of onions from relative depth approach. (a) On a flat surface this method is straightforward, however in reality, (b) the onions move along a conveyor that consists of rods and gaps, and hence the depth of the background is not constant, making this approach unreliable.

db df df db

Page 6: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

Figure 3: Static objects close together. Volume estimated automatically using diameter at centroid and a fixed background depth. Actual volume is 134mm

3

Once the two depths are known, the dimension of each object can be computed. Results

[see Figure 4] show that this approach is accurate and reliable, in particular when locating

the peak value around the region of interest (in blue). Using the PV approach, the volume

ranged from 130mm3 to 136mm3 whilst the CV approach produced volumes ranging from

115mm3 to 134mm3. Note that one anomaly reading was removed as the tennis ball was

partially out of the camera’s field of view, hence producing a smaller than actual reading.

However, the approach of using relative depth relies heavily on a stable background. Any

deviation from the known background depth will skew estimates significantly. Therefore,

even though this approach shows promise in the controlled environment, it would not be

reliable in the practical world where:

a) onions are not necessarily in contact with the conveyor – onions were observed to be

“bouncing around” on numerous occasions

b) the conveyor system is not flat (consists of cylindrical rods) and hence onions can lie

in different positions relative to the conveying system [see Figure 2(b)]

Page 7: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

Figure 4: Volume estimate of static objects at various locations. Tennis balls are at the location of latest reading and hence, readings without tennis balls present signify past readings before the objects were moved. Blue values are computed using the peak value and the red values are computed using diameter estimation at centroid. Actual volume is 134mm

3.

3D Point Cloud

This approach measures the diameter of each onion by finding the spatial distance between

two points on the edges of the subject. It is envisaged that the algorithm would roughly be:

a) Locate each onion using depth data and morphology operations

b) Compute minor axis of each onion [see Figure 5]

c) Determine intersection points of minor axis to the boundary of the onion [see

Figure 5]

d) Extract 3D data at intersection points

e) Compute spatial distance between the intersection points

This approach is straightforward when the onions are individualised (i.e. not in contact with

one another). However when packed together, it is a very challenging task to accurately

determine the boundaries of each object. Part of the problem is due to the fact that

traditional image processing cannot be used to segregate each onion as the background

(conveyor system) is frequently contaminated with dirt [see Figure 6] and hence, have the

same colour as onions.

Consequently, the implementation of an algorithm to accurately localise and measure is out

of the scope of this investigation. The objective of this experiment is to determine whether

point cloud data from the Kinect provide a true indication of crop dimension, assuming

intersection points can be identified. For this experiment, a GUI was developed to display

Kinect colour and depth images, enabling the user to define locations where measurements

should be taken. An example of this is shown in Figure 7, where the selected points are

marked as red dots. Table 1 summarises measurements acquired for each object seen in

Figure 7.

Page 8: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

Figure 5: Example of major (red) and minor (blue) axes of an onion; and intersection points between the minor-axis to the boundary of the onion (blue dots).

Figure 6: Conveying system used on the harvester

Figure 7: GUI developed to measure dimensions based on points selected manually by the user. In this screen capture, the length across the bulb of the bottom right onion was measured to be 73mm, compared to the actual length of 76mm.

Table 1: Summary of Objects Used for Experiment, actual dimensions and Kinect measured dimensions

Id Object Location Actual length across bulb (mm)

Kinect length across bulb (mm)

1 Tennis Ball Top Left 64 65 2 Round Onion Top Right 74 76 3 Small Oblong Onion Bottom Left 55 55 4 Large Oblong Onion Bottom Right 76 73

Page 9: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

Multiple experiments were taken with the same objects placed at different positions and

orientations relative to the field of view of the Kinect camera and measurements show that

the Kinect was within +/-5mm of the actual cross sectional length. In part, the error was due

to how accurately the user selects the measurement points but is a useful indication on the

dependence of this approach on a reliable localisation of subject boundary for accurate

measurements.

Dropping Onion

One concept investigated is to perform measurement of onions just as they drop from the

conveyor system to the collecting bin. The advantage of this approach is that image

background can be controlled and hence free from the conveyor rod. However, there are a

couple of assumptions that may not be practical in the “real world”:

a. Whilst it is not confined to one onion per frame, one has to ensure that the onions

have spatial separation during the fall.

b. Certain orientations are more desirable as they provide more precise

measurements of the cross-section length.

To investigate this approach, the Kinect camera was mounted on a tripod, with the image

plane orientated at right angle to the ground. Whilst the program was continuously capturing

each frame, a tennis ball was dropped in front of the camera field of view. As can be seen in

Figure 8, it was soon evident that the Kinect camera is not able to cope with objects

travelling at speed. As the object drops towards the ground, the depth image captures a trail

of depth readings from the object, forming an elongated observation.

Due to this finding, an investigation was started to look into the performance of other 3D

sensors around subject movement. Various experiments were undertaken with:

a) ASUS Xtion

b) Microsoft Kinect V2

c) Intel RealSense

As expected, the ASUS Xtion camera had similar issues with moving objects. This is not

surprising as they share the same PrimeSense sensor. The Microsoft Kinect V2 and Intel

Realsense though did not have the same problem. However, the depth data is less

accessible than the Microsoft Kinect; and a significant amount of work would be required to

extract and process the data.

A visit to Wilcox farm was made to determine conveyor speed. Whilst there are no hard

velocity measures, “typical” conveyor travel was observed to be around 0.5m/s. However,

the velocity can be dialled up or down depending on the driver and there are no firm rules on

the operating or maximum conveyor speed. For the purpose of this study, it is assumed that

typical velocity will be 0.5m/s.

A six-axis robot was used to validate data integrity when capturing objects moving at 0.5m/s

and results show no data degradation up to 1m/s. Hence, the decision was made to continue

trials using the Microsoft Kinect camera. As for the falling object concept, if the

aforementioned constraints can be met, the concept will work well as objects are physically

segmented in space.

Page 10: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

(a) (b)

Figure 8: Effect of velocity on the depth image. (a) RGB image showing a dropping tennis ball moving across the Kinect frame and (b) elongated trail of the tennis ball which provides a false impression on the shape of the object.

b) Measurement of Stationary objects with vibration

Due to the environment and application, the sensing system must be capable of operating

effectively in the presence of vibration. The objective of this task is to determine the effects

(if any) of vibration on volume estimation using the collected spatial data.

To achieve this, the objects were placed inside a plastic bin and suspended from a

stationary six-axis robot. A pneumatic vibrator was mounted underneath the bin to provide

vibration with a range of frequencies [see Figure 9]. A number of experiments were

conducted with all results showing no discernible differences compared to measurements

taken of the same objects without vibration.

Figure 9: Physical setup to ascertain effects of vibration on Kinect data

c) Measurement of Moving objects with vibration

The final controlled experiment involves taking measurements of objects in motion, in the

presence of vibration. For this test, the setup is the same as the vibration experiment [Figure

Page 11: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

9]. However, this time around, the robot is used to move the bin (with objects) across the

field of view of the Kinect camera. A program has been written to move the robot at user-

defined speeds ranging from 0.5m/s – 1m/s, whilst another program has been written to

continuously capture and store data frames for later analysis. Additional objects were also

added in the bin but were not measured in this experiment.

Once the capture has been completed, the frame that best centres the bin is picked for

analysis. Like before, the user is asked to select the two points of interest. The result for the

experiment shown in Figure 10 (colour image) and Figure 11 (depth image) is presented in

Table 2. The experiment was repeated multiple times, both at 0.6m/s and 0.8m/s, with

measurements observed to be between +/-6mm of the actual cross-sectional length. Whilst

the experiments produced good results, it is important to note that the measurements are

highly dependent on ability to locate good measuring points. Results observed here verify

that the data produced by the Kinect camera retains its integrity even when objects are

moving at 0.8m/s, but data quality degraded with velocities beyond 1m/s. Essentially, the

objects started to shrink in size and were no longer a true representation of the physical

world.

Table 2: Cross section length as measured by the Microsoft Kinect through user selection, compared to actual measurements.

Id Object Actual length across bulb (mm)

Kinect length across bulb (mm)

1 Tennis Ball 64 64.6 2 Round Onion 74 73.5 3 Small Oblong Onion 55 59 4 Large Oblong Onion 76 75

Figure 10: RGB capture of the bin and objects as it moves across the Kinect’s field of view at a velocity of 0.8m/s. The image is blurry due to the movement of the objects.

Page 12: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

Figure 11: Depth capture of the bin and objects as it moves across the Kinect’s field of view at a velocity of 0.8m/s. The objects of interest are labelled 1-4, with the corresponding measurements shown in Table 2. For object 1 the sampled point are marked by red dots.

On-Harvester Field Measurements

For on-harvester experiments, an enclosure was designed [see Figure 12], built and

integrated onto the frame of the harvesting conveyor system [see Figure 13]. Primarily, the

enclosure provides shelter from sunlight for the Kinect to operate. Moreover, it also

significantly reduced floating debris obstructing the camera from seeing the objects. Dust

and floating debris was observed to be a significant problem on the farm.

Figure 12: CAD drawing of enclosure frame integration to conveyor system of the onion harvester with camera mounted within

1

4

2

3

Page 13: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

Figure 13: Image showing the prototype system integrated to the harvester, ready for experiments

During the on-harvester experiments over 2000 sets of colour and depth images were

collected. A sample of three sets of data is shown in Figure 14. From visual observation, the

objects look to be in the right proportion in the depth image. Furthermore, manual

measurements extracted from the point cloud data are 55mm, 51mm and 65mm

respectively. Whilst these values were not verified with physical measurements, they fall

within the typical ranges and therefore seem plausible. Further experiments will be required

to verify these measurements properly. It is also worth noting that depth data completely

disappears when the scene is saturated with sunlight.

(a) (b)

Page 14: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

(c) (d)

(e) (f) Figure 14: Sample of three sets of colour and depth images from the Kinect camera. From visual observation, the onions look proportionally correct in the depth image. Measurements extracted from the point cloud data (red dots) are (b) 55mm, (d) 51mm and (f) 65mm.

Further samples of interesting cases are presented in Figure 15 - Figure 18 to highlight the

difficulties in achieving a reliable system in the practical world. It is important to note that

these problems were observed regularly and therefore should a vital part of the

considerations in the next phase of development.

(a) (b)

Figure 15: Jumping onion occluding onions beneath.

Page 15: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

(a) (b)

Figure 16: Onion leaves were not successfully cropped.

(a) (b)

Figure 17: Onions stacked on top of each other.

(a) (b)

Figure 18: Weed as part of the harvest.

Conclusions The objective of this investigation is to determine the suitability of low-cost 3D sensing

devices, and in particular the Microsoft Kinect camera to estimate yield of onions during

harvesting. Whilst the Kinect has shown versatility in numerous applications, it is not known

whether it is suitable for the client’s application of interest.

Page 16: REPORT NUMBER: Callaghan Innovation Report 343 · Introduction Page Bloomer Associates would like to develop a system to monitor onion yield during harvesting. Currently, onions are

First step was to investigate the sensor in a controlled environment. The Kinect camera was

used to measure onion and controlled subjects in stationary, under vibration and in-motion.

Results show that cross-sectional diameter of the onion bulb is measured to within +/-6mm

of actual values.

Once completed, a full enclosure was designed and fabricated so that real-world

experiments can be undertaken on the harvester during the harvesting process. Whilst

Kinect data cannot be validated with actual measurements, data captured showed that the

sensor retained data integrity and objects were of the correct proportion and sizes. Part of

the next step forward would be to validate these values are accurate.

It is important to note that a number of problems were observed regularly during on-

harvester trials. These include:

a. Onions were stacked on top of one another

b. Onions “jumped” around and occlude other onions

c. Leaves of the onion crop were not removed, causing occlusion

d. Foreign objects such as weed and floating debris were present

e. Dust may cover the camera sensor

Hence, it is important to ensure these problems are solved for any spatial sensing approach

to work reliably.