9
Available online at www.sciencedirect.com ScienceDirect SoftwareX xx (xxxx) xxx–xxx www.elsevier.com/locate/softx anyFish 2.0: An open-source software platform to generate and share animated fish models to study behavior Q1 Spencer J. Ingley a,, Mohammad Rahmani Asl b , Chengde Wu b , Rongfeng Cui c , Mahmoud Gadelhak b , Wen Li d , Ji Zhang d , Jon Simpson b , Chelsea Hash e , Trisha Butkowski b , Thor Veen f,g , Jerald B. Johnson a,h , Wei Yan b , Gil G. Rosenthal c a Evolutionary Ecology Laboratories, Department of Biology, Brigham Young University, Provo, UT, USA b Department of Architecture, Texas A&M University, College Station, TX, USA c Department of Biology, Texas A&M University, College Station, TX, USA d Department of Computer Science, Texas A&M University, College Station, TX, USA e Lively Disposition, Cohoes, NY, USA f The Biodiversity Research Centre, University of British Columbia, Vancouver, British Columbia, Canada g Department of Integrative Biology, University of Texas at Austin, One University Station C0990, Austin, TX, 78712, USA h Monte L. Bean Life Science Museum, Brigham Young University, Provo, UT, USA Received 13 May 2015; received in revised form 6 October 2015; accepted 7 October 2015 Abstract Experimental approaches to studying behaviors based on visual signals are ubiquitous, yet these studies are limited by the difficulty of combining realistic models with the manipulation of signals in isolation. Computer animations are a promising way to break this trade-off. However, animations are often prohibitively expensive and difficult to program, thus limiting their utility in behavioral research. We present anyFish 2.0, a user-friendly platform for creating realistic animated 3D fish. anyFish 2.0 dramatically expands anyFish’s utility by allowing users to create animations of members of several groups of fish from model systems in ecology and evolution (e.g., sticklebacks, Poeciliids, and zebrafish). The visual appearance and behaviors of the model can easily be modified. We have added several features that facilitate more rapid creation of realistic behavioral sequences. anyFish 2.0 provides a powerful tool that will be of broad use in animal behavior and evolution and serves as a model for transparency, repeatability, and collaboration. c 2015 Published by Elsevier B.V. Keywords: Animal communication; Animation; Video playback; Teleostei Code metadata 1 Current code version anyFish v. 2.0 Permanent link to code/repository used of this code version https://github.com/ElsevierSoftwareX/SOFTX-D-15-00014 Legal code license GNU general public license (http://www.gnu.org/copyleft/gpl.html) Code versioning system used Github version control Software code languages, tools, and services used Unity personal, Unity pofessional, C#, JavaScript, MATLAB Compilation requirements, operating environments & dependences Microsoft visual studio, MonoDevelop, Unity If available Link to developer documentation/manual http://swordtail.tamu.edu/anyfish/AnyFish Unity Quickstart Guide Support email for questions anyfi[email protected] 2 Correspondence to: 401 WIDB, Brigham Young University, Provo, UT 84602, USA. Tel.: +1 352 278 2705; fax: +1 801 422 0090. E-mail address: [email protected] (S.J. Ingley). http://dx.doi.org/10.1016/j.softx.2015.10.001 2352-7110/ c 2015 Published by Elsevier B.V.

anyFish 2.0: An open-source software platform to generate ...swordtail.tamu.edu/docs/Ingley_etal_2015.pdf · 2 S.J. Ingley et al. / SoftwareX xx (xxxx) xxx–xxx 1 Software metadata

  • Upload
    ngoanh

  • View
    216

  • Download
    2

Embed Size (px)

Citation preview

Available online at www.sciencedirect.com

ScienceDirect

SoftwareX xx (xxxx) xxx–xxxwww.elsevier.com/locate/softx

anyFish 2.0: An open-source software platform to generate and shareanimated fish models to study behavior

Q1 Spencer J. Ingleya,∗, Mohammad Rahmani Aslb, Chengde Wub, Rongfeng Cuic,Mahmoud Gadelhakb, Wen Lid, Ji Zhangd, Jon Simpsonb, Chelsea Hashe, Trisha Butkowskib,

Thor Veen f,g, Jerald B. Johnsona,h, Wei Yanb, Gil G. Rosenthalc

a Evolutionary Ecology Laboratories, Department of Biology, Brigham Young University, Provo, UT, USAb Department of Architecture, Texas A&M University, College Station, TX, USA

c Department of Biology, Texas A&M University, College Station, TX, USAd Department of Computer Science, Texas A&M University, College Station, TX, USA

e Lively Disposition, Cohoes, NY, USAf The Biodiversity Research Centre, University of British Columbia, Vancouver, British Columbia, Canada

g Department of Integrative Biology, University of Texas at Austin, One University Station C0990, Austin, TX, 78712, USAh Monte L. Bean Life Science Museum, Brigham Young University, Provo, UT, USA

Received 13 May 2015; received in revised form 6 October 2015; accepted 7 October 2015

Abstract

Experimental approaches to studying behaviors based on visual signals are ubiquitous, yet these studies are limited by the difficulty ofcombining realistic models with the manipulation of signals in isolation. Computer animations are a promising way to break this trade-off.However, animations are often prohibitively expensive and difficult to program, thus limiting their utility in behavioral research. We presentanyFish 2.0, a user-friendly platform for creating realistic animated 3D fish. anyFish 2.0 dramatically expands anyFish’s utility by allowingusers to create animations of members of several groups of fish from model systems in ecology and evolution (e.g., sticklebacks, Poeciliids, andzebrafish). The visual appearance and behaviors of the model can easily be modified. We have added several features that facilitate more rapidcreation of realistic behavioral sequences. anyFish 2.0 provides a powerful tool that will be of broad use in animal behavior and evolution andserves as a model for transparency, repeatability, and collaboration.c⃝ 2015 Published by Elsevier B.V.

Keywords: Animal communication; Animation; Video playback; Teleostei

Code metadata1

Current code version anyFish v. 2.0Permanent link to code/repository used of this code version https://github.com/ElsevierSoftwareX/SOFTX-D-15-00014Legal code license GNU general public license (http://www.gnu.org/copyleft/gpl.html)Code versioning system used Github version controlSoftware code languages, tools, and services used Unity personal, Unity pofessional, C#, JavaScript, MATLABCompilation requirements, operating environments & dependences Microsoft visual studio, MonoDevelop, UnityIf available Link to developer documentation/manual http://swordtail.tamu.edu/anyfish/AnyFish Unity Quickstart GuideSupport email for questions [email protected]

2

∗ Correspondence to: 401 WIDB, Brigham Young University, Provo, UT84602, USA. Tel.: +1 352 278 2705; fax: +1 801 422 0090.

E-mail address: [email protected] (S.J. Ingley).

http://dx.doi.org/10.1016/j.softx.2015.10.0012352-7110/ c⃝ 2015 Published by Elsevier B.V.

2 S.J. Ingley et al. / SoftwareX xx (xxxx) xxx–xxx

Software metadata1

Current software version anyFish v. 2.0Permanent link to executables of this version http://swordtail.tamu.edu/anyfish/AnyFish Editor Program Download

https://github.com/anyFish-Editor/anyFish-2.0Legal software license GNU general public license (http://www.gnu.org/copyleft/gpl.html)Computing platforms/operating systems Windows operating system (operating on a Windows machine or a virtual machine

software application, such as parallels desktop for Mac, with Windows installed)Installation requirements & dependences Windows operating system (operating on a Windows machine or a virtual machine

software application, such as parallels desktop for Mac, with Windows installed),MATLAB runtime 2012a or newer versions

If available, link to user manual—if formally published include areference to the publication in the reference list

http://swordtail.tamu.edu/anyfish/AnyFish User Manual

Support email for questions [email protected]

2

1. Motivation and significance3

Communication is of fundamental interest in the study of4

animal behavior [1]. Due to the complex nature of animal com-Q25

munication, teasing apart the role of individual signals is of-6

ten experimentally difficult. Studies often rely on our ability7

to use naturally occurring signal variation or to experimentally8

manipulate and present signals (i.e., video or audio stimuli) to9

receivers (i.e., live study animals) in a controlled environment.10

Although desirable in many cases, this is often difficult or im-11

possible to achieve with previously existing technology, and the12

inability to decouple correlated traits and control the behavior13

of live stimuli limits researchers to naturally occurring varia-14

tion.15

Despite some limitations, researchers have benefited from16

technological advances providing the ability to manipulate cer-17

tain signals and present them to live animals in wild or labora-18

tory conditions using acoustic [2–8] and video playback [9–12].19

Recently, computer animations have offered a promising20

alternative to live animals or video playback [13], and have21

been implemented in studying communication in a variety of22

taxa (e.g., spiders [14]; birds [15]; lizards [16]; and fishes23

[17–22]). Animated stimuli presented to live animals provide24

the flexibility to manipulate virtually any trait while maintain-25

ing other traits constant [13,20]. Although promising, there are26

several major logistical limitations to the use of animations in27

behavior research. For example, the complexity of many visual28

displays currently requires the use of expensive and sophisti-29

cated software, often demanding specialized expertise. Thus,30

computer animations are not feasible for many researchers, re-31

stricting their use to those with expertise in these methods or32

sufficient funding to hire experts. The laborious nature of tradi-33

tional animation methods also means that exemplars are often34

based on representative behavior of a single individual, rather35

than several slightly varied stimuli, which could result in pseu-36

doreplication due to non-independence of trials [23,24]. Using37

multiple individuals and behavioral sequences is required to38

avoid pseudoreplication, although this is often prohibitively dif-39

ficult or time consuming with traditional animation methods.40

Furthermore, the means by which researchers can share and use41

computer animations created by sophisticated animation soft-42

ware are lacking. Thus, an inherent limitation to all of these43

approaches is the inability to reproduce and share the visual44

signals that are used.45

We present anyFish 2.0, a user-friendly, open-source soft- 46

ware for creating fish animations for behavioral research [25]. 47

anyFish 2.0 provides an alternative to often expensive and 48

difficult-to-use animation software. The functionality of any- 49

Fish means that any researcher can quickly create a variety of 50

behavioral stimuli and share projects through digital reposito- 51

ries (e.g., Dryad), providing a model for transparency and re- 52

producibility in animal behavior research. Such transparency 53

has been a hallmark of other fields for decades, yet animal 54

behaviorists have struggled in this area, often lacking means 55

whereby they can share experimental stimuli and accurately 56

replicate experiments. The free/open-source nature of anyFish 57

also means that anybody can use/modify the software to fit their 58

research needs. Below, we describe how anyFish 2.0 expands 59

the previous version described by Veen et al. [25], which fea- 60

tured only a single fish model, and highlight new features of 61

anyFish 2.0. 62

2. Software description 63

Here, we briefly outline the steps required to create 3D 64

animated fish using anyFish 2.0 (Fig. 1). Once anyFish and Q3 65

the appropriate third-party software programs are downloaded 66

and installed, the animation process involves three steps: (1) 67

preparation of geometric morphometric files to determine the 68

fin/body shape of the animated fish; (2) preparation of fin/body 69

‘textures’ to determine the appearance of the model; and (3) 70

applying motion to the model within the anyFish editor. 71

2.1. Software architecture 72

anyFish 2.0 UI was created in the Unity game engine 73

(https://unity3d.com/) and works as a stand-alone application 74

in Windows (on a Windows machine or virtual machine soft- 75

ware application). It is written primarily in C# and JavaScript to 76

enable model locomotion of the fish spine and blending the mo- 77

tion path through space. The entire system is executed through a 78

series of automated scripts that drive a physics simulation. The 79

Unity game engine was chosen for its simplicity and because it 80

enables physical modeling of objects in a 3D environment. 81

anyFish 2.0 Editor is a standalone windows form application 82

that provides multiple functionalities to the user such as quanti- 83

fying morphology and TPS (‘thin plate spline’) Transform. TPS 84

S.J. Ingley et al. / SoftwareX xx (xxxx) xxx–xxx 3

Fig. 1. Flow chart of primary steps in anyFish. (A) A lateral image (.JPG) of the fish to be modeled should be optimized (Table 1). This image is used to createbody and fin textures. (B) Create a TPS file in tpsUtil and assign morphological landmarks (Fig. 3) in tpsDig. The populated TPS file is then copied directly into theproject folder (G) and used in step C. (C) TPS-Transformer uses the TPS file generated in B and the image from A to transform the image to fit the anyFish defaultmodel. (D) The output from C, which should be copied to the project folder (G). (E) Fin images should be extracted from the starting image (A) and applied tothe appropriate fin guide. (F) Final fin textures should be saved as .PNG files and copied into the project folder (G). (G) The project folder contains all of the inputdata that will be used to create the final model, H. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of thisarticle.)

Table 1Recommend third-party programs. These programs are used to perform tasks before (e.g., imagemanipulation) and after creating an animation in anyFish.

Task Name software URL

Image manipulation Adobe Photoshop www.adobe.comGIMPa http://www.gimp.orgTPS-transformera http://swordtail.tamu.edu/anyfish

Morphometrics tpsUtila http://life.bio.sunysb.edu/morphtpsDiga http://life.bio.sunysb.edu/morphtpsRelwa http://life.bio.sunysb.edu/morphConsensus-to-TPSa http://swordtail.tamu.edu/anyfish

Creating and playback video Adobe Premiere www.adobe.comVLCa http://www.videolan.org

a For each task we provide a suggested free and commercial software option.

Transform runs a module that is developed by authors in MAT-1

LAB and requires MATLAB 2012 or newer version installation.2

The anyFish 2.0 editor program is available for download on3

the anyFish website (http://swordtail.tamu.edu/anyfish). Prior4

to creating an animation in anyFish, several steps must be5

completed which require the use of tools created primarily6

for the anyFish application and third-party programs. We pro-7

vide a list of the suggested software programs (commercial and8

freeware options) in Table 1, and a brief discussion of eachQ49

task below (more detailed are found in the anyFish user man- 10

ual: http://swordtail.tamu.edu/anyfish/AnyFish User Manual; 11

see summary and tutorial videos at https://www.youtube.com/ 12

user/anyFishTutorials and in supplementary Video 1). anyFish 13

comes pre-loaded with two fish models. The first is a generic 14

model of a stickleback fish (Gasterosteus spp.). Veen et al. [25] 15

have discussed this model in greater detail. anyFish 2.0 includes 16

a second model, which is a generic ‘poeciliid’ model (Poecili- 17

idae). The poeciliid model can be used for a variety of poeciliid 18

4 S.J. Ingley et al. / SoftwareX xx (xxxx) xxx–xxx

Fig. 2. ‘Natural’ and manipulated models created in anyFish. (A) Small Danio rerio; (B) Large Danio rerio; (C) ‘Natural’ Xiphophorus birchmanni; (D) X.birchmanni with the body shape of X. malinche; (E) ‘Natural’ X. malinche; (F) X. malinche with an extended caudal sword; (G) Novel male color morph ofPoecilia latipinna; (H) Novel male color morph of P. latipinna with exaggerated dorsal fin pigment. (I) ‘Natural’ Brachyrhaphis terrabensis; (J) B. terrabensis withBrachyrhaphis roseni fins. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.)

fishes (e.g., guppies, swordtails; Fig. 2) that are well-studied1

models in ecology and evolution, or with non-poeciliid fish2

with similar body forms (e.g., killifish (Cyprinodontidae and3

Aplocheilidae) and zebrafish (Cyprinidae); Fig. 2(A), (B)). The4

new model expands the utility of anyFish dramatically by mak-5

ing the program available to individuals working on numerous6

model systems prevalent in ecology and evolution. Although 7

the stickleback model presented by Veen et al. [25] and poe- 8

ciliid model presented here should have broad appeal, anyFish 9

is as an open source platform, facilitating the creation of other 10

fish models outside of the range of variation currently offered 11

(e.g., Cichlidae or Gobiidae). New models can be integrated and 12

S.J. Ingley et al. / SoftwareX xx (xxxx) xxx–xxx 5

Fig. 3. Geometric morphometric landmark configurations. Landmarking scheme for the ‘Poeciliid’ model (A) without a sword and (B) with a sword. This landmarkconfiguration is used for any fish to be modeled with the ‘Poeciliid’ model.

used in the anyFish environment to take advantage of the user-1

friendly interface. Briefly, this is done by designing and creat-2

ing a new animation ‘rig’ (a 3D model of the standard shape and3

texture mapping of the new species). New rigs can be created4

in Maya, a 3D modeling software program, by adjusting an ex-5

isting model (i.e., stickleback or poeciliid model) or creating a6

new model following a similar protocol. Based on the landmark7

configuration that best represents the new species, the texture8

transformation (using TPS Transform) can be easily achieved9

by tuning existing procedures. Once a new rig is created, it can10

be incorporated into anyFish source code, allowing researchers11

to create altogether new models and incorporate them into theQ512

functionality of anyFish.13

2.2. Software functionalities, workflow, and example models14

2.2.1. Quantifying and modifying morphology15

anyFish provides the ability to manipulate the appearance16

of the animated fish in several ways, including body and fin17

size/shape (e.g., Fig. 2). Below, we detail two general steps18

required to manipulate the model size and shape.19

The use of morphological landmarks and geometric morpho-20

metrics to quantify body shape has become standard in ecology21

and evolution [26]. We adopted methods familiar to many biol-22

ogists as a basis for quantifying and modifying the shape of23

animations generated in anyFish. The first step is to acquire24

standardized digital images (Fig. 1(A)). These images are used25

both to obtain morphological data and as skin textures to be ap-26

plied to the rig (i.e., the model ‘skeleton;’ see below). We rec-27

ommend the use of a color standard for post-production color28

balancing of the fish image. Images should be taken in or con-29

verted to jpeg format before applying landmarks to the image.30

The second step for quantifying and modifying body and fish31

shapes is to generate TPS files (a common file format in ge-32

ometric morphometrics) that contain morphological landmark33

data (Fig. 1(B)). anyFish uses a set of landmarks (i.e., morpho-34

logical points on the lateral image of the fish) to capture the key35

morphological features of the fish (Fig. 3; [25]). TPS files are36

created using tpsUtil [27]. tpsDig [28] can then be used to pop-37

ulate the TPS file with two-dimensional X–Y coordinates for38

each landmark. These TPS files are used to modify the shape of39

the animated fish (see below) and to transform texture images so40

that they match a built-in coordinate system for use as textures41

in the anyFish editor (see below; new fish rigs incorporated into 42

anyFish can create custom TPS configurations to maximize an- 43

imation performance). A population average or ‘consensus’ can 44

also be used. We have accomplished this by creating a program 45

(‘TPS from Consensus’) that applies a scale to a TPS consensus 46

file as generated by tpsRelw [29], and formats the file for use 47

in anyFish. Thus, users can landmark numerous fish and gener- 48

ate a population consensus to define the shape of the animation. 49

Multiple TPS files can be loaded into the anyFish editor, allow- 50

ing the user to quickly alternate between different body shapes 51

(e.g., Fig. 2(C), (D)). This provides the flexibility required to 52

create a variety of stimuli (e.g., varying body shape while con- 53

trolling the model’s texture; Fig. 2(C)–(F)), either for use in 54

different experiments or to generate slight variations of an an- 55

imation to avoid pseudoreplication. TPS files can be modified 56

manually in tpsDig to manipulate traits of interest, which will 57

be reflected in the anyFish model (e.g., exaggerated swordtail 58

length or pigmentation; Fig. 2(E)–(H)). 59

2.2.2. Applying texture to the animation 60

The second step involves creating and applying a model ‘tex- 61

ture.’ A texture is essentially the ‘skin’ of the 3D animated 62

fish, and can consist of any digital image to which the ap- 63

propriate landmarks are applied. Lateral digital images of the 64

fish of interest provide an ideal texture. These images can be 65

optimized and customized (e.g., changing color parameters; 66

Fig. 2(G)–(H); [20,25]) easily using an image manipulation 67

program (e.g., Adobe Photoshop). A recent study by Culumber 68

and Rosenthal [20] successfully implemented this functional- 69

ity by manipulating the tail pigmentation in animated platyfish. 70

Once an image is optimized and landmarked, TPS-transformer 71

is used to ‘transform’ the image to match the default shape of 72

the digital skeleton (Fig. 1(C)–(D)). This transformation step 73

serves to mount the texture to the 3D skeleton. In a later step, 74

the body-shape of the final model is specified through the TPS 75

files created in step described above. Separate fin textures are 76

also required (Fig. 1(E)–(F)). A lateral image of the fin is 77

applied to a fin guide by the user in an image manipulation 78

program (e.g., Adobe Photoshop, GIMP). Fin textures are au- 79

tomatically matched to the model in anyFish. Thus, fin textures 80

can be manipulated and applied independent of the body texture 81

(Fig. 2(I), (J)). 82

6 S.J. Ingley et al. / SoftwareX xx (xxxx) xxx–xxx

Fig. 4. User menus for anyFish. (A) anyFish Editor project selection menu. (B) anyFish Project Editor menu. (C) The anyFish editor interface.

2.2.3. Creating a swimming path: applying motion with the1

anyFish editor2

Once shape and texture files have been created, the user can3

create an animation using anyFish editor. Upon opening any-4

Fish, a menu is provided for specifying the key features of the5

animation. The user will then have several options for guid-6

ing the movement of the model. The first option is to select7

and modify a pre-existing path (e.g., from the anyFish web- 8

site or other publications using anyFish, e.g., [20]). The user 9

can modify any parameters of the path, including body posi- 10

tion and rotation in the X , Y , and Z -axes, and fin position. The 11

second option for specifying movement is to create a path de 12

novo. The user can manually adjust the position and rotation of 13

the model (Fig. 4), and the position of the fins. Here, the user 14

S.J. Ingley et al. / SoftwareX xx (xxxx) xxx–xxx 7

can record video of the movement of interest and import the1

video frames to manually match the position of the model with2

that of the fish in the video (i.e., ‘rotoscoping’). This does not3

require specifying the position of the model in each frame. In-4

stead, by ‘keyframing’, or setting important frames in the video5

animation, one can assign the fish position at intervals and the6

anyFish physics system will interpolate the model’s posi-7

tion. anyFish 2.0 provides several new functions that will in-8

crease the speed and accuracy of keyframing, for example, the9

‘Pectoral auto movement’ tool, which automatically applies10

pectoral fin movement to the animation (Fig. 4). We have11

also implemented copy/paste functions for copying/pasting12

keyframes of repetitive behaviors. The final option for speci-13

fying model movement is to apply motion capture data of a live14

fish from third party software. This method allows the user to15

record the behavior of a live fish using motion capture software16

and match the movement of the model to the fish that has been17

tracked. Completed paths are rendered using anyFishVM and18

subsequently edited in standard video editing software (e.g., Ta-19

ble 1). The rendering process involves two steps: (1) still frames20

will be rendered for each frame of the animation; and (2) frames21

will be assembled using anyFishVM to create an animation in22

standard video formats.23

3. Impact and uses24

anyFish provides an excellent means to create high-quality25

fish stimuli for behavioral research, and serves as a model for26

repeatability and transparency (i.e., a permanent and sharable27

record of the stimulus) in the field of animal behavior. With28

anyFish 2.0, the construction and manipulation of animations29

for a variety of model fish systems is made very accessible30

and will serve a broad research community (e.g., participants31

in workshops conducted at Evolution 2014, Behaviour2015,32

and Animal Behavior Society 2015 conferences). anyFish will33

improve the pursuit of existing research questions by allowing34

researchers to create and manipulate realistic animations in35

ways that are impossible or extremely difficult to do with live36

fish. Thus, the anyFish workflow will change the daily practice37

of its users by putting the power to animate in the hands of38

non-experts, and by allowing users to rapidly and transparently39

share their models with the scientific community. Below, we40

briefly discuss several avenues of research that will benefit from41

anyFish.42

3.1. anyFish and the evolution of mating behavior43

anyFish provides an ideal tool for studying the evolution44

of mating behavior. Researchers are often interested in traits45

that animals use to determine the suitability of potential mates46

(e.g., body size [30,31]; body shape [32,33]; behavior [34];47

color/pigmentation [20,30,34,35]). Determining how specific48

traits function in reproductive behavior often requires the isola-49

tion of such traits in a controlled experimental context. This is50

done by presenting live animals with stimuli (often in pairs) that51

differ in a single trait, and measuring the response of the live52

animal. In the case of animations, computer monitors are com-53

monly used to present paired stimuli to the animal receiver [13].54

anyFish provides a powerful tool to focus on the role of a single 55

signal (or the interaction of multiple signals) by providing the 56

ability to vary a single trait while maintaining other traits con- 57

stant. For example, Culumber and Rosenthal [20] used anyFish 58

to test for the role of mating preferences in the maintenance of 59

a tail spot polymorphism in a species of platyfish. They created 60

fish models that differed in their tail spot coloration and pre- 61

sented pairs of stimuli to live fish, allowing the researchers to 62

test for live fish mating preferences for these traits while con- 63

trolling all other variables. This functionality could also serve in 64

studying the evolutionary trajectory of signal-receiver dynam- 65

ics and processes such as sensory bias (e.g., color signals could 66

be added to animated fish and presented to live fish), in which 67

the exploitation of pre-existing sensitivities of the visual system 68

are used to increase mating success [36–39]. 69

3.2. Social behavior, interspecific interactions 70

The study of social behavior, both inter- and intra-sexual 71

interactions, could also benefit from anyFish. For example, 72

color variation often plays an important role in patterns of 73

male–male aggression [40,41]. Using anyFish, these color pat- 74

terns could be manipulated and presented to live animals to 75

tease apart the signals used in male aggression. Other intersex- 76

ual or even interspecific interactions, such as shoaling or socia- 77

bility tendencies [42], could be studied using anyFish. anyFish 78

will also facilitate studies that will increase our understanding 79

of predator–prey dynamics, including the role of prey color, 80

size, and behavior in predator preference [43]. Such studies 81

of piscivorous predator behavior could even extend to non-fish 82

taxa, such as dragonfly larvae or crustaceans. For example, re- 83

searchers are currently using anyFish to test prey preferences 84

in dragonfly larvae by presenting larvae with animations of fish 85

that vary in size, color, and behavior. 86

In the future, existing Computer Vision techniques and 87

systems could be used to track the motion of real fish through 88

video cameras. These live fish behaviors could be retrieved 89

from the video feed and be inputted into anyFish, so that the 90

virtual fish can effectively “see” and respond in real time to the 91

live fish receiver. 92

4. Conclusions and future directions 93

The open source nature of anyFish lends itself to further 94

technological advances by the anyFish development team and 95

the user community, such as the inclusion of new 3D models, 96

the incorporation of real-time fish tracking and behavioral re- 97

sponses [44], and adjusting the color balance to account for in- 98

terspecific variation in visual sensitivity [45,46]. These features 99

would further allow playback studies of visual communication 100

to achieve the same power and robustness as studies of acoustic 101

signals. 102

Despite the advances provide by anyFish and other ani- 103

mation platforms, using video animations in animal behavior 104

research is not without limitations. For example, many ani- 105

mals have the ability to detect light from the ultraviolet (UV) 106

spectrum. Current screen projection technologies are unable to 107

8 S.J. Ingley et al. / SoftwareX xx (xxxx) xxx–xxx

project UV light, and thus any signals naturally occurring in the1

UV would be lost in video animations. Further limitations as-2

sociated with projecting a stimulus on a video screen include3

the potential absence of cues used by live animals to gauge4

depth and apparent distance from the observer. This obstacle5

can be overcome by incorporating appropriate species-specific6

cues (e.g., shadows and occlusions), but their existence must be7

noted when designing experiments. We advise care when de-8

signing, using, and interpreting results from experiments using9

animated stimuli and refer users to previous work which more10

thoroughly addresses the limitations of these methods [45,47].11

Despite the general limitations of using animated stimuli, the12

functionality of anyFish provides an exciting array of experi-13

mental opportunities that will help shed light on the evolution14

of animal behavior.15

Acknowledgments16

Funding for the development of anyFish was provided by17

the US National Science Foundation (IOS-1045226). SJI was18

supported by a US NSF Graduate Research Fellowship. WeQ619

thank a large community of animal behaviorists who provided20

feedback on the project since its beginning. We thank Arminda21

Suli for providing the Danio rerio used in Fig. 2, and Luis22

Arriaga for providing the image of Poecilia latipinna used in23

the same figure. We thank two anonymous reviewers whose24

feedback helped us improve this manuscript.25

Appendix A. Supplementary data26

Supplementary material related to this article can be found27

online at http://dx.doi.org/10.1016/j.softx.2015.10.001.28

References29

[1] Bradbury JW, Vehrencamp SL. Principals of animal communication.Suderland, MA: Sinauer Associates; 2011.

30

[2] Shaw KL, Lesnick SC. Genomic linkage of male song and female acousticpreference QTL underlying a rapid species radiation. Proc Natl Acad SciUSA 2009;106:9737–42.

31

[3] Remage-Healey L, Colemand MJ, Oyama RK, Schlinger BA. Brainestrogens rapidly strengthen auditory encoding and guide song preferencein a songbird. Proc Natl Acad Sci USA 2010;107:3852–7.

32

[4] Akre KL, Farris HE, Lea AM, Page RA, Ryan MJ. Signal perception infrogs and bats and the evolution of mating signals. Science 2011;333:751–2.

33

[5] Kroodsma DE. Suggested experimental-designs for song playbacks. AnimBehav 1989;37:600–9.

34

[6] McGregor P, Catchpole C, Dabelsteen T, Falls JB, Fusani L, et al. Designof playback experiments: The Thornbridge Hall NATO ARW consensus.In: McGregor P, editor. Playback and studies of animal communication.Springer US; 1992. p. 1–9.

35

[7] Draganoiu TI, Nagle L, Kreutzer M. Directional female preference foran exaggerated male trait in canary (Serinus canaria) song. Proc R SocB-Biol Sci 2002;269:2525–31.

36

[8] Phelps SM, Ryan MJ, Rand AS. Vestigial preference functions in neuralnetworks and tungara frogs. Proc Natl Acad Sci USA 2001;98:13161–6.

37

[9] Rosenthal GG, Evans CS, Miller WL. Female preference for dynamictraits in the green swordtail, Xiphophorus helleri. Anim Behav 1996;51:811–20.

38

[10] Johnson JB, Basolo AL. Predator exposure alters female mate choice inthe green swordtail. Behav Ecol 2003;14:619–25.

39

[11] Clark DL, Uetz GW. Video image recognition by the jumping spider,Maevia inclemens (Aranea, Salticidae). Anim Behav 1990;40:884–90.

40

[12] Evans CS, Marler P. On the use of video images as social-stimuli inbirds—audience effects on alarm calling. Anim Behav 1991;41:17–26.

41

[13] Woo KL, Rieucau G. From dummies to animations: a review of computer-animated stimuli used in animal behavior studies. Behav Ecol Sociobiol2011;65:1671–85.

42

[14] Harland DP, Jackson RR. Influence of cues from the anterior medial eyesof virtual prey on Portia fimbriata, an araneophagic jumping spider. J ExpBiol 2002;205:1861–8.

43

[15] Watanabe S, Troje NF. Towards a “virtual pigeon”: A new technique forinvestigating avian social perception. Anim Cogn 2006;9:271–9.

44

[16] Ord TJ, Peters RA, Evans CS, Taylor AJ. Digital video playback andvisual communication in lizards. Anim Behav 2002;63:879–90.

45

[17] Fisher HS, Wong BBM, Rosenthal GG. Alteration of the chemicalenvironment disrupts communication in a freshwater fish. Proc R Soc B-Biol Sci 2006;273:1187–93.

46

[18] Wong BBM, Rosenthal GG. Female disdain for swords in a swordtail fish.Am Nat 2006;167:136–40.

47

[19] Verzijden MN, Rosenthal GG. Effects of sensory modality on learnedmate preferences in female swordtails. Anim Behav 2011;82:557–62.

48

[20] Culumber ZW, Rosenthal GG. Mating preferences do not maintain thetailspot polymorphism in the platyfish, Xiphophorus variatus. Behav Ecol2013;24:1286–91.

49

[21] McKinnon JS. Video mate preferences of female three-spined stickle-backs from populations with divergent male coloration. Anim Behav1995;50:1645–55.

50

[22] Kunzler R, Bakker TCM. Female preferences for single and combinedtraits in computer animated stickleback males. Behav Ecol 2001;12:681–5.

51

[23] Hurlbert SH. Pseudoreplication and the design of ecological fieldexperiments. Ecol Monogr 1984;54:187–211.

52

[24] McGregor PK. Playback experiments: design and analysis. Acta Ethol2000;3:3–8.

53

[25] Veen T, Ingley SJ, Cui R, Simpson J, Asl MR, et al. anyFish: an open-source software to generate animated fish models for behavioural studies.Evol Ecol Res 2013;15:361–75.

54

[26] Adams DC, Rohlf FJ, Slice DE. Geometric morphometrics: ten years ofprogress following the ‘revolution’. Ital J Zool 2004;71:5–16.

55

[27] F. Rohlf, tpsUtil, file utility program. 1.26 ed. Stony Brook, NY: 56

Department of Ecology and Evolution, State University of New York at 57

Stony Brook; 2004. 58

[28] F. Rohlf, tpsDig, digitize landmarks and outlines. 2.05 ed. Stony Brook, 59

NY: Department of Ecology and Evolution, State University of New York 60

at Stony Brook; 2005. 61

[29] F. Rohlf, tpsRelw, relative warps analysis. 1.36 ed. Stony Brook, NY: 62

Department of Ecology and Evolution, State University of New York at 63

Stony Brook; 2003. 64

[30] Boughman JW, Rundle HD, Schluter D. Parallel evolution of sexualisolation in sticklebacks. Evolution 2005;59:361–73.

65

[31] Tobler M, Schlupp I, Plath M. Does divergence in female mate choiceaffect male size distributions in two cave fish populations? Biol Lett 2008;4:452–4.

66

[32] Langerhans RB, Makowicz AM. Sexual selection paves the road to sexualisolation during ecological speciation. Evol Ecol Res 2013;15:633–51.

67

[33] Langerhans RB, Gifford ME, Joseph EO. Ecological speciation inGambusia fishes. Evolution 2007;61:2056–74.

68

[34] Pauers MJ, Mckinnon JS. Sexual selection on color and behavior withinand between cichlid populations: Implications for speciation. Curr Zool2012;58:475–83.

69

[35] Boughman JW. Divergent sexual selection enhances reproductiveisolation in sticklebacks. Nature 2001;411:944–8.

70

[36] Ryan MJ, Rand AS. The sensory basis of sexual selection for complexcalls in the tungara frog, Physalaemus pustulosus (sexual selection forsensory exploitation). Evolution 1990;44:305–14.

71

[37] Hodgson A, Black AR, Hull R. Sensory exploitation and indicator modelsmay explain red pelvic spines in the brook stickleback, Culaea inconstans.Evol Ecol Res 2013;15:199–211.

72

S.J. Ingley et al. / SoftwareX xx (xxxx) xxx–xxx 9

[38] Endler JA, Basolo AL. Sensory ecology, receiver biases and sexualselection. Trends Ecol Evolut 1998;13:415–20.

1

[39] Endler JA. Signals, signal conditions, and the direction of evolution. AmNat 1992;139:S125–53.

2

[40] Dijkstra PD, Seehausen O, Pierotti MER, Groothuis TGG. Male-malecompetition and speciation: aggression bias towards differently colouredrivals varies between stages of speciation in a Lake Victoria cichlid speciescomplex. J Evol Biol 2007;20:496–502.

3

[41] Pauers MJ, Kapfer JM, FendoS CE, Berg CS. Aggressive biases towardssimilarly coloured males in Lake Malawi cichlid fishes. Biol Lett 2008;4:156–9.

4

[42] Nomakuchi S, Park PJ, Bell MA. Correlation between exploration activityand use of social information in three-spined sticklebacks. Behav Ecol2009;20:340–5.

5

[43] Godin JGJ, McDonough HE. Predator preference for brightly coloredmales in the guppy: a viability cost for a sexually selected trait. BehavEcol 2003;14:194–200.

6

[44] Butkowski T, Yan W, Gray AM, Cui R, Verzijden MN, et al. Automatedinteractive video playback for studies of animal communication. J Vis Exp2011.

7

[45] Fleishman LJ, Endler JA. Some comments on visual perception and theuse of video playback in animal behavior studies. Acta Ethol 2000;3:15–27.

8

[46] Fleishman LJ, McClintock WJ, D’Eath RB, Brainard DH, Endler JA.Colour perception and the use of video playback experiments in animalbehaviour. Anim Behav 1998;56:1035–40.

9

[47] Rosenthal GG. Design considerations and techniques for constructingvideo stimuli. Acta Ethol 2000;3:49–54.

10