16
PhysFrame: Type Checking Physical Frames of Reference for Robotic Systems Sayali Kate Purdue University USA [email protected] Michael Chinn University of Virginia USA [email protected] Hongjun Choi Purdue University USA [email protected] Xiangyu Zhang Purdue University USA [email protected] Sebastian Elbaum University of Virginia USA [email protected] ABSTRACT A robotic system continuously measures its own motions and the ex- ternal world during operation. Such measurements are with respect to some frame of reference, i.e., a coordinate system. A nontrivial robotic system has a large number of different frames and data have to be translated back-and-forth from a frame to another. The onus is on the developers to get such translation right. However, this is very challenging and error-prone, evidenced by the large number of questions and issues related to frame uses on developers’ forum. Since any state variable can be associated with some frame, reference frames can be naturally modeled as variable types. We hence develop a novel type system that can automatically infer variables’ frame types and in turn detect any type inconsistencies and violations of frame conventions. The evaluation on a set of 180 publicly available ROS projects shows that our system can detect 190 inconsistencies with 154 true positives. We reported 52 to devel- opers and received 18 responses so far, with 15 fixed/acknowledged. Our technique also finds 45 violations of common practices. CCS CONCEPTS Software and its engineering Abstract data types; Soft- ware defect analysis. KEYWORDS Physical Frame of Reference, Frame Consistency, Type Checking, Static Analysis, z-score Mining, Robotic Systems ACM Reference Format: Sayali Kate, Michael Chinn, Hongjun Choi, Xiangyu Zhang, and Sebastian Elbaum. 2021. PhysFrame: Type Checking Physical Frames of Reference for Robotic Systems. In Proceedings of the 29th ACM Joint European Soft- ware Engineering Conference and Symposium on the Foundations of Software This is an extended version of an article published in Proc. ESEC/FSE 2021 [12 pages, https://doi.org/10.1145/3468264.3468608]. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). ESEC/FSE ’21, August 23–27, 2021, Athens, Greece © 2021 Copyright held by the owner/author(s). ACM ISBN 978-1-4503-8562-6/21/08. https://doi.org/10.1145/3468264.3468608 Engineering (ESEC/FSE ’21), August 23–27, 2021, Athens, Greece. ACM, New York, NY, USA, 16 pages. https://doi.org/10.1145/3468264.3468608 1 INTRODUCTION Robotic systems have rapidly growing applications in our daily life, enabled by the advances in many areas such as AI. Engineering such systems becomes increasingly important. Due to the unique characteristics of such systems, e.g., the need of modeling the phys- ical world and satisfying the real time and resource constraints, robotic system engineering poses new challenges to developers. One of the prominent challenges is to properly use physical frames of reference. Specifically, the operation of a robotic system involves moving individual body parts and interacting with the external world. It entails precisely measuring positions and orientations of body parts and external objects. All these measurements are repre- sented with respect to a set of coordinate systems (also called frames of reference or frames in short). For example, in a three-dimensional coordinate system, the location (1,2,3) is meaningless unless we know the frame to which this location refers to. The origin of a frame provides the reference position (0,0,0), whereas the orienta- tion of the three axes tells us the directions in which they point to. Further, the origin and orientation of a frame itself may be defined relative to another frame and so on. Robotic systems usually follow a modular design, in which body parts and control software components are developed indepen- dently by various parties. As such, different components often use different frames. For example, a camera has the camera frame through which the physical world is measured from the camera’s perspective. In this frame, the center of camera is the origin and the axes follow the orientations of the camera. The body of a robot has the body frame whose origin is the center of the body and axes are pre-defined based on the shape of robot. Measurements in the camera frame have to be translated to the body frame before they can be used in body related computation. Since the camera can be mounted at different places and moves during operation, such translation has to be done by the developers in software. More discussion about frames and an example can be found in Section 2. Most robotic systems are developed in general purpose languages such as C/C++, facilitated by domain specific libraries. Such lan- guages do not have intrinsic support for the additional complexity induced by the use of frames. For example, although popular li- braries such as ROS [32] provide functions to facilitate translation arXiv:2106.11266v1 [cs.RO] 21 Jun 2021

PhysFrame: Type Checking Physical Frames of Reference for

  • Upload
    others

  • View
    7

  • Download
    0

Embed Size (px)

Citation preview

Page 1: PhysFrame: Type Checking Physical Frames of Reference for

PhysFrame: Type Checking Physical Frames of Reference forRobotic Systems

Sayali KatePurdue University

[email protected]

Michael ChinnUniversity of Virginia

[email protected]

Hongjun ChoiPurdue University

[email protected]

Xiangyu ZhangPurdue University

[email protected]

Sebastian ElbaumUniversity of Virginia

[email protected]

ABSTRACTA robotic system continuouslymeasures its ownmotions and the ex-ternal world during operation. Such measurements are with respectto some frame of reference, i.e., a coordinate system. A nontrivialrobotic system has a large number of different frames and datahave to be translated back-and-forth from a frame to another. Theonus is on the developers to get such translation right. However,this is very challenging and error-prone, evidenced by the largenumber of questions and issues related to frame uses on developers’forum. Since any state variable can be associated with some frame,reference frames can be naturally modeled as variable types. Wehence develop a novel type system that can automatically infervariables’ frame types and in turn detect any type inconsistenciesand violations of frame conventions. The evaluation on a set of 180publicly available ROS projects shows that our system can detect190 inconsistencies with 154 true positives. We reported 52 to devel-opers and received 18 responses so far, with 15 fixed/acknowledged.Our technique also finds 45 violations of common practices.

CCS CONCEPTS• Software and its engineering → Abstract data types; Soft-ware defect analysis.

KEYWORDSPhysical Frame of Reference, Frame Consistency, Type Checking,Static Analysis, z-score Mining, Robotic Systems

ACM Reference Format:Sayali Kate, Michael Chinn, Hongjun Choi, Xiangyu Zhang, and SebastianElbaum. 2021. PhysFrame: Type Checking Physical Frames of Referencefor Robotic Systems. In Proceedings of the 29th ACM Joint European Soft-ware Engineering Conference and Symposium on the Foundations of Software

This is an extended version of an article published in Proc. ESEC/FSE 2021 [12 pages,https://doi.org/10.1145/3468264.3468608].

Permission to make digital or hard copies of part or all of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full citationon the first page. Copyrights for third-party components of this work must be honored.For all other uses, contact the owner/author(s).ESEC/FSE ’21, August 23–27, 2021, Athens, Greece© 2021 Copyright held by the owner/author(s).ACM ISBN 978-1-4503-8562-6/21/08.https://doi.org/10.1145/3468264.3468608

Engineering (ESEC/FSE ’21), August 23–27, 2021, Athens, Greece. ACM, NewYork, NY, USA, 16 pages. https://doi.org/10.1145/3468264.3468608

1 INTRODUCTIONRobotic systems have rapidly growing applications in our daily life,enabled by the advances in many areas such as AI. Engineeringsuch systems becomes increasingly important. Due to the uniquecharacteristics of such systems, e.g., the need of modeling the phys-ical world and satisfying the real time and resource constraints,robotic system engineering poses new challenges to developers.One of the prominent challenges is to properly use physical framesof reference. Specifically, the operation of a robotic system involvesmoving individual body parts and interacting with the externalworld. It entails precisely measuring positions and orientations ofbody parts and external objects. All these measurements are repre-sented with respect to a set of coordinate systems (also called framesof reference or frames in short). For example, in a three-dimensionalcoordinate system, the location (1,2,3) is meaningless unless weknow the frame to which this location refers to. The origin of aframe provides the reference position (0,0,0), whereas the orienta-tion of the three axes tells us the directions in which they point to.Further, the origin and orientation of a frame itself may be definedrelative to another frame and so on.

Robotic systems usually follow a modular design, in which bodyparts and control software components are developed indepen-dently by various parties. As such, different components oftenuse different frames. For example, a camera has the camera framethrough which the physical world is measured from the camera’sperspective. In this frame, the center of camera is the origin andthe axes follow the orientations of the camera. The body of a robothas the body frame whose origin is the center of the body and axesare pre-defined based on the shape of robot. Measurements in thecamera frame have to be translated to the body frame before theycan be used in body related computation. Since the camera canbe mounted at different places and moves during operation, suchtranslation has to be done by the developers in software. Morediscussion about frames and an example can be found in Section 2.

Most robotic systems are developed in general purpose languagessuch as C/C++, facilitated by domain specific libraries. Such lan-guages do not have intrinsic support for the additional complexityinduced by the use of frames. For example, although popular li-braries such as ROS [32] provide functions to facilitate translation

arX

iv:2

106.

1126

6v1

[cs

.RO

] 2

1 Ju

n 20

21

Page 2: PhysFrame: Type Checking Physical Frames of Reference for

ESEC/FSE ’21, August 23–27, 2021, Athens, Greece S. Kate, M. Chinn, H. Choi, X. Zhang, S. Elbaum

Figure 1: Forum posts that show frames in robotic program-ming are difficult to handle. (Note: right-side snapshot is taken from

http://wiki.ros.org/ROS/Troubleshooting page licensed under the CC BY 3.0)

between frames, the onus is on the developers to determine thereference frames of program variables, the places where translationis needed, and correctly implement the concrete translations. Sincetranslation is often done by vector/matrix operations, many de-velopers even realize their own translation functions from scratchwithout using ROS APIs. As a result, use of frames is error-prone,even for experienced developers. This is evidenced by the fact thata large number of questions and issues for a robotic system projectare usually regarding reference frames. Figure 1 shows that ques-tions tagged with ‘tf’ (a keyword for ROS frame support [15, 36] )are among the top ten types of questions by ROS developers, suchas“confused about coordinate frames", “tf transforms are confusing",and “is there an easier way to find the coordinates of a point in an-other frame". Also, ROS wiki identifies transform issues as one ofthe top 5 common problems. Besides the difficulties of getting themright during development, misuse of frames may cause robot run-time malfunction [5], code reuse problems [1, 4], and maintenancedifficulties after deployment.

ROS provides a number of tools to help developers debug framerelated problems [2]. These are mostly runtime tools that can facili-tate visualization of frames, relation between frames, details of atransformation between two frames (e.g. when it is created and bywhich component). However, it is always more desirable to detectproblems as early as possible in the development life-cycle. A statictool that can scan and detect frame misuses before running thesystem would be highly desirable. To the best of our knowledge,there are unfortunately no such tools.

In this paper, we introduce and implement a novel fully auto-mated static tool to identify frame related problems in ROS-basedprojects. Developers direct the tool to operate on a project direc-tory, and the tool automatically models the project, and checks forpotential frame faults. Since each physical state related variable(e.g., sensor reading, acceleration, velocity, and position) is associ-ated with some frame, our tool builds on such such associationswhich are similar to variable types by nature. We hence proposea type system to model reference frames. A traditional type sys-tem often starts with type annotations that require developers toknow variable types in the first place, which is difficult in our con-text as knowing the right frames is usually difficult. Instead, ourtechnique automatically infers frames and performs type consis-tency checks, leveraging ROS conventions that can be extractedfrom its specifications and mined from code repositories. The tech-nique uses frame information sources such as static data valuesexchanged between components and the ROS frame APIs for as-signing frame types, and defines type propagation and checkingrules based on the standard coordinate conventions documented in

ROS-Enhancement-Proposals (REPs) and the semantics of framerelated operations such as displacements and rotations. As partof our contribution we present a type language as a vehicle to de-scribe these type rules , which constitute the foundation for ourdomain-specific static analyzer.

Our contributions are summarized as follows:• We propose a fully automated type inference and checking tech-nique for physical frames in ROS-based programs to detect frameinconsistencies and convention violations.

• We define frame and transformation abstract types, and our sys-tem automatically infers such types for variables. This is achievedby novel abstraction of frame related program semantics.

• We present a data-driven approach to identifying commonlyfollowed practices that may not be documented anywhere. Oursystem also checks for violations of such practices.

• We implement a tool, PhysFrame, and evaluate it on 180 ROSprojects from Github. It reports 190 type inconsistencies with154 true positives (81.05%). We report 52 from projects that arerecently active to developers and have received 18 responsesso far, with 15 fixed/confirmed. It also detects 45 violations ofcommon practices.

2 REFERENCE FRAMES AND ROS2.1 FramesA coordinate system that is used as a reference for measurements ofquantities such as position and velocity is called a frame of reference,in short, a frame. It is defined by two components: an origin pointand the axes system. A robotic system often consists of many parts,each having at least a body frame with the origin at the center ofthe rigid shape of the part, and axes pointing to some orthogonaldirections. It allows measurements in the perspective of the part.The body frame for the base of a robot is often called base_link. Itis one of the most important frames in a robot as measurements inthis frame describe how the robot as a whole sees the world. Thebody frame for a camera is called the camera frame. If the part isa sensor, it often has other frames to denote sensor readings. Forexample, a camera has multiple optical frames, including the colorframe measuring RGB format image, the depth frame measuringdepth image, and the ir frame measuring infra red image. Theseframes are centered at the corresponding sensors (e.g., lens) insteadof the camera body. They are different from the camera frame.

There are also the odometry (odom) frame, the map frame, andthe earth frame to denote the physical world, i.e., how the worldsees the robot and its surroundings. The odom frame representsrobot’s initial pose and does not move with the robot. It is like athird person standing at the robot’s starting position and observingthe motion of robot in the environment. The map frame projectseverything in a localized map such as inside a room. It is like a thirdperson standing somewhere in the room (not the starting point ofthe robot) and observing the robot and its environment. The earthframe is an earth centered frame.

During operation, sensor readings are acquired regarding thecorresponding sensor frames. They are then transformed to thebody frames of the parts where the sensors are installed, and even-tually to the base_link frame and some world frame(s) in whichcontrol algorithms use the transformed values to compute actuation

Page 3: PhysFrame: Type Checking Physical Frames of Reference for

PhysFrame: Type Checking Physical Frames of Reference for Robotic Systems ESEC/FSE ’21, August 23–27, 2021, Athens, Greece

Figure 2: Frames in a Simple Robotsignals. These signals may undergo the inverse transformationsuntil they reach the actuators. A simple commodity robot may haveup to 81 different frames [35] as shown in Figure 7 in Appendix andvalues have to be correctly transformed between any pair of frames.It is a heavy burden for developers to get all these transformationsright.

Figure 2 shows a simple robot. Observe that each of its parts hasits own body frame (e.g., camera_link, laser, and base_link). Thereare also the world frames such as odom and map. Suppose if thevehicle locates an obstacle, then it knows the obstacle’s positionin the camera_depth_optical frame. This position is different fromthe position of the obstacle in the map frame. In order to answerthe question “what is the position of an obstacle in the room?",the vehicle needs to know how the camera_depth_optical frame ispositioned and oriented relative to the base_link frame and howthe base_link frame is positioned and oriented relative to the mapframe, then has to convert the obstacle position from the camera_-depth_optical frame all the way to the map frame. Moreover, theposition of the obstacle in the frame of front right wheel, front_-right_wheel, is different from that in the base_link frame. Hence,to know the position with respect to the front right wheel, thevehicle needs to know the position and orientation of front_right_-wheel frame relative to the base_link frame. Different body framesmay have different axis orientation conventions. Figure 2 presentstwo axis orientations labeled as F1 and F2. Each part has a defaultforward-facing direction (e.g., the nose of an aerial robot and thefront of a camera), the values forward, backward, left, right, etc., aredefined regarding the default direction. F1 and F2 are widely usedin ROS. Moreover, various libraries use different conventions. Forexample, (x: right, y: up, z: backward) axis orientation is widelyused in OpenGL.

2.2 ROS and FramesThe Robotic Operating System (ROS) [32] is an open source soft-ware framework for building robotic applications. Its modular, dis-tributed nature has made it widely used. Its users range from anovice developer working on a hobby project to industries. ROSprovides various components such as drivers, localization packages,odometer packages developed by the community in addition to thecore components. It provides a package called tf or tf2 [15, 36] tosupport transformation between frames.

ROS Transforms. To specify frames and transformations, devel-opers leverage the tf library to explicitly publish transformationsbetween frames (called transforms in ROS), which also implicitlyintroduce the frames. A transform consists of a parent frame anda child frame, uniquely identified by their ids. ROS developers of-ten say a transform is from the parent to the child. For example“map→odom” means a transform from the map frame to the odomframe. It is defined by providing the displacements of the childframe’s origin and the rotation of its axes regarding the parentframe. However, the transform is used to translate states in the childframe to the parent frame (not the other way suggested by its name).We unfortunately have to inherit such term ambiguity from ROS.In the rest of the paper, we explicitly distinguish the name of atransform and its use in state translation to avoid confusion.

After transforms are published, other ROS components, such asstandard control algorithms, subscribe to these transforms basedon their names and use them in computation without knowinghow they are implemented. Hence, these core algorithms can beagnostic to concrete system composition.Launch Files. Many parts have fixed and static relative positions,e.g., camera mounted at a fixed position of an aerial robot. Thetransforms between their frames can be statically defined in launchfiles, which are a kind of static initialization files that define howto start executing a robotic system. The transforms for frameswith dynamic relative positions, e.g., when a camera is mountedon a gimbal, have to be published and updated on-the-fly duringoperation. They are hence implemented by code. An example launchfile snippet can be found in Appendix A .TF Tree. Transforms are potentially needed in between each pairof frames. However, specifying/implementing transforms for eachpair entails substantial and error-prone efforts. ROS has a cleanhierarchical design to reduce the complexity. If we consider a frameas a node and a transform between two frames as an edge, ROSrequires all published frames and transforms to form a tree, thatis, no cycles and each node having only one parent. We call it aTF tree. As such, the developer only needs to focus on realizingthe transforms denoting individual edges. ROS then automaticallyconstructs the transform from an arbitrary frame 𝑓1 to anotherarbitrary frame 𝑓2, by first performing the transforms from 𝑓1 to thelowest common ancestor frame of the two, and then the transformfrom the ancestor to 𝑓2. It can be easily inferred that such a designsubstantially reduces the developers’ burden. The below figureshows the TF tree for the robot in Figure 2. Observe that whilethe developers only need to develop and publish 12 transforms(denoted by the edges), ROS allows 156 transforms from a frameto any other frame. The red edges form a transform path betweenfront_right_wheel to camera_depth_optical, when the right wheelon the front of the robot uses data from the camera depth sensor.

2.3 Frame Conventions in ROSAll ROS projects follow certain conventions when manipulatingframes. Violation to these conventions would cause integration,maintenance, and reuse problems. Our technique leverages these

Page 4: PhysFrame: Type Checking Physical Frames of Reference for

ESEC/FSE ’21, August 23–27, 2021, Athens, Greece S. Kate, M. Chinn, H. Choi, X. Zhang, S. Elbaum

conventions to automatically infer and check frame types. Thereare two sources for extracting conventions: ROS specification andlaunch files. The former is explicit and the latter is implicit. SinceROS conventions are often related to static coefficients values (e.g.,orthogonal axes orientations), they are difficult to extract directlyfrom code where coefficients are largely variables.Explicit Conventions FromROS Specifications. These conven-tions can be classified into the following three categories.Naming Conventions. ROS applications use messages to exchangedata between components. The frame_id field in a message speci-fies the frame for the data in that message. These ids follow certainconventions. For example, “map” is used for the map frame [27],and “base_link” for the body frame of the robot’s base [27, 28];optical frames should have an “_optical” suffix [16].Axis Orientation Conventions.According to [16], body frames shouldhave the (x: forward, y: left, z: up) or FLU axis orientation (i.e., F1orientation in Figure 2); optical frames should have the (x: right, y:down, z: forward) orientation (i.e., F2 in Figure 2).Tree Order Conventions. Recall that in a robotic system, the pub-lished frames and transforms are organized in a TF tree. ROS re-quires TF trees follow a partial order: earth → map → odom →base_link [27]. Note that while a concrete TF tree for a specificrobotic system may omit some of these frames and have additionalframes, such an ordermust be respected. The reason is that the orderoptimizes the performance for mostly commonly seen transforms.Implicit Conventions from ROS Launch Files. Launch filesprovide a rich resource for mining implicit conventions. We fo-cus on extracting two kinds of implicit conventions.Co-occurrence Conventions. Robotic systems of a same kind sharesimilarity in their physical compositions. As such, they often usesimilar frames and transforms. This is reflected by co-occurrencesof transforms in launch files. For example, a ground vehicle robotwith two wheels has one wheel on its left and the other wheel onits right. If a transform from base_link to the frame of left wheelis defined, then a transform from base_link to the frame of rightwheel is also defined.Value Conventions. Certain transforms are associated with fixedcoefficient values such as 0 rotation angles or 0 displacement values.For example, a transform from camera_depth to camera_depth_-optical must have 0 displacement values.

3 MOTIVATIONTo motivate our technique we use code from the simultaneous lo-calization and mapping (SLAM) component ORB2-SLAM2 (linksin the caption of Figure 3) [3, 29]. Localization and mapping algo-rithms like ORB are key to the operation of modern mobile robots,incrementally building a map of the robot surroundings utilizingits camera as the robot navigates. This component, however, usesdifferent conventions from ROS and hence the most important taskfor interfacing then is to perform appropriate frame conversions.

The code snippet in Figure 3 aims to implement a transform tofacilitate translation of states from the ROS camera frame to theROS map frame using the environment model of ORB-SLAM2. Theconversion implementation, however, is incorrect, and PhysFramecatches this error by type checking.

Constructing the aforementioned transform is non-trivial, andconsists of three groups of operations. First, given some ROS state

Figure 3: Example of incorrect transform between framessource: https://git.io/JtSHb, fixed source: https://git.io/JtSRt

in the child camera frame with the conventional FLU orientation, itshould be translated into ORB’s RDF orientation, (x:right, y:down,z:forward). Second, the position of the camera frame in the mapframe needs to be determined. However, since ORB-SLAM2 mod-els the environment in the camera frame, one can only query theposition of map frame in the camera frame, so it needs to be trans-posed to acquire the position of camera frame in the map frame.Third, the transform should translate the results back to ROS’s FLUorientation. The code shown fails to accomplish the last step.

Function PublishPositionAsTransform() starting at line 27takes the ORB-SLAM2’s position matrix (denoting the map frame’sposition in the camera frame), i.e., variable position, and con-structs a ROS transform (line 28) by invoking TransformFrom-Mat(), which starts at line 1. In line 4 it acquires the rotation matrixfrom the ORB-SLAM2’s position matrix, which includes both dis-placements and rotation, and performs transposition by functiont(). Line 6 clones it to tf_camera_rotation. Line 10 creates amatrix denoting rotation of an FLU axis system anti clockwisefor 90 degree with respect to the 𝑥 axis (shown to the right ofthe code). Lines 13 and 17 create two additional rotation matrices.Lines 20-23 compose the three matrices and the earlier rotationmatrix. Line 25 creates a ROS transform from displacements de-noted by tf_camera_translation and rotation denoted by tf_-camera_rotation. When the transform is used in state translation,a state is multiplied with tf_camera_rotation, which equals toinvYZ*Rz*Rx*rotation. The first three matrixes are equivalent totransforming the FLU orientation of the state to the RDF orienta-tion as shown on the right of line 25. The fourth matrix (rotation)then translates the state from the camera frame to the map frame.However, the constructed transform is problematic as it forgetsto further translate back to FLU. This bug will cause downstream

Page 5: PhysFrame: Type Checking Physical Frames of Reference for

PhysFrame: Type Checking Physical Frames of Reference for Robotic Systems ESEC/FSE ’21, August 23–27, 2021, Athens, Greece

Figure 4: Overview of PhysFrame

system mis-behaviors, which could be devastating if the camerareadings are the dominant source for control decisions. It is fixedby appending the inverse rotation matrices of invYZ, Rz, and Rx totf_camera_rotation after line 23, to change RDF to FLU.

PhysFrame associates each state variable with a frame type thatabstracts both the displacements and orientation of the frame inits parent frame. It also associates each transform variable with atransform type that denotes the displacements and rotation neededin frame translation. It propagates such types following operationsemantics and checks consistency. In the example code, it associatesmatrix Rx at line 10 with a transform type from the FLU orientation𝐴 to the orientation 𝐵. The type is derived from the constant matrixvalues. It further associates matrix Rz at line 13 with a transformtype from FLU to the orientation 𝐶 ; and invYZ at line 17 from FLUto 𝐷 . By modeling the semantics of matrix multiplications at lines20, 21, and 23, it determines that the transform at line 25 has thetype from FLU to RDF. This yields a type error at lines 30-31 whenthe transform is published by function sendTransform() becauseROS convention demands both camera frame and map frame tohave FLU, whereas the transform converts FLU to RDF.

Observe that it is difficult to get such transformation right due toits intrinsic subtlety and complexity. While this is only a one-steptransform, as mentioned earlier, a non-trivial robotic system hasmore than 80 different frames and transforms between any pairmay be necessary. The example bug is just one of the many kinds ofbugs PhysFrame detects. Others include missing frame and brokenTF tree. More discussion can be found in Section 4.2.3.

4 OUR APPROACH4.1 OverviewFigure 4 presents an overview of PhysFrame. It takes as input aC++ ROS project, and outputs frame inconsistencies and implicitconvention violations. PhysFrame consists of three components:implicit convention miner, file-processor, and type checker.

The miner takes a database of static transforms extracted fromthe launch files of a large repository of over 2200 mature ROSprojects fromGithub and performs data mining to infer implicit con-ventions. The miner identifies frequent static patterns that includepair of transforms that commonly co-exist in a project, transformscommonly having zero displacements, and transforms commonlyhaving zero rotation. Since these patterns may happen coinciden-tally, the miner computes a frequency for each pattern to quantifyits certainty (we use z-score [14] as a frequency measure) and re-ports it if it is above a specified z-score threshold. The mining

process is executed once on the collected static transform data-base, but can be updated as new static transforms are incorporatedinto the database. The produced rule conventions are then used inthe analysis for each subject ROS project to identify anti-patterns.Details can be found in Appendix B .

A ROS project often contains multiple executable subsystems,each having its own TF-tree. PhysFrame has to analyze these sub-systems one at a time. The file-processor separates a project intosubsystems, each consisting of a group of C/C++ source files andlaunch files that are executed together. A subsystem usually hasa separate initial launch file. The file-processor hence starts fromthese top level launch files, identifies the child launch files andthe corresponding CMakeLists files, which indicate the source filesinvolved in each subsystem, and in turn finds file groups that rep-resent the project’s subsystems. Details are elided.

The type checker then takes patterns and file groups information,together with project files, and types variables and statements inthe C/C++ source files. Type errors are reported when the programcannot be properly typed. PhysFrame reports 7 kinds of type errorsand 2 kinds of type warnings as inconsistencies, and reports 3kinds of implicit convention violation warnings (see Section 4.2.3).Warnings may not lead to broken functionalities but problems incode maintenance and reuse. In the following, we will focus onexplaining the type system.4.2 Type SystemBefore introducing the formal language and type rules, we intu-itively explain how frames and transforms are used. Typically, one-step transform objects are explicitly created to facilitate translationof states in a child frame (e.g., position) to a parent frame. A trans-form specifies the relative position and orientations of the childframe in the parent frame, denoted by displacements and rotations.It also specifies the (string) ids for the child and parent frames. An idis the unique representation of a frame. The creations of transformsalso implicitly define frames (through the ids). They are implicit asthere are no explicit ROS data structures for frames. A state variablecan be explicitly associated with some frame id, yielding a stamped(a ROS term) state variable. A transform can be explicitly applied tosome stamped variable (in a child frame) to acquire its correspon-dence in the parent frame. One-step transforms can be publishedand hence become part of the global TF tree. As such, transformsbetween arbitrary frames (e.g., those multiple steps apart) can beautomatically constructed by traversing the tree (Section 2.2). Todefine a one-step transform, developers often explicitly specify theentailed rotation. A rotation can be composed from others. Forexample, a 60 degree rotation can be composed from two 30 degreerotations. Plain state variables (not stamped) can be operated onjust like regular C/C++ variables/data-structures. As such, eventhey (implicitly) belong to some frames, frame inconsistencies can-not be detected. Furthermore, ROS does not provide type checkingmechanism even for stamped variables. The C/C++ type systemcannot offer help either as it does not have any domain knowledge.For example, it can type a variable to a ROS transform data struc-ture but cannot identify the specific transform type, i.e., the parentand child frames and the displacements and rotation involved.

4.2.1 ASimplifiedLanguage andTypes. ROS is based onC/C++with a set of library functions. While our type system supports the

Page 6: PhysFrame: Type Checking Physical Frames of Reference for

ESEC/FSE ’21, August 23–27, 2021, Athens, Greece S. Kate, M. Chinn, H. Choi, X. Zhang, S. Elbaum

complex syntax of C/C++/ROS and models the frame related APIs,we use a simplified language for discussion brevity. The languagefocuses on modeling frame related semantics and ignores the stan-dard C/C++ types, operations and statements. It directly modelsa number of ROS tf APIs as language primitives. Moreover, sinceour analysis is flow insensitive, control structure like conditionalstatements and loops are less relevant and hence not modeled.Types. We define three types: (1) frame type (Frame) that repre-sents the frame information of a state variable, (2) transform type(Transform) that represents frame transforms, and (3) rotation type(𝑅𝑜𝑡𝑎𝑡𝑖𝑜𝑛) that represents changes in the axes’ directions. Note thatthese are not ROS types, but rather types in PhysFrame.Frame Type. This is an abstract type and always used with someconcrete state type such as Point and Pose, indicating points andposes in a specific frame, respectively. For example, ROS usestf::Stamped<T>(T x, ..., String id) to denote a T type stateobject associated with a frame identified by id. The frame type’sdefinition is as follows.

𝐹𝑟𝑎𝑚𝑒 ::= < 𝑖𝑑, 𝑝𝑖𝑑, 𝑥,𝑦, 𝑧, 𝑑𝑥 , 𝑑𝑦, 𝑑𝑧 > (1)

It is composed of eight fields: 𝑖𝑑 denoting the (string) name of frame,𝑝𝑖𝑑 the name of its parent frame type; the next three fields, 𝑥,𝑦, 𝑧,being the positions of the frame’s origin with respect to its parentframe, where their values are of type Displacement.

Displacement ::= 𝑅 | ⊤ (2)

where, 𝑅 is a static numerical value, and ⊤ represents a value un-known statically. The next three fields, 𝑑𝑥 , 𝑑𝑦, 𝑑𝑧 , encode the 𝑥 , 𝑦, 𝑧axes orientations of a frame, with values of the Orientation type:

Orientation = {left, right, up, down, forward, backward}

They denote the (orthogonal) orientations of axes regarding thebody part corresponding to the frame. They are independent of theplacement of the body part regarding its base. For example, althougha camera may be installed with arbitrary (and even dynamic) anglesregarding a robot’s body, the orientation of the camera frame’s axesis forward-left-up (FLU), which is orthogonal regarding the camera.Although axes orientations are orthogonal, different componentsand robotic frameworks have different orientation conventions,transformations need to be explicitly performed by developers.Transform Type. ROS allows developers to create transform objectsthat denote transformation from a frame to another (e.g., using thett::Transformer class). We associate such objects with a trans-form type that abstracts information to facilitate consistency andconvention checks. The transform type is defined as follows.

𝑇𝑟𝑎𝑛𝑠 𝑓 𝑜𝑟𝑚 ::=< cid, pid, 𝑥,𝑦, 𝑧, 𝜏𝑟 > (3)Here, 𝑐𝑖𝑑 and 𝑝𝑖𝑑 are the child and parent frame ids, respectively,𝑥 , 𝑦, 𝑧 are abstract values of the aforementioned Displacement type,and 𝜏𝑟 denotes an abstract rotation, which will be explained next.The displacements and rotation together define a transformation.Rotation Type. Rotations change axes orientations of a frame. ROSallows creation of first-order rotation objects (e.g., through the ROStt::Transform class) and manipulations of these objects. Theycan be further used to construct transform objects. In this work, weare interested in orthogonal rotations such as those with 90 or 180degrees as they are used in changing axes orientations. We hence

⟨𝑃𝑟𝑜𝑔𝑟𝑎𝑚⟩ 𝑃 F 𝑆

⟨𝑆𝑡𝑎𝑡𝑒𝑚𝑒𝑛𝑡 ⟩ 𝑆 F 𝑆1;𝑆2 | 𝑥 := 𝑣 | 𝑥 := 𝑦 | 𝑥 := 𝑦 op 𝑧 |fx := stamped(𝑠, 𝑥) | fx .id := 𝑠 |𝑥 := get_data (fx) | fx := fy | fx :=external() |fx := transform_to(𝑠, fy) |tx := new_transform(𝑠𝑐ℎ𝑑 , 𝑠𝑝𝑛𝑡 , 𝑥, 𝑦, 𝑧, rx) |sendTransform (tx) |tx := lookupTransform (𝑠𝑝𝑛𝑡 , 𝑠𝑐ℎ𝑑 ) |fx := apply_transform (tx, fy) | rx := 𝑣3×3 |rx := 𝑥 | rx := ry ⊗ rz | publish (𝑠𝑡𝑜𝑝𝑖𝑐 , fx)

<𝑆𝑡𝑎𝑡𝑒𝑉𝑎𝑟> 𝑥, 𝑦, 𝑧

<𝑆𝑡𝑎𝑚𝑝𝑒𝑑𝑆𝑡𝑎𝑡𝑒𝑉𝑎𝑟> fx, fy, fz<𝑇𝑟𝑎𝑛𝑠 𝑓 𝑜𝑟𝑚𝑉𝑎𝑟> tx, ty, tz<𝑅𝑜𝑡𝑎𝑡𝑖𝑜𝑛𝑉𝑎𝑟> rx, ry, rz<𝐶𝑜𝑛𝑠𝑡𝑉𝑒𝑐𝑡𝑜𝑟> 𝑣

<𝐶𝑜𝑛𝑠𝑡𝑆𝑡𝑟𝑖𝑛𝑔> 𝑠

Figure 5: Language

define rotation type as follows.

𝑅𝑜𝑡𝑎𝑡𝑖𝑜𝑛 ::= < 𝑑𝑠𝑥 , 𝑑𝑠𝑦, 𝑑𝑠𝑧 > | ⊤ (4)The three fields, 𝑑𝑠𝑥 , 𝑑𝑠𝑦, 𝑑𝑠𝑧 are variables of DirSwitch type:

DirSwitch = {“𝑥”, “ − 𝑥”, “𝑦”, “ − 𝑦”, “𝑧”, “ − 𝑧”, ⊤}

Field 𝑑𝑠𝑥 holds a value indicating that the axis denoted by thevalue points in the same direction as the previous 𝑥 axis (i.e., the 𝑥axis before rotation). Fields 𝑑𝑠𝑦 and 𝑑𝑠𝑧 work in a similar way. Weuse the following example to explain the semantics. Let < “𝑧”, “ −𝑥”, “ − 𝑦” > be a Rotation type. It switches orientations of axessuch that 𝑧 axis points in the same direction as the previous 𝑥 axis’direction, −𝑥 axis points in the same direction as the previous 𝑦axis’ direction (i.e., the current 𝑥 axis is the opposite of the previous𝑦 axis), and −𝑦 points in the same direction as the previous 𝑧 axis’direction. Assume a variable of frame type < ..., forward, left, up>.After applying the aforementioned rotation, a new frame type <..., right, down, forward > is derived. Value ⊤ means the rotation isnot orthogonal or cannot be determined statically.Language. Figure 5 presents the language. We call all ROS statevariables StateVar , such as tf::Point and tf::Pose. State vari-ables are frame agnostic. When they are explicitly associated withsome frame, they become StampedStateVar1, corresponding to tf::Stamped⟨Point⟩ and tf::Stamped⟨Pose⟩, etc. in ROS.We use𝑥,𝑦, 𝑧for StateVar and fx, fy, fz for StampedStateVar . We also use variablestx, ty, tz and rx, ry, rz to denote transform and rotation variables.Note that these variables are not typed. Their types will be resolvedby our type rules. We distinguish different kinds of variables justfor better readability.

Statement “𝑥 := 𝑣” denotes a state variable assignment usinga constant vector. A scalar variable can be considered as a vec-tor variable with one dimension. Function stamped() explicitlyassociates 𝑥 with a frame denoted by a string 𝑠 , correspondingto tf::Stamped⟨T⟩::Stamped() in ROS. ROS allows stamping astate variable with an empty id and later explicitly setting the idfield of the stamped variable. A common error is that the developerstamps an empty frame and later forgets to set the frame. Statement“fx .id := 𝑠” models the set frame operation. Function get_data()retrieves the plain state from a stamped state variable. Functionexternal()models the situations where stamped state variables are

1ROS uses the term “stamped” to denote that a variable is contextualized with a frame.We hence use a similar term here.

Page 7: PhysFrame: Type Checking Physical Frames of Reference for

PhysFrame: Type Checking Physical Frames of Reference for Robotic Systems ESEC/FSE ’21, August 23–27, 2021, Athens, Greece

published by libraries without source code and hence beyond ouranalysis. Function transform_to() transforms fy to a target framedenoted by 𝑠 , yielding a new stamped variable fx. It correspondsto ROS functions such as tf::Transformer::transformPoint().Function new_transform() creates a one-step transform object fortranslating states in the child frame to their correspondences in theparent frame. It specifies the parent and child frame ids ( by 𝑠𝑝𝑛𝑡and 𝑠𝑐ℎ𝑑 , respectively), the displacements (by state variables 𝑥,𝑦, 𝑧),and the rotation (by 𝑟𝑥). The displacements and rotation specifythe linear and angular positions of the child frame with respectto the parent frame. This primitive corresponds to creating a ROStf::StampedTransform object. One-step transforms can be pub-lished by function sendTranform() such that they become edgesof the TF tree (Section 2.2). Function lookupTransform() finds atransform from frame 𝑠𝑝𝑛𝑡 to frame 𝑠𝑐ℎ𝑑 . These two frames may notcorrespond to any published one-step geometric transforms. Recallthat ROS automatically constructs a transform by finding a pathfrom 𝑠𝑝𝑛𝑡 to the lowest common ancestor (in the TF tree) and thento 𝑠𝑐ℎ𝑑 . It corresponds to tf::Transfomer::lookupTransform().Function apply_transform() applies a transform tx to a stampedvariable fy. As such, fy must have the child frame type of tx and fxhave the parent frame type of tx. Statement “rx := 𝑣3×3” specifiesa constant rotation by a 3 times 3 matrix. One can also specify adynamic rotation from a (matrix/vector) variable. Two rotationscan be aggregated to one by matrix multiplication ⊗. A stampedvariable can be published to a topic. Any (remote) subscribes of thetopic will receive the value. As such, the variable’s frame must beproperly set. Otherwise, remote parties cannot make sense of it.

While PhysFrame supports conditionals, loops, functions, andother frame related ROS APIs. They are elided from our language.

4.2.2 Type Rules. Table 1 presents the type rules. These rulesinfer types for state variables and frame related variables and checkfor any inconsistencies, that is, variables cannot be properly typed,e.g., a variable having more than one types following the rules.We use meta-variables 𝜏 , 𝜏𝑡 , and 𝜏𝑟 to range over the infinite setsof Frame, Transform, and Rotation types. In our rules, a variable𝑥 : 𝜏 means that 𝑥 has 𝜏 type. A statement 𝑥 := ... : 𝜏 means thatthe statement has 𝜏 type, which is equivalent to the left-hand-sidevariable 𝑥 having 𝜏 type as well. In general, a rule is read as follows:if the premises are satisfied (e.g., type checked), the conclusions areyielded. The rules are driven by the syntax of the language.Data Assignment Rules. Rule 𝑅1 specifies that for a copy statement,if the right-hand-side is typed to 𝜏 , the left-hand-side is typed to 𝜏as well. Rule 𝑅2 requires that in a binary operation of state variables,the two operands must have the same type 𝜏 . Then we concludethe result of the statement has the same type. Rule 𝑅3 specifies thatif 𝑥 is typed to 𝜏 , which has the 𝑠 as the frame id, we can concludefx has the same type 𝜏 . Note that type inference is flow-insensitive,if 𝑥 was free (i.e., untyped), the rule types it to 𝜏 . Rule 𝑅4 types anexplicit set frame statement to the type denoted by the id string 𝑠 .Rule 𝑅5 specifies that if a state variable 𝑥 is acquired from a stampedvariable fx, 𝑥 inherits the type of fx. Rule 𝑅6 specifies two stampedvariables in a copy statement must have the same type.Transformation-related Rules. Rule 𝑅7 types a transformation state-ment. Specifically, the meaning of function reachable() is definedin Figure 6. It asserts that there is a direct/indirect transform from

reachable (𝑠1, 𝑠2) : 𝑠1 and 𝑠2 share some common ancestor in the TF treeand hence reachable from each other

disp (𝑥) =

{𝑟 𝑥 holds a constant 𝑟 ∈ 𝑅 after

constant propagation⊤ otherwise

rotate (< 𝑑𝑥 , 𝑑𝑦 , 𝑑𝑧 > ,< 𝑑𝑠𝑥 , 𝑑𝑠𝑦 , 𝑑𝑠𝑧 >) = < 𝑛𝑥 , 𝑛𝑦 , 𝑛𝑧 >, with

𝑛𝑥 =

𝑑𝑥 𝑑𝑠𝑥 = “𝑥”𝑜𝑝𝑝𝑜𝑠𝑖𝑡𝑒 (𝑑𝑥 ) 𝑑𝑠𝑥 = “ − 𝑥”𝑑𝑦 𝑑𝑠𝑦 = “𝑥”𝑜𝑝𝑝𝑜𝑠𝑖𝑡𝑒 (𝑑𝑦 ) 𝑑𝑠𝑦 = “ − 𝑥”𝑑𝑧 𝑑𝑠𝑧 = “𝑥”𝑜𝑝𝑝𝑜𝑠𝑖𝑡𝑒 (𝑑𝑧 ) 𝑑𝑠𝑧 = “ − 𝑥”⊤ otherwise

𝑛𝑦 = ..., 𝑛𝑧 = ...

tree_form_with_edge (𝑠1, 𝑠2) : adding an new edge denoted by𝑠1 → 𝑠2 does not break the TF tree form

orthogonalize (𝑣3×3) ={ ⊤ 𝑑𝑠𝑥 = ⊤ ∨ 𝑑𝑠𝑦 = ⊤∨

𝑑𝑠𝑧 = ⊤⟨𝑑𝑠𝑥 , 𝑑𝑠𝑦 , 𝑑𝑠𝑧 ⟩ otherwise

𝑑𝑠𝑥 =

“𝑥” 𝑣1 = [1, 0, 0]“ − 𝑥” 𝑣1 = [−1, 0, 0]“𝑦” 𝑣1 = [0, 1, 0]“ − 𝑦” 𝑣1 = [0,−1, 0]“𝑧” 𝑣1 = [0, 0, 1]“ − 𝑧” 𝑣1 = [0, 0,−1]⊤ otherwise

𝑑𝑠𝑦 = ... 𝑑𝑠𝑧 = ...

compose ( ⟨𝑑𝑠′𝑥 , 𝑑𝑠′𝑦 , 𝑑𝑠′𝑧 ⟩, ⟨𝑑𝑠′′𝑥 , 𝑑𝑠′′𝑦 , 𝑑𝑠′′𝑧 ⟩) = ⟨𝑑𝑠𝑥 , 𝑑𝑠𝑦 , 𝑑𝑠𝑧 ⟩

𝑑𝑠𝑥 =

𝑑𝑠′′𝑥 𝑑𝑠′𝑥 = “𝑥”−𝑑𝑠′′𝑥 𝑑𝑠′𝑥 = “ − 𝑥”𝑑𝑠′′𝑦 𝑑𝑠′𝑥 = “𝑦”−𝑑𝑠′′𝑦 𝑑𝑠′𝑥 = “ − 𝑦”𝑑𝑠′′𝑧 𝑑𝑠′𝑥 = “𝑧”−𝑑𝑠′′𝑧 𝑑𝑠′𝑥 = “ − 𝑧”⊤ otherwise

𝑛𝑦 = ..., 𝑛𝑧 = ...

Figure 6: Auxiliary functions used in type rules

fy’s frame 𝜏1 to the frame 𝜏 denoted by 𝑠 . If so, we conclude that fxhas 𝜏 type. Rule 𝑅8 types a statement that creates a one-step trans-form from < 𝑠𝑐ℎ𝑑 , 𝑠𝑝𝑛𝑡 , 𝑥,𝑦, 𝑧, rx >. It entails the introduction of animplicit frame type 𝜏𝑐 whose id is 𝑠𝑐ℎ𝑑 , parent id is 𝑠𝑝𝑛𝑡 , displace-ments are those abstracted from 𝑥,𝑦, 𝑧 using the disp() functiondefined in Figure 6, and axes orientations are resulted from apply-ing the (abstract) rotation specified by the rotation type 𝜏𝑟 of rx tothe orientations of the parent frame. Function rotate is defined inFigure 6. It takes a 3-dimension source orientation (with values left,right, forward, etc.) and a 3-dimension rotation (with values “𝑥”,“-𝑦”, etc.), and produces a resulted 3-dimension target orientation.Specifically, the result 𝑥 orientation, denoted as 𝑛𝑥 is determined bysearching for which rotation dimension has value of “𝑥” or “-𝑥”. Ifthe 𝑦 rotation has value “𝑥”, denoted as ds𝑦 = “𝑥”, 𝑛𝑥 has the sameorientation as the original 𝑦 orientation (i.e., 𝑑𝑦 ). If the value is “-𝑥”,the orientation is flipped. An example of such rotation can be foundin the discussion of rotation type at the end of Section 4.2.1.

Rule 𝑅9 types a statement that publishes a transform tx. It re-quires the edge from the parent frame to the child frame does notbreak the TF tree, e.g., by introducing cycles or having multipleparents for a child. The tree_form_with_edge() function not onlychecks the condition but also adds the edge to the TF tree if thecondition is satisfied. Rule 𝑅10 types a statement that looks upa transform from an arbitrary parent frame to an arbitrary childframe in the TF tree. The transform may not be any of the createdone-step transforms, but rather automatically composed by ROS bytraversing the tree. It requires reachability between the two frames(in the tree), which implicitly demands the existence of the two

Page 8: PhysFrame: Type Checking Physical Frames of Reference for

ESEC/FSE ’21, August 23–27, 2021, Athens, Greece S. Kate, M. Chinn, H. Choi, X. Zhang, S. Elbaum

Table 1: Type Rules

𝑅1𝑦 : 𝜏

𝑥 := 𝑦 : 𝜏𝑅2

𝑦 : 𝜏 𝑧 : 𝜏𝑥 := 𝑦 or 𝑧 : 𝜏

𝑅3𝜏 .𝑖𝑑 = 𝑠 𝑥 : 𝜏

fx := stamped(𝑠, 𝑥) : 𝜏 𝑅4𝜏 .𝑖𝑑 = 𝑠

fx .id := 𝑠 : 𝜏

𝑅5fx : 𝜏

𝑥 := get_data(fx) : 𝜏 𝑅6fy : 𝜏

fx := fy : 𝜏

𝑅7𝜏 .𝑖𝑑 = 𝑠 reachable (𝜏 .𝑖𝑑, 𝜏1 .𝑖𝑑) fy : 𝜏1

fx := transform_to(𝑠, fy) : 𝜏

𝑅8

rx : 𝜏𝑟 𝜏𝑐 = ⟨𝑠𝑐ℎ𝑑 , 𝑠𝑝𝑛𝑡 , disp (𝑥), disp (𝑦), disp (𝑧), 𝑑𝑥 , 𝑑𝑦 , 𝑑𝑧 ⟩𝑠𝑐ℎ𝑑 ! = 𝑠𝑝𝑛𝑡 𝜏𝑝 .𝑖𝑑 = 𝑠𝑝𝑛𝑡

⟨𝑑𝑥 , 𝑑𝑦 , 𝑑𝑧 ⟩ = rotate ( ⟨𝜏𝑝 .𝑑𝑥 , 𝜏𝑝 .𝑑𝑦 , 𝜏𝑝 .𝑑𝑧 ⟩, 𝜏𝑟 )tx := new_transform(𝑠𝑐ℎ𝑑 , 𝑠𝑝𝑛𝑡 , 𝑥, 𝑦, 𝑧, rx) :

< 𝑠𝑐ℎ𝑑 , 𝑠𝑝𝑛𝑡 , disp (𝑥), disp (𝑦), disp (𝑧), 𝜏𝑟 >

𝑅9tx : 𝜏𝑡 tree_form_with_edge (𝜏𝑡 .𝑝𝑖𝑑, 𝜏𝑡 .𝑐𝑖𝑑)

sendTransform(tx) : 𝜏𝑡

𝑅10reachable (𝑠𝑝𝑛𝑡 , 𝑠𝑐ℎ𝑑 ) 𝜏𝑡 .𝑝𝑖𝑑 = 𝑠𝑝𝑛𝑡 𝜏𝑡 .𝑐𝑖𝑑 = 𝑠𝑐ℎ𝑑

tx := lookupTransform(𝑠𝑝𝑛𝑡 , 𝑠𝑐ℎ𝑑 ) : 𝜏𝑡

𝑅11tx : 𝜏𝑡 fy : 𝜏 𝜏𝑡 .𝑝𝑖𝑑 = 𝜏′.𝑖𝑑 𝜏𝑡 .𝑐𝑖𝑑 = 𝜏 .𝑖𝑑

fx := apply_transform(tx, fy) : 𝜏 ′

𝑅12𝜏𝑟 = orthogonalize (𝑣3×3)

rx := 𝑣3×3 : 𝜏𝑟𝑅13 rx := 𝑥 : ⊤

𝑅14

𝜏′𝑟 = ⟨𝑑𝑠′𝑥 , 𝑑𝑠′𝑦 , 𝑑𝑠′𝑧 ⟩ 𝜏 ′′𝑟 = ⟨𝑑𝑠′′𝑥 , 𝑑𝑠′′𝑦 , 𝑑𝑠′′𝑧 ⟩ry : 𝜏′𝑟 rz : 𝜏 ′′𝑟 𝜏r = compose (𝜏′𝑟 , 𝜏 ′′𝑟 )

rx := ry ⊗ rz : 𝜏𝑟

𝑅15ry : ⟨𝑑𝑠𝑥 , 𝑑𝑠𝑦 , 𝑑𝑠𝑧 ⟩ rz : ⊤

rx := ry ⊗ rz : ⟨𝑑𝑠𝑥 , 𝑑𝑠𝑦 , 𝑑𝑠𝑧 ⟩

𝑅16fx : 𝜏

publish(𝑠𝑡𝑜𝑝𝑖𝑐 , 𝑓 𝑥) : 𝜏

frames. Rule 𝑅11 specifies that when typing a statement applying atransform tx to a stamped variable fy, the transform’s child framemust equal to fy’s frame, and its parent frame is the resulting frame.

𝑅16 types a statement that publishes a stamped variable. It re-quires that the variable is properly typed.Rotation-related Rules. Rule𝑅12 types the creation of a rotation froma 3× 3 constant vector. Function orthogonalize() is used to abstractthe vector to the corresponding DirSwitch abstract values. Its defini-tion can be found in Figure 6. Specifically, an orthogonal 3×3 vectoris composed of three orthogonal unit vectors, i.e., each row andeach column has only one non-zero element of value 1 or -1. Now,if we consider that rows 1,2,3 represent initial 𝑥,𝑦, 𝑧 orientationand columns 1,2,3 represent then changed 𝑥,𝑦, 𝑧 orientation, thenthe column position of the non-zero element in a row indicatesthe axis of the changed orientation that is the same as the axis ofthe initial orientation represented by the row. For example, value-1 in row 2 and column 1 denotes that the direction of 𝑦 axis ofinitial orientation becomes the direction of −𝑥 axis of the changedorientation. When the matrix does not denote orthogonal rotation,the resulted type is ⊤.

Rule 𝑅13 types a rotation creation from a variable (i.e., non-constant). It usually corresponds to angular transformation that isneeded for state translation across frames, but not for orientationchanges. For example, if a camera can spin regarding the body, itsreadings (regarding the camera frame) have to go through a dy-namic angular transformation in order to be interpreted regarding

Table 2: Type checking of example in Figure 3

St# Statement Type Rule

6 tf _cam_rotation := tmp ⊤ 𝑅1310 Rx := [ [1, 0, 0], [0, 0,−1], [0, 1, 0] ] ⟨𝑥,−𝑧, 𝑦⟩ 𝑅1213 Rz := [ [0,−1, 0], [1, 0, 0], [0, 0, 1] ] ⟨−𝑦, 𝑥, 𝑧 ⟩ 𝑅1220 tf _cam_rotation := Rx ⊗ tf _cam_rotation ⟨𝑥,−𝑧, 𝑦⟩ 𝑅1521 tf _cam_rotation := Rz ⊗ tf _cam_rotation ⟨𝑧, 𝑥, 𝑦⟩ 𝑅1423 tf _cam_rotation := invYZ ⊗ ... ⟨𝑧,−𝑥,−𝑦⟩ 𝑅1425 rtmp := tf _cam_rotation ⟨𝑧,−𝑥,−𝑦⟩

30 ttmp := new_transform(𝑠cam, 𝑠map, ..., rtmp)error becauserotate( ⟨𝑓 , 𝑙,𝑢 ⟩,⟨𝑧,−𝑥,−𝑦⟩) =

⟨𝑟,𝑑, 𝑓 ⟩𝑅8

the body frame. Note that in this case, the angular transformationis independent of frame orientations. Both the camera and the bodyframes have orthogonal orientations. Rule 𝑅14 types a rotation com-position statement. It leverages function compose() to derive a newrotation type. In the function definition in Figure 6, ⟨𝑑𝑠 ′𝑥 , 𝑑𝑠 ′𝑦, 𝑑𝑠 ′𝑧⟩and ⟨𝑑𝑠 ′′𝑥 , 𝑑𝑠 ′′𝑦 , 𝑑𝑠 ′′𝑧 ⟩, both denote change in the orientation, wherethe former change is followed by the latter. For example, Rz and Rxvectors in our motivating example (Figure 3) represent ⟨−𝑦, 𝑥, 𝑧⟩and ⟨𝑥,−𝑧,𝑦⟩ changes respectively as per rule 𝑅12. Part A in the be-low figure shows these orientation changes in a sequence; whereaspart B shows the orientation change that is computed by aggregat-ing these two changes. Note that both parts result into the sameorientation.

𝑅15 types a rotation composition in which one of the rotations isnot static or not orthogonal. Note that an arbitrary angular rotationcan be integrated with an orthogonal axes orientation rotation.Since we only model and check axes orientation, the resulted typeis the one that represents orthogonal orientation changes.Example. Table 2 presents part of the example code in Figure 3written in our language and the types computed by the type rules.The first column shows the original line numbers of the statements.St# 6 is an assignment of a rotation variable from a state variable.The state variable tmp gets a value from the external value statevariable rotation. Therefore, as per rule 𝑅13, tf_cam_rotation is typedto ⊤. St# 10 and 13 are the assignments of rotation variables fromconstant vectors. Rule 𝑅12 allows typing Rx and Rz to ⟨𝑥,−𝑧,𝑦⟩ and⟨−𝑦, 𝑥, 𝑧⟩, respectively. Further, St# 20 and 21 denote the composi-tion of rotations: the former is typed with the type of Rx by 𝑅15 asthe type of tf_cam_rotation is ⊤, whereas the latter is typed with⟨𝑧, 𝑥,𝑦⟩, which is computed by composing types of Rz and tf_cam_-rotation using rule 𝑅142. St# 23 and 25 are similarly typed. Finally,St# 30 constructs a new transform object. However, it cannot beproperly typed by rule 𝑅8 as applying rotation ⟨𝑧,−𝑥,−𝑦⟩ to the par-ent map frame with the conventional orientation ⟨forward, left, up⟩,i.e, FLU, yields the orientation ⟨right, down, forward⟩, contradictingwith the ROS orientation convention FLU for the camera frame. Onthe other hand, PhysFrame correctly type-checks the fixed version.Soundness and Completeness. The essence of PhysFrame is toleverage flow-insensitive static analysis (like type checking) to2PhysFrame uses SSA to handle variable re-definitions.

Page 9: PhysFrame: Type Checking Physical Frames of Reference for

PhysFrame: Type Checking Physical Frames of Reference for Robotic Systems ESEC/FSE ’21, August 23–27, 2021, Athens, Greece

find frame related bugs. It is automated and does not require anyuser type annotations for subject ROS projects. As such, the typesystem is neither sound nor complete. This is because some typerules are uncertain. For example in rule 𝑅12, a matrix representingorthogonal angular transformation may not be intended for frameaxes orientation changes, but rather because the parent and childbody parts are installed in a way that they form some orthogonalangle(s), while their frame axes orientations are identical (bothFLU). However, since there is no way for PhysFrame to know thereal meanings of such rotation without user input, it considers thisan axes orientation rotation, which is the most common case. Andstamped variables from third party libraries are widely used andPhysFrame may not be able to type them. Therefore, PhysFramehas both false positives and false negatives. On the other hand,we foresee future robotic system development frameworks maysupport full-fledged frame types which the developer can use todeclare their variables. With such explicit type declarations, a soundand complete type system can be developed.

4.2.3 Frame Related Problems. PhysFrame reports 7 kinds oftype errors, 2 kinds of type warnings and 3 kinds of violations ofimplicit conventions. The first two are also called inconsistencies.Type Errors. This first kind is TF tree related errors. These errorsinclude a frame having multiple parent frames, frames forming acycle, and frames not following conventional orders such as a mapframe must be the ancestor of a body frame (Section 2.3).

The second kind of type errors is incorrect frames or transforms.ROS allows developers to stamp a state variable 𝑥 with an emptyid. If these variables are published, meaning that other compo-nents from different parties may receive such data, they shouldbe stamped with a proper frame (rule 𝑅16). Moreover, if they areused in transforms, runtime errors will be triggered as well. Inthese cases, PhysFrame reports a missing frame error. Althoughnot explicitly modeled by our rules, a few ROS data types requireexplicitly specifying child frame ids. If child frames are not set,PhysFrame reports missing child frame errors.

A transform is required to have different parent and child frames(rule 𝑅8). Otherwise, it is meaningless. PhysFrame reports a re-dundant transform. Rule 𝑅8 requires that the displacements androtation in a newly created transform should yield (relative) posi-tions and axes orientations consistent with the parent frame andROS conventions (e.g., body frames should have forward-left-up(FLU) orientations). Otherwise, it reports incorrect transforms.Type Warnings. A sensor transform that defines a sensor framewith respect to some other robotic body part like base_link, wrist,etc. or vice versa should have at least some non-zero displace-ment(s)/rotation(s) as a sensor’s center unlikely aligns with a bodypart’s center. PhysFrame reports a warning when a sensor trans-form has zero displacements and rotations. For a transform fromparent to child, ROS naming conventions follow templates ‘<par-ent>_to_<child>’ and ‘<child>_in_<parent>’, which give the clearmeanings of the transform. Developers however often inverse theparent and the child in transform variable names. PhysFrame hencereports reversed name warnings. These two are warnings as theymay not affect the functionalities but rather degrade maintainability.Such checkings are not explicitly modeled in type rules althoughPhysFrame supports them.

Implicit Convention Violations. As mentioned in Section 2.3,implicit conventions are mined from launch files. As such, they maynot be required but rather commonly seen. PhysFrame reports threekinds of warnings. The first is null displacement expected warning,meaning that certain transforms or frames are supposed to havenull displacements. The second is null rotation expected warning,meaning that certain transforms are supposed to have null rotations.The third is that co-occurrence violation warnings when frames thatare commonly seen together (in other projects) are not presentin a project. Such checkings are performed as an additional stepafter frame and transform types are inferred. The reason is that it ishard to implement implicit conventions as type rules as the minedconventions may change depending on the repositories used andthe z-score threshold setting.

5 EVALUATIONWe address the following questions: RQ1. How effective and ef-ficient is PhysFrame in disclosing frame related inconsistencies?RQ2. What is the response from developers on the inconsistenciesreported by PhysFrame? RQ3.What kinds of implicit conventionsare violated and what is the impact of z-score on results?

5.1 Experiment SetupImplementation. We have implemented a prototype of Phys-Frame in around 4800 lines of Python code. It utilizes a third-partycomponent, cppcheck [26], to obtain an XML dump for a C/C++ fileproviding an intermediate representation. It contains program to-kens, symbol tables for scopes, functions and variables, and an ASTfor each statement.We have also implemented constant propagationto resolve variables to static values when possible. The miner is im-plemented using MySQL database and SQL script. Implementationdetails can be found in Appendix C . The database for the implicitconvention miner is built from a list of ROS projects that contain atleast one launch file with a static_transform_publisher. We use onlythe mature projects from this list. We consider a project maturewhen it has more than 30 commits. The database contains 12.1Kstatic transforms in 5.3K launch files from 2.2K mature projects. Thetool and data is available at https://doi.org/10.5281/zenodo.4959920Artifacts. We run the type checker on 180 ROS projects fromGithub. They include (a) 7 projects from the ROS Kinetic distri-bution that contain at least one issue or a commit related to frame;(b) 23 randomly selected projects from the ROS indigo distribution;and (c) 150 randomly selected ROS projects that contain at leastone static transform. Among these projects, 69 are mature and 99not (with less than 30 commits). Note that we consider evaluationon unmature projects important as well because they are likely bynon-expert ROS developers that can benefit more from PhysFrame.Table 7 in Appendix shows the statistics of these projects. On aver-age, each project has 45 source/launch files and 52491 LOC, withthe largest one having 433 files and over 7 million LOC.Examination Process. We examine each of the reported incon-sistencies by reviewing the corresponding source code and markit as True Positive (TP) or False Positive (FP). The manual examina-tion poses a threat to the validity of results presented in this paper(please refer to Appendix D ). We mitigate it by cross-checking,being very conservative in marking TPs, and confirmation with

Page 10: PhysFrame: Type Checking Physical Frames of Reference for

ESEC/FSE ’21, August 23–27, 2021, Athens, Greece S. Kate, M. Chinn, H. Choi, X. Zhang, S. Elbaum

Table 3: Inconsistencies by Category (with the first seven type errors, the last two type warnings)

Category Number of Inconsistencies Number of Repositories

Total (#) TP (#) FP (#) Total (#) TP (#) FP (#)

C/C++ Launch C/C++ Launch

c_MULTIPLE_PARENTS_IN_FRAME_TREE 4 3 1 0 0 4 4 0c_CYCLE_IN_FRAME_TREE 1 1 0 0 0 1 1 0td_INCORRECT_FRAME_ORDER_IN_TREE 7 3 4 0 0 7 7 0c_INCORRECT_TRANSFORM 17 4 4 0 9 9 3 6td_REDUNDANT_TRANSFORM 2 2 - 0 - 2 2 0td_MISSING_FRAME 108 92 - 16 - 56 49 13td_MISSING_CHILD_FRAME 12 8 - 4 - 7 4 3td_REVERSED_NAME (type warning) 33 1 28 1 3 19 17 4c_SENSOR_NULL (type warning) 6 0 3 0 3 5 2 3

Total 190 114 40 21 15 110 89 29[81.05%] [18.95%]

Table 4: Summary of True Positive Inconsistencies

Report & Response Count Inconsistency Categories

found by searchingcommit-history or issuereports of repositories.

3 td_MISSING_FRAME1 c_INCORRECT_TRANSFORM

fixed by developersbefore we reportedthem.

1 td_MISSING_FRAME1 td_REDUNDANT_TRANSFORM1 c_CYCLE_IN_FRAME_TREE1 c_MULTIPLE_PARENTS_IN_FRAME_TREE

reported, and eitherfixed by developers orready to accept apull-request.

2 td_MISSING_FRAME1 td_MISSING_CHILD_FRAME1 td_REDUNDANT_TRANSFORM1 td_INCORRECT_FRAME_ORDER_IN_TREE

reported, andacknowledged bydevelopers (in somecases, confirmed as auseful fix to others forreusing the packagenode).

1 td_MISSING_FRAME5 td_REVERSED_NAME2 c_SENSOR_NULL2 td_INCORRECT_FRAME_ORDER_IN_TREE

reported, but refusedby developers on thegrounds that reuse ofthe package node is noteasy, or the package isno longer maintained.

3 td_MISSING_FRAME

reported, but noresponse fromdevelopers yet.

12 td_MISSING_FRAME1 td_MISSING_CHILD_FRAME4 c_INCORRECT_TRANSFORM14 td_REVERSED_NAME1 td_INCORRECT_FRAME_ORDER_IN_TREE2 c_MULTIPLE_PARENTS_IN_FRAME_TREE

found in a packagenode that is reused byother developer. Theissue is either fixed oracknowledged by otherusers in the originalrepository of the node.

2 td_MISSING_FRAME4 td_MISSING_CHILD_FRAME

found in backup-copyfiles in repositories.

5 td_MISSING_FRAME

potential issues. 8 td_MISSING_FRAME

projects not active inpast two years.

55 td_MISSING_FRAME2 td_MISSING_CHILD_FRAME3 c_INCORRECT_TRANSFORM10 td_REVERSED_NAME1 c_SENSOR_NULL3 td_INCORRECT_FRAME_ORDER_IN_TREE1 c_MULTIPLE_PARENTS_IN_FRAME_TREE

Table 5: Implicit Convention Warnings by CategoryCategory z=1 z=2 z=5 z=10

W# R# W# R# W# R# W# R#

w_sig_null_rot_exp 18 9 16 7 14 5 9 2w_name_null_rot_exp 20 10 20 10 2 2 0 0w_name_co-occurrence 7 3 0 0 0 0 0 0

developers. The implicit convention violation warnings are by theirnature more difficult to determine if they denote real bugs. Wehence run the tool on the same set of projects with different z-scorethreshold values and study the reported warnings. All type check-ing runs are on a Dell PowerEdge R420 Linux server with two IntelXeon E5-2690 2.90GHz 8-core processors and 192 GB of RAM.

5.2 Results and ObservationsRQ1. Effectiveness and Efficiency. PhysFrame reports 190 in-consistencies with 154 true positives and 81.05% true positive rate.These inconsistencies belong to 100 projects, over 55% of the totalwe examined. This number is specially significant considering thatmany more frame related bugs reported in earlier versions of thesystems may have been detected by PhysFrame. PhysFrame alsoreports 45, 36, 16, 9 implicit convention violation warnings whenthe z-score threshold is set to 1, 2, 5, 10, respectively, which cor-responds to p-values smaller than 0.15 for z=1 to p-values smallerthan 0.00001 for z=10. The average analysis time is 52 seconds withthe maximum 1190 seconds. Further details can be found in Table 6in Appendix .Inconsistencies by Category. Table 3 summarizes inconsistencies bytheir category, with the first seven type errors and the last two typewarnings. The first column lists the categories. Each has either a ‘𝑐_’or ‘𝑡𝑑_’ prefix, denoting whether it is a “correctness” or a “technicaldebt” issue, respectively. The “correctness” issues represent thetypes of inconsistencies that violate ROS conventions such as a treerepresentation of all the frames in an application, sensor framesshould have non-null transforms with respect to robot body frames,and incorrect transforms. The incorrect transform issues includethose that have invalid displacement/rotation (like our motivatingexample in Section 3) which can lead to problems during runtime,and those that do not follow ROS naming conventions (discussedin Section 2.2) while defining a child frame in the transform whichaffect code readability and may cause issues during code reuse. The“technical debt” issues represent the types of inconsistencies that

Page 11: PhysFrame: Type Checking Physical Frames of Reference for

PhysFrame: Type Checking Physical Frames of Reference for Robotic Systems ESEC/FSE ’21, August 23–27, 2021, Athens, Greece

may make code maintenance and understanding difficult and maylead to potential runtime problems during code reuse/collaboration.The next five columns show the total number of inconsistencies foreach category, TP cases in C/C++ code and in launch files, FP casesin C/C++ code and in launch files, respectively. Further, the lastthree columns show the number of projects with inconsistencies,TPs, and FPs, respectively.

Many of the reported inconsistencies are about missing frames,i.e., developers forgot to set frame ids for published state values.FP cases of this type are observed due to usage of stamped datatypes for storing non-frame data (e.g., binary mask of image), usageof stamped transform for tf operations that do not need frame_idfield value, and passing of stamped data type variables to functionswhose signature is not available during analysis. Similar sources offalse positives affect other categories. PhysFrame also found manyreversed name warnings. For example, ROS expects transform vari-ables to follow the naming convention of “⟨parent⟩_to_⟨child⟩”.We find that many developers use the names reversed. This maycause maintenance and reuse issues as the convention is expectedby other parties. PhysFrame also identified 17 frame type incon-sistencies similar to our motivating example in Section 3 and 12TF tree errors. Given that ROS is built to encourage collaborativedevelopment, even the issues affecting code readability or reusecan be important. We have three case studies in Appendix E .RQ2. Developer Responses. Table 4 summarizes the distributionof the true positives across different reporting actions, developerresponses and inconsistency categories. Row 1 (below the tabletitle) shows that 4 of the 154 true positives PhysFrame uncoveredwere known issues by the developers. Row 2 shows that anotherfour were fixed by the time we reached the developers. Among theremaining true positives, we reported the 52 that belong to projectsthat showed activity in the past 2 years (rows 3-6). We received18 responses with 15 either acknowledging this issue or fixing it,and 3 declining our reports because either it is not seen as relevant(project is not intended for external use hence conventions do notneed to be respected), or project no longer maintained. Row 8 showsthat PhysFrame found 6 bugs in reused versions of projects thathave already been fixed in the original repositories.RQ3. Implicit Convention Violations. Table 5 shows the threekinds of implicit convention violations PhysFrame has found andthe impact of z-score threshold. W# means the number of warningsand R# the number of projects with warnings. The most popu-lar kind is w_name_null_rot_exp, meaning that a transform witha specific name is expected to have 0 rotation; w_sig_null_rot_-exp is similar except that a transform is identified by parent andchild frame ids instead of name. Some of the w_sig_null_rot_expwarnings are persistent (with different z-threshold), meaning thatconventions adopted by almost all projects are being violated.

6 RELATEDWORKComponent-basedRobot Software.Component-based program-ming [8, 9] has been extensively used in robot development, aim-ing to facilitate reuse. Following the principle, several robot soft-ware frameworks [25], such as Open Robot Control Software (Oro-cos) [10], ROS [32]. Player [18], Miro [37], have been developed andwidely used. These frameworks commonly provide communica-tion infrastructure across components [17] and substantial library

support. As such, respecting conventions is critical for these infras-tructures. Unfortunately, they provide no compile time support toensure frame conventions, one of the most error-prone aspects ofrobot programming. PhysFrame is a type checking technique thatsystematically detects problems in using frames.Type Checking in Robotic Systems. Type checking [11, 33] is awell-known technique to improve software reliability. In roboticsoftware, due to its domain-specific characteristics, variables of-ten have physical-world semantics. Additional type checking canhence be performed considering the physical world. Unit typechecking [7, 13] is the most common robotic software type system,which type variables with physical unit information. The seminalwork [21, 22, 34] in the type checking of units and dimensions ex-tends a programming language. Ayudante [19] builds and comparesclusters of variables based on dataflow and the meaning of vari-ables names to detect type (e.g., unit) inconsistencies. The recentwork on Phriky [30, 31] supports dimensionality type checks inROS without developer annotations, and Phys [20] extends it withprobabilistic type inference to detect more inconsistencies. In thiswork we extend the type checking to reference frames in roboticsoftware, which requires a distinct type system.Frame Safety. Lowry et al. [23] proposed a general abstract in-terpretation system for certifying domain-specific properties andprovide a case study to certify NAIF library programs’ frame-safety.The case study checks limited properties (e.g., frame consistencies)and the system needs annotations for program inputs. In contrast,ours is a fully automated tool checking many properties for ROSprojects (e.g., TF tree shape and frame orientations), without re-quiring annotations. Our abstract domains are hence different fromtheirs and our method is type system based. The work by Basir etal. [6] on automated theorem proving for automatically generatedcode also presented a case study to verify two frame-safety require-ments. META-AMPHION system [24] helps domain experts builddomain-specific program synthesizers. In the context of frame-safety, we believe that this work would enable the synthesis ofprograms that are frame-safe given the correct domain theory. Inour work, we focus on finding frame issues in the already devel-oped code. Overall we do not need annotations, have a push-buttonapproach, and provide a broad assessment of real code.

7 CONCLUSIONWe develop a fully automated type checking technique for referenceframes in robotic system, which is one of the most subtle anderror-prone aspects in robotic software engineering. We explain thesemantics of frames and frame transformations, and propose a novelway to abstract them. Type checking rules are developed based onthe domain specific semantics. Applying our technique to 180 ROSprojects from GitHub discloses over 190 errors and warnings, with154 true positives.We reported 52 of them and received 18 responsesby the time of submission and 15 fixed/acknowledged.

ACKNOWLEDGMENTSWe thank our insightful reviewers. This research was supported, inpart by NSF 1901242, 1910300, 1909414 and 1853374, ONRN0001417-12045, N000141410468 and N000141712947. Any opinions, findings,and conclusions in this paper are those of the authors only and donot necessarily reflect the views of our sponsors.

Page 12: PhysFrame: Type Checking Physical Frames of Reference for

ESEC/FSE ’21, August 23–27, 2021, Athens, Greece S. Kate, M. Chinn, H. Choi, X. Zhang, S. Elbaum

REFERENCES[1] 2015. diff_drive_controller is hard-coded into the ’odom’ reference frame #204.

https://github.com/ros-controls/ros_controllers/issues/204[2] 2017. TF: Debugging Tools. http://wiki.ros.org/tf/Debuggingtools[3] 2019. ORB-SLAM2 ROS node. http://wiki.ros.org/orb_slam2_ros[4] 2019. T265 frame questions #772. https://github.com/IntelRealSense/realsense-

ros/issues/772[5] 2020. Incorrect frame tree cartographer ros and t265 camera #1599. https://github.

com/IntelRealSense/realsense-ros/issues/1599[6] Nurlida Basir, Ewen Denney, and Bernd Fischer. 2010. Deriving Safety Cases for

Hierarchical Structure in Model-Based Development. In Proceedings of the 29thInternational Conference on Computer Safety, Reliability, and Security (Vienna,Austria) (SAFECOMP’10). Springer-Verlag, Berlin, Heidelberg, 68–81.

[7] Geoffrey Biggs. 2007. Designing an application-specific programming languagefor mobile robots. Ph.D. Dissertation. ResearchSpace@ Auckland.

[8] Alex Brooks, Tobias Kaupp, Alexei Makarenko, Stefan Williams, and AndersOreback. 2005. Towards component-based robotics. In 2005 IEEE/RSJ InternationalConference on Intelligent Robots and Systems. IEEE, 163–168.

[9] Davide Brugali and Patrizia Scandurra. 2009. Component-based robotic engi-neering (part i)[tutorial]. IEEE Robotics & Automation Magazine 16, 4 (2009),84–96.

[10] Herman Bruyninckx. 2001. Open robot control software: the OROCOS project. InProceedings 2001 ICRA. IEEE international conference on robotics and automation(Cat. No. 01CH37164), Vol. 3. IEEE, 2523–2528.

[11] Luca Cardelli. 1996. Type systems. ACM Computing Surveys (CSUR) 28, 1 (1996),263–264.

[12] crigroup. 2017. cubes_task.launch. https://git.io/JtDhp[13] Kostadin Damevski. 2009. Expressing measurement units in interfaces for sci-

entific component software. In Proceedings of the 2009 Workshop on Component-Based High Performance Computing. 1–8.

[14] Dawson Engler, David Yu Chen, Seth Hallem, Andy Chou, and Benjamin Chelf.2001. Bugs as Deviant Behavior: A General Approach to Inferring Errors inSystems Code. In Proceedings of the Eighteenth ACM Symposium on OperatingSystems Principles (Banff, Alberta, Canada) (SOSP ’01). Association for ComputingMachinery, New York, NY, USA, 57–72. https://doi.org/10.1145/502034.502041

[15] Tully Foote. 2013. tf: The transform library. In IEEE International Conference onTechnologies for Practical Robot Applications (TePRA), 2013 (Open-Source Softwareworkshop). 1–6. https://doi.org/10.1109/TePRA.2013.6556373

[16] Tully Foote and Mike Purvis. 2010. Standard Units of Measure and CoordinateConventions. https://www.ros.org/reps/rep-0103.html

[17] Open Source Robotics Foundation. [n.d.]. ROS Topics homepage. http://wiki.ros.org/Topics

[18] Brian Gerkey, Richard T Vaughan, and Andrew Howard. 2003. The player/stageproject: Tools for multi-robot and distributed sensor systems. In Proceedings ofthe 11th international conference on advanced robotics, Vol. 1. 317–323.

[19] Irfan Ul Haq, Juan Caballero, and Michael D Ernst. 2015. Ayudante: Identifyingundesired variable interactions. In Proceedings of the 13th International Workshopon Dynamic Analysis. 8–13.

[20] Sayali Kate, John-Paul Ore, Xiangyu Zhang, Sebastian Elbaum, and Zhaogui Xu.2018. Phys: probabilistic physical unit assignment and inconsistency detection. InProceedings of the 2018 26th ACM Joint Meeting on European Software EngineeringConference and Symposium on the Foundations of Software Engineering. 563–573.

[21] Andrew Kennedy. 1994. Dimension Types. In Proceedings of the 5th EuropeanSymposium on Programming: Programming Languages and Systems (ESOP ’94).Springer-Verlag, Berlin, Heidelberg, 348–362.

[22] Andrew Kennedy. 2009. Types for Units-of-Measure: Theory and Practice. InProceedings of the Third Summer School Conference on Central European FunctionalProgramming School (Budapest, Hungary) (CEFP’09). Springer-Verlag, Berlin,Heidelberg, 268–305.

[23] Michael Lowry, Thomas Pressburger, and Grigore Rosu. 2001. Certifying Domain-Specific Policies. In Proceedings of the 16th IEEE International Conference onAutomated Software Engineering (ASE ’01). IEEE Computer Society, USA, 81.

[24] Michael R. Lowry and Jeffrey Van Baalen. 1997. META-AMPHION: Synthesis ofEfficient Domain-Specific Program Synthesis Systems. Automated Software Engg.4, 2 (April 1997), 199–241. https://doi.org/10.1023/A:1008637201658

[25] Gergely Magyar, Peter Sinčák, and Zoltán Krizsán. 2015. Comparison study ofrobotic middleware for robotic applications. In Emergent Trends in Robotics andIntelligent Systems. Springer, 121–128.

[26] Daniel Marjamaeki. 2017. Cppcheck - A tool for static C/C++ code analysis. http://cppcheck.sourceforge.net/

[27] Wim Meeussen. 2010. Coordinate Frames for Mobile Platforms. https://www.ros.org/reps/rep-0105.html

[28] Thomas Moulard. 2011. Coordinate Frames for Humanoid Robots. https://www.ros.org/reps/rep-0120.html

[29] Raúl Mur-Artal and Juan D. Tardós. 2017. ORB-SLAM2: an Open-Source SLAMSystem for Monocular, Stereo and RGB-D Cameras. IEEE Transactions on Robotics33, 5 (2017), 1255–1262. https://doi.org/10.1109/TRO.2017.2705103

[30] John-Paul Ore, Carrick Detweiler, and Sebastian Elbaum. 2017. LightweightDetection of Physical Unit Inconsistencies Without Program Annotations. InProceedings of the 26th ACM SIGSOFT International Symposium on Software Testingand Analysis (Santa Barbara, CA, USA) (ISSTA 2017). ACM, New York, NY, USA,341–351. https://doi.org/10.1145/3092703.3092722

[31] John-Paul Ore, Carrick Detweiler, and Sebastian Elbaum. 2017. Phriky-Units:A Lightweight, Annotation-Free Physical Unit Inconsistency Detection Tool.In Proceedings of the 26th ACM SIGSOFT International Symposium on SoftwareTesting and Analysis (Santa Barbara, CA, USA) (ISSTA 2017). Association forComputing Machinery, New York, NY, USA, 352–355. https://doi.org/10.1145/3092703.3098219

[32] Morgan Quigley, Ken Conley, Brian Gerkey, Josh Faust, Tully Foote, Jeremy Leibs,Rob Wheeler, and Andrew Y Ng. 2009. ROS: an open-source Robot OperatingSystem. In ICRA workshop on open source software, Vol. 3.2. Kobe, Japan, 5.

[33] John C Reynolds. 1974. Towards a theory of type structure. In ProgrammingSymposium. Springer, 408–425.

[34] Mikael Rittri. 1995. Dimension Inference under Polymorphic Recursion. InProceedings of the Seventh International Conference on Functional ProgrammingLanguages and Computer Architecture (La Jolla, California, USA) (FPCA ’95).Association for Computing Machinery, New York, NY, USA, 147–159. https://doi.org/10.1145/224164.224197

[35] Clearpath Robotics. 2017. Robots - PR2. http://wiki.ros.org/Robots/PR2[36] Wim Meeussen Tully Foote, Eitan Marder-Eppstein. [n.d.]. Ros tf package home-

page. http://wiki.ros.org/tf,http://wiki.ros.org/tf2[37] Hans Utz, Stefan Sablatnog, Stefan Enderle, and Gerhard Kraetzschmar. 2002.

Miro-middleware for mobile robot applications. IEEE Transactions on Roboticsand Automation 18, 4 (2002), 493–497.

Page 13: PhysFrame: Type Checking Physical Frames of Reference for

PhysFrame: Type Checking Physical Frames of Reference for Robotic Systems ESEC/FSE ’21, August 23–27, 2021, Athens, Greece

APPENDIX

base_footprint

base_footprint_joint

xyz: 0 0 0.051 rpy: 0 -0 0

base_link

base_bellow_joint

xyz: -0.29 0 0.8 rpy: 0 -0 0

base_laser_joint

xyz: 0.275 0 0.252 rpy: 0 -0 0

bl_caster_rotation_joint

xyz: -0.2246 0.2246 0.0282 rpy: 0 -0 0

br_caster_rotation_joint

xyz: -0.2246 -0.2246 0.0282 rpy: 0 -0 0

fl_caster_rotation_joint

xyz: 0.2246 0.2246 0.0282 rpy: 0 -0 0

fr_caster_rotation_joint

xyz: 0.2246 -0.2246 0.0282 rpy: 0 -0 0

torso_lift_joint

xyz: -0.05 0 0.739675 rpy: 0 -0 0

torso_lift_motor_screw_joint

xyz: -0.15 0 0.7 rpy: 0 -0 0

base_bellow_link base_laser_link bl_caster_rotation_link

bl_caster_l_wheel_joint

xyz: 0 0.049 0 rpy: 0 -0 0

bl_caster_r_wheel_joint

xyz: 0 -0.049 0 rpy: 0 -0 0

bl_caster_l_wheel_link bl_caster_r_wheel_link

br_caster_rotation_link

br_caster_l_wheel_joint

xyz: 0 0.049 0 rpy: 0 -0 0

br_caster_r_wheel_joint

xyz: 0 -0.049 0 rpy: 0 -0 0

br_caster_l_wheel_link br_caster_r_wheel_link

fl_caster_rotation_link

fl_caster_l_wheel_joint

xyz: 0 0.049 0 rpy: 0 -0 0

fl_caster_r_wheel_joint

xyz: 0 -0.049 0 rpy: 0 -0 0

fl_caster_l_wheel_link fl_caster_r_wheel_link

fr_caster_rotation_link

fr_caster_l_wheel_joint

xyz: 0 0.049 0 rpy: 0 -0 0

fr_caster_r_wheel_joint

xyz: 0 -0.049 0 rpy: 0 -0 0

fr_caster_l_wheel_link fr_caster_r_wheel_link

torso_lift_link

head_pan_joint

xyz: -0.01707 0 0.38145 rpy: 0 -0 0

imu_joint

xyz: -0.02977 -0.1497 0.164 rpy: 3.14159 -2.06823e-13 3.14159

l_shoulder_pan_joint

xyz: 0 0.188 0 rpy: 0 -0 0

l_torso_lift_side_plate_joint

xyz: 0.0535 0.209285 0.176625 rpy: 0 -0 0

laser_tilt_mount_joint

xyz: 0.09893 0 0.227 rpy: 0 -0 0

r_shoulder_pan_joint

xyz: 0 -0.188 0 rpy: 0 -0 0

r_torso_lift_side_plate_joint

xyz: 0.0535 -0.209285 0.176625 rpy: 0 -0 0

head_pan_link

head_tilt_joint

xyz: 0.068 0 0 rpy: 0 -0 0

head_tilt_link

head_plate_frame_joint

xyz: 0.0232 0 0.0645 rpy: 0 -0 0

head_plate_frame

projector_wg6802418_frame_joint

xyz: 0 0.11 0.0546 rpy: 0 -0 0

sensor_mount_frame_joint

xyz: 0 0 0 rpy: 0 -0 0

projector_wg6802418_frame

projector_wg6802418_child_frame_joint

xyz: 0 0 0 rpy: 0 -1.5708 0

projector_wg6802418_child_frame

sensor_mount_link

double_stereo_frame_joint

xyz: 0 0 0 rpy: 0 -0 0

high_def_frame_joint

xyz: 0.046457 -0.11 0.0546 rpy: 0 -0 0

double_stereo_link

narrow_stereo_frame_joint

xyz: 0.045 0.06 0.0501 rpy: 0 -0 0

wide_stereo_frame_joint

xyz: 0.045 0.03 0.0501 rpy: 0 -0 0

narrow_stereo_link

narrow_stereo_l_stereo_camera_frame_joint

xyz: 0 0 0 rpy: 0 -0 0

narrow_stereo_optical_frame_joint

xyz: 0 0 0 rpy: -1.5708 -5.55112e-17 -1.5708

narrow_stereo_l_stereo_camera_frame

narrow_stereo_l_stereo_camera_optical_frame_joint

xyz: 0 0 0 rpy: -1.5708 -5.55112e-17 -1.5708

narrow_stereo_r_stereo_camera_frame_joint

xyz: 0 -0.09 0 rpy: 0 -0 0

narrow_stereo_l_stereo_camera_optical_frame narrow_stereo_r_stereo_camera_frame

narrow_stereo_r_stereo_camera_optical_frame_joint

xyz: 0 0 0 rpy: -1.5708 -5.55112e-17 -1.5708

narrow_stereo_r_stereo_camera_optical_frame

narrow_stereo_optical_frame

wide_stereo_link

wide_stereo_l_stereo_camera_frame_joint

xyz: 0 0 0 rpy: 0 -0 0

wide_stereo_optical_frame_joint

xyz: 0 0 0 rpy: -1.5708 -5.55112e-17 -1.5708

wide_stereo_l_stereo_camera_frame

wide_stereo_l_stereo_camera_optical_frame_joint

xyz: 0 0 0 rpy: -1.5708 -5.55112e-17 -1.5708

wide_stereo_r_stereo_camera_frame_joint

xyz: 0 -0.09 0 rpy: 0 -0 0

wide_stereo_l_stereo_camera_optical_frame wide_stereo_r_stereo_camera_frame

wide_stereo_r_stereo_camera_optical_frame_joint

xyz: 0 0 0 rpy: -1.5708 -5.55112e-17 -1.5708

wide_stereo_r_stereo_camera_optical_frame

wide_stereo_optical_frame

high_def_frame

high_def_optical_frame_joint

xyz: 0 0 0 rpy: -1.5708 -5.55112e-17 -1.5708

high_def_optical_frame

imu_link l_shoulder_pan_link

l_shoulder_lift_joint

xyz: 0.1 0 0 rpy: 0 -0 0

l_shoulder_lift_link

l_upper_arm_roll_joint

xyz: 0 0 0 rpy: 0 -0 0

l_upper_arm_roll_link

l_upper_arm_joint

xyz: 0 0 0 rpy: 0 -0 0

l_upper_arm_link

l_elbow_flex_joint

xyz: 0.4 0 0 rpy: 0 -0 0

l_elbow_flex_link

l_forearm_roll_joint

xyz: 0 0 0 rpy: 0 -0 0

l_forearm_roll_link

l_forearm_cam_frame_joint

xyz: 0.135 0 0.044 rpy: -1.5708 -0.562869 0

l_forearm_joint

xyz: 0 0 0 rpy: 0 -0 0

l_forearm_cam_frame

l_forearm_cam_optical_frame_joint

xyz: 0 0 0 rpy: -1.5708 -5.55112e-17 -1.5708

l_forearm_cam_optical_frame

l_forearm_link

l_wrist_flex_joint

xyz: 0.321 0 0 rpy: 0 -0 0

l_wrist_flex_link

l_wrist_roll_joint

xyz: 0 0 0 rpy: 0 -0 0

l_wrist_roll_link

l_gripper_palm_joint

xyz: 0 0 0 rpy: 0 -0 0

l_gripper_palm_link

l_gripper_l_finger_joint

xyz: 0.07691 0.01 0 rpy: 0 -0 0

l_gripper_led_joint

xyz: 0.0513 0 0.0244 rpy: 0 -0 0

l_gripper_motor_accelerometer_joint

xyz: 0 0 0 rpy: 0 -0 0

l_gripper_motor_slider_joint

xyz: 0.16828 0 0 rpy: 0 -0 0

l_gripper_r_finger_joint

xyz: 0.07691 -0.01 0 rpy: 0 -0 0

l_gripper_tool_joint

xyz: 0.18 0 0 rpy: 0 -0 0

l_gripper_l_finger_link

l_gripper_l_finger_tip_joint

xyz: 0.09137 0.00495 0 rpy: 0 -0 0

l_gripper_l_finger_tip_link

l_gripper_led_frame l_gripper_motor_accelerometer_link l_gripper_motor_slider_link

l_gripper_motor_screw_joint

xyz: 0 0 0 rpy: 0 -0 0

l_gripper_motor_screw_link

l_gripper_r_finger_link

l_gripper_r_finger_tip_joint

xyz: 0.09137 -0.00495 0 rpy: 0 -0 0

l_gripper_r_finger_tip_link

l_gripper_joint

xyz: 0 0 0 rpy: 0 -0 0

l_gripper_l_finger_tip_frame

l_gripper_tool_frame

l_torso_lift_side_plate_link laser_tilt_mount_link

laser_tilt_joint

xyz: 0 0 0.03 rpy: 0 -0 0

laser_tilt_link

r_shoulder_pan_link

r_shoulder_lift_joint

xyz: 0.1 0 0 rpy: 0 -0 0

r_shoulder_lift_link

r_upper_arm_roll_joint

xyz: 0 0 0 rpy: 0 -0 0

r_upper_arm_roll_link

r_upper_arm_joint

xyz: 0 0 0 rpy: 0 -0 0

r_upper_arm_link

r_elbow_flex_joint

xyz: 0.4 0 0 rpy: 0 -0 0

r_elbow_flex_link

r_forearm_roll_joint

xyz: 0 0 0 rpy: 0 -0 0

r_forearm_roll_link

r_forearm_cam_frame_joint

xyz: 0.135 0 0.044 rpy: 1.5708 -0.562869 0

r_forearm_joint

xyz: 0 0 0 rpy: 0 -0 0

r_forearm_cam_frame

r_forearm_cam_optical_frame_joint

xyz: 0 0 0 rpy: -1.5708 -5.55112e-17 -1.5708

r_forearm_cam_optical_frame

r_forearm_link

r_wrist_flex_joint

xyz: 0.321 0 0 rpy: 0 -0 0

r_wrist_flex_link

r_wrist_roll_joint

xyz: 0 0 0 rpy: 0 -0 0

r_wrist_roll_link

r_gripper_palm_joint

xyz: 0 0 0 rpy: 0 -0 0

r_gripper_palm_link

r_gripper_l_finger_joint

xyz: 0.07691 0.01 0 rpy: 0 -0 0

r_gripper_led_joint

xyz: 0.0513 0 0.0244 rpy: 0 -0 0

r_gripper_motor_accelerometer_joint

xyz: 0 0 0 rpy: 0 -0 0

r_gripper_motor_slider_joint

xyz: 0.16828 0 0 rpy: 0 -0 0

r_gripper_r_finger_joint

xyz: 0.07691 -0.01 0 rpy: 0 -0 0

r_gripper_tool_joint

xyz: 0.18 0 0 rpy: 0 -0 0

r_gripper_l_finger_link

r_gripper_l_finger_tip_joint

xyz: 0.09137 0.00495 0 rpy: 0 -0 0

r_gripper_l_finger_tip_link

r_gripper_led_frame r_gripper_motor_accelerometer_link r_gripper_motor_slider_link

r_gripper_motor_screw_joint

xyz: 0 0 0 rpy: 0 -0 0

r_gripper_motor_screw_link

r_gripper_r_finger_link

r_gripper_r_finger_tip_joint

xyz: 0.09137 -0.00495 0 rpy: 0 -0 0

r_gripper_r_finger_tip_link

r_gripper_joint

xyz: 0 0 0 rpy: 0 -0 0

r_gripper_l_finger_tip_frame

r_gripper_tool_frame

r_torso_lift_side_plate_link

torso_lift_motor_screw_link

Figure 7: PR2 Robot Coordinate Frames (a square node rep-resents a frame)

A AN EXAMPLE LAUNCH FILE SNIPPETThe following shows a snippet from a launch file defining a trans-form from a base_link frame to an optical depth frame for a Robotiqgripper simulation [12].<node name="kinect_link_broadcaster" pkg="tf" type="

static_transform_publisher" args="0.536 0 1 1.5708 3.14159 0

base_link openni_depth_optical_frame 100" />

The name field uniquely identifies the transform and the args fieldprovides parameters of the transform with the first three valuesthe 𝑥 , 𝑦, 𝑧 displacements of the origin of the child frame regardingthe parent frame; the next three values the yaw, pitch, roll rotationangles; followed by the parent and child frame ids.

B IMPLICIT CONVENTION MINERThe miner is only executed once on the collected static transformdatabase. The produced results are then used in the analysis foreach subject ROS project. It identifies the following information.• Pairs of transforms that commonly co-exist in a project. Forexample, PhysFrame finds that the transform base_link→ left_-wheel co-exists with base_link→ right_wheel. As such, violationsto this implicit convention may potentially be buggy.

• Transforms commonly having zero displacements, such as kinect_-link→kinect_depth_frame.

• Transforms commonly having zero rotation, such as world→map.Since these patterns may happen coincidentally, we use the fol-

lowing z-score[14] to quantify their certainty.

𝑧 (𝑛, 𝑒) = (𝑒/𝑛 − 𝑝0)/√︁(𝑝0 ∗ (1 − 𝑝0)/𝑛)

For a relation (pattern) denoted as a pair of premise and conclusion,𝑛 is the number of times the premise is observed, 𝑒 is the number oftimes both the premise and conclusion are observed. For examplein computing the z-score for a transform 𝐴 having zero rotation,𝑛 is the number of times 𝐴 is observed and 𝑒 is the number oftimes 𝐴 having zero rotation. Coefficient 𝑝0 denotes the expected

likelihood that a conclusion is satisfied. We use 𝑝0 = 0.9 followingthe literature. Note that 𝑧 grows as 𝑛 increases and 𝑛 − 𝑒 decreases.

C PROTOTYPE IMPLEMENTATIONPrototype implementation of PhysFrame contains around 4800lines of python code and 160 lines of SQL script. It can be invokedon command line with the project path input.

The implicit convention miner uses MySQL to store the statictransforms (present as static_transform_publisher nodes in launchfiles) extracted from a list of 6K projects. The list was generatedusing the GitHub code search API which followed: downloading of55K launch files that contained a static_transform_publisher frompublic GitHub repositories; downloading of an original version ofeach file that was found to be modified in the commit history ofits GitHub repository, removal of files which could not be parsedcorrectly, removal of duplicate files as the launch files are oftenshared between projects. Each row in the database represents astatic transform with the following fields: a launch file id, a projectid, each of the attributes and arguments of the static_transform_pub-lisher node, a flag indicating the version of the launch file (originalor later commit), a flag indicating if the project is mature (morethan 30 commits). Note that we skip static transforms that employmacro expansion in the node arguments (which we leave it as afuture task). In total our database contains 26K static transformsfrom 13K files in 6K projects, out of which 12.1K static transformsare from mature projects. The miner extracts conventions from the12.1K static transforms.

Further, the miner uses SQL scripts to build a table of observedrelations (patterns) for each of the kinds discussed in Appendix B,and to compute the 𝑧 scores of those relations. Then, given thez-score threshold 𝑡 , it outputs the relations with 𝑧 > 𝑡 as implicitconventions. Note that the execution of the miner is a one timestep and is not run each time the tool is invoked on a ROS projectto be type-checked.

When PhysFrame is invoked on a subject project, the tool firstexecutes the file-processor component to obtain a group of C/C++and launch files that are executed together for each of the launchfiles in the project. A C++ ROS project contains a set of packages,a set of nodes in each package and, optionally, a set of launchfiles which invoke nodes and other launch files. A node can also beinvoked on its own. A node is specified using a package name and anexecutable name. Therefore, it processes package.xml files to obtainthe package names available in the project, and CMakeLists.txtfiles to obtain the sets of C/C++ files belonging to executables. Weuse the cmakelists_parsing python module as a CMakeLists.txt fileparser. Note that the file-processor considers only those nodes thatare present in the project. A launch file may invoke other project’snodes, e.g., ROS core nodes. These file groups are used to constructvalid TF trees.

Further, recall that our type system encodes the displacementfields in the types with static numerical values if any. For that, weimplement constant propagation to be performed before the typechecking. The constant propagation at the time of this work isimplemented at a function level, where it follows a visitor patternto annotate numerical constants and variables with their values:the values can be either a number or a numerical vector; then to

Page 14: PhysFrame: Type Checking Physical Frames of Reference for

ESEC/FSE ’21, August 23–27, 2021, Athens, Greece S. Kate, M. Chinn, H. Choi, X. Zhang, S. Elbaum

propagate the annotated values on the AST of each statement inthe function.

Finally, the type checker component applies the type rules on thevariables and statements of each C/C++ file present in the project.The implementation supports the transfer of types across functions:from a function’s return type to the caller, and from a caller to thefunction parameters. At the time of this work, this transfer happensbetween functions at the file level. The transfer of types betweenfunctions can be extended at the file groups level, which we leaveit as a future task. Note that however the TF trees constructedthrough the type rules are maintained at the file groups level. Inaddition, we assign types to static transforms in launch files whichadd them to the TF trees, and check for any type errors. Afterinference, PhysFrame performs additional checks on the inferredtypes in both source code and launch files which include sensor_null,reversed_name and implicit convention violations checks.

D THREATS TO VALIDITYExternal Threats. The set of benchmarks used in evaluation maynot be representative. We mitigate the threat by using a large set of180 randomly selected projects with various levels of maturity andcomplexity (with the most complex one having over 7 million LOC).We argue that evaluation on immature projects is equally importantas the large group of non-expert ROS developers (of immatureprojects) are the ones that suffer the most from frame problems. Inthe future, we foresee to push our type checker to become part ofsome mainstream robotic software development framework suchthat the technique can be better evaluated in practice and usedduring development instead of after deployment.Internal Threats.Wemanually label true positive type errors. Humanerrors may lead to evaluation bias. We mitigate this threat by: 1)reporting some of the findings to the developers and ROS answerforum; 2) cross-checking the labeling results between two authors;3) being conservative in labeling true positives. We mark a casethat has any doubt or inconsistent labels (across authors) as a falsepositive.

E CASE STUDIESCase I: Redundant transform. Figure 8 presents a redundant

Figure 8: Example for redundant transform (source: https://git.io/Jt5Vs )

transform detected by PhysFrame. The code invokes the lookup-Transform function to obtain the transform between identical parentand child frames (‘base_link’) from the tf tree, which is essentiallyan identity transform with zero displacements and zero rotation.Manual analysis found that the transform is used in a a number ofplaces computing distance between the robot (‘base_link’) and thetarget goal. They simply correspond to useless computation. Afterwe reported it to the developer, it was acknowledged that though itdoes not affect the correctness, it is completely unnecessary andcan cause confusion.Case II: Missing frame. Figure 9 shows the code snippet related to

Figure 9: Example formissing frame information in the pub-lished ROS message (fixed in: https://git.io/JtSJv )

a missing frame error reported by PhysFrame. It was also reportedin swri-robotics/gps_umd GitHub repository. The function on the leftis part of this repository and is responsible for publishing navigationsatellite (i.e., GPS) fix, whereas the function on the right is fromanother package (or repository) called robot_localization and usesthe published fix data to fuse into robot’s position estimate. Since,robot_localization transforms GPS data into a frame that matchesrobot’s starting pose (position and orientation) in its world frame, itneeds the frame information of the received GPS data. However, thefunction on the left does not specify the frame information in thepublished fix, and hence robot_localization assumes that the navsatdevice is mounted at robot’s origin. This can be problematic if thenavsat device is mounted at different pose. This illustrates that theframe information of the published data becomes important duringintegration of different packages. Moreover, the missing frameinconsistency makes code harder to understand and maintain.Case III: Cycle in the frame tree: Figure 10 illustrates an example

Figure 10: Example of loop in the tf tree (fixed in: https://git.io/Jt5VB)

of a loop in the TF tree. Observe that the two published transformsfcu→fcu_frd and fcu_frd→fcu form a cycle. The tf package main-tains parent-child relationships between the frames in a directedtree structure which by definition cannot have cycles. A cycle in-validates a tree structure definition as well as can lead to multiplepaths between two nodes resulting in more than one net transformsbetween the corresponding frames, which in turn causes ambiguityduring the computation of a net transform between two frames.

Page 15: PhysFrame: Type Checking Physical Frames of Reference for

PhysFrame: Type Checking Physical Frames of Reference for Robotic Systems ESEC/FSE ’21, August 23–27, 2021, Athens, Greece

Table 6: Inconsistencies and Implicit Convention Violation WarningsImplicit warnings #: single value 0 = no warnings were reported for all the thresholds;

and value - = a project does not qualify for the implicit convention checks as it does not contain launch files with static transforms.

Github Repository Inconsistencies ImplicitWarnings forz=1,2,5,10 (#)

Run-time(s)

Github Repository Inconsistencies ImplicitWarnings forz=1,2,5,10 (#)

Run-time(s)TP

(#)FP(#)

TP(#)

FP(#)

pheeno_ros 1 0 - 0 victim_localization 1 1 0 4mavros 3 0 0 52 LAM_SLAM 1 0 0 62rosflight 3 0 - 2 ipc_slam_cnn 1 1 1, 1, 0, 0 1ur_modern_driver 1 0 - 13 denso_tomato 0 4 0 5orb_slam_2_ros 1 0 - 291 mbzirc_task2 2 0 0 4gps_umd 1 0 - 1 SpCoSLAM_Lets 0 1 0 135

se306p1 4 0 - 1 catkin_ws 3 2 0 8roman-technologies 4 0 - 6 robotx 2 0 0 9CRF 1 0 0 8 IntelligentRobotics 1 0 0 8roscorobot 2 0 - 20 riptide2017 1 0 0 0rotors_simulator_with_drcsim_env 2 0 - 6 air_manipulator 12 0 0 23px4_offboard_velocity_control 1 0 - 0 summit-xl-ros-stack 2 0 0 10velocity_marker_tracker 2 0 - 0 pick_n_place 0 0 2, 2, 0, 0 200MultiRobotExplorationPackages 4 1 0 424 Mobile-Robotics-Development-Platform 1 0 0 25body_axis_controller 1 0 - 0 jaguar-bot 2 2 0 53body_axis_velocity_controller 1 0 - 0 semantic_labels_sys 0 0 1, 0, 0, 0 45marker_tracker 1 0 - 1 Racecar-Platform 1 1 0 4Loam_Matlab_SMP 1 0 - 5 fyp_uav_radar 1 0 0 7arduino_brushless 1 0 - 1 PX4_Drone 3 0 0 0frobit_lego_transporter 1 0 - 1 depth_tools 1 0 0 0wiic_twist 1 0 - 1 IntelligentCar 0 3 0 39Create_PointCloud-ROS 1 0 - 1 rapid_pbd 0 1 0 9autorally 2 0 - 9 Lidar_Utility 1 0 0 18ua-ros-pkg 2 0 0 162 ros_webcam_lidar 1 0 0 0model_car 3 0 0 41 lidar_slam_3d 0 1 0 1door_detection_smr 3 0 - 4 Autonomous_wheelchair_navigation 3 0 0 0table_cleaner_ur10 6 0 - 18 AutoNavQuad 1 0 0 3

ece6460_examples 1 0 0 1 Mini-Lab 2 1 0 14lsd_slam 1 0 0 133 gamesh_bridge 0 1 0 0los_slam 1 0 0 2 rockin_logger 1 0 0 1Robotic 2 0 0 3 hpp_source_code 0 0 1, 1, 0, 0 9USV-ROS-Stack 1 2 0 7 chairbot_selfdrive 0 0 1, 1, 1, 0 2Inverse-Kinematics 2 0 0 2 mycomputer 1 0 0 2ifacWC2020 0 0 3, 2, 0, 0 17 NWPU_robot 4 0 0 35gki_pr2_symbolic_planning 0 1 0 7 match_catch 5 0 2, 2, 2, 0 1190util 2 0 0 1 HislandAutonomousRobot 1 0 0 1derail-fetchit-public 2 1 2, 1, 1, 0 13 epuck_world 1 0 0 0WaterGun2016 1 0 0 139 human_dialogue 0 2 0 22PiObs 2 0 0 0 monoodometer 3 0 0 4916-662_Robot_Autonomy_Project 1 0 0 0 pushing_benchmark 0 0 5, 5, 2, 0 217dyros_jet_dg 1 0 0 69 hrp_mpc_lidar 1 0 4, 4, 2, 2 868robo-games 2 0 0 2 scan_tools 0 0 1, 1, 0, 0 3ROS-Challenges 2 0 0 2 kill_test 0 1 0 18RAS-2016 3 1 0 10 thesis-repo 3 0 0 1cam_mocap_calib 0 3 0 1 dashgo 0 0 6, 6, 0, 0 1bigfoot 0 0 1, 1, 0, 0 2 iarc-2017 4 0 1, 1, 0, 0 21ros_project 4 0 0 5 iarc-mission8 1 0 1, 0, 0, 0 26ae-scoot 0 0 7, 7, 7, 7 40 grasping_oru 0 2 0 13MR2_ROS 0 2 5, 0, 0, 0 1 navigation_oru-release 4 0 0 457Fake-LaserScan 0 0 1, 1, 1, 0 0

Distributed_Collaborative_Task_-Tree_ubuntu-version-16.04

0 1 0 14 Total 154[81.05%]

36 45, 36, 16, 9 5162

Page 16: PhysFrame: Type Checking Physical Frames of Reference for

ESEC/FSE ’21, August 23–27, 2021, Athens, Greece S. Kate, M. Chinn, H. Choi, X. Zhang, S. Elbaum

Table 7: Projects used in the evaluation of PhysFrameC=C/C++ source files, L=Launch files, LOC=non-blank and non-commented lines of C/C++ source and header files as computed by CLOC

Github Repository C(#) L(#) LOC Github Repository C(#) L(#) LOC Github Repository C(#) L(#) LOC

cam_mocap_calib 2 3 397 bigfoot 10 14 1282 ros_project 16 4 3224ae-scoot 44 15 12405 jaguar-bot 96 52 20089 tbot_arm_ws 1 2 65nbvplanner 11 10 3291 MR2_ROS 5 6 1459 depth_tools 4 2 473IntelligentCar 86 44 46421 semantic_labels_sys 102 151 21031 LAM_SLAM 151 30 25638ipc_slam_cnn 3 3 2769 rapid_pbd 33 15 7361 Baxter-Uncertainty 3 3 3783DaVinci-Haptics 1 3 314 Racecar-Platform 10 22 3885 PX4_Drone 3 3 236A2DR_CHULA 21 37 9221 csir-manipulation 3 9 465 robotx 25 7 5698robond_project_6 4 5 71 IntelligentRobotics 17 37 7435 ME495Final 1 6 229velodyne_driver 17 7 1799 Robotics 28 30 4814 crazy_lidar 5 3 1698crazypi 1 2 471 RAS-2016 43 34 7835 denso_tomato 20 12 4458mbzirc_task2 17 29 3422 ros_assignments 15 2 926 catkin_ws 44 66 5872Robotics-Localization 2 6 28 team2 7 17 18263 team3 8 10 18499AGV_ws 16 31 6799 16-662_Robot_Autonomy_-

Project3 5 103 dyros_jet_dg 34 24 16401

kobuki_navigation 1 10 163 quadrotor_moveit_nav 5 13 336 robo-games 6 17 1124ROS-Challenges 8 16 1728 mbot_arm_DRL 1 3 103 ArDroneControl 6 7 1585kuka_lwr_simulation 10 59 2177 contamination_stack 5 6 47856 VehicleControl 147 38 31301riptide2017 2 6 507 air_manipulator 78 27 13507 udacity_bot 1 3 28summit-xl-ros-stack 28 187 173054 fyp_uav_radar 33 64 445617 ece6460_examples 29 14 1486lsd_slam 48 12 16489 at_home_rsbb 14 6 3873 los_slam 8 5 1434Robotic 18 17 2164 udacity_bot-d 1 3 29 USV-ROS-Stack 27 16 5323Inverse-Kinematics 18 20 2380 ifacWC2020 30 8 9102 Udacity-RoboND-project-2-

Kinematics-Project-master3 18 1199

gki_pr2_symbolic_planning 49 3 7461 robot_2d_nav 3 1 362 NASA-RMC-2020-NorthstarRobotics

21 37 2436

util 4 4 970 carl_bot 14 17 2745 carl_moveit 2 9 1118derail-fetchit-public 39 29 10420 rail_ceiling 3 11 846 WaterGun2016 93 14 15151PiObs 1 5 17 Where-Am-I-Udacity 2 6 29 victim_localization 16 3 2667Fake-LaserScan 1 1 44 pick_n_place 37 6 21281 Mobile-Robotics-Development-

Platform74 21 14200

RosBat 6 7 1261 ultrasonic_to_laserscan 5 1 532 gps_umd 3 0 452mavros 69 13 13065 orb_slam_2_ros 59 5 22460 pheeno_ros 5 4 549realsense-ros 3 12 1771 rosflight 9 0 1850 ur_modern_driver 22 12 5697door_detection_smr 17 3 8127 table_cleaner_ur10 62 50 17235 ua-ros-pkg 125 123 38418model_car 22 16 13928 autorally 35 42 9220 arduino_brushless 4 0 447frobit_lego_transporter 2 5 195 RoboticBugAlgo 8 0 1170 wiic_twist 3 0 795Create_PointCloud-ROS 6 0 384 CS491_Autonomous_Buggy 74 15 28874 MultiRobotExplorationPackages 206 78 7115053body_axis_controller 1 0 61 body_axis_velocity_controller 1 0 59 marker_tracker 1 3 76px4_offboard_velocity_control 1 0 214 rotors_simulator_with_drc-

sim_env27 47 4579 velocity_marker_tracker 1 0 53

Loam_Matlab_SMP 6 3 2209 roscorobot 47 19 14831 CRF 23 4 3386roman-technologies 23 0 13102 catkin_roboway 46 38 25360 NWPU_robot 95 59 23834baxter_project 413 20 79454 AprilTag_Localization 93 22 196152 autonomus_transport_indus-

trial_system2 5 561

gx2019_omni_simulations 5 12 921 uwb-localization 23 3 6324 match_catch 65 146 135516Lidar_Utility 72 41 9300 ros_leishen_lidar 2 1 253 ros_webcam_lidar 3 2 253pose_estimation 3 3 1083 lidar_slam_3d 4 1 885 surgical_robot 1 12 17Autonomous_wheelchair_navi-gation

1 21 94 slam_autonomous_navigation 38 14 8450 F1tenth 5 11 171

AutoNavQuad 17 26 3492 SpCoSLAM_Lets 63 11 11488 Mini-Lab 49 66 6941gamesh_bridge 1 2 74 rockin_logger 1 2 160 head_meka 0 1 0meka_pmb2_ros 4 59 398 hpp_source_code 0 2 0 RoboND-Localization-Project 2 6 29amcl_3d 11 2 1518 autoware_rc_car 5 6 57767 rc_car 3 9 200vehicle_sim 2 7 196 ros_lecture19_pkg 5 1 179 keti_ws 76 13 42784ros-detection 1 3 12822 chairbot_selfdrive 10 20 1497 tms_ss_pot 12 10 2565mycomputer 14 50 5079 epuck_world 2 3 531 Distributed_Collaborative_-

Task_Tree_ubuntu-version-16.04

46 73 8867

human_dialogue 60 68 13760 ENPM-661-Planning-Projects 1 1 67 monoodometer 73 16 170563D-Mapping-Navigation-of-Quadrotors

2 17 2015 pushing_benchmark 58 25 62266 tangentbug 2 1 332

hrp_mpc_lidar 154 95 108073 pioneer_tools 5 10 230 turtlesim_pioneer 5 1 266rcah18_pepper_navigation 1 11 280 se306p1 16 0 1851 sequential_scene_parsing 22 4 87553d_mapping 6 21 629 scan_tools 12 8 2346 ROSBeginner 4 10 125px4VIO 9 1 1491 ua-catvehicle-dev 5 24 567 octagon 24 14 2425kill_test 50 26 15288 contactdb_utils 11 12 1620 thesis-repo 33 5 860oko_slam 29 23 7381 dashgo 4 19 3346 HislandAutonomousRobot 6 6 632iarc-2017 45 56 10578 iarc-mission8 27 39 14238 theLearner 11 1 1290onigiri_war 1 8 106 Mini-Lab-8 8 39 626 grasping_oru 15 16 6911navigation_oru-release 134 95 84833 yumi_demos 3 12 917 orsens_ros 2 4 1137