13
HOSTED BY journal homepage: www.elsevier.com/locate/dcan Available online at www.sciencedirect.com Building technology platform aimed to develop service robot with embedded personality and enhanced communication with social environment Aleksandar Rodić a,n , Miloš Jovanović a , Ilija Stevanović a , Branko Karan b , Veljko Potkonjak c a Mihajlo Pupin Institute, University of Belgrade, Volgina 15, 11060 Belgrade, Serbia b Institute of Technical Sciences, Serbian Academy of Sciences and Arts, Knez Mihajlova 35, 11000 Belgrade, Serbia c School of Electrical Engineering, University of Belgrade, Bulevar Kralja Aleksandra 73, 11000 Belgrade, Serbia Received 24 December 2014; received in revised form 3 March 2015; accepted 12 March 2015 Available online 6 April 2015 KEYWORDS Human-centric robot; Personal robot; Articial emotional intelligence Abstract The paper is addressed to prototyping of technology platform aimed to develop of ambient- aware human-centric indoor service robot with attributes of emotional intelligence to enhance interaction with social environment. The robot consists of a wheel-based mobile platform with spinal (segmented) torso, bi-manual manipulation system with multi-nger robot hands and robot head. Robot prototype was designed to see, hear, speak and use its multimodal interface for enhanced communication with humans. Robot is capable of demonstrating its affective and social behavior by using audio and video interface as well as body gestures. Robot is equipped with advanced perceptive system based on heterogeneous sensorial system, including laser range nder, ultrasonic distance sensors and proximity detectors, 3-axis inertial sensor (accelerometer and gyroscope), stereo vision system, 2 wide-range microphones, and 2 loud- speakers. The device is foreseen to operate autonomously but it may be also operated remotely from a host computer through wireless communication link as well as by use of a smart-phone based on advanced client-server architecture. Robot prototype has embedded attributes of articial intelligence and utilizes advanced cognitive capabilities such as spatial reasoning, obstacle and collision avoidance, simultaneous localization and mapping, etc. Robot is designed http://dx.doi.org/10.1016/j.dcan.2015.03.002 2352-8648/& 2015 Chongqing University of Posts and Telecommunications. Production and Hosting by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). n Corresponding author. E-mail addresses: [email protected] (A. Rodić), [email protected] (M. Jovanović), [email protected] (I. Stevanović), [email protected] (B. Karan), [email protected] (V. Potkonjak). Peer review under responsibility of Chongqing University of Posts and Telecommunications. Digital Communications and Networks (2015) 1, 112124

Building technology platform aimed to develop service ...core.ac.uk/download/pdf/81164814.pdf · The mobile robot platform is a motorized trolley with two driving, microprocessor

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Building technology platform aimed to develop service ...core.ac.uk/download/pdf/81164814.pdf · The mobile robot platform is a motorized trolley with two driving, microprocessor

Available online at www.sciencedirect.com

H O S T E D B Y

Digital Communications and Networks (2015) 1, 112–124

http://dx.doi.org/12352-8648/& 2015 Carticle under the CC

nCorresponding auE-mail addresses

ilija.stevanovic@pupPeer review unde

journal homepage: www.elsevier.com/locate/dcan

Building technology platform aimed todevelop service robot with embeddedpersonality and enhanced communicationwith social environment

Aleksandar Rodića,n, Miloš Jovanovića, Ilija Stevanovića,Branko Karanb, Veljko Potkonjakc

aMihajlo Pupin Institute, University of Belgrade, Volgina 15, 11060 Belgrade, SerbiabInstitute of Technical Sciences, Serbian Academy of Sciences and Arts, Knez Mihajlova 35, 11000Belgrade, SerbiacSchool of Electrical Engineering, University of Belgrade, Bulevar Kralja Aleksandra 73, 11000 Belgrade,Serbia

Received 24 December 2014; received in revised form 3 March 2015; accepted 12 March 2015Available online 6 April 2015

KEYWORDSHuman-centric robot;Personal robot;Artificial emotionalintelligence

0.1016/j.dcan.201hongqing UniversiBY-NC-ND license

thor.: aleksdandar.rodiin.rs (I. Stevanovir responsibility of

AbstractThe paper is addressed to prototyping of technology platform aimed to develop of ambient-aware human-centric indoor service robot with attributes of emotional intelligence to enhanceinteraction with social environment. The robot consists of a wheel-based mobile platform withspinal (segmented) torso, bi-manual manipulation system with multi-finger robot hands androbot head. Robot prototype was designed to see, hear, speak and use its multimodal interfacefor enhanced communication with humans. Robot is capable of demonstrating its affective andsocial behavior by using audio and video interface as well as body gestures. Robot is equippedwith advanced perceptive system based on heterogeneous sensorial system, including laserrange finder, ultrasonic distance sensors and proximity detectors, 3-axis inertial sensor(accelerometer and gyroscope), stereo vision system, 2 wide-range microphones, and 2 loud-speakers. The device is foreseen to operate autonomously but it may be also operated remotelyfrom a host computer through wireless communication link as well as by use of a smart-phonebased on advanced client-server architecture. Robot prototype has embedded attributes ofartificial intelligence and utilizes advanced cognitive capabilities such as spatial reasoning,obstacle and collision avoidance, simultaneous localization and mapping, etc. Robot is designed

5.03.002ty of Posts and Telecommunications. Production and Hosting by Elsevier B.V. This is an open access(http://creativecommons.org/licenses/by-nc-nd/4.0/).

[email protected] (A. Rodić), [email protected] (M. Jovanović),ć), [email protected] (B. Karan), [email protected] (V. Potkonjak).Chongqing University of Posts and Telecommunications.

Page 2: Building technology platform aimed to develop service ...core.ac.uk/download/pdf/81164814.pdf · The mobile robot platform is a motorized trolley with two driving, microprocessor

113Building technology platform aimed to develop service robot with embedded personality

in a manner to enable uploading of new or changing existing algorithms of emotionalintelligence that should provide to robot human-like affective and social behavior. The keyobjective of the project presented in the paper regards to building advanced technologyplatform for research and development of personal robots aimed to use for different purpose,e.g. robot-entertainer, battler, robot for medical care, security robot, etc. In a word, thedesigned technology platform is expected to help in development human-centered servicerobots to be used at home, in the office, public institutions, etc.& 2015 Chongqing University of Posts and Telecommunications. Production and Hosting by Elsevier B.V.

This is an open access article under the CC BY-NC-ND license(http://creativecommons.org/licenses/by-nc-nd/4.0/).

1. Introduction

Human-centered high-tech systems like personal devices(PC-computers, smart phones, cars or personal robots) aredesigned to be used by humans in everyday life. Personaldevices (widely accepted and available to majority) wererecognized in the recent decades as forthcoming technologyof modern society that will initiate new technology revolu-tion like it was the invention and exploitation of theInternet in 90-ies. Robotic scholars and practitioners parti-cularly emphasize human-like movement, human-like intel-ligence and human-like communication as necessaryfeatures of the future personal robots [1–3] as the technol-ogy of the 21st century.

Recognizing the need to intensify work in human-centricrobotics, a group of academic institutions [4] have launcheda project of research and development of ambient intelli-gent service robots. Development of new laboratory proto-types of intelligent service robots, intended for applicationsin indoor environment, was designated as one of the mainproject objectives. At the same time, the issue was on thevariety of software solutions, starting from the modelingand simulation of the motion of service robots to algorithmsof multimodal interaction and emotion modeling includingsocial i.e. inter-personal (human–robot or robot–robot) inter-action. The prototype is developed with two main objec-tives in mind: first, to serve as a test-bed for scientificresearch and second, to provide a basis for possible commer-cial and noncommercial applications. The intended operationarea (indoor space) influenced the selection of locomotionstructure and navigation sensors. Starting from the decom-position of motion of the service robot into the transport(global) motion, by which the robot changes its operatingpositions (using its legs or wheels), and the local task-specificmanipulation by which the working task is directly performed,a structure consisting of a mobile platform with rolling wheelsas a means of transport has been adopted as the basis of theprototype. Similarly, the navigation is based on sensorssuitable for operation in indoor environments, primarily theultrasonic sensors and low-cost visual depth sensors. In orderto bring the manner of carrying out the service tasks as closeas possible to human, the task performer was chosen in a formof humanoid torso with a head and artificial hands withanthropomorphic characteristics and the abilities of multi-modal (audio, gesture) communication with human users. Theresulting prototype has been assigned the acronym UBMSR(Universal Bi-Manual Service Robot).

The paper is organized as follows. The overview of theprototype as a whole is given in the next section. Especiallyimportant subsystems—motion, sensory, and the controlsystem are elaborated in more details in three subsequentSections 3–5. Development of the cognitive features of thesystem is considered in Section 5, too. In the Section 6 someexperimental results are presented to verify potentialsystem capabilities. The closing Section 7 to be in plancontains a short description of the state of the project,conclusion and further developments.

2. System overview

The UBMSR prototype is designed for indoor applications.The robot is designed primarily for research purposesalthough it has a universal character according to itsfunctionality, and moreover, its modules are of industrialquality. This device can be used in households (as a house-hold appliance as battler or as a socializing or enter-tainment robot), and in public places (airports, railroadstations, museums, shopping malls, offices, medical institu-tions, etc.) to provide information and user assistance, as acarrier of load or medical waste, as well as in industry as amobile flexible robot system easily adaptable to changes intechnological process and in production program. Thanks toits soft compliant joints, this robotic system may work incooperation with people without special need of ensuringsafety of workplace. As a research tool, the UBMSR may beused for research, development, and evaluation of algo-rithms of motion of mobile robots in unstructured environ-ment, evaluation of artificial intelligence algorithms, soundand pattern recognition, developing of multimodal human–robot interface (HRI), etc.

When designing the system, a care has been taken partlyof anthropomorphic proportions as well as the functional-ities that should imitate human behavior and activities.For this reason, this device possesses a high level of anth-ropomorphism although it integrates particular componentsthat are purely industrial realizations, e.g. lightweightrobot arms and hands. In the future, these elements shallbe possibly replaced by corresponding elements inspired byhuman body.

The robot structure consists of several functional modules(Fig. 1): (i) wheel-driven mobile robot platform; (ii) segmen-ted torso; (iii) bi-manual robotic system with three-fingeredgraspers; (iv) movable robot head; (v) sensory-acquisition

Page 3: Building technology platform aimed to develop service ...core.ac.uk/download/pdf/81164814.pdf · The mobile robot platform is a motorized trolley with two driving, microprocessor

Fig. 1 The Universal Bi-manual Service Robot as technologyplatform for research and development of personal robots andwireless networked systems.

A. Rodić et al.114

system; and (vi) control and communication module. Theadopted system structure, as displayed in Fig. 1, allows finesystem mobility in different indoor environments. Whendetermining robot dimensions, human proportions have beenpartly taken into account, since it has been planned to usethe robot in space where people live and work and the robotis expected to cooperate with people in particular tasks.Overall dimensions of the mobile platform, including wheelsare L�W�H=0.70� 0.70� 0.72 m3. The torso is placed sothat the shoulders are on the level of 1.5 m and the spanbetween shoulders is 0.475 m, which corresponds to humanproportions [5]. The size of the robot head, from chin tocrown is 0.26 m and it is proportional to the width of thetorso, whereas the width of the head comes out of proportiondue to the need of installing quality microphones at theposition of ears. Such defined dimensions of the robot allowits easy passage through the standard door frame (0.80 m) inhouseholds and public areas. Drive wheels have a diameter of0.35 m, whereas the diameter of auxiliary wheels is 0.20 m.These dimensions allow motion of the mobile platform overthe flat floor surface as well as the diverse floor cloths. Therobot cannot climb stairs but it can roll over small obstacleswhose height does not exceed 0.07 m. More details about themechanical structure of the robot are given in the nextsection.

3. Mechanical design

3.1. Mobile robot platform

The mobile robot platform is a motorized trolley with twodriving, microprocessor controlled wheels and two auxiliarywheels ensuring stability and maintenance of direction ofmotion. The platform is designed to house a cabinet withtwo horizontally divided compartments. On the lower level,two 12 V 120 Ah batteries are placed together with a 12–230VAC adapter/charger, electrical fuses, etc., whereas the

upper level houses the control and communication electro-nics - the “brain” of the device. To unload the actuators ofthe torso and achieve optimal distribution of masses, i.e. toget the center of mass placed as low as possible, a care hasbeen taken to seat all electronic boards inside the cabinetand not in the head or torso of the robot. The system isdesigned to ensure safety against overturning even in theposition of maximally abducted/extended robot hands andwith maximum payload of 5 kg per single arm.

3.2. Segmented torso

The torso of the robot consists of lower and upper part. Theupper part carries the robot head and two 6 DOF robot armswith artificial hands. The lower part is a central supportpillar ending with a two-axis rotation drive with a slewingdrive transmission, allowing bending (change in elevation)of the upper part of the torso around the horizontal axisin the range [�30, +60] degrees, and turning (rotationabout the vertical axis) in the range [�180, +180] degrees.These redundant degrees of freedom significantly expandthe operation work space of the robot (reachability of itshands).

3.3. Bi-manual manipulation system

The bi-manual manipulation system consists of two 6 DOF,5 kg payload universal robot arms UR5 [6] of industrialquality. Each arm is equipped with a three-fingered Bar-rett-type robotic hand grasper [7]. Two-arm manipulationsystem, combined with ability of bending the torso, pro-vides good manipulation properties of the overall system.The universal robot hands UR5 possess compliant jointsfacilitating the interaction with the man without risks ofinjury. The hands have their own industrial controllers thatare integrated in the UBMSR control architecture. In thefuture, the UR5 lightweight robot arms will be replaced bytwo anthropomorphic arms with poly articulated joints [8].

3.4. Robot head

The robot head (Fig. 2) contains: (i) kinematic mechanismwith 3 DOF, actuated by three DC electric motors withslewing drive transmission, (ii) two industrial-grade camerasfor robot vision, (iii) two wide-band professional-grademicrophones with increased sensitivity, and (iv) optionalvisual depth sensor placed on the crown of the head. Thehead has good kinematic abilities of rotation about threeaxes: vertical, frontal, and sagittal. Thanks to visual andaudio capabilities of the robot, the head can be rotated indirection of the source of the light and/or sound. The robotis also equipped with the appropriate LED illuminationfacilitating its functioning in the conditions of reducedambient light.

4. Sensorial system

The sensory and data acquisition system of the robot isheterogeneous. It serves to perceive its physical and socialenvironment. Sensors are embedded almost throughout the

Page 4: Building technology platform aimed to develop service ...core.ac.uk/download/pdf/81164814.pdf · The mobile robot platform is a motorized trolley with two driving, microprocessor

Fig. 2 CAD model of the UBMSR head with the audio and vision systems embedded in the plastic frame.

115Building technology platform aimed to develop service robot with embedded personality

whole mechanical structure of the system. The robot hasjoint position sensors (optical encoders) and velocity sensors(tachogenerators) as well as external distance sensors (laserrange-finder and proximity ultrasonic probes), inertial sen-sors (3-axis gyroscope and 3-axis accelerometer), andoptional special-purpose sensors (humidity and temperaturesensor, smoke detector etc.). The universal robot hands areequipped with joint encoders, torque sensors in the arm andwrist joints, and force/torque sensors in the hand graspers.A tilt sensor is placed on the platform to detect collisionwith surrounding objects.

Two fast industrial grade CMOS color cameras [9] andaudio equipment — microphones [10] and speakers [11,12]are embedded within the robot head. It is also possible toplace the optional RGBD sensor (the combined color anddepth camera) on the crown of the head that can be usedfor human gesture recognition and identification of movingobstacles within the range of 1–4 m around the robot. Theactual version of the employed RGBD sensor is Kinect XBOX360, operating as a structured light sensor [13], but it isplanned to replace it with a more advanced Kinect forWindows v2 time-of-flight sensor [14].

It is possible to set the visual system to operate indifferent configurations yielding different performances ofvision-based navigation. The basic configuration is a bifocalstereo vision consisting of two CMOS cameras (monocularsystem is also feasible, but its depth derivation is possibleonly when the robot is in motion). The system can beoperated in both outdoor and indoor environments, but itseffectiveness significantly depends on the texture of sur-rounding objects. Furthermore, it is possible to improve therobustness and speed of the system by adding the thirdcamera, obtaining in this manner a trifocal sensor. Alter-natively, the CMOS cameras can be completely replaced bythe RGBD sensor, which is limited for use in indoor environ-ment only and is slower compared to industrial grade CMOScameras. It provides better resolution of depth maps.Further combinations of RGBD and ordinary cameras arealso possible, and they can be tuned to inherit good featuresof both sensors.

5. Control and communication architecture

5.1. Control, communication and networking

The control module is represented by a hierarchy-distri-buted structure that best suits the individual tasks that arein advance imposed to the service robot. This implies thatthere is a controller at the highest hierarchical level, whichcoordinates all tasks, data acquisition, and communicationtraffic, and which has subordinated particular low levelcontrollers such as (i) mobile platform controller, respon-sible for motion planning, navigation, and path followingtogether with obstacle and collision avoidance, and whichtakes into account different characteristics of frictionbetween the wheels and the ground; (ii) controller of themanipulation system (totally 12 DOF) including the torso(2 DOF) and the robot head (3 DOF), which also integrates twoseparate industrial controllers for UR5 hands; (iii) image andsound processing, interpretation of voice commands, etc.

The communication block provides networking of therobot in LAN using a Wi-Fi access or GSM-GPRS modem(the case of information-structured environment), and con-necting the robot to the Internet. Also, it is planned toremotely command the robot using a smartphone withAndroid application or standard notebook or PC computerwith corresponding GUI remote application. This assumes aWI-FI hot spot or GSM-GPRS modem on the robot whichbrings an additional functionality of remote communicationwith authorized access and ability to monitor systemparameters or remote transmission of image and sound.

As it is shown in Fig. 1, presenting the key components ofthe mobile service robot system, the control system iscomplex and distributed. Because of that, a modular controlsystem design is assumed which is combined with a highspeed CAN bus architecture [15]. This high speed CAN busarchitecture is the main bus of the entire mobile servicerobot control system. Because of the speed and reliability,the high speed CAN bus architecture provides a properfunction of all modules, reliable and time precise dataexchange, as well as proper supervision of the whole system

Page 5: Building technology platform aimed to develop service ...core.ac.uk/download/pdf/81164814.pdf · The mobile robot platform is a motorized trolley with two driving, microprocessor

Visual controllerVAFER-ULT I1-I7

NI-SB_RIO-9636Vehicle controller

ADLE3800PCPCE-104Global system controller

Hi speed CAN bus

Ethernet

UR-5 robotcontroller

UR-5 robotcontroller

Dual manipulationrobotic arm 2xUR-5

GSM GPRS modem

Touch screen control monitor

Sound system

Fig. 3 Global block structure of the control system of mobile service robot.

A. Rodić et al.116

by main controller of the system and also in a remote wayby TABLET PC supervision software specially realized for thepurpose of the remote supervision, commanding and controlof the robotic system and global task settings. The globalblock control schematic based on the high speed CAN busarchitecture is presented in Fig. 3.

Whole control structure is based on the high speed CANbus architecture controlled by the ADLE3800 PC PCIe104processor board [16]. This is a powerful PCIe 104 processormodule where the ROS operation system is implemented[17]. It controls all subsystems connected onto the highspeed CAN bus architecture. Let's mention before all atrolley controller based on National Instruments singleboard RIO architecture. A detailed description of this sub-system will be presented in the text to follow. The system ofmachine vision is a subsystem of robot global control struc-ture and it is connected to the high speed CAN bus, too. Thismachine vision subsystem is based on the powerful graphicprocessor which is in charge of image acquisition and imageprocessing as well as decision making system according tothe results of the machine vision module control algorithms.A detailed description of this subsystem and its maincharacteristics will be also presented in the text to follow.As it is presented in Fig. 3 a dual arm manipulationsubsystem is also connected to the high speed CAN busand it is a part of the robot control architecture. The dualarm robotic system is based on the light way robotic armUR-5 produced by the Universal Robot Co. [18]. Each lightway robotic arm is directly controlled by its own controller.Both controllers are connected to the CAN bus. Based onthat, prescribed dual arm manipulation task is possible to becompleted without any collision, according to the results ofmachine vision algorithm. This global command structureprovides possibility to realize complex task movements such

as obstacle avoidance and avoidance of dual arm collisionduring the desired task.

The processor module ADLE3800 PC PCIe104 directlycontrols implemented audios system. Based on microphoneimplemented into the mobile robot a generic speech reco-gnition task is realized together with a voice synthesis.These two systems are running into the ROS operationsystem and it is responsible for audio command recognitionand robot–human voice interaction during the task. Com-plete human–robot interface (HRI), together with voice andimage data interpreter, are implemented onto the touchscreen panel mounted to the chest of the robot. In such away, the audio, visual and touch facilities of the HRI can becontrolled and supervised by operator. Implemented touchscreen panel is used not only for HR interaction, but also forthe global control of the overall system as well as itsconstituent modules. Another possibility of global controland supervision of mobile service robot is remotely estab-lished using a smartphone and implemented wireless GSM-GPRS communication integrated into the control loop of thesystem structure. In such a way, it is possible to establishremotely command settings and supervision of the robotstatus as well as real time image data transmission fromrobot to the screen of a smartphone.

In the text to follow the other functional subsystems, as apart of the global control structure of mobile robot, arepresented and analyzed in details. Primarily, the wheeledtrolley subsystem of the robot is in charge of movement ofthe robot in human area. A detailed control structure of thewheel-based trolley is presented in Fig. 4.

The trolley of the service robot is equipped with a largenumber of sensors. This implicates that controller itselfmust be equipped with a large number of I/O ports toreceive and collect all these data. Controller itself must

Page 6: Building technology platform aimed to develop service ...core.ac.uk/download/pdf/81164814.pdf · The mobile robot platform is a motorized trolley with two driving, microprocessor

NI-SB_RIO-9636 Hi speed CAN bus

Intelligent Light control

RS232

8 IR range detector

3 axis motion system detector

IIC bus

LED lamp

LASER range finder

8 Ultra sound range detectorOTTOBOCK

two axis drive module

Analogue control

BM

Digital controlDigital statusCurrent monitorEncoder chanal

IIC bus

IIC bus

Left and RightAutonomous wehicle

motor, break, encoder control

BM

AMCFrontal slew ring

drive module

M

Slew ringFrontal

motor, encoder control

Connection to PC104 E Board

AMCFreontal slow ring

drive moduel

AMCSagittal slew ring

drive module

M

Slew ringSagittal

motor, encoder control

Fig. 4 Detailed control structure of the wheeled trolley of the service robot.

117Building technology platform aimed to develop service robot with embedded personality

have a proper structure providing all data to be adequatelyand in time collected and available in any time to the othersubsystems for proper supervision and control. Primarily, itis important to have reliable obstacle sensor detection. Forthat purpose, the trolley of the mobile robot is equippedwith both ultrasound and infrared sensors capable ofdetecting obstacle to the mid-range distance (10–80 cm).At least, 8 ultrasound distance sensors and 8 infra-redobstacle detectors are implemented onto the trolleywhereby 2 of each are mounted to the edge of the vehicle.By combining ultrasound and IR detectors, required accu-racy and reliability are obtained for the system to detectobstacles in various indoor environments. These sensors areconnected to the internal unique I2C bus providing easy andtimely reading of each sensor. Because of that, the sensorsare mounted to the edge of every side of the trolley. Byproper reading sensor information and combining sensors'information, it is possible to detect obstacles, measure theproximity and generate regular path to avoid immobile andmobile obstacles. Accurate Laser Range Finder is mountedto the front panel of the trolley. It is responsible formapping whole frontal plain with respect to the path ofmotion. Precise obstacle detection in the motion directionis possible by this sensor. Accordingly, it is possible togenerate a desired path between the detected obstacles.This Laser Range Finder sensor is connected to the con-troller by RS232 type connection as it is presented in Fig. 3.Complete collection and basic processing of the dataobtained from sensors is in charge of the NI SB RIO 9626single board controller by National Instruments [19]. Itoperates under the LabView [20] real time OS. This con-troller is responsible for the proper time data acquisitionfrom the sensors, basic sensors data processing and sensordata transmission to the high speed CAN bus architecturetowards to the supervisory controller. Data received fromthe sensors are promptly available to the global controllerinto the system. In parallel, NI SB RIO 9626 controller usesthese data collected from the sensors for the local servocontrol of the main wheels during the motion. The globalrobot controller is responsible for navigation and pathplanning. Simultaneous localization and mapping (SLAM)algorithms are running in this controller. Beside proximitysensors and range finder robot has the corresponding 3-axis

inertial sensor. The 3-axis inertial system comprises of 3-axisaccelerometer and 3-axis gyro. This sensor is mounted tothe trolley of the robot and connected to the NI SB RIO 9626controller trough to the internal unique I2C bus. By appro-priate sensor-data fusion, NI controller and global controllercalculate the global position/orientation (coordinates) andspeed of the robotic system during the movement. Com-plementary to collecting and analysis of the data acquiredfrom sensors of the robotic system, NI SBRIO controllermanages intelligent operation of the front lights. LED lamplights are mounted in the front panel of the trolley. They areused for lightning of the path in advance in direction ofmotion. Light intensity is measured by the visual subsystemand acquired information is transmitted through the highSpeed CAN bus architecture. The NI SBRIO controller is alsoin charge of controlling of power-driven wheels of thetrolley. Two robust and reliable OTTOBOCK DC brush motorswith breaks and incremental encoders are mounted into thetrolley for that purpose. Control of the driving motors isrealized by a NI SB RIO 9626 architecture which containsstrong FPGA programmable field. This FPGA is directlyincluded into the control structure to supervise the drivingwheels, motors' breaks and incremental encoders of themotors. The current and the speed of the motors arecalculated using the data from incremental encoders andstatus of the motors. FPGA structure inside NI SB RIO 9636board is programed in such way to directly produce currentmode and velocity mode control of the motors. It canacquire value of the motor current as well as value of theincremental encoder counter. Combining with the globalposition of the system determined, it is also possible toestablish odometry approach of task accomplishment. Thementioned NI controller is also responsible to control panand tilt joint actuators of the torso of robot. These jointsare realized using slewing-drive actuators. The FPGA struc-ture inside the controller is also used to directly controlmentioned two axis slowing-drive joints. The encoders usedin these two joints are of absolute type so the values aredirectly readable as an accurate position in the task space.The current mode and velocity mode control are implemen-ted into the structure so the inclination of the torso couldbe harmonized regarding the desirable contact task or bydesirable two arm movement together with the machine

Page 7: Building technology platform aimed to develop service ...core.ac.uk/download/pdf/81164814.pdf · The mobile robot platform is a motorized trolley with two driving, microprocessor

Visual controllerVAFER-ULT I1-I7

USB2.0

USB3.0 Hub

USB3.0

Hi speed CAN busConnection to PC104 E Board

Fig. 5 Control structure of the system of machine vision withservice robot.

A. Rodić et al.118

vision algorithm. As it has been already mentioned that NIcontrol board is directly connected to the high speed CANbus architecture, then direct flow of the data in real timefrom and to global controller and others subsystems arecompletely provided.

Vision system is a robot subsystem based on employmentsmachine vision algorithms and vision-based task operation(navigation and/or manipulation). Detailed control struc-ture of this subsystem is presented in Fig. 5.

The visual system is based on stereovision system whichconsists of two high speed USB3 cameras. KINECT camerais also mounted on the top of the head of the mobileservice robot for precise depth vision system. KINECTcamera is connected through USB2 port to the controller.By combining data from the stereovision and from theKINECT sensor based on depth image vision, it can beobtained accurate information about the object(s) in sur-rounding, its shape, structure and distance. A powerfulVAFER-ULT I1 controller [21] is in charge of informationprocessing from the vision system. It has an INTEL I7processor and powerful NVIDIA graphic accelerator andimaging processor which allow maximal speed of imageprocessing. These collected data from the image processorcan be used for further analyzing onto the main controlleror at the remote station (smart phone, tablet, etc.). Theimages can be transferred to the remote user, by wirelessGSM/GPRS communication, for real time supervision or forfurther processing in order to support task realization.Vision subsystem controller is also connected to othersubsystems into the control loop by high speed CAN busarchitecture.

5.2. Remote monitoring and supervisory control

Remote control system is accomplished via GSM/GPRScommunication or Wi-Fi communication. This kind of aremote connection enables the transmission of commands,sensor-data, audio signals (sound) and images to/from therobot. Monitoring can be done via computer, mobile phone,or tablet. Two-way communication allows remote control ofthe task(s), reading the status of a service robot andtransmission of the sound and images from the robot tosmartphone and from smartphone to the robot. Thus, it ispossible to give the robot the sound commands directlywithout the involvement of specific commands on the

screen. The supervisory system was implemented in theCLOUD application that allows maximum reliability duringdata transmission, real time video transmission and max-imum availability of information as well as real timeinformation transfer. The realized supervisory structure insuch way enables updating the data and their reception atany time. The data are always available to the operator andit is possible to simultaneously share, analyze and modify itin real time. This especially applies to the data of the sensorsystem, which are of great importance both for theaccomplishing of the service robot tasks and for monitoringthe robot by the human-operator.

Specially designed voice, i.e. audio control applicationimplemented on the smartphone under the ANDROID operationsystem allows user to simultaneously impose in a direct wayvoice commands and then to listen robot voice response froma remote site. After receiving command(s), robot proceedsexecution of the same(s) and user can directly supervise therobot operation by video stream.

5.3. Cognitive EI interface

Since the UBMSR is aimed to accomplish service tasks thatrequire interaction with people, a special cognitive inter-face, so-called EI (Emotional Intelligence) controller, isdesigned allowing the robot to have a better integrativeperformances and better social acceptance by humans. TheEI robot interface (EI-controller) should provide robot theability of perception/acquisition of human emotions (affec-tive state), understanding emotions, and control of its ownaffective and social behavior as well as interaction withpeople-interpersonal interaction. These capabilities shallbe developed on the basis of embodied attributes ofartificial emotional intelligence in robots using humanpersonality profile model that assumes type of personality,temperament and human character. Basis for this function-ality represents research from humanities, and especiallypsychology.

The researches conducted in the paper should enabledevelopment of the specialized EI interface that representsa basis for design of the robot EI-controller. Research in thispaper is based on experimental work with human exam-inees. Amongst the set of 237 human examinees of differentage (adolescents 53.58%, junior adults 24.47%, senior adults21.52% and elderly persons 0.63%), gender (males 41% andfemales 59%), degree of education (low educated 2.53%,intermediate 76.79% and high-educated 20.68%) and amen-ability to the different social frames (e.g. family education,companions, religion, etc.), corresponding tests of identifi-cation their attributes of psychological behavior in differentlife situations marked as “trigger events” have been done.Initially, the anonymous examinees were required to fill outthe on-line questionnaires available at the Internet foracquisition of their personality types and temperaments[22,23]. Then, the same ones were required to fill outcorresponding questionnaire for acquiring socio-economicconditions as well as the current psyho-physical conditions(interior state). After that, the examinees were asked togive their opinion (assessment) about their physical andpsychological reactions upon the imposed scenarios (triggerevents) assumed from everyday life (e.g. from jolly and

Page 8: Building technology platform aimed to develop service ...core.ac.uk/download/pdf/81164814.pdf · The mobile robot platform is a motorized trolley with two driving, microprocessor

119Building technology platform aimed to develop service robot with embedded personality

funny emotions to frightening, tragic or disgusting). Theexaminees have ranked the degree of their experiencing theimposed trigger events on the scale 0–100% assessing theirdegree of effectiveness. Based on these tests, correspond-ing statistical processing of the results was accomplishedand knowledge generalization was done. Knowledge fusionwas made by merging obtained results with the acquiredpersonality profiles done with the examinees. Based on thisoutputs, corresponding mathematical model of human affec-tive and social behavior [24] was developed. The model(software) was implemented in the controller of the servicerobot [25]. The EI-controller should ensure better socialacceptance of the device from the potential customers aswell more effective and intuitive communication betweenrobot and other persons. Beginning with the theory of person-ality and the experimental results acquired in advance, math-ematical model of human affective and social behavior wasdeveloped whose fundamentals are given in the text to follow.

While the artificial intelligence (AI) copes with tasks suchas pattern recognition, spatial reasoning, context reason-ing, decision making, etc., the emotional intelligence (EI) isresponsible in humans for self-awareness, self-manage-ment, awareness of others and relationship management[26–28]. According to the theory of personality from huma-nities, individual emotional conditions primarily depend onthree factors [29]: (i) personality type, (ii) personality

Fig. 6 High-level model description: trigger/stim

Fig. 7 Structure of the three-stage EDB-model with fuzzy blocks:behavior interpreter.

character, and (ii) type of temperament. These factors aredetermined by the genetic code of an individual and it isacquired by birth. Besides, the behavior of individuals isstrongly influenced by additional exterior and interiorfactors such as (Fig. 6): (i) “exterior world” i.e. physicaland social environment, (ii) “interior stimuli” as well as (iii)“socio-economic factors”. The personality character, as animportant factor, was disregarded in this research since it isrelated mostly to certain pathologic states and that is not ofour interest bearing in mind that in this study of develop-ment model of human affective and social behavior weconcerned only health and regular behavior of humans [24].

Under the term “exterior world” our physical and socialenvironment (that includes trigger-events, -situations, -actsor relationships with other persons) are assumed. The“interior stimuli” mainly concerns with human physical andpsychological conditions (regarding to body and psyche) suchus senses of fatigue, pain, disease, etc. Under the same term,the feelings like love, depression, anger, hatred, fear, satis-faction etc. are assumed, too. The “socio-economic factors”regard to the conditions that influence some features ofindividual behavior. The “social factors” includes: family andschool education, influence of companions, religion and affili-ation to the social groups e.g. political parties, civil commu-nities, clubs, fans, economic conditions, etc. Bearing in mindthe previous facts, a psychological (affective) behavior of an

uli—personality profile—behavior—reactions.

emotionally state generator, emotionally state modulator and

Page 9: Building technology platform aimed to develop service ...core.ac.uk/download/pdf/81164814.pdf · The mobile robot platform is a motorized trolley with two driving, microprocessor

A. Rodić et al.120

individual is determined by plenty of different exterior andinterior factors. To derive a generic model of human psycho-logical behavior the following assumptions are assumed. Theconsidered biological system can be approximated by adeterministic, dynamic, non-linear, multi-input–multi-output(MIMO) system of arbitrary complexity. The input and outputvariables of emotion-driven behavior model (EDB) are hardlymeasurable but they can be quantified by using correspondingpsychological questionnaires [22,23] as described previously.

The variables of interest required for modeling EDB(Fig. 7) include qualitative information regarding person-ality profile, physical and social environment, emotionaland health conditions, social factors, etc. They are quiteheterogeneous and rather linguistic and descriptive thanmeasurable and numeric. Generally observed, humansuse to behave (in psychological sense) in a fuzzy manner.That means, people use their sensory-motor skills, symbolicbody and facile gestures as well phonetic forms to expresstheir psychological state – behavior attributes, mood, etc.Due to a fuzzy nature of human behavior, the modelingapproach assumed in this project consequently belongs tothe knowledge-based type [22,23]. It assumes utilizationof the fuzzy inference system i.e. fuzzy logic structure.Accordingly, the appropriate structure of the EDB modelis represented by a three-stage block structure (Fig. 7).It consists of three fuzzy inference systems FIS-1 to FIS-3aligned in the cascade with aim to propagate individualbehavior attributes towards the output of the model. Achi-eved behavior attributes depend on information entered tothe model by the input variables such as: “personalityprofile”, “exterior world”, “interior stimuli” and “socialfactors”.

Biological mechanism of human affective behavior pro-duces emotive reactions based on three excitation signals(information carriers): (i) “trigger” of behavior (that arousesdifferent psychological reactions), (ii) “profiler” (that shapesevent-driven emotional state (ES) to fit the personalityprofile of the individual), and (iii) behavior “booster” orbehavior “inhibitor” (that increases or decreases the expres-siveness of the individual affective manifestations).

The trigger of psychological behavior (mood) is the“exterior world” according to the Fig. 6. The exterior worldrepresents our physical and social environment (type ofevent, situation, inter-personal relationship and interactionor acts). The initiated emotional state is then gained byinfluence of some “interior stimuli” and “socio-economicfactors”. According to the Fig. 7, the trigger variables areled to the input of the FIS-1 block named “ES generator”.This block produces at its output the “non-profiled” ES.It can be also called the “non-personalized” (whereas itdoes not reflect an individual affective behavior but oneestimated by an experience based on emotional state ofa large sample of individuals). For example, a sad and afrightening event (situation) certainly cause with majoritypersons corresponding feeling of sadness as well fear as thenormal psychological reactions to be expected. But, theexpressiveness of feelings differs from person to persondepending on their personality profiles (personality typeand temperament), interior stimuli and social factors.Because of that, the individual traits (personality type andtemperament) represent so called the “ES profiler” whilethe “interior stimuli” and “social factors” represent the

“ES booster” or “ES inhibitor” depending if it increasesor decreases energy of emotions. Both, the emotional state“profiler” and “booster” (or “inhibitor”) shape the “non-profiled” ES making of it one that is “profiled” and calledalso “personalized” since its ES attributes are individually-specific. The fuzzy block FIS-2 (Fig. 7) is called “ES modulator”because the adjustment (modulation) of the non-profiled ES isachieved in this block.

Fuzzy model blocks FIS-1 to FIS-3 (Fig. 7) are assumed tobe of Mamdani type. The input and output vectors are of thearbitrary number depending on how complex (general)model wish to be developed. Since the considered biologicalsystem is non-linear, the corresponding input and outputmembership functions (MFs) of the FIS-1 to FIS-3 blocks areassumed to be the Gauss curve or the sigmoidal function[22]. The number of fuzzy rules in the corresponding rulesdata-bases can vary depending on how refined behaviorproperties intend to fit by model designed. If one requestsfrom the model to provide high resolution (distinction)between the similar psychological states then it is requiredlarger number of rules in the corresponding data-base.The proposed model architecture (Fig. 7) with fuzzy blocksaligned in a cascade enables easy monitoring of the inter-mediate model states (ES attributes that correspond to thenon-personalized or personalized states) generated at theparticular FIS blocks. Also, such modular structure enableseasier analysis of the separated or synergetic influence ofthe “interior stimuli” and “social factors” upon the indivi-dual affective behavior.

6. Experiments and model simulation

Prototype of personal robot presented in Fig. 1 is still inprogress. Hardware integration is expected to be readysoon. Being the main novelties presented in the paperregards to problems of EI-model design described inSection 5, some selected results presented here are usedto verify concept and feasibility of the approach. In thatsense, test results obtained with human examinees as wellas corresponding simulation results of modeling humanaffective behavior will be presented in this Section. Modelof emotion-driven behavior (Fig. 7) is basis for buildingEI-controller of the service robot presented in Fig. 1. Inorder to verify the feasibility, two characteristic indepen-dent demo-scenarios are selected. These demo-scenarios ascharacteristic trigger-events are used with aim to excitehuman emotionally state and cause affective reactions.Selected test scenarios are described in the following way:

Scenario 1 (irritating trigger event): You are in the super-market waiting in front of the cash register to pay thingsyou have chosen. You are late and in danger to lose somescheduled appointment very important for your future pro-fessional career. Some leisured, boring and talkative personslows down the cashier and do not care about anybody in his/her surrounding.

Scenario 2 (unpleasant trigger event): At the dinnertable, someone told a funny joke. While laughing, you havesuddenly burped loudly.

How do you feel and what are your reactions in bothscenario cases?

Page 10: Building technology platform aimed to develop service ...core.ac.uk/download/pdf/81164814.pdf · The mobile robot platform is a motorized trolley with two driving, microprocessor

Fig. 8 Representation of human emotionally-driven behavior caused by trigger event described in the Scenario 1; (a) Emotionallystates acquired from the persons of ESFJ(a) and ESFJ(b) type with Choleric temperament; (b) Emotionally state acquired from theperson of INTP type with Choleric temperament.

121Building technology platform aimed to develop service robot with embedded personality

Within the frame of experimental verification of theapproach proposed in this paper for building EI-controller,human examinees (237 of them) have been asked to comeout about how much the chosen demo (virtual) scenarios1 and 2 affect their emotional experiencing. Majority ofexaminees, 22.14% of them, were categorized in the groupdetermined as ESFJ (Extraverted-Sensing-Feeling-Judging)type of personality, and 44.25% of them were identified tohave temperament of a Choleric type. With aim to examinethe sensitivity of the emotion-driven behavior model in away to fit fine differences in affective behavior caused bysmall variations of personality attributes, the experimentalresults obtained from two examinees belonging to the group“ESFJ and Choleric” are compared first. Expression of theirpersonality attributes were identified in percentages:(i) Person ESFJ(a) is E(7%), S(39%), F(13%), J(19%) and Choleric77%; (ii) Person ESFJ(b) is E(36%)-S(14%)-F(25%)-J(9%) andCholeric 30%. Obtained emotionally states of the personsESFJ(a) and ESFJ(b) are presented in Fig. 8a by correspond-ing graphs shown in an emotionally state plane. In the samegraph, it is also presented so-called „NON-PROFILED „emo-tionally state that was determined as the average statederived from emotionally states acquired amongst 237examinees. After that, emotionally-driven behavior modelfrom Fig. 7 was simulated. In simulation, personal attributesof the previously described examinees EFSJ(a) and ESFJ(b) and the demo Scenario 1 (irritating trigger event) wereintroduced as inputs to the model. Obtained model outputs(emotionally states) are shown in Fig. 9a. These resultsdemonstrate relatively fine congruence with the test resultsshown in Fig. 8a that are acquired from human examinees.It verifies assumption that designed emotionally-drivenbehavior model fits well human affective behavior. This testalso confirms that model developed in the paper is enough

sensitive to variation of personality attributes and thatmodel makes difference within the same personality cate-gory (e.g. ESFJ). Experimental results with human exam-inees as well as corresponding results of model simulationregarding to implementing unpleasant trigger event (Sce-nario 2) are presented in Figs. 10 and 11 successively. Bothtypes of personality ESFJ as well as INTP were taken intoaccount, too.

Validation tests have been accomplished not only upontwo here presented scenarios but also with more scenarioexamples. Obtained results prove required generality of themodel developed in the paper. That makes us to believethat the designed model can be extended in a way to fit abroader domain of human affective states. Model can beused effectively with service robot to enhance its social andinter-personal compatibility and communication skill.

In the second test, simulation of human emotionally-drivenbehavior model is accomplished for the case of an INTP(Introverted-iNtuitive-Thinking-Perceiving) type of personalityand for the Choleric temperament. Chosen person has theattributes: I(36%), N(16%), T(51%), P(5%) and Choleric tem-perament 55%. Emotionally states acquired by examination ofthe person with these personality traits are presented inFig. 8b. Corresponding simulation results of the model thatcorrespond to these personality traits are shown in Fig. 9b. Bycomparison of the emotionally states ESFJ and INTP presentedin Figs. 8a and 8b, significant differences are noticeable due todifferences of personality type attributes. Emotionally statespresented in Figs. 8b and 9b show satisfactory congruence asexamples presented previously in Figs. 8a and 9a.

Affective reactions generated by emotion-driven beha-vior model proposed in the paper are demonstrated by useof some simple graphical interface in a form of the feelingcharts presented in Fig. 12.

Page 11: Building technology platform aimed to develop service ...core.ac.uk/download/pdf/81164814.pdf · The mobile robot platform is a motorized trolley with two driving, microprocessor

Fig. 9 Simulation results plotted in an ES plane obtained by use of the emotionally-driven behavior model (Fig. 7) excited by thetrigger event described in Scenario 1. (a) Emotionally states obtained by model simulation for the personality profiles ESFJ(a) andESFJ(b). (b) Emotionally state obtained by model simulation for the personality profile INTP.

Fig. 10 Representation of human emotionally-driven behavior caused by the trigger event described in Scenario 2; (a) Emotionallystates acquired from the persons of ESFJ(a) and ESFJ(b) type with Choleric temperament; (b) Emotionally state acquired from theperson of INTP type with Choleric temperament.

A. Rodić et al.122

Simulation results presented in Figs. 9 and 11 verify assump-tion that the model proposed in this paper fits human affectivebehavior with satisfactory accuracy. Additional improvementscan be achieved by further tuning of fuzzy membershipfunctions by use of the extensive experimental results obtainedfrom human examinees. With additional tuning of fuzzy rules inthe model as well as with certain modification of the aggregate

algorithms, accuracy of the model can be additionallyimproved. The next step in the project will be a microprocessorimplementation of model algorithms within the robot EI-controller. Obtained results make us believe that the modelof emotion-driven behavior proposed in this paper can beeffectively implemented with robot from Fig. 1 to enable itsbetter emotion awareness of others and itself.

Page 12: Building technology platform aimed to develop service ...core.ac.uk/download/pdf/81164814.pdf · The mobile robot platform is a motorized trolley with two driving, microprocessor

Fig. 11 Simulation results plotted in the ES plane obtained by use of the emotionally-driven behavior model (Fig. 7) excited by thetrigger event described in Scenario 2. (a) Emotionally states obtained by model simulation for the personality profiles ESFJ(a) andESFJ(b). (b) Emotionally state obtained by model simulation for the personality profile INTP.

Fig. 12 Example of feeling charts used in the graphicalinterface. Mood rating scale with respect to the sadness andhappiness.

Fig. 13 Personal entertainment robot [30]. Robot designderived from the basic model presented in Fig. 1.

123Building technology platform aimed to develop service robot with embedded personality

7. Conclusion and future development

The paper describes architecture and design details of thetechnology platform aimed to develop a laboratory prototypeof human-centric service robot for indoor applications. Deve-lopment of such a system is important step towards integra-tion of “smart” home technology of future with social robotsdeveloped for personal purposes.

During the work done so far, many individual modules havebeen developed and/or separately tested. As the interimoutline of the project, the personal entertainment robotnamed Personal Robot entertainer was developed for thepurpose of exhibition at the Festival of Science 2014,Belgrade, Serbia [30]. This model has less technical capabil-ities than the basic platform presented in Fig. 1, but can beassumed as one of the possible robot modifications (Fig. 13).

The laboratory prototype (Fig. 1) will be completed,tested and evaluated in typical household tasks, whichshould open up the possibilities for its commercial applica-tions. The robot designed has an original hardware andsoftware architecture and it is especially interesting due toits original cognitive EI robot interface necessary for socialand service robots operating and interacting with people.

Acknowledgment

The work presented in this paper is supported by theSerbian Ministry of Education, Science and TechnologyDevelopment within the project no. TR-35003 entitled“Research and Development of Ambient Intelligent ServiceRobots of Anthropomorphic Characteristics”. The research-ers are partially supported by the Alexander von HumboldtFoundation (Germany), under the title “Building attributesof artificial emotional intelligence aimed to make robots

Page 13: Building technology platform aimed to develop service ...core.ac.uk/download/pdf/81164814.pdf · The mobile robot platform is a motorized trolley with two driving, microprocessor

A. Rodić et al.124

feel and sociable as humans (Emotionally Intelligent Robots- EIrobots)”, contract no. 3.4-IP-DEU/112623. The authorsacknowledge to the staff of the Intermediate educationcenter “Goša” from Smedervska Palanka (Serbia) for meticu-lously done questionnaires with human examinees regardingacquisition of personality attributes.

References

[1] R.A. Brooks, C. Breazeal, R. Irie, C.C. Kemp, M. Marjanović,B. Scassellati, and M.M. Williamson, Alternative essences ofintelligence, in: Proceedings of the AAAI'98/IAAI'98, AmericanAssociation for Artificial Intelligence, Menlo Park, CA, USA,1998, pp. 961–968.

[2] T. Fukuda, R. Michelini, V. Potkonjak, S. Tzafestas,K. Valavanis, M. Vukobratović, How far away is artificial man,IEEE Robot. Autom. Mag. 8 (1) (2001) 66–73.

[3] B.R. Duffy, Anthropomorphism and the social robot, Robot.Auton. Syst. 42 (3–4) (2003) 177–190.

[4] M. Vukobratović, Belgrade school of robotics, Institute MihailoPupin, Belgrade, 2002.

[5] HPC - Human Proportion Calculator (ideal). online at: ⟨http://hpc.anatomy4sculptors.com⟩ (accessed 16.11.14).

[6] Universal Robots A/S, Universal robot UR5. online at: ⟨http://www.universal-robots.com/GB/Products/Downloads.aspx⟩(accessed 16.11.2014).

[7] Barrett Technology Inc., Barrett hand, online at: ⟨http://www.barrett.com/robot/products-hand.htm⟩ (accessed 16.11.2014).

[8] A. Rodić, B. Miloradović, Đ. Urukalo, On developing anthro-pomimetic robot-arm, in Proceedings of the 17th InternationalMulticonference Information Society (IS 2014), Volume F,Robotics, Institut “Jožef Štefan”, Ljubljana, Slovenia, October,8th, 2014, pp. 29–32.

[9] Point Grey Research Inc., Flea3 1.3MP color USB 3.0 camera (e2vEV76C560). online at: ⟨http://www.ptgrey.com/flea3-13-mp-color-usb3-vision-e2v-ev76c560-camera⟩ (accessed 16.11.14).

[10] Sennheiser Electronic GmbH, Sennheiser MK 4—condensermicrophone for professional studio recordings, online at:⟨http://en-de.sennheiser.com/condenser-microphone-studio-recordings-professional-mk-4⟩ (accessed 16.11.14).

[11] ProElectronic, Piezo visokotonski zvučnik KHS106 (Piezo twe-eter KHS106), online at: ⟨http://www.proelectronic.rs/PIEZO-VISOKOTONAC-KHS106-KHS106-4367.htm⟩ (accessed 16.11.14).

[12] ProElectronic, Mini zvučnik 45021 (Mini speaker 45021), onlineat: ⟨http://www.proelectronic.rs/MINI-ZVUCNIK-45021-MINI-ZVUCn17-13181.htm⟩ (accessed 16.11.14).

[13] Microsoft Corporation, Microsoft XBOX Kinect, online at:⟨http://www.microsoft.com/education/ww/products/Pages/kinect.aspx⟩ (accessed 16.11.14).

[14] Kinect for Windows team, Revealing Kinect for Windows v2hardware, online at: ⟨http://blogs.msdn.com/b/kinectforwindows/archive/2014/03/27/revealing-kinect-for-windows-v2-hardware.aspx⟩ (accessed 16.11.14).

[15] CAN with Flexible Data-Rate, BOSH user manual, SpecificationVersion 1.0 (released April 17th, 2012) ⟨http://www.bosch-semiconductors.de/media/pdf_1/canliteratur/can_fd_spec.pdf⟩.

[16] ⟨http://adl-usa.com/products/detail/103/adle3800pc⟩(accessed 16.11.14).

[17] ⟨http://www.ros.org/⟩ (accessed 16.11.14).[18] ⟨http://www.universal-robots.com/GB/Products.aspx⟩

(accessed 16.11.14).[19] ⟨http://www.ni.com/pdf/manuals/373378c.pdf⟩ (accessed

16.11.14).[20] ⟨http://www.ni.com/labview/⟩ (accessed 16.11.14).[21] ⟨http://www.ieiworld.com/product_groups/industrial/con

tent.aspx?gid=00001000010000000001&cid=09050665109937740140&id=0D270425292726476606.VIlWUMu85aQ⟩ (accessed16.11.14).

[22] On-line personality test: ⟨http://www.16personalities.com/free-personality-test⟩ (accessed 20.12.2014).

[23] On-line temperament test: ⟨http://www.neoxenos.org/wpcontent/blogs.dir/1/files/temperaments/tempera-ment_test.htm⟩ (accessed 20.12.14).

[24] Rodić Aleksandar, Khalid Addi, Mathematical modelingof human affective behavior aiemed to design of robotEI-controller Series: Mechanisms and Machine Science, in:A. Rodic, D. Pisla, H. Bleuler (Eds.), New Trends in Medicaland Service Robots: Challenges and Solutions, 20, SpringerPublishing House, 2014, pp. 141–163. http://dx.doi.org/10.1007/978-3-319-05431-5 ISSN: 2211-0984.

[25] A. Rodić, M. Jovanović, How to make robots feel and social ashumans, in: Proceedings of the 6th IARIA International Con-ference on Advanced Cognitive Technologies and Applications(COGNITIVE), Venice, Italy, May 25th–29th, 2014, pp. 133–139.

[26] D.H. Kluemper, Trait emotional intelligence: The impact ofcore-self evaluations and social desirability, Personal. Individ.Differ. 44 (6) (2008) 1402–1412.

[27] J.D. Mayer, P. Salovey, What is emotional intelligence?, in:P. Salovey, D. Sluyter (Eds.), Emotional Development andEmotional Intelligence: Implications for Educators, BasicBooks, New York, 1997, pp. 3–31.

[28] K.V. Petrides, R. Pita, F. Kokkinaki, The location of traitemotional intelligence in personality factor space, Br. J.Psychol. 98 (2007) 273–289.

[29] D.G. Myers, Psychology, 9th ed., Worth Publishers, New York,2010.

[30] Personal entertainment robot video: ⟨http://youtu.be/wM2eaViT_i0⟩ (accessed 20.12.2014).