4
motionEAP: An Overview of 4 Years of Combining Industrial Assembly with Augmented Reality for Industry 4.0 Markus Funk 1 , Thomas Kosch 1 , Romina Kettner 1 , Oliver Korn 2 , Albrecht Schmidt 1 1 University of Stuttgart (Pfaffenwaldring 5a, 70569 Stuttgart, Germany) 2 University of Applied Sciences Offenburg (Badstr. 24, 77652 Offenburg, Germany) [email protected] 1 [email protected] 2 ABSTRACT With our society moving towards Industry 4.0, an increasing number of tasks and procedures in manual workplaces are augmented with a digital component. While the research area of Internet-of-Things focuses on combining physical objects with their digital counterpart, the question arises how the in- terface to human workers should be designed in such Industry 4.0 environments. The project motionEAP focuses on using Augmented Reality for creating an interface between workers and digital products in interactive workplace scenarios. In this paper, we summarize the work that has been done in the motionEAP project over the run-time of 4 years. Further, we provide guidelines for creating interactive workplaces using Augmented Reality, based on the experience we gained. ACM Classification Keywords H.5.m. Information Interfaces and Presentation (e.g. HCI): Miscellaneous Author Keywords Augmented Reality, Assistive Systems, Industry 4.0, Order Picking, Assembly Workplace INTRODUCTION With the proliferation of Internet of Things (IoT) technology, digital information are becoming more and more integrated into everyday life. Also in manufacturing, digital components are playing a larger role e.g. for quality assurance, configuring products, ordering products, and learning how to produce them. Especially while teaching new workers, technology can make a large impact as it can provide cognitive assistance in the workplace [16]. Starting 2013, the German Federal Ministry for Economic Affairs and Energy funded the project motionEAP 1 , which investigated using motion recognition algorithms and Aug- mented Reality for recognizing a worker’s actions and work 1 motionEAP website: http://www.motioneap.de Paste the appropriate copyright statement here. ACM now supports three different co- pyright statements: ACM copyright: ACM holds the copyright on the work. This is the historical ap- proach. License: The author(s) retain copyright, but ACM receives an exclusive publication license. Open Access: The author(s) wish to pay for the work to be open access. The additio- nal fee must be paid to ACM. This text field is large enough to hold the appropriate release statement assuming it is single spaced. Every submission will be assigned their own unique DOI string to be included here. Figure 1. Our system assistive system for the manual assembly work- place uses in-situ projection to provide assembly instructions and uses motion recognition to become context-aware. steps and further for providing feedback according to the sen- sed actions directly at the workplace. The consortium consists of industrial partners: Audi AG, Schnaithmann Maschinenbau GmbH, GWW - Gemeinnützige Werkstätten und Wohnstätten GmbH, BESSEY Tool GmbH & Co. KG, KORION Simulati- on & Assistive Technology GmbH, Robert Bosch GmbH and research partners University of Applied Sciences Esslingen and University of Stuttgart. The project has a run-time until the end of 2016. There are three main aspects of the product life-cycle: manual assembly, order picking, and ethical con- siderations regarding assistive technology at the workplace. The ethical considerations are described by Behrendt et al. [3]. The contribution of this paper is a summary about the motio- nEAP project and a set of 8 guidelines that we learned from the manual assembly scenario and the order picking scenario. BUILT PROTOTYPES AND CONDUCTED STUDIES In the context of the motionEAP project, we built different prototypes that investigate a different aspect of using Augmen- ted Reality for cognitive assistance at the workplace. The two large areas are prototypes for manual assembly workplaces and order picking tasks. The Manual Assembly Workplace For augmenting the manual assembly workplace, we built an assistive system using in-situ projection and activity recogniti- on. The system is capable of projecting in-situ instructions for

motionEAP: An Overview of 4 Years of Combining Industrial ...€¦ · motionEAP: An Overview of 4 Years of Combining Industrial Assembly with Augmented Reality for Industry 4.0 Markus

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

motionEAP: An Overview of 4 Years of Combining IndustrialAssembly with Augmented Reality for Industry 4.0

Markus Funk1, Thomas Kosch1, Romina Kettner1, Oliver Korn2, Albrecht Schmidt1

1University of Stuttgart (Pfaffenwaldring 5a, 70569 Stuttgart, Germany)2University of Applied Sciences Offenburg (Badstr. 24, 77652 Offenburg, Germany)

[email protected][email protected]

ABSTRACTWith our society moving towards Industry 4.0, an increasingnumber of tasks and procedures in manual workplaces areaugmented with a digital component. While the research areaof Internet-of-Things focuses on combining physical objectswith their digital counterpart, the question arises how the in-terface to human workers should be designed in such Industry4.0 environments. The project motionEAP focuses on usingAugmented Reality for creating an interface between workersand digital products in interactive workplace scenarios. Inthis paper, we summarize the work that has been done in themotionEAP project over the run-time of 4 years. Further, weprovide guidelines for creating interactive workplaces usingAugmented Reality, based on the experience we gained.

ACM Classification KeywordsH.5.m. Information Interfaces and Presentation (e.g. HCI):Miscellaneous

Author KeywordsAugmented Reality, Assistive Systems, Industry 4.0, OrderPicking, Assembly Workplace

INTRODUCTIONWith the proliferation of Internet of Things (IoT) technology,digital information are becoming more and more integratedinto everyday life. Also in manufacturing, digital componentsare playing a larger role e.g. for quality assurance, configuringproducts, ordering products, and learning how to produce them.Especially while teaching new workers, technology can makea large impact as it can provide cognitive assistance in theworkplace [16].

Starting 2013, the German Federal Ministry for EconomicAffairs and Energy funded the project motionEAP1, whichinvestigated using motion recognition algorithms and Aug-mented Reality for recognizing a worker’s actions and work1motionEAP website: http://www.motioneap.de

Paste the appropriate copyright statement here. ACM now supports three different co-pyright statements:• ACM copyright: ACM holds the copyright on the work. This is the historical ap-proach.• License: The author(s) retain copyright, but ACM receives an exclusive publicationlicense.• Open Access: The author(s) wish to pay for the work to be open access. The additio-nal fee must be paid to ACM.This text field is large enough to hold the appropriate release statement assuming it issingle spaced.Every submission will be assigned their own unique DOI string to be included here.

Figure 1. Our system assistive system for the manual assembly work-place uses in-situ projection to provide assembly instructions and usesmotion recognition to become context-aware.

steps and further for providing feedback according to the sen-sed actions directly at the workplace. The consortium consistsof industrial partners: Audi AG, Schnaithmann MaschinenbauGmbH, GWW - Gemeinnützige Werkstätten und WohnstättenGmbH, BESSEY Tool GmbH & Co. KG, KORION Simulati-on & Assistive Technology GmbH, Robert Bosch GmbH andresearch partners University of Applied Sciences Esslingenand University of Stuttgart. The project has a run-time untilthe end of 2016. There are three main aspects of the productlife-cycle: manual assembly, order picking, and ethical con-siderations regarding assistive technology at the workplace.The ethical considerations are described by Behrendt et al. [3].The contribution of this paper is a summary about the motio-nEAP project and a set of 8 guidelines that we learned fromthe manual assembly scenario and the order picking scenario.

BUILT PROTOTYPES AND CONDUCTED STUDIESIn the context of the motionEAP project, we built differentprototypes that investigate a different aspect of using Augmen-ted Reality for cognitive assistance at the workplace. The twolarge areas are prototypes for manual assembly workplacesand order picking tasks.

The Manual Assembly WorkplaceFor augmenting the manual assembly workplace, we built anassistive system using in-situ projection and activity recogniti-on. The system is capable of projecting in-situ instructions for

(a) (b) (c)Figure 2. Four systems that were built in the motionEAP project: a single workplace, a multiuser workplace consisting of three workplaces that arealigned in a U-shape, OrderPickAR - a cart-mounted order picking system, and HelmetPickAR - a user-mounted order picking system.

communicating the workers how to perform a work step. Si-multaneously, the system is capable of recognizing a workersperformed actions e.g. picking a part from a bin or assemblinga part at the workpiece carrier. The workplace can be usedin two ways: as a single workplace (see Figure 2a) or as amulti user workplace (see Figure 2b). Apart from presentinginstructions and recognizing assembly steps, our system is e.g.capable of using tangibles as input for digital functions [8], orfinding appropriate spots to project the feedback based on thesurface suitability [12]. As the user-levels differ from begin-ner to expert users, we implemented a concept for providingadaptive assembly instructions. The system can provide threelevels of visual feedback: beginner mode, advanced mode, andexpert mode [5, 7]. For creating instructions, we are usinga Programming by Demonstration (PbD) concept, where anexpert worker demonstrates how a product is assembled cor-rectly. Afterwards the system automatically generates a visualinstruction for beginner and advanced users [9].

We conducted several studies for manual assembly workpla-ces with both cognitively impaired workers and non-impairedworkers. Considering the cognitively impaired workers, wefirst conducted a study for finding the most appropriate vi-sual instruction visualization [4]. Thereby, we found that acontour-based instruction performs best for all levels of cogni-tive impairments. Afterwards, we compared haptic, auditory,and visual error feedback [23]. We found that visual error feed-back is preferred by the cognitively impaired workers whilehaptic feedback leads to anxiety among the workers. In ano-ther study we found that using our assistive system cognitivelyimpaired workers can assemble products containing up to 48parts without a noticeable drop in performance [15]. To furthermotivate the workers, we were experimenting with differentgamification concepts to increase worker motivation [18, 19,20, 21]. Finally, we investigated the long-term impact of in-situprojected instructions on cognitively impaired workers [2].

For workers without impairments, we conducted a study eva-luating the instructions created by PbD by comparing them totraditional instructions [13]. Another study focused on findingthe best error feedback modality for non-impaired workers,

which revealed that a combination of haptic and visual feed-back is the most promising [6].

Order PickingIn the domain of order picking, we developed three diffe-rent systems investigating different design dimensions: Astationary system [1], a cart-mounted system called Order-PickAR [17] (see Figure 2c), and a user-worn system calledHelmetPickAR [14] (see Figure 2c). All systems have beencompared to several state-of-the-art instruction methods.

A Benchmark for AR InstructionsAs a result of our research in using Augmented Reality (AR)for assembly instructions, we introduced a method calledGeneral Assembly Task Model (GATM) to evaluate assemblyinstructions more accurately and provided a reference assemb-ly task [10]. Further, we used our GATM method to compareour assistive system to other AR technologies for providing as-sembly instructions at the manaual assembly workplace [11].

LESSONS LEARNED AND GUIDELINESBased on the experience we gained from using our assistivesystem in different scenarios and with different user groups,we propose guidelines and recommendations for designingand implementing assistive systems for the workplace. Alt-hough these guidelines and recommendations were inspired byusing assistive systems at workplace scenarios, we believe thatthey can be transferred to other scenarios involving assistivesystems, e.g. for a smart kitchen scenario (cf. [9]).

1. Keep the shown feedback simple:When designing instructions for assistive systems, two requi-rements should be considered: First, the instructions shouldcontain all important information that are relevant for the task.Second, the instructions should be as simple as possible. Asthe two requirements are contrary, the minimum trade-off thatstill fulfills both requirements needs to be found. In our stu-dies [4, 15], we found that displaying text should be avoidedas some workers are not able to read. Also there might be wor-kers with foreign native tongues. Further, we found that videosand complex pictures should also be avoided as they transfer alot more information than necessary to complete a task. As the

best trade-off between the two requirements, we found thatdisplaying contour information for showing assembly steps isa good way of fulfilling both requirements (see [4]).

2. Display direct feedback:Assistive systems have many possibilities to display feed-back and instructions to users. We learned that good feedbackshould fulfill two requirements: it should be displayed directlyat the position where an action is required, and displayed feed-back should be context-aware to match the performed actions.When displaying feedback on a screen that is located next towhere the actions are performed, users have to transfer theinformation that is shown on a screen to the real world. Thisrequires cognitive effort and consumes time. In our studies [13,15], we found that using in-situ projection to display projec-ted feedback directly on the area where the action has to beperformed is faster and less cognitively demanding. Second-ly, instructions and feedback should be pre-selected based oncontext. Where in a paper-based reference manual the correctpage needs to be found first, an interactive system can selectthe appropriate feedback or instruction automatically.

3. Design for hands-free usage:Operating an assistive system should not result in additio-nal effort and should not limit the user in performing theirtasks. Therefore, we require assistive systems to be designedto enable a hands-free usage. If the user would have to carry aremote controller for interacting with the assistive system, theuser’s hands would always be occupied by using the controller.We argue for using environment-mounted activity recognitionfor interacting with assistive systems. For explicitly trigge-ring functions or entering different modes, touch screens orgestures can be used.

4. Equip the environment rather than the user:When analyzing the design space of assistive systems, thedimension of where to put the technology is very important.Assistive systems can be mounted in the environment or canbe carried by the user. Also hybrid approaches exist, where as-sistive systems are both placed away from the user, but the usercan carry the technology on demand (e.g. the OrderPickARcart [17]). Through many studies with expert workers andcognitively impaired workers [4, 13, 15], we recommend torather equip the environment with technology of the assistivesystem than equipping users with technology. We learned thatusers do not want to wear any additional piece of technologywhen performing a work task. Further, the users sometimesleave the work area to perform other tasks. Then the userswould have to take off the technology and put it on again whenresuming the task. Also when assistive systems are placed inthe environment, multiple users can benefit from it.

5. Strive for intuitive natural interaction:For integrating an assistive system seamlessly into a scenarioin the physical world, the interaction with the assistive systemalso should happen in the physical world. We have to distin-guish two scenarios: interacting with the system regularly andprogramming procedures or workflows for the system. As theusers of assistive systems sometimes have problems using acomputer or cannot deal with a complex GUI, interaction withthe system should happen based on gestures and detecting

activity. Further, as designers of assistive systems, we shouldkeep in mind that also users who are teaching workflows orprocedures to assistive systems usually do not have a program-ming background. Although these users are experts in the taskthey are teaching, does not necessarily mean they are expertsin using computers. Therefore, we suggest to also use naturalinteraction for programming assistive systems.

6. Design for personalized feedback:During our studies, we tried to find the best feedback for bothcommunicating work instructions [4] and for presenting er-rors [6, 23]. Although we found that a contour visualization ofassembly positions is in general the best way to communicateassembly instructions, there were still users that preferred wat-ching an assembly video or looking at pictures of the assembly.Also for communicating errors that occur at the workplace,we found that a combination of visual and haptic feedbackis a good trade-off between privacy and error-awareness fornon-impaired workers. Despite these results, some users of theassistive system preferred auditory feedback over the visualand haptic feedback. Further, cognitively impaired users likedreceiving positive feedback after a work step was performed.As designers of assistive systems, we should take these factsinto account. Therefore, a standard feedback should representthe best feedback for the overall population. However, weshould provide the opportunity to personalize the feedbackthat is given and adjust it to the preferences of the users.

7. Enable users to control the speed:Especially when using assistive systems for augmenting workprocesses, it is important to not rush the user. Instead, the usersshould be able to perform the steps according to their ownpace. This results in one important design decision: all actionsthat advance instructions or feedback should be triggered bythe user and not by the system. Only if the user initiates theprocess, the user feels in control (cf. [24]). If the system wouldproceed instructions after a defined amount of time, the systemwould set the pace of the task. We assume that this would leadto more stress at the workplace. Additionally, for explicitlyinteracting with the system, e.g. for skipping a work step or forreplaying the previous work step, we integrated a foot pedalthat can be pressed by the user to control the feedback.

8. Add motivating quantified-self information:Occasionally participants asked how many items were left inthe current task and how fast they were. Some participantssuggested to have this information always present during theirwork tasks. This could be as simple as displaying a progressbar which fills up when completing more work steps, or morecomplex with a leader-board that shows which worker ma-de the fewest errors or produced the most parts. Displayingquantified-self information can be closely linked to addinggamification elements to enhance work processes, which wassuggested by Korn et al. [19, 20, 22]. We believe that designingassistive systems in a way that users can always view theirquantified-self information will lead to a higher motivationduring monotonous tasks when using assistive systems.

CONCLUSIONIn this paper we presented an overview of the project motio-nEAP, which focuses on using Augmented Reality for provi-

ding cognitive assistance at the workplace. We summarized theresearch prototypes that were implemented during the projectrun-time and provide an overview of the performed studies.Finally, we present a set of eight guidelines to support interac-tion designers, engineers, and researchers in creating assistivesystems for the workplace.

ACKNOWLEDGMENTSThis work was funded by the German Federal Ministry forEconomic Affairs and Energy in the project motionEAP, grantno. 01MT12021E.

REFERENCES1. Andreas Baechler, Liane Baechler, Sven Autenrieth,

Peter Kurtz, Georg Kruell, Thomas Hoerz, and ThomasHeidenreich. 2016a. The Development and Evaluation ofan Assistance System for Manual Order Picking-CalledPick-by-Projection-with Employees with CognitiveDisabilities. In Proc. ICCHP’16. Springer, 321–328.

2. Liane Baechler, Andreas Baechler, Markus Funk, SvenAutenrieth, Georg Kruell, Thomas Hoerz, and ThomasHeidenreich. 2016b. The Use and Impact of anAssistance System for Supporting Participation inEmployment for Individuals with Cognitive Disabilities.In Proc. ICCHP’16. Springer, 329–332.

3. Hauke Behrendt, Markus Funk, and Oliver Korn. 2015.Ethical Implications Regarding Assistive Technology atWorkplaces. In Collective Agency and Cooperation inNatural and Artificial Systems. Springer, 109–130.

4. Markus Funk, Andreas Bächler, Liane Bächler, OliverKorn, Christoph Krieger, Thomas Heidenreich, andAlbrecht Schmidt. 2015a. Comparing projected in-situfeedback at the manual assembly workplace withimpaired workers. In Proc. PETRA’15. ACM, 1.

5. Markus Funk, Tilman Dingler, Jennifer Cooper, andAlbrecht Schmidt. 2015b. Stop helping me-I’m bored!:why assembly assistance needs to be adaptive. In AdjunctProc. UbiComp’15. ACM, 1269–1273.

6. Markus Funk, Juana Heusler, Elif Akcay, Klaus Weiland,and Albrecht Schmidt. 2016. Haptic, Auditory, or Visual?Towards Optimal Error Feedback at Manual AssemblyWorkplaces. In Proc. PETRA’16.

7. Markus Funk, Oliver Korn, and Albrecht Schmidt. 2014a.Assisitive augmentation at the manual assemblyworkplace using in-situ projection. In Proc. CHI’14 EA.

8. Markus Funk, Oliver Korn, and Albrecht Schmidt. 2014b.An augmented workplace for enabling user-definedtangibles. In Proc. CHI’14 EA. ACM, 1285–1290.

9. Markus Funk, Oliver Korn, and Albrecht Schmidt. 2015.Enabling end users to program for smart environments.Proc. CHI’15 Workshops (2015).

10. Markus Funk, Thomas Kosch, Scott W Greenwald, andAlbrecht Schmidt. 2015. A benchmark for interactiveaugmented reality instructions for assembly tasks. InProc. MUM’15. ACM, 253–257.

11. Markus Funk, Thomas Kosch, and Albrecht Schmidt.2016. Interactive Worker Assistance: Comparing the

Effects of Head-Mounted Displays, In-Situ Projection,Tablet, and Paper Instructions. In Proc. UbiComp’16.

12. Markus Funk, Thomas Kosch, Katrin Wolf, PascalKnierim, Sven Mayer, and Albrecht Schmidt. 2016a.Automatic projection positioning based on surfacesuitability. In Proc. PerDis’16. ACM, 75–79.

13. Markus Funk, Lars Lischke, Sven Mayer, AlirezaSahami Shirazi, and Albrecht Schmidt. 2016b. AssistiveAugmentation at the Workplace: A System for CreatingSemantically-Rich Assembly Instructions. In AssistiveAugmentation: Cognitive Science and Technology (toappear). Springer.

14. Markus Funk, Sven Mayer, Michael Nistor, and AlbrechtSchmidt. 2016. Mobile In-Situ Pick-by-Vision: OrderPicking Support using a Projector Helmet. In Proc.PETRA’16, Vol. 4.

15. Markus Funk, Sven Mayer, and Albrecht Schmidt. 2015.Using In-Situ Projection to Support Cognitively ImpairedWorkers at the Workplace. In Proc. ASSETS’15. ACM,185–192.

16. Markus Funk and Albrecht Schmidt. 2015. Cognitiveassistance in the workplace. IEEE Pervasive Computing14, 3 (2015), 53–55.

17. Markus Funk, Alireza Sahami Shirazi, Sven Mayer, LarsLischke, and Albrecht Schmidt. 2015. Pick from here!: aninteractive mobile cart using in-situ projection for orderpicking. In Proc. UbiComp’15. ACM, 601–609.

18. Oliver Korn, Markus Funk, Stephan Abele, Thomas Hörz,and Albrecht Schmidt. 2014. Context-aware assistivesystems at the workplace: analyzing the effects ofprojection and gamification. In Proc. PETRA’14. 38.

19. Oliver Korn, Markus Funk, and Albrecht Schmidt. 2015a.Design approaches for the gamification of productionenvironments: a study focusing on acceptance. In Proc.PETRA’15. ACM, 6.

20. Oliver Korn, Markus Funk, and Albrecht Schmidt. 2015b.Towards a gamification of industrial production: acomparative study in sheltered work environments. InProc. EICS’15. ACM, 84–93.

21. Oliver Korn, Peter Muschick, and Albrecht Schmidt.2016. Gamification of Production? A Study on theAcceptance of Gamified Work Processes in theAutomotive Industry. In Proc. AHFE’16.

22. Oliver Korn, Albrecht Schmidt, and Thomas Hörz. 2013.The potentials of in-situ-projection for augmentedworkplaces in production: a study with impaired persons.In CHI’13 EA. ACM, 979–984.

23. Thomas Kosch, Romina Kettner, Markus Funk, andAlbrecht Schmidt. 2016. Comparing Tactile, Auditory,and Visual Assembly Error-Feedback for Workers withCognitive Impairments. In Proc. ASSETS’16. ACM.

24. Ben Shneiderman. Designing the user interface:strategies for effective human-computer interaction.Vol. 3.