21
CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 1 Deliverable 4.3 Low-fidelity Prototypes of mixed pipeline for videogame development CR-PLAY “Capture-Reconstruct-Play: an innovative mixed pipeline for videogames development” Grant Agreement ICT-611089-CR-PLAY Start Date 01/11/2013 End Date 31/10/2016

Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

1

Deliverable 4.3 Low-fidelity Prototypes of mixed pipeline

for videogame development

CR-PLAY “Capture-Reconstruct-Play: an innovative mixed pipeline for

videogames development”

Grant Agreement ICT-611089-CR-PLAY

Start Date 01/11/2013

End Date 31/10/2016

Page 2: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

2

Document Information Deliverable number: 4.3 Deliverable title: Low-fidelity Prototypes of mixed pipeline for videogame development Deliverable due date: 31/10/2014 Actual date of delivery: 31/10/2014 Main author(s): Ivan Orvieto, Matteo Romagnoli (TL) Main contributor(s): George Drettakis, Jerome Esnault (INRIA), Michael Goesele, Jens

Ackermann, Fabian Langguth, Stefan Guthe (TUD), Corneliu Ilisescu, Gabriel Brostow (UCL)

Version: 1.0

Versions Information Version Date Description of changes 0.1 16/09/2014 Structure and contributors 0.2 26/10/2014 First Draft of Unity Editor Tools 0.3 02/10/2014 Reconstruction, IBR, VBR contributions 0.4 10/10/2014 Pre-final version produced and sent to partners for comments 0.5 23/10/2014 Final version produces for internal proof-reading 1.0 29/10/2014 Final version

Dissemination Level PU Public X

PP Restricted to other programme participants (including the Commission Services)

RE Restricted to a group specified by the consortium (including the Commission Services)

CO Confidential, only for members of the consortium (including the Commission Services)

Deliverable Nature R Report

P Prototype X

D Demonstrator

O Other

CR-PLAY Project Information The CR-PLAY project is funded by the European Commission, Directorate General for Communications Networks, Content and Technology, under the FP7-ICT programme. The CR-PLAY Consortium consists of:

Participant Number

Participant Organisation Name Participant Short Name

Country

Coordinator

1 Testaluna S.R.L. TL Italy Other Beneficiaries

2 Institut National de Recherche en Informatique et en Automatique INRIA France 3 University College London UCL UK 4 Technische Universitaet Darmstadt TUD Germany 5 Miniclip UK Limited MC UK 6 University of Patras UPAT Greece 7 Cursor Oy CUR Finland

Page 3: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

3

Summary

This is the third deliverable of Work Package 4: “Design and Development (Requirements, Functional Specifications, Prototypes)”. The leader of this work package is TL, with involvement of all other partners. The objective of this work package is focused on gathering end-user needs, forming these into functional specifications, and creating the prototypes of the mixed pipeline for videogame development. This WP sets into practice the user-centred design approach adopted by CR-PLAY, ensuring that the technologies developed will result in tools that are effective and usable for professional and semi-professional use.

This deliverable - “D4.3 Low-fidelity Prototypes of mixed pipeline for videogame development” -describes the results of Task 4.3 “Low-fidelity Prototypes”. D4.3 is meant to play along with the Low-fidelity prototype explaining the main components of the mixed pipeline, their integration and communication in the common platform, risks and related contingency actions.

The structure of this deliverable is as follows:

Section 1 presents the mixed pipeline developed from user requirements (D4.1) and general architecture (D4.2), focusing on the main components and tools that provide end-users with the CR-PLAY experience. It presents the Reconstruction Tool, the IBR and VBR integration, and the Editor Tools developed to support game developers during game creation.

Section 2 presents the main risks and related contingency actions that are foreseeable at this stage of the project. They will be constantly updated as long as development activities progress during the project lifetime.

Finally, Section 3 draws conclusions and describes next steps of development activities in WP4.

Technical instructions on how to install, setup and utilize the pipeline are provided as an Annex at the end of this deliverable.

Page 4: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

4

Table of Contents

Summary ...................................................................................................................................................... 3

Table of Contents ......................................................................................................................................... 4

Abbreviations and Acronyms ........................................................................................................................ 5

Introduction ................................................................................................................................................. 6

1. The Low-fidelity prototype .................................................................................................................... 6

1.1 Reconstruction tool ........................................................................................................................ 7

1.2 Image Base Rendering integration into Unity .................................................................................. 7

1.3 Video Base Rendering integration in Unity...................................................................................... 8

1.4 Unity Editor Tools ........................................................................................................................... 9

2. Risks and contingency plan ................................................................................................................. 11

3. Conclusion .......................................................................................................................................... 12

3.1 Plan for next period ...................................................................................................................... 12

References .................................................................................................................................................. 13

Annexes ...................................................................................................................................................... 14

Low-fidelity Prototype: instruction manual ............................................................................................. 14

Guidelines for capturing datasets ........................................................................................................ 14

How to use the Reconstruction Tool ................................................................................................... 15

How to use CR-PLAY mixed pipeline in Unity ....................................................................................... 18

Page 5: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

5

Abbreviations and Acronyms

IBR: Image Based Rendering

VBR: Video Based Rendering

WP: Work Package

MVE: Multi-View Environment

VSFM: Visual Structure from Motion

PMVS: Patch-based Multi-View Software

GLSL: OpenGL Shader Language

GUI: Graphical User Interface

Page 6: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

6

Introduction

The development of a mixed pipeline for videogame creation is a complex process that involves strategic planning and implementation of features and their technical validation. After the first six months focused on requirements collection (D4.1) and system definition (D4.2), the effort of WP4 focused on the integration into Unity of i) the reconstruction tools (WP1) and ii) the first releases of the IBR and VBR algorithms (WP2 and WP3). Unity is the common platform for game development chosen for CR-PLAY, as described in D4.2.

The result of this integration is the Low-fidelity prototype, a first release of the CR-PLAY mixed pipeline that will allow end-users (game developers) to start using the CR-PLAY system in their common workflows.

This document describes all the components that comprise the CR-PLAY architecture (as defined in Task 4.2), and the data-flow from Capture to Play, passing through the Reconstruction Tool and the Unity Editor Tools, that together support game developers during the span of the game creation process.

Figure 1: General Architecture from D4.2

1. The Low-fidelity prototype

As result of the activities of WP4 during the first year, the Low-fidelity prototype represents the initial implementation of the mixed pipeline for videogame development. It combines and integrates

Page 7: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

7

technologies developed in WP1 (starting from the reconstruction algorithm developed by TUD [Goesele07, Fuhrmann14, libMVE]), WP2 (starting from the depth-synthesis and local warp Image-Based Rendering algorithm of INRIA [Chaurasia2013]) and WP3 (starting from the Video-Based Rendering techniques implemented based on Shoedl et al.’s method [Schoedl00]).

This section describes software components that have been created in WP4 to allow such technologies to be integrated and used within Unity, together forming the Low-fidelity prototype.

1.1 Reconstruction tool

As part of the Low-fidelity prototype, significant effort went into developing the communication between the 3D Reconstruction from images (TUD) and the depth-synthesis step for Image Based Rendering (INRIA). The result is the Reconstruction tool, a centralized and automatic process that supports operations to build datasets. This work is mentioned here for completeness, but fully described in D1.1 as part of the WP1 work in Year 1.

1.2 Image Base Rendering integration into Unity

The Image Based Rendering runtime application is divided into three parts:

Loading of dataset files into internal data structures (camera parameters and superpixel depths).

Loading of input images and textures as data structures into the graphics card, and preparing the warping structures.

Launching the rendering loop (OpenGL based):

1. Select the best 4 input cameras based on the novel camera position/orientation. 2. Solve linear system (solver step) and then render 4 warped images (warp shader). 3. Select best values to be used for the novel view’s image (blend shader). 4. Fill holes of the novel view’s image using Poisson synthesis (poisson shader).

IBRManager is the component (UnityBehavior) which centralizes IBR data in the Unity environment, and is responsible for loading input parameters of images and cameras. Other dataset files, such as depth maps and superpixels, are loaded by the Unity C++ plugin that combines these data to create 32bit runtime textures needed by IBR shaders. As Unity does not support 32bit texture assets, the plugin is also called to convert those data into standard 8bit RGBA textures that can be easily stored among other standard Unity assets. The Unity C++ plugin also initializes the linear solver for warped sparse matrix resolution, and returns the 2D triangulated mesh of each input image for the warp rendering step.

Once the IBR is initialized, the actual rendering operations are managed by the IBRRender Unity component. First, it starts by selecting the 4 most suitable input cameras based on their relative position and rotation with respect to the game camera. It asks the Unity C++ plugin to solve the sparse linear system for these selected images, to warp the images, and to return the result that will be used for the first warping pass performed by the Unity shaders. Finally, Unity applies the blending pass followed by the hole-filling final rendering step, and outputs the resulting image on screen.

Page 8: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

8

Figure 2: IBR component integration scheme

To allow real-time control over the rendering loop, IBR parameters are available through the Unity Inspector window associated with the IBRRender component.

Figure 3: IBR parameters in the Unity Inspector window

1.3 Video Base Rendering integration in Unity

VBR capabilities have been added to Unity, implementing a number of scripts that provide functionalities to both prepare data produced by the VBR pre-processing pipeline, and to correctly visualize them in a Unity scene.

The present VBR pre-processing pipeline is an external tool (not yet integrated in the CR-PLAY mixed pipeline) that takes a video and extracts video textures as a set of images that, when played, show a (looping) dynamic object (e.g. a flag fluttering in the wind). Images are then aggregated in a single zip file and added to the Unity project, where a custom script takes care of uncompressing the contents to the correct location.

Once VBR data are loaded in Unity, the VBRBehaviour component is responsible for showing them on dedicated game objects, giving game developers the possibility to control the video playback, indefinitely swapping the images to play the user-defined video texture. Each VBR dedicated game object contains another specific component that makes it behave as with a billboard video-texture. This

Unity C++ Plugin

IBR DATASET

Input images

Cameras parameters

3D point cloud

Oversegmented superpixels

Synthesis depth maps

Integer images data structure

Warp solver

Texture2D convertion

Unity GameEngine

IBRCamera

IBRRenderer

IBRManager

Fill holes of outputted novel view

Select 4 best IBR cameras

Warp 4 IBR cameras

Blend 4 warped IBR cameras

Page 9: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

9

way, it is possible to improve the sense of immersion, automatically directing the billboard to face the game camera, and ensuring the user can never actually walk around the billboard to see its side or back.

1.4 Unity Editor Tools

The main objective of the Unity Editor Tools is to integrate IBR and VBR technologies in a common workflow. This workflow allows developers to integrate IBR and VBR assets in a Unity project, and to deploy the game on a target platform. In this section, tools developed to achieve this objective are described and contextualized in a general development workflow.

IBR and VBR assets are comprised of different heterogeneous files organized in specific folder hierarchies, and then aggregated in a single zip file (one for the IBR data and one for VBR data). This choice has been adopted for two main reasons: to avoid having developers deal with more than one file for each dataset, and to let developers import datasets without having necessarily knowing or understanding how a particular dataset is organized.

For this purpose, an Importer Tool has been created that is able to scan specific folders looking for available datasets. Developers can then choose the dataset to be imported and the Importer Tool will automatically decompress all needed files in the correct locations for the project. Since the IBR C++ Plugin needs to access the IBR dataset at runtime also (in the deployed game), the dataset is imported in the StreamingAssets folder, a special Unity folder that is automatically copied within the final executable package when the game is deployed. This way, possible errors or asset misplacement is minimized. VBR datasets do not need to be accessible by external plugins, so they can be imported to a normal folder and then loaded into the scene at a later stage.

Figure 4: Dataset Importer Tool

Due to their intrinsic differences, IBR and VBR data need to be handled in different ways. To save developers from having to create game objects from scratch, different Prefabs (pre-configured game objects) are provided, and are ready to be directly instantiated in the editor's scene.

IBR game objects are provided with two different tools: IBR Dataset Loader and Superpixel Texture Creator. IBR Dataset Loader is a specific tool that loads the IBR scene proxy model and input cameras in the editor's scene, allowing developers to have a precise reference for the IBR position and size. These are useful for placing traditional contents along with IBR assets, and understanding which areas in the scene will be preferred for placement of game cameras. This tool also loads the initial data structures useful for subsequent IBR rendering steps. At runtime, it shows the currently active cameras, visually conveying which positions are more suitable for certain points of view.

Dataset Importer

IBR Dataset ZIP

VBR Dataset ZIP

StreamingAssets

VBRAssets

Page 10: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

10

Figure 5: IBR Dataset Loader. Proxy 3D model of captured scene and cameras are displayed.

When the scene is correctly loaded, developers need to use the Superpixel Texture Creator that creates the intermediate 8bit RGBA textures (4 for each input image + 3 for the superpixels' graphs). It stores them in the Resources folder, a special Unity folder that contains those data that are not loaded by Unity at startup, but that need to be explicitly loaded by the game in later stages. When the game starts, these four 8bit textures are merged into two float Render Textures that will be fed into runtime shaders. Superpixel Texture Creator does not need to be run after each scene's loading because, once generated, float textures can be used to render IBR scenes at any time. Float textures need to be generated only when the input dataset changes.

Figure 6: Superpixel Texture Creator

The VBR game object is manipulated through only one editing tool: the VBR Dataset Loader. The VBR Dataset Loader loads the videotextures into the internal data structures of the VBR game object. Using the scripts attached to the VBR game object, it is possible to control the video texture playback and set VBR related parameters as explained in the previous section.

Figure 7: VBR Dataset Loader

VBRAssets

VBR Dataset Loader

Superpixel Texture Creator

StreamingAssets

Resources

StreamingAssets

IBR Dataset Loader

Page 11: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

11

2. Risks and contingency plan

Despite the inclusion of end-users (game developers) in every phase of the project and the experience of partners in their respective domains, there are risks that activities in the project can face problems and unforeseeable issues. Beyond the first list of risks already presented in the DoW and in previous deliverables, the first real implementation of the CR-PLAY mixed pipeline highlighted the presence of additional technical risks, which derive from the practical development and contextual technical evaluation. A table summarizing additional risks related to development activities of CR-PLAY is presented here, with an estimation of their likelihood of happening, and the related contingency actions.

Risks and contingency actions table

Id Risk description Probability (low, med, high)

- Comment

Impact (low, med, high)

- Comment

Contingency action

Risk 4.1 Deployment of games on different platforms (web, mobile, desktop) could be limited due to the use of native C++ plugin for implementation of IBR core features

Med/High–a more accurate analysis will be performed in Year 2.

Med- deployment options could be limited to certain selected platforms.

A decision about what platforms will be addressed in CR-PLAY will be taken in year 2. For those platforms it will be required to get rid of the single native plugin (as it is now), porting their code inside Unity (taking advantage of Unity multiplatform capabilities).

Risk 4.2 Related to Risk 4.1 contingency action: porting plugin code on Unity could cause performances issues due to the porting of native code to a higher level architecture.

Med/High - Unity scripts are not compiled as native language and are likely to be less efficient than C++ native plugin.

Med- deployment options could be limited to certain selected platforms.

If this will be confirmed, no porting into Unity will happen, rather separate plugins for different platforms will be created, in order to avoid performance issues.

Risk 4.3 Since IBR preparation time depends on the number of input pictures, game loading time might be long depending on computing power.

Med- Depending on the size of the captured scene, IBR could need a large number of input images.

High- the loading phase could take considerable time, especially for large input dataset.

Loading algorithms and IBR data will be optimized in order to speed up loading phase.

Risk 4.4 Since IBR and VBR quality depends on input asset quality, the deployment packet (the set of files needed to run the game on player machines) might be big, due to high-resolution input assets.

Med– both IBR and VBR final quality is dependent on the quality of input assets.

High- deployment packet size could be considerably big, especially for large input dataset.

Game developers will decide a trade-off between quality and size of the deployed game. Analysis on asset optimization will be done in Year 2.

Page 12: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

12

Risk 4.5 The reconstruction can generate arbitrary not-well-scaled reconstructions due to the impossibility of getting the correct dataset scale (size) only from input images.

High– this is a known risk we are already working on, in order to mitigate it.

Med- end-users would have to adapt the game scale factor to the reconstruction size.

Additional information can be fed into the reconstruction process in order to get datasets properly scaled. As an alternative, a scaling feature can be added in order to allow game developers to change scale of reconstruction results.

Risk 4.6 Game developers could provide disruptive feedbacks on usability and technical choices taken during the development of the Low-fidelity prototype

Low – we are confident that development choices and usability of Low-fidelity prototype is in line with initial user-requirements (collected from game developers) and with market standards.

Med- the mixed pipeline would not be in line with user and market requirements.

User feedback will be addressed in the development of High-fidelity prototype.

3. Conclusion

This document presents the Low-fidelity prototype software, describing the main components developed to support different phases of the CR-PLAY mixed pipeline. Being the main result of work of WP4 in the first year of the project, it integrates the latest technological advancements achieved in WP1, WP2 and WP3, providing end-users with the tools and methods useful to create initial prototypes of games within the evaluation framework set in WP5, and the technical validation performed in Task 4.6.

3.1 Plan for next period

Following the three-tier schema that defines development tasks in WP4 (Low-fidelity, High-fidelity and Final prototypes of mixed pipeline), the next step is to collect feedback coming from MS13 "Result of Low-fidelity prototype evaluation and recommendations for next design iteration" (WP5), to organize and prioritize the integration and development tasks that will lead to the creation of the High-fidelity prototype in Year 2. Partners involved in development WPs will have specific roles, and the work will be performed accordingly to the plan depicted firstly in the DoW and then in D4.2. More detailed activities will be defined within sub-groups and depending on specific needs within the shell of main tasks of the WPs, towards the achievement of the main project objectives.

Page 13: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

13

References

[Chaurasia 2013] Chaurasia, G., Duchene, S., Sorkine-Hornung, O., &Drettakis, G. (2013). Depth synthesis and local warps for plausible image-based navigation. ACM Transactions on Graphics (TOG), 32(3), 30.

[Fuhrmann14] Floating Scale Surface Reconstruction– Simon Fuhrmann, Michael GoeseleIn: ACM Transactions on Graphics (Proceedings of SIGGRAPH 2014), 2014.

[Goesele07] Multi-View Stereo for Community Photo Collections –Michael Goesele, Noah Snavely, Brian Curless, Hugues Hoppe, Steven M. SeitzIn: Proceedings of ICCV 2007, Rio de Janeiro, Brasil, October 14-20, 2007

[libMVE] http://www.gris.informatik.tu-darmstadt.de/projects/multiview-environment/

[Schoedl00] Video textures. - Schoedl, A., Szeliski, R., Salesin, D. H., and Essa, I. 2000, In:Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH ’00, 489–498.

Page 14: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

14

Annex A

Low-fidelity Prototype: instruction manual

Guidelines for capturing datasets

Camera settings If your camera has a resolution setting, choose the highest resolution possible. If needed, the

algorithms will downscale the images later. Set a fixed aperture, either using aperture priority or manual mode (consult the camera manual if

necessary). Take images that are well-exposed (neither too dark nor too bright). This might require you to

change the exposure time or ISO speed if you are in manual mode. In aperture priority mode, check that the camera selected sensible values for these settings (consult the camera manual if necessary).

If the depth of field is too small, try setting a larger f-Number (smaller aperture) but be aware that this might darken the image if the exposure time is not increased accordingly.

Focus! If the scene or object are out of focus, reconstruction will fail. It is ok to refocus in each of the images.

Camera placement

Take pictures with sufficient overlap. If in doubt, just capture images at small intervals. This takes more time but ensures that everything has been seen with sufficient overlap. Each point on the surface should be visible in at least four images.

NO YES

Do not take panoramas. The reconstruction will not work if you just rotate the camera while standing still. The algorithms need to observe the surface from different viewpoints. The further away from the object, the larger the translation between shots can be.

NO YES

It makes sense to take “connecting shots”. For example if you take images from farther away and then move closer to capture some details, the algorithm might not be able to connect the details to

Page 15: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

15

the large scale surface even though they overlap. Try to take some additional shots while moving towards the details.

NO YES

It is allowed (and can actually be quite beneficial) to take images from different heights. So, if you have the chance to climb on a nearby wall, do not hesitate.

Scene content

The reconstruction is based on the assumption of a static scene. Try to avoid moving parts such as leaves, shadows, or pedestrians as much as possible. A certain amount of movement is tolerable but those parts will be reconstructed less accurately or not at all.

Surfaces with texture variation work well. If your object does not display a lot of texture, it can help if at least some other textured surfaces are visible in the images.

Transparent and mirroring objects do not work.

Best practices (if possible) and additional information Try to do two (semi-) circles with different radii around the object. Surface points that are observed under grazing angles in all images are hard to reconstruct. Reconstruction might work better on overcast days (even illumination, no moving shadows, …), but

relighting requires direct sun illumination on the object. Take overview images as well as close-ups for specific details (but remember the connecting shots).

How to use the Reconstruction Tool

Installation

Download and install following external tool to build the service: o CMake-GUI (http://www.cmake.org). o 7-zip compression/uncompression tool (http://www.7-zip.org). o An image processing tool like ImageMagick (http://www.imagemagick.org) or

GraphicMagick (http://www.graphicsmagick.org).

Move all taken pictures in a folder.

Download the CR-PLAY project sources.

Create a build directory.

Open CMake-GUI and set following values: o Source directory: the root folder of the CR-PLAY project. o Binary directory: the build folder (you need to create one).

Configure the solution and choose the platform on which you want to execute the process): o On Linux it is MakeFiles o On Windows it is Visual Studio (10, 11 or 12)

Page 16: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

16

You are now ready to configure and use the dataset reconstruction tool.

Usage

From CMake-GUI, check the option DATASET_COMPUTE.

Fill the DATASET_COMPUTE_IMG_DIR field with the pictures' path folder.

Click on Configure to let CMake shows new custom options for you.

In the table below you can find the description of main options available in CMake to customize the reconstruction process.

FIELD NAME DESCRIPTION

DATASET_VERBOSE By default ON, it prints more info at runtime about which process is going to run.

DATASET_ALL_IN_ONE_TARGET By default ON. All dataset-* custom target will have their dependencies dataset-* custom target to be executed before, and you will have the final dataset-all which allow to compute everything. Otherwise, you'll be able to run independently which dataset-* custom target you want.

DATASET_PREPARE_FORCE By default ON. Overwrite the preparation step process, otherwise, if images already exists, skip this step and continue

DATASET_RECONSTRUCT_PROCESS [Bundler/PMVS, Bundler/MVE, VSFM/PMVS, VSFM/MVE]

By default VSFM/PMVS, choose which kind of tools you want to use for reconstruction (will be automatically downloaded by CMake at runtime into build dir).

DATASET_IBR_BIN_DIR Where CMake can find local binaries for IBR. (Leave empty if you want CMake to download it automatically)

DATASET_IBR_CLEAN Clean useless files when finished.

DATASET_IBR_SPIX_N By default 1250. Defines the number of superpixels you want for the oversegmentation's step.

DATASET_IBR_SPIX_WC By default 20 [10-40]. Defines the compactness factor, in range for the oversegmentation's step.

DATASET_IBR_SPIX_THREADS By default 20. Number of threads used to compute superpixels.

DATASET_IBR_DEPTH_MVS_THRESHOLD By default 0.8. The minimum MVS point confidence threshold, useful for back-projection. Try 0.0 if black depth-maps are generated.

Page 17: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

17

DATASET_IBR_DEPTH_SYNTH_PROCESS [Matlab, MatlabStandalone]

DATASET_IBR_DEPTH_SYNTH_SAMPLES Indicates how many additional depth samples will be found in depth maps images.

DATASET_IBR_DEPTH_SYNTH_SKY By default OFF. Check to take into account the sky for the depth synthesis step process.

DATASET_ZIP_FINISH [zip&keepFiles, zip&deleteFiles, keepFiles]

DATASET_VSFM_BIN_DIR VisualSFM binary folder. (Leave empty if you want CMake to download it automatically)

DATASET_VSFM_CLEAN Default to ON. Cleans intermediate VSFM files when finished.

DATASET_MVE_BIN_DIR MVE binary folder. (Leave empty if you want CMake to download it automatically)

DATASET_MVE_SCALE By default 2. Scale used for whole reconstruction pipeline.

DATASET_MVE_FSS_THRESHOLD Threshold on geometry confidence: N in a range between 0 and the image number. Takes into account 3D points for surface recognition only if visible from N cameras.

DATASET_MVE_FSS_COMP_SIZE Minimum number of vertices per component on the mesh reconstruction step.

DATASET_MVE_FSS_NO_CLEAN Prevents cleanup of degenerated faces on the mesh reconstruction step.

DATASET_MVE_REPLACE_UNDISTORD By default ON. Try to undistort images and use them as input of MVE.

DATASET_MVE_CLEAN By default ON. Cleans useless generated files when finished.

Once CMake generation finished: o On Linux, go to your CMake binary directory (build folder) and build make files. o On Windows, open solutions generated (build folder) with Visual Studio.

Few targets are available for build (in order): o Dataset-prepare o Dataset-cameras o Dataset-reconstruct o Dataset-ibr o Dataset-all (only available if you checked DATASET_ALL_IN_ONE_TARGET option)

Build Dataset-all if present, or build each single target in the order given above.

Page 18: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

18

How to use CR-PLAY mixed pipeline in Unity

Installation To install the CR-PLAY Mixed Pipeline in Unity, it is necessary to import the unity package cr-play.unitypackage, that automatically installs all needed resources in the selected Unity project. This package contains both IBR and VBR scripts and resources.

Import datasets Before working with IBR and VBR, it is needed to import datasets to Unity project using the dedicated tools provided. Importing operations automatically take the dataset zip coming from the Reconstruction Tool and import the needed assets in the Unity project folders. Copy and paste the dataset file coming from the reconstruction tool in the Datasets folder (create

this folder if needed) [Default datasets folder can be changed by selecting the CR-PLAY/Options menu.]

Select the CR-PLAY/Import menu to open the Import window. Check the dataset you want to import and click Import IBR, to import IBR datasets, or Import VBR,

to import VBR datasets.

IBR datasets are automatically imported in Assets/StreamingAssets/<dataset_name> folder and VBR datasets are automatically imported in Assets/VBRAssets/<datase_name>. [StreamingAssets is a special Unity folder that is automatically copied in the deployment packet, IBR asseets need to be copied in the deployment packet because IBR C++ Plugin needs to use them during rendering]

Setup IBR scene To setup the IBR scene it is necessary to load the pre-configured IBRScene prefab (Assets/Prefabs/) in the current scene and initialize the associated behaviors to let the IBR scene to load a particular dataset. Drag and drop the IBRScene prefab in the scene view (or in hierarchy view).

Page 19: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

19

Attach the IBRRender behavior to the Main Camera by drag and drop the IBRRenderer.cs script on the main camera game object.

Select the IBRScene game object in the Hierarchy View to open its Inspector View. Go to the IBRSceneLoader behaviour and set the "Dataset Name Folder" field with the

<dataset_name> to load.

To load the IBR scene in Unity, click on the "Load IBR scene" button in the IBRSceneLoader behaviour.

To generate the superpixel textures, click on the "Create spixel textures" button in the IBRSceneLoader behaviour. [This operation need to be performed only the first time a dataset is loaded or every time a dataset change]

Page 20: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

20

Setup VBR scene To setup the VBR scene it is necessary to load the pre-configured VBRBehaviour prefab (Assets/Prefabs/) in the current scene and initialize the associated behaviors to let the IBR scene to load a particular dataset. Drag and drop the VBRObject prefab in the scene view (or in hierarchy view). Select the VBRObject/VideoTexture game object in the Hierarchy View to open its Inspector View.

Go to the VBRBehaviour behaviour and set the "Dataset Name" field with the <dataset_name> to load and the "Video Framerate" field with the frame-rate of videotextures.

To load videotextures, click on the "Load Dataset" button in the VBRBehaviour behaviour.

Page 21: Deliverable 4 · 2014. 10. 31. · CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline 7 technologies developed in WP1 (starting from the reconstruction

CR-PLAY Project no. 661089 Deliverable 4.3 Low-fidelity Prototypes for mixed pipeline

21