6
1 Transforming the UI—Designing Tomorrow’s Interface Today (1 of 5): Game Developers Push the Edge with Intel® RealSense™ Technology By Garret Romaine and John Tyrrell The ways we interact with games—and how they interact with us—are in a state of constant flux as new technologies enable different experiences. And the gaming industry has always been quick to explore the potential of new interactive technologies. The industry is on the cusp of an exciting future as interfaces evolve and the new halcyon age of game development continues to deliver remarkable ideas. Playing a part in this future is Intel® RealSense™ technology, which combines stateoftheart 3D camera sensing with advanced voice recognition built into the hardware. Using the Intel® RealSense™ SDK (beta), developers are just beginning to explore its potential to deliver immersive interactive entertainment experiences that go beyond the confines of classic hardware controllers. This article, the first in five about the use of perceptual computing and Intel RealSense technology, provides insights from four innovative developers who are putting the technology to work. Reshaping the Fantasy World “We are interested in using gestural control to create a feeling of transformation and a real sense of tactile wonder in the space between you and your computer screen,” said Robin Hunicke of Funomena when describing their upcoming game that is making use of Intel RealSense technology.

Transformingthe+UI—Designing+Tomorrow’s+InterfaceToday+ ...€¦ · 1"" " Transformingthe+UI—Designing+Tomorrow’s+InterfaceToday+(1of5):+ Game"Developers"Pushthe"Edge"withIntel®RealSense™Technology"

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Transformingthe+UI—Designing+Tomorrow’s+InterfaceToday+ ...€¦ · 1"" " Transformingthe+UI—Designing+Tomorrow’s+InterfaceToday+(1of5):+ Game"Developers"Pushthe"Edge"withIntel®RealSense™Technology"

1      

Transforming  the  UI—Designing  Tomorrow’s  Interface  Today  (1  of  5):  Game  Developers  Push  the  Edge  with  Intel®  RealSense™  Technology    By  Garret  Romaine  and  John  Tyrrell          The  ways  we  interact  with  games—and  how  they  interact  with  us—are  in  a  state  of  constant  flux  as  new  technologies  enable  different  experiences.  And  the  gaming  industry  has  always  been  quick  to  explore  the  potential  of  new  interactive  technologies.  The  industry  is  on  the  cusp  of  an  exciting  future  as  interfaces  evolve  and  the  new  halcyon  age  of  game  development  continues  to  deliver  remarkable  ideas.  Playing  a  part  in  this  future  is  Intel®  RealSense™  technology,  which  combines  state-­‐of-­‐the-­‐art  3D  camera  sensing  with  advanced  voice  recognition  built  into  the  hardware.  Using  the  Intel®  RealSense™  SDK  (beta),  developers  are  just  beginning  to  explore  its  potential  to  deliver  immersive  interactive  entertainment  experiences  that  go  beyond  the  confines  of  classic  hardware  controllers.    This  article,  the  first  in  five  about  the  use  of  perceptual  computing  and  Intel  RealSense  technology,  provides  insights  from  four  innovative  developers  who  are  putting  the  technology  to  work.    Reshaping  the  Fantasy  World      “We  are  interested  in  using  gestural  control  to  create  a  feeling  of  transformation  and  a  real  sense  of  tactile  wonder  in  the  space  between  you  and  your  computer  screen,”  said  Robin  Hunicke  of  Funomena  when  describing  their  upcoming  game  that  is  making  use  of  Intel  RealSense  technology.      

   

Page 2: Transformingthe+UI—Designing+Tomorrow’s+InterfaceToday+ ...€¦ · 1"" " Transformingthe+UI—Designing+Tomorrow’s+InterfaceToday+(1of5):+ Game"Developers"Pushthe"Edge"withIntel®RealSense™Technology"

2      

Based  on  a  belief  in  the  positive  impact  of  games,  Hunicke  co-­‐founded  Funomena  in  2013.  One  of  their  aims  is  to  explore  the  limits  of  emotional  interaction  between  players  and  technology.  “Our  game  follows  a  character  on  a  journey  of  personal  transformation  and  was  heavily  influenced  by  the  art  of  origami.  We  love  the  idea  of  a  camera  that  can  see  subtle  changes  in  the  way  a  player  positions  their  hands.”  High-­‐fidelity  tracking  of  gestures  makes  the  input  much  more  nuanced,  and  Intel  RealSense  technology  will  enable  games  to  become  truly  responsive  to  how  players  move  in  front  of  the  camera.  “We  can  use  that  information  to  let  them  manipulate  and  shape  objects  in  real  time.”    While  still  under  wraps,  the  game  will  use  the  precision  of  the  Intel®  RealSense™  3D  camera  to  read  gestures,  letting  players  connect  directly  with  the  game  world  and  explore  the  game’s  puzzles  and  environments  in  new  ways.  “We’re  always  interested  in  pushing  the  boundaries  of  what  games  can  express.  But  even  as  we’re  building  something  that  challenges  your  assumptions,  we’re  actively  embracing  your  input  as  a  player.  You  will  literally  use  your  hands  to  unlock  the  mysteries  of  this  world  piece  by  piece  and  customize  each  level  to  reflect  your  vision  of  what  the  final  world  should  look  like.”  said  Hunicke.      Motion  Foundations    Israel-­‐based  Side-­‐Kick  Games  has  been  working  with  motion  controllers  for  several  years,  including  2D  webcams  (using  PointGrab*,  XTR*,  and  EyeSight*  middleware),  Kinect*,  and  PrimeSense*.  Their  accumulated  knowledge  has  made  the  adoption  of  Intel  RealSense  technology  a  straightforward  process.      “We  built  a  layer  in  our  games  for  the  motion  controller  so  that  we  can  see  a  lot  of  different  middleware  inputs  and  offer  a  standard  motion-­‐control  interface,”  explained  chief  operating  officer,  Tal  Raviv.  “Using  that  infrastructure—and  our  experience  with  motion  control—the  transition  to  Intel  RealSense  technology  from  full-­‐body,  long-­‐range,  and  short-­‐range  motion  control  was  relatively  smooth.  The  Intel  RealSense  technology  interface  is  very  straightforward  and  it  has  great  tracking  capabilities.  There  were  very  few  issues  to  overcome  when  compared  to  using  other  technologies.”      In  Side-­‐Kick’s  upcoming  Warrior  Wave  game,  players  use  their  hands  to  bring  soldiers  to  safety  and  protect  them  from  enemies.  The  team  made  use  of  different  Intel  RealSense  technology  functionalities  according  to  the  in-­‐game  context.  “We  use  two  types  of  controls.  The  first  is  Silhouette,  which  is  a  branch  of  the  SDK  that  lets  the  game  ‘see’  the  shape  of  a  hand  but  has  no  ‘knowledge’  of  its  construction  (palm,  fingers,  thumb).    Skeletal  is  the  tracked  part  of  the  SDK  that  gives  information  about  the  structure  of  the  hand  (locations  of  each  fingertip  and  center  of  the  palm).”  Raviv  continued,  “The  basic  game  mechanics  work  with  Silhouette,  but  other  things  such  as  menus  work  with  Skeletal.”      

Page 3: Transformingthe+UI—Designing+Tomorrow’s+InterfaceToday+ ...€¦ · 1"" " Transformingthe+UI—Designing+Tomorrow’s+InterfaceToday+(1of5):+ Game"Developers"Pushthe"Edge"withIntel®RealSense™Technology"

3      

   

   Side-­‐Kick  Games  is  also  ensuring  that  the  context  is  appropriate  for  the  application  of  Intel  RealSense  technology,  implementing  it  alongside  existing  interface  technologies  in  Warrior  Wave  to  deliver  an  optimal  experience.  “Players  use  the  touch  screen  to  navigate  through  the  menus  and  activate  things  because  this  is  the  most  intuitive  method.  They  use  motion  control  while  in  the  game  and  get  a  more  immersive  experience  because  they’re  not  tied  to  the  traditional  mouse  and  touch  controls,”  explained  Raviv.    “This  combination  is  working  best  for  the  players  now.  In  the  future,  the  combination  of  voice  and  motion  control  could  be  more  significant  for  the  UI,  but  the  main  advantage  of  motion  control  today  is  in  the  gameplay  itself.”      Following  Orders    The  team  at  Iridium  Studios  is  using  both  the  motion  and  voice  capabilities  of  Intel  RealSense  technology  to  deliver  a  more  realistic  interface  for  its  forthcoming  real-­‐time  tactical  strategy  game  There  Came  an  Echo.  “In  the  past,  when  directing  a  small  squad  of  units  in  a  real-­‐time  strategy  game,  you  would  use  the  mouse,”  said  Iridium  founder  Jason  Wishnov.  “But  in  reality  you  don’t  draw  a  box  around  people  and  move  them.  You  have  to  communicate  with  them,  and  you  do  that  with  your  voice  and  gestures.”    In  There  Came  an  Echo,  voice  commands  are  vital  to  delivering  a  more  realistic  and  immersive  experience.  Using  accurate  voice  commands  creates  extremely  engaging  gameplay  that  more  closely  mimics  a  real-­‐life  scenario.  Equally  as  important  for  Wishnov  are  the  benefits  voice  recognition  brings  to  the  storytelling  aspect.  “We  spent  a  lot  of  time  and  energy  building  the  narrative  and  establishing  these  characters  as  actual  people  with  real  motivations,  fears,  and  character  flaws.  Voice  recognition  helps  connect  the  player  to  the  characters  in  a  narrative  sense,”  explained  Wishnov.  

Page 4: Transformingthe+UI—Designing+Tomorrow’s+InterfaceToday+ ...€¦ · 1"" " Transformingthe+UI—Designing+Tomorrow’s+InterfaceToday+(1of5):+ Game"Developers"Pushthe"Edge"withIntel®RealSense™Technology"

4      

 

     

Meanwhile,  the  motion-­‐sensing  abilities  of  the  Intel  RealSense  3D  camera  lend  themselves  to  non-­‐vocal  order-­‐giving  in  the  context  of  the  game’s  tactical  combat.  “Certain  military  hand  gestures  that  can  map  directly  to  in-­‐game  commands  are  also  fun.  For  example,  closing  your  fist  to  mean  go  or  mark,  and  holding  up  your  hand  and  closing  it  to  control  soldiers  is  very  cool,”  said  Wishnov.    Beyond  the  accuracy  of  the  technology  itself,  the  key  to  making  motion  control  work  for  the  player  is  context.  “If  you  shoehorn  a  set  of  gestures  into  a  game  just  because  you  can,  players  won’t  know  why  they're  doing  something  and  it  doesn't  feel  cohesive,”  said  Wishnov.  “It's  very  important  to  make  sure  gestures  feel  both  relevant  and  natural.”    Taking  a  Pulse    One  project  that  clearly  exemplifies  the  detail  and  precision  capabilities  of  the  Intel  RealSense  3D  camera  is  in  the  biofeedback-­‐enhanced  adventure  game  Nevermind.  In  the  upcoming  title  from  Flying  Mollusk,  players  take  the  role  of  a  Neuroprober,  who  is  able  to  venture  into  the  minds  of  psychological  trauma  victims,  solve  puzzles,  and  overcome  the  brain’s  defense  mechanisms  to  restore  their  psyches.  The  game  detects  a  player’s  stress  using  biofeedback  technology.    When  stress  is  detected,  the  environment  dynamically  reacts  to  the  player’s  fear  and  stress  levels  becoming  more  challenging  and  punishing  as  the  anxiety  level  rises.  Conversely,  as  the  player  relaxes,  Nevermind  becomes  more  gentle  and  forgiving.  In  this  way,  Nevermind  aims  to  help  players  become  more  aware  of,  and  better  able  to  manage,  their  stress  levels  both  in  and  outside  the  game  world.    

Page 5: Transformingthe+UI—Designing+Tomorrow’s+InterfaceToday+ ...€¦ · 1"" " Transformingthe+UI—Designing+Tomorrow’s+InterfaceToday+(1of5):+ Game"Developers"Pushthe"Edge"withIntel®RealSense™Technology"

5      

   “I’ve  always  been  interested  in  the  capabilities  of  biofeedback  and  gaming,”  said  Erin  Reynolds,  creative  director  at  Flying  Mollusk.  She  began  researching  biofeedback  in  2009  as  a  way  to  create  a  deeper  connection  between  the  player  and  the  game.  Although  several  biofeedback  options  existed,  none  were  readily  accessible  for  the  majority  of  consumers,  but  Intel  RealSense  technology  is  changing  that.        “The  Intel  RealSense  3D  camera  can  detect  heart  rate,  which  is  what  we  use  to  detect  a  player’s  fear  and  stress.  Nevermind  is  based  on  this,”  said  Reynolds.  “Previously,  when  people  played  Nevermind,  they  had  to  wear  a  chest  strap  under  their  shirt.  Now  with  the  camera,  they  can  just  sit  down  at  their  computer  and  play.  It  makes  a  more  seamless  and  intuitive  experience.”    The  Intel  RealSense  3D  camera’s  ability  to  take  a  pulse  by  simply  ‘looking’  at  a  person’s  head  opens  up  a  realm  of  possibilities  that  ignites  Reynold’s  imagination.  “It  means  that  developers  can  make  biofeedback  a  viable  part  of  their  offerings,  whether  it’s  games,  medical  tools,  or  communication  applications,”  said  Reynolds.  “We’re  beyond  excited  about  the  possibilities.”      “Another  biofeedback  feature  of  the  camera  is  emotion-­‐detection.  Developers  are  very  enthused  about  this  feature,  which  has  a  lot  of  potential  to  change  how  games  will  be  played  in  the  future,”  added  Chuck  McFadden,  Intel  product  manager  for  Intel  RealSense  technology.    The  Next  Level    The  use  of  Intel  RealSense  technology  depends  on  the  specific  needs  of  the  game  in  delivering  the  appropriate  experience  to  players,  as  demonstrated  by  the  different  approaches  of  these  developers.  Other  possible  uses  for  Intel  RealSense  technology  are  just  now  emerging.  “We  host  hackathons  and  game  jams  with  our  technology,  traveling  around  the  world  to  hotbeds  of  game  development,”  said  McFadden.  “We  give  developers  the  code  and  the  Intel  RealSense  3D  camera,  and  turn  them  loose.”  

Page 6: Transformingthe+UI—Designing+Tomorrow’s+InterfaceToday+ ...€¦ · 1"" " Transformingthe+UI—Designing+Tomorrow’s+InterfaceToday+(1of5):+ Game"Developers"Pushthe"Edge"withIntel®RealSense™Technology"

6      

 Intel  is  also  hosting  the  Intel®  RealSense™  App  challenge  2014,  described  by  McFadden  as  a  “million-­‐dollar  pot”  that  pushes  innovators  and  inventors  to  come  up  with  more  creative  ideas  for  the  new  technology.  “Some  ideas  are  things  that  I  never  would  have  dreamed  of,  such  as  a  new  way  to  play  a  musical  instrument,”  said  McFadden.  “Watching  the  Intel  RealSense  technology  being  taken  to  the  next  level  is  very  exciting.”    “As  developers,  we're  no  longer  confined  to  the  interface  conventions  from  thirty  years  ago,”  said  Reynolds,  highlighting  the  real  benefit  of  Intel  RealSense  technology.  “It’s  giving  us  a  chance  to  look  at  the  needs  of  today’s  society  and  change  the  way  we  interact  with  our  computers.”    “I  don’t  know  of  many  people  who  are  taking  advantage  of  this  new  technology  yet,”  said  Hunicke  of  the  new  UI  possibilities  just  now  emerging  as  a  result  of  Intel  RealSense  technology.  “We're  at  the  cusp.  We  have  the  ability  to  define  what  those  interfaces  will  be  and  what  that  future  will  look  like.    We  can  set  the  context,  create  emotional  connections  with  users,  and  get  them  out  of  their  comfort  zone,”  concluded  Hunicke.  “We're  getting  them  used  to  the  future.”    Resources  and  More  Info    Explore  Intel®  RealSense™  technology  further,  learn  about  Intel®  RealSense™  SDK  for  Windows  Beta,  and  reserve  a  Developer  Kit  here.      Read  a  geek’s  perspective  on  why  Intel  RealSense  technology  is  really  important.    Soar  through  the  air  without  a  controller  here.    Is  your  project  ready  to  demonstrate?    Join  the  Intel®  Software  Innovator  Program.    It  supports  developers  who  have  forward-­‐looking  projects  and  provides  speakership  and  demo  opportunities.    Get  started  at  the  Developer  Resource  Center.