20
1 Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on 24 th October 2019. KRAKEN? What does it mean to communicate with an Alien Intelligence and how might we try to do it? Astrobiology and Michael Arbib's octoplus Astrobiology, a form of future and outerspaceoriented speculative evolutionary biology, already considers this question. Like Olaf Stapledon’s bacterial / microbial alien intelligence in his scifi Last and First Men, we may not even recognise another intelligence’s arrival. Michael Arbib’s [2011] octoplus was a thought experiment in how language could have evolved differently, given a different set of sensory and cognitive apparatus, based on the alien entity of the octopus. Arbib asks, “what might be some of the properties of a language evolved from the basis of chromatophores and body texture rather than visual control of the hand?” Arbib develops what he calls an Octoplus – an imagined octopusbeing that has evolved differently in the different environment of another planet. Most octopuses only live long enough to reproduce, about two years, and die shortly after; they do not raise their young. So the young learn for themselves how to inhabit the world. They mostly live on their own for most of their lives, shaped by solitary hunting and

Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  1  

Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on 24th October 2019.

KRAKEN? What does it mean to communicate with an Alien Intelligence

and how might we try to do it?

 

 

 

Astrobiology  and  Michael  Arbib's  octoplus  

Astrobiology,  a  form  of  future-­‐  and  outer-­‐space-­‐oriented  speculative  evolutionary  

biology,  already  considers  this  question.      

 

Like  Olaf  Stapledon’s  bacterial  /  microbial  alien  intelligence  in  his  sci-­‐fi  Last  and  First  

Men,  we  may  not  even  recognise  another  intelligence’s  arrival.  

 

Michael  Arbib’s  [2011]  octoplus  was  a  thought  experiment  in  how  language  could  

have  evolved  differently,  given  a  different  set  of  sensory  and  cognitive  apparatus,  

based  on  the  alien  entity  of  the  octopus.  Arbib  asks,    

 

“what  might  be  some  of  the  properties  of  a  language  evolved  from  the  basis  of  

chromatophores  and  body  texture  rather  than  visual  control  of  the  hand?”    

 

Arbib  develops  what  he  calls  an  Octoplus  –  an  imagined  octopus-­‐being  that  has  

evolved  differently  in  the  different  environment  of  another  planet.  Most  octopuses  

only  live  long  enough  to  reproduce,  about  two  years,  and  die  shortly  after;  they  do  

not  raise  their  young.  So  the  young  learn  for  themselves  how  to  inhabit  the  world.  

They  mostly  live  on  their  own  for  most  of  their  lives,  shaped  by  solitary  hunting  and  

Page 2: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  2  

danger.  They  camouflage,  communicate,  hunt  and  avoid  predators  with  displays  of  

colour,  texture  and  pattern  that  can  be  breathtakingly  varied;  so  far  as  biologists  

understand,  these  are  produced  by  a  system  of  light-­‐sensing  and  reflecting  and  

colour-­‐producing  cells  in  their  skin  and  despite  apparent  colour  blindness,  or  lack  of  

colour  vision  as  found  in  humans  –  their  eyes,  which  evolved  separately  from  the  

human  line  of  eye  evolution  but  which  is  remarkably  similar  in  many  ways,  appear  to  

see  in  black  and  white.  Combined  with  their  rapid  shapeshifting  capacity,  from  their  

lack  of  fixed  form,  their  distributed  neural  network,  the  highly  sensitive  suckers  on  

their  arms  that  taste  and  feel,  their  plasticity  and  visual  system  affords  them  a  high  

degree  of  intelligence  from  and  in  relation  to  their  environment.    

 

 

 

While  there  are  certain  patterns  and  signals  marine  biologists  have  presumed  to  

interpret  –  the  obvious  cases  of  camouflage,  and  mating  /  warning  /  threatening  

signals  –  there  is  a  whole  range  of  octopus  display  that  remains  uninterpretable  or  

incomprehensible  to  biologists,  although  it  may  communicate  on  an  aesthetic  level.  

There  are  rapidly  changing  displays  that  have  been  called  passing  cloud,  that  could  

be  dynamic  marine  camouflage  or  tactics  for  confusing  predators  or  prey,  imitating  

the  effect  of  shadows  passing  overhead;  more  mysteriously,  there  are  what  

scientists  call  chromachatter  and  other  seemingly  random  displays,  out  of  the  

Page 3: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  3  

context  of  obvious  communication,  hunting  or  hiding;  they  also  appear  to  dream  

while  emitting  rapidly  changing  displays.    

 

Arbib  imagines  an  evolution  that  emphasized  sociality  and  longer  lives  could  lead  to  

development  of  a  complex  language,  molded  by  their  morphology  and  a  different  set  

of  environmental  affordances  and  evolutionary  responses.    

 

The  mirror  system  hypothesis  offers  a  gestural  basis  for  language;  it  is  one  of  several  

theories  suggesting  that  language  evolved  from  manual  dexterity,  and  humans’  two  

opposable  thumbs,  and  originates  in  pantomimed  representations.  

 

Arbib  proposes  that  octopuses’  colour-­‐,  pattern-­‐  and  texture-­‐matching  “with  the  

statistics  of  what  it  sees  around  it”  constitutes  an  equivalent  “sensorimotor  

transformation”  to  that  of  the  human  hand  gesture;  he  posits  the  octoplus  language  

on  an  expanded  ability  to  control  this,  their  capacity  for  imitation  forming  the  basis  

for  the  visual  miming  of  objects  from  memory,  in  conjunction  with  movements  of  

their  arms.  Then  the  development  of  grammar  from  separation  out  of  action-­‐mimes  

and  object-­‐mimes,  based  in  abstraction  of  existing  split  communications  as  when  

male  octopuses  are  courting  a  female  with  the  visual  signaling  on  one  side  of  their  

body  and  simultaneously  emitting  an  aggression  colour  signal  towards  other  males  

on  the  side  facing  away  from  her.  Arbib  describes  this  as  a  “video  screen  model”  

where  the  display  becomes  “an  assemblage  of  subdisplays”.  He  suggests  that  

concepts  of  action  and  object  are  universal  to  human  language,  although  verb  and  

noun  categorisation  varies.    

 

The  experience-­‐world  of  an  organism  is  subjective,  “a  consequence  of  its  specialized  

receptor  and  effector  apparatus”,  and  its  particular  embodiment  and  social  

structure;  think  about  the  ways  in  which  humans’  sensory  perceptual  apparatus,  

often  with  emphasis  on  sight  and  sound,  frame  how  humans  inhabit  their  worlds  

(rather  than  dogs’  perceptual  emphasis  on  smell  for  example,  in  reading  their  

environments  and  social  relationships);  and  the  ways  in  which  the  structure  of  

human  societies  also  structure  relationships  to  and  perceptions  of  our  

Page 4: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  4  

environment(s).  Similarly,  octopus  concepts  and  syntax,  formed  from  their  

perceptions  of  their  environments  and  relationships,  are  likely  to  be  entirely  

different  from  human  ones.    

 

[As  Arbib  says,]  “any  intelligence  with  which  we  are  likely  to  establish  communication  

will  have  vision,  language  and  a  sophisticated  knowledge  of  applied  mathematics,  …  

[but]  can  we  posit  convergent  cultural  evolution  in  expressiveness?”    

 

What  would  octopus  geometry  consist  of,  for  a  being  with  no  bodily  angles  that  

inhabits  a  world  with  no  fixed  horizon  line  and  three  dimensions  of  travel,  eight  

semi-­‐autonomous  arms,  a  body  with  no  fixed  form;  geometry  is  etymologically  

derived  from  and  originated  in  land  measurements,  rather  than  measurements  in  a  

fluid  all-­‐round  360  degrees.  Would  their  version  of  geometry  be  grounded  in  non-­‐

linear,  non-­‐Euclidean  structures?  Is  it  possible  to  reverse-­‐engineer  a  speculative  

nonhuman  language  based  on  this  hypothesis?  Is  this  all  based  on  very  

anthropocentric  ideas  about  what  constitutes  language  or  communication?  

 

Octopuses’  short  life  spans,  and  for  the  most  part  lack  of  communal  living,  are  due  to  

environmental  pressures,  which  in  turn  have  formed  their  quick  intelligence  /  

cognition  as  predatorial  and  predated  beings  in  a  fast-­‐moving  dangerous  

environment;  although  there  is  evidence  of  octopuses  in  the  deeper  water  we  know  

little  about,  that  live  longer;  and  recent  discoveries  of  octopus  colonies  cohabiting  

covetable  areas  of  ocean  real  estate,  vast  seafloor  accumulations  of  discarded  shells  

that  provide  enough  material  cover  for  concealing  camouflaged  octopuses,  to  offer  a  

degree  of  security  in  a  particularly  hazardous  space  shared  with  sharks  and  other  

large  predators.  

 

As  in  Adrian  Tchaikovsky’s  sci-­‐fi  novel  Children  of  Time,  where  an  anthropogenic  

virus  to  speed  up  evolutionary  processes  and  enhance  other  species’  intelligence  is  

released,  there’s  an  assumption  that  other  species  do  not  have  sophisticated  

communication  systems  already;  that  a  human-­‐style  language  is  the  desirable  

pinnacle  of  evolution;  that  most  meaningful  communication  occurs  via  this  kind  of  

Page 5: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  5  

formal  language;  and  that  the  capacity  for  language  in  this  way  is  an  indicator  of  

intelligence.  What  do  we  mean  by  intelligence  anyway?  Kevin  may  address  this  in  

more  detail  later,  but  for  now  I’m  going  to  propose  Cognitive  Psychologist  Ulric  

Neisser’s  definition  of  cognition  as  “all  the  processes  by  which  sensory  input  is  

transformed,  reduced,  elaborated,  stored,  recovered  and  used.  (1976)”  For  Etic  Lab  

a  working  definition  might  also  include  the  degree  of  connectivity  with  others,  the  

capacity  for  an  entity  to  change  itself  as  a  result  of  an  encounter,  and  to  have  impact  

on  the  world.  

 

Although  Arbib  recognizes  that  there  is  no  guarantee  any  exolanguage  will  contain  

any  surface  properties  associated  with  human  language,  he  does  not  view  any  

nonhuman  terrestrial  communication  as  constituting  any  kind  of  language  as  such.  

These,  like  the  IQ  test,  ground  prerequisites  for  judging  intelligence  in  an  inherent  

and  unquestioned  assumption  of  superiority  of  those  doing  the  testing;  which  is  as  

ridiculous  as  the  assumption  that  there  can  be  an  equivalent  of  Star  Trek’s  universal  

translator,  that  converts  an  alien  language  into  English.  In  experimental  testing  from  

an  imperialist  perspective,  we’re  in  danger  of  just  testing  or  reflecting  on  ourselves,  

not  other  cultures  or  species.  Arbib  assumes  that  rich  language  development  and  

advanced  technologies  are  prerequisites  of  communication  with  extraterrestrials,  

but  who  knows  how  a  terrestrial  microorganism  might  communicate  with  an  

extraterrestrial  one,  in  what  exchange  of  electrical  or  chemical  signals.  

 

 

Biosemiotics  

 

Marine  Biologist  Jennifer  Mather’s  definition  of  intelligence,  or  cognitive  ability,  is  

about  obtaining  and  using  information,  in  ways  afforded  by  environments  and  

evolved  sensory  apparatus.  Biosemiotics  is  a  biology-­‐based  discipline  that  borrows  

from  literary  theory  and  Peircean  semiotic  analysis,  which  models  a  triadic  relation  

rather  than  the  binary  of  Saussurean  signifier-­‐signified,  and  a  more  fluidly  dynamic  

relationship  between  an  object,  representamen  (means  of  representation)  and  

Page 6: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  6  

interpretant  (receiver  of  the  message),  that  can,  for  example,  begin  or  end  with  the  

object.  

Biosemiotics  refers  to  this  act  of  perceiving  and  communicating  biological  

information,  including  signs  between  organisms  (literally,  communication),  such  as  

human  gesture,  speech,  sign-­‐making;  or  octopus  displays,  gesture  and  sign-­‐

production;  but  also  signs  encoded  in  DNA  and  activated  through  interactions  in  

environments,  eg.  the  fact  we  develop  legs  before  we  need  them  indicates  we,  

mostly,  intend  to  /  will  walk;  more  abstractly,  things  like  the  rate  at  which  different  

species  perceive  time  is  a  form  of  biological  information.  The  rate  at  which  time  is  

perceived  varies  across  animals:  evidence  suggests  that  distinct  species  experience  

passing  time  on  different  scales.  (A  2013  study  in  Animal  Behavior  reveals  that  body  

mass  and  metabolic  rate  determine  how  animals  of  different  species  perceive  time.)  

“[for]  fish,  which  live  on  fast-­‐moving  prey,  processes  of  motion  may  appear  more  

slowly  in  their  environment,  as  in  slow  motion.”    

 

From  a  Biosemiotic  perspective,  human  cultural  production  is  also  a  form  of  

biosemiotics;  it  encodes  signs,  communication  and  information,  and  is  an  expression  

of  relationships.  This  works  the  other  way  too,  cultural  production  as  biosemiotics  

affect  our  perceptions  of  the  world,  and  what  we  give  attention  to.  

 

For  Biosemioticians,  “interpretation  is  the  defining  form  of  semiosis  and  life”;    

 

“the  ability  to  reach  a  conclusion  from  sensory  inputs  whose  result  can  vary  

according  to  circumstances,  memory,  experience,  and  learning.  In  a  way,  it  is  the  

ability  to  ‘‘jump-­‐to-­‐conclusions”…  from  a  limited  number  of  data,  with  results  that  

may  not  be  perfect  but  are  good  enough  for  the  purpose  of  survival…  [it  is]  a  form  of  

semiosis  because  it  involves  signs  and  meanings.”  (Barbieri)  

 

“The  transformation  of  the  signals  received  by  the  sense  organs  into  mental  images,  

or  high  level  neural  states,  is  based  on  sets  of  rules  that  are  often  referred  to  as  

neural  codes,  because  neurobiology  has  made  it  abundantly  clear  that  there  are  no  

necessary  connections  between  sensory  inputs  and  mental,  or  neural,  images…  What  

Page 7: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  7  

organisms  interpret…  is  not  the  world  but  representations  of  the  world,  and  neural  

representations  are  formed  by  neural  networks  made  of  many  different  types  of  

cells”.  

 

AI  combines  code  and  networks,  largely  based  on  the  computational  aspect  of  how  

biological  brains  may  work  in  machine  learning.    

 

   

 

Machine  Learning  and  Interpretation  

 

As  artists  and  designers  we  are  interested  in  the  phenomenology  of  things,  the  way  

their  operations  structure  experience.  Although  some  of  you  may  be  much  more  

familiar  with  the  principles  of  machine  learning  than  I  am,  I  am  assuming  many  of  

you  are  not,  so  I’m  going  to  give  a  very  basic  description  of  how  it  works.    

 

There  are  a  number  of  machine  learning  techniques.  I’ll  go  through  a  few  of  them  

briefly  here.  

 

Page 8: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  8  

   

 

Predicting  [regression]  

 

Supervised  learning  algorithms  make  predictions  based  on  a  set  of  examples,  known  

as  regression.  The  programme  is  trained  on  the  data  to  find  the  equation  from  the  

desired  end  state.  It  starts  by  filling  in  the  missing  value,  the  multiplier,  with  a  

random  value.    

 

 Image  from  Tariq  Rashid’s  Make  Your  Own  Neural  Network  

Page 9: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  9  

 

It  calculates  the  result  and  the  error  margin,  adjusts  the  multiplier  correspondingly,  

and  tries  again,  and  again,  iteratively,  until  it  has  the  correct,  or  close  enough  to  

correct,  output.  

 

This  is  the  basis  of  how  a  machine  learns,  using  a  model  to  adjust  parameters  based  

on  known  values.  In  this  case,  it’s  a  predictor  model  because  it  takes  an  input  and  

predicts  what  the  output  should  be.    

 

 

Neural  Networks,  Deep  Learning  and  human  brains  

 

But  there  are  usually  a  number  of  layers  of  code,  called  neural  networks  as  they  are  

modeled  on  neuronal  computation  in  the  biological  brains,  and  their  network  of  

neurons.  The  ML  neural  net  passes  data  in  parallel  through  a  sequence  of  layers,  

based  on  an  idea  of  how  computation  works  in  a  biological  brain.  Layers  of  input-­‐

output  nodes  are  processed  through  a  threshold  activation  function  that  mimics  the  

neuronal  output  thresholds.    

 

   

Page 10: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  10  

   

   

Images  from  Neurocomic:  Neurocomic,  a  graphic  novel  about  how  the  brain  works  by  

neuroscientists  Matteo  Farinella  and  Hana  Ros.  

 

Page 11: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  11  

In  biological  brains,  neurons  process  information  from  the  environment  (light,  

sound,  touch,  etc.);  each  neuron  is  an  individual  unit  that  receives  inputs,  processes  

the  information,  and  if  the  input  is  large  enough,  pass  it  on  to  more  neurons.  These  

are  most  concentrated  in  the  brain  for  humans.  The  human  brain  has  about  100  

billion  neurons.  African  Elephants  have  257  billion  neurons  in  their  brain,  although  

most  of  those  are  in  the  cerebellum  rather  than  the  cerebral  cortex  where  you  find  

the  highest  concentration  of  human  brains.  Octopuses  have  around  500  million  

neurons  but  more  than  half  are  in  their  arms.  But  even  animals  with  much  lower  

neuron  counts  can  outperform  computers  on  many  complex  tasks.  It’s  not  all  about  

the  neurons  or  the  processing  power,  although  that  is  what  neural  networks  are  

modeled  on  and  what  I’ll  be  talking  about  today;  biological  brains  and  nervous  

systems  have  a  lot  more  going  on.  

 

In  biological  brains,  networks  of  neurons  pass  information  from  one  to  another;  

whether  each  receiver  processes  and  passes  on  the  information  depends  on  whether  

the  information  passes  a  certain  threshold:  that  is,  they  suppress  the  input  until  it  

becomes  large  enough  to  trigger  an  output.  And  they  process  signals  in  parallel.  Each  

neuron  receives  inputs  from  many  others,  and  passes  its  signal  on  to  many  others  

when  it  outputs.  

 

 Sigmoid  function,  from  Tariq  Rashid’s  Make  Your  Own  Neural  Network  

Page 12: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  12  

 

When  a  neural  net  node  receives  a  signal,  it  too  suppresses  that  information  until  it  

reaches  a  threshold,  which  is  calculated  by  an  activation  function.  In  order  to  imitate  

the  biological  threshold,  the  activation  function  chosen  is  generally  one  that  

produces  a  smooth  gradient,  such  as  the  s-­‐shaped  sigmoid  function  [y=1/1+1e-­‐x].  

 

Neural  net,  from  Tariq  Rashid’s  Make  Your  Own  Neural  Network  

 

 

In  each  layer,  inputs  are  weighted  in  various  combinations,  summed,  and  passed  on  

to  the  next  layer  (fed  forwards)  if  they  pass  the  threshold.    

 

Page 13: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  13  

   

The  weightedness  for  each  input  is  randomly  assigned.  These  weights  are  important  

to  machine  learning,  they  are  what  moderate  the  learning,  and  they  are  what  is  

adjusted  to  get  the  desired  output.  As  with  the  simpler  example  earlier,  the  

computer  uses  the  margin  of  error  compared  with  the  training  data  to  refine  its  

results.  In  this  case  the  error  needs  to  be  distributed  across  the  whole  network  in  

order  for  it  to  continue  to  function  as  a  neural  net.  One  way  of  doing  this  is  by  what’s  

called  backpropagation,  where  the  error  margins  are  fed  back  across  the  network,  

divided  between  the  links  proportionally  to  their  initial  size.    

   

Page 14: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  14  

   

Backpropagating  errors  

 

This  is  done  iteratively,  in  very  small  steps:  

 

   

The  whole  process  mimics  the  feedforward  and  feedbackwards  in  biological  

neuronal  networks.  This  combination  of  simple  calculations  results  in  the  ability  to  

learn  sophisticated  class  boundaries  and  data  trends;  many-­‐layered  networks  of  this  

sort  perform  what  is  known  as  "deep  learning".  Deep  learning  is  useful  for  data  

Page 15: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  15  

resilience  –  eg.  moderating  imperfect  signals:  errors  or  anomalies  in  the  data  are  

evened  out  across  the  network’s  distributed  learning;  it  has  more  sophistication,  and  

speed  gained  from  parallel  processing.  

 

 

Biological  Brains  

 

There  are  some  particular  broad  differences  from  biological  wetware  as  I  understand  

it:  the  computational  model  above  is  based  on  the  morphology  of  biological  brains,  

although  generally  in  ordered,  uniform  layers  which  are  easier  for  computation,  

rather  than  an  organically  distributed;  in  a  biological  brain,  neurons  wouldn’t  be  

structured  in  a  grid  –  the  signals  would  pass  through  multiple  neurons  in  a  much  less  

linear  way.  AI  arguably  has  a  form  of  electrophysiology  (the  use  of  electrical  signals  

to  communicate)  and  the  use  of  synchronicity  in  cognitively  tying  together  particular  

experiences  (understandings  of  cause  and  effect),  but  not  in  the  same  way  the  

human  brain  sends  electrical  signals  to  communicate  and  is  thought  to  synchronise  

experience  (which  we  don’t  entirely  understand  anyway);  it  also  doesn’t  use  

memory  in  the  same  way.  It  doesn’t  have  a  correlate  for  biological  pharmacology  

(internal  chemistry)  or  neuroplasticity.  

 

Where,  as  Catherine  Malabou  suggests,  the  human  brain  could  be  said  to  represent  

the  intersection  between  the  symbolic  and  the  biological  (Malabou),  what  

represents  thought  in  octopuses  and  AIs?    What  might  the  specific  morphology  of  AI  

and  its  digital  environment  afford,  and  what  might  speculative  mechanistic  modeling  

based  on  an  octopus  brain  reveal  to  inquisitive  humans?  What  might  it  say  about  

alien  and  artificial  intelligences?  

 

Octopuses,  as  has  been  often  pointed  out,  come  from  a  branch  of  the  evolutionary  

tree  that  diverged  from  the  human  one  millions  of  years  ago.  They  belong  to  a  

branch  we  class  as  mollusks,  a  group  of  soft-­‐bodied  water  dwellers  ranging  from  very  

simple  organisms  to  complex  and  sophisticated  ones  including  also  squid  and  

cuttlefish.  

Page 16: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  16  

 

 

Developing  an  AI  Octopus  Using  Reinforcement  Learning    

 

In  reinforcement  learning,  the  algorithm  learns  from  unknowns.  It  is  hard-­‐coded  to  

enjoy  inhabiting  a  particular  state,  and  will  try  to  return  to  that  state  as  often  as  it  

can  by  manipulating  its  environment.    

 

 action>environment>observation,  reward>agent  

 

The  algorithm  modifies  its  strategy  in  order  to  achieve  the  highest  reward,  as  in  this  

basic  reinforcement  learning  algorithm  Etic  Lab’s  Alex  Hogan  has  constructed  (see  

reinforcement  learning  link  in  blog  text).  

 

If  we  made  a  digital  octopus  in  a  digital  environment  using  reinforcement  learning,  it  

could  manipulate  its  environment  to  induce  changes  in  itself.  It  learns  through  

reinforcement,  in  response  to  and  constrained  by  the  affordances  of  its  digitally  

modeled  environment.  It  chooses  an  action  in  response  to  each  data  point,  which  is  

rewarded  by  a  signal  indicating  how  ‘good’  the  decision  was.  A  digital  octopus  is  a  

set  of  variables,  representing  for  example  skin  colouration  and  arm  movements  (8  of  

them!).  We  could  add  more  variables  to  make  it  more  complex,  representing  eg  skin  

Page 17: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  17  

pattern,  skin  texture,  jet  propulsion.  These  together  could  respond  to  variations  in  

its  digital  environment  –  based  on  another  set  of  variables,  representing  for  example  

depth,  light,  flow,  etc.    

The  octopus  and  its  environment  would  constitute  a  black  box,  which  returns  a  

mathematical  output  signals.  We  could  convert,  synaesthetically  translate,  or  

transduce  the  variation  in  output,  the  resulting  signals,  into  a  set  of  shifting  aesthetic  

outputs,  remixing  for  example  image,  colour  and  sound  from  a  database  of  objects  

created  by  Maggie.  As  the  algorithm  learns,  some  signals  will  be  repeated  more  

frequently;  then  as  the  digital  environment  changes,  the  octopus  AI  will  have  to  

adapt  its  behavior  to  return  to  its  preferred  state.    

 

 Expressive  Machine  by  Laura  Dekker,  installation  shot  at  the  V&A,  image  courtesy  the  artist.  

 

Artist  Laura  Dekker  describes  a  similar  but  partially  physically  manifest  process  in  her  

Expressive  Machine  project,  using  outputs  from  multiple,  decentralized  software  and  

hardware  interfaces  that  form  installations  with  elements  responsive  to,  for  

example,  visitors’  sound,  touch,  webfeeds,  that  are  transduced  “across  multiple  

modes”  into  stream-­‐of-­‐consciousness  text  displays,  icons  and  sound;  as  she  says:  

“from  touch  to  sound,  to  word,  to  vision,  to  taste,  to  uniquely  machinic  states  with  

no  particular  human  analogue.  These  stimuli  are  processed  in  various  

interpretations,  elaborations,  in  a  relatively  unstructured  ‘data  soup’.  Asynchronous  

Page 18: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  18  

processes  consume  data  from  the  soup.  When  trigger  conditions  for  a  particular  

expressive  process  are  satisfied,  the  machine  produces  externalized  outputs  in  

various  forms:  sound,  shift  of  attention,  fragments  of  narrative,  and  so  on.”    

 

Similarly,  what  we  ultimately  want,  rather  than  constructing  an  interpretation  of  a  

digital  octopus  in  a  digital  environment,  is  an  AI  based  on  real-­‐world  interactions.  We  

want  a  real  octopus  to  programme  the  AI.  Kevin  will  talk  more  about  that  next.  

 

The  way  Laura  Dekker  sees  it,  “what  can  be  considered  as  creativity  arises  as  an  

emergent  property  –  a  serendipitous  by-­‐product  of  the  machine  working  through  its  

experiences,  rather  than  an  explicit  creative  process.”  (EVA  conference  proceeding  

London  2019)  

 

Etic  Lab  has  another  perspective  on  computational  creativity,  around  the  effects  of  

algorithmic  intelligence  on  human  culture.  We  suggest,  from  our  experience  working  

on  recent  commercial  projects,  that  Algorithmic  AI  is  a  new,  creative  actor  in  the  

field  of  online  identity-­‐construction  and  cultural  production;  that  is,  it  is  actively  

affecting  human  culture.    

 

We’ve  written  a  paper  about  this,  Guru  Code,  which  you  can  read  on  the  Etic  Lab  

website,  so  I  won’t  go  into  detail  here,  but  just  to  summarise:  

Iteratively  responsive  metrics  produced  by  an  algorithm  can  turn  the  perspective  of  

its  producer’s  assumptions  and  the  datasets  it  is  based  on  into  the  ground  of  the  

truth,  altering  or  reifying  the  subjective  realities  of  participating  users,  acting  as  a  

kind  of  guru.  There  have  been  studies  of  online  identity  construction  since  the  90s  

(Friedman  and  Schultermandl  2016;  Papacharissi  2011;  Turkle  1995),  but  we  have  

entered  a  new  phase  of  online  identity  construction  mediated  by  machine  learning  

algorithms.  Online  identities  are  now  constructed  on  an  ever-­‐shifting  ground,  the  

longer-­‐term  effects  of  which  are  yet  to  be  seen.  ML  algorithms  have  not  been  

analysed  so  much  from  this  perspective;  research  has  tended  to  focus  either  on  

technologies  of  surveillance  or  on  bias  in  relation  to  algorithmically-­‐impacted  

identities  (Cheney-­‐Lippold  2017;  Eubanks  2018;  Noble  2018;  O’Neill  2016;  Zuboff  

Page 19: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  19  

2019).  [something  Ramon  has  written  about]  The  algorithm  creates  new  subjective  

realities,  reshaping  and  framing  markets.  Like  any  futures  market  or  guru,  this  

operates  largely  on  belief  in  a  brand  or  cult  leader’s  mastery  of  knowledge,  

expertise,  charisma  and  capacity  to  influence.  What  we’re  calling  Guru  Codes  have  

the  capacity  to  responsively  alter  or  reify  subjective  realities  through  their  

entanglement  with  human  users,  in  ways  that  were  not  possible  before  the  

introduction  of  sophisticated  Machine  Learning  algorithms.  Moreover,  we  suggest  

that  these  Guru  Codes  operate  between  human  and  machine  perception-­‐worlds,  

environments  that  function  on  very  different  principles;  these  form  a  constitutive  

miscommunication  and  misapprehension  about  what  is  happening,  which  counter-­‐

intuitively  contribute  to  its  effectiveness.  

 

 

Embodied  Interpretation  and  cognition  

 

To  go  back  to  my  starting  point  of  the  octoplus  and  embodied  cognition  in  relation  to  

environments.  Humans  are  physically,  bodily  surface-­‐bound,  with  free  movement  on  

the  horizontal  plane  but  limited  vertically,  which  are  reflected  in  spatial  semantics.  

As  Linguist  Arthur  Holmer  points  out:    

 

“all  human  languages  possess  concepts…  [that]  distinguish  upwards  from  

downwards  and  posses  corresponding  verbs  such  as  rise  and  fall.  Meanwhile,  they  do  

not  universally  distinguish  directions  of  horizontal  motion:  some  languages  do  but  

the  distinctions  are  entirely  language-­‐specific.  Horizontal  motion  verbs  focus  on  the  

manner  of  movement:  eg.,  walking,  running,  crawling,  or  floating…  our  

categorization  of  spatial  semantics  and  motion  verbs  is  determined  by  our  lifeworld.”  

(Arthur  Holmer,  Greetings  Earthling!  In  History  and  Philosophy  of  Astrobiology,  

p178)    

 

By  this  logic,  if  we  imagine  a  species  whose  lifeworld  is  characterized  by  

weightlessness,  concepts  of  up  and  down  are  much  less  relevant  for  them.  AI  does  

not  possess  a  human-­‐like  set  of  sensory  apparatus;  it  lacks  defined  morphology,  

Page 20: Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual … · 2020. 2. 15. · ! 1! Talk given by Etic Lab’s Stephanie Moran at Goldsmiths Visual Cultures Department on

  20  

pharmacology,  embodiment,  but  can  be  hard-­‐coded  with  teleological  purpose;  like  

humans,  this  can  give  it  the  emergent  property  of  appearing  to  have  a  conscious  

purpose.      

 

 

Decentring  narratives  

 

We’re  interested  in  decentring  anthropocentric  narratives;  the  nonhuman  

intelligence  of  AI  may  both  help  and  hinder  with  this.  Kevin  is  going  to  speak  next  

about  how  we  intend  to  use  AI  to  communicate  with  an  octopus,  and  how  an  

octopus  might  program  a  very  different  kind  of  AI.    

 

 

 

 

 

 

©  Stephanie  Moran  and  Etic  Lab  2019