digest_Adaptive compressed sensing—a new class of self-organizing coding models for neuroscience, paper.docx

  • Upload
    jie-fu

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

  • 7/29/2019 digest_Adaptive compressed sensinga new class of self-organizing coding models for neuroscience, paper.docx

    1/3

    Adaptive compressed sensinga new class of self-organizing coding models for

    neuroscience, paper

    Jie Fu

    https://sites.google.com/site/bigaidream/

    Keywords: compressed sensing, adaptive

    Adaptive compressed sensing

    https://sites.google.com/site/bigaidream/https://sites.google.com/site/bigaidream/https://sites.google.com/site/bigaidream/
  • 7/29/2019 digest_Adaptive compressed sensinga new class of self-organizing coding models for neuroscience, paper.docx

    2/3

    Simulation experiments with adaptive compressed sensing

    All coding circuits contained n=432 neurons, thereby producing representations of the original

    data that were three times over-complete.

    [KEY] Since the ACS model learns a dictionary of the compressed data rather than the original

    data, the original image cannot be reconstructed from the adapted matrix.

    [KEY] Computing the data dictionary from requires an ill-posed step of matrix factorization:

    = .

    ACS is able to form representations of sensory data that convey its essential structure although

    the coding network receives only a subsampled version of the data (compressed).

    Discussion and Conclusions

    The coding and learning scheme of ACS can be formulated as a neural networks, building on an

    earlier sparse coding model [A network that uses few active neurons to code visual input predicts

    the diverse shapes of cortical receptive fields]

    This paper learns in the weights of the coding circuit while keeping the random projection fixed,

    as opposed to a previous suggestion which optimizes the compression performance by learning

    in the random projection [Forming sparse representaions by local antiHebbian learning].

    The scheme of ACS suggests that representations in the brain can be sparse [Non-gaussian

    membrane potential dynamics imply sparse, synchronous activity in auditory cortex] [Sparse

    representation of sounds in the unanesthetized auditory cortex] and dense [Noise, neural codes

    and cortical organization] [Is there a signal in the noise], with the type of code being

    lamina-specific.ACS suggests that if the input subsamples the data then feedback in shaping the receptive fields

  • 7/29/2019 digest_Adaptive compressed sensinga new class of self-organizing coding models for neuroscience, paper.docx

    3/3

    becomes essential for coding efficiency.