What I learned today

By NeuroChi · Nov 12, 2009 · ·
  1. NeuroChi
    A growing list...

    'Mach bands' are a result of lateral inhibition, which is what the horizontal and amacrine cells do with the information provided by a photoreceptor in order to facilitate better edge perception.

    'Parallel processing' is responsible for the possibility of a top down influence from higher order cognitive processes in the association cortex like past experiences to influence sensory perception.

    'NMDA receptors' play a key role in a phenomena called long term potentiation by detecting coincidental firing of other neurons which results in the formation of long-term neural connections between neurons and consequently facilitates the storage of memory.

    'Serotonin' plays a role in the inhibition of feeding such as reducing overall food intake and shifting preference away from highly palatable food when it is present in certain areas of the brain; which is supported by anecdotal evidence of the way psilocin evokes bizarre perceptual changes in the way the subject perceives their own interaction with food.

    Practically speaking, 'habituation' is the opposite of 'sensitization', but technically speaking, the former is a short term result of neurotransmitter depletion resulting in a reduction of response after repeated stimulation, while the latter is result of long term anatomical changes in the synapse mediated by serotonin resulting in more rapid and increased response for every single stimulation.

    Dopaminergic neurons in the ventral tegmental area activate in response to a surprising reward stimulus when there is no conditioned stimulus to predict it, unless the subject has been conditioned to associate a predictor with the reward at which point the neurons will respond to the predictor rather than the reward itself.

    There are high-affinity choline transporters and low-affinity choline transporters for bringing choline back into the pre-synaptic terminal, though I'm not sure yet why. Seems as though the NA+ ion dependant HACT kicks in when acetylcholine levels are depleted or in need of quick replenishing while the LACT just keeps pumping along to maintain baseline levels of ACh.

    Share This Article


  1. Synchronium
    NMDA-induced LTP fascinated me for a while.

    The magnesium block (removed by depolarisation) acts like an AND logic gate, requiring at least two inputs to propagate the signal. They're also permeable to calcium, which can encourage transcription of new NMDA receptors (I think...), hence the potentiation.
  2. NeuroChi
    Hm... interesting comparison between a specific receptor and a specific mechanism in digital circuits.

    I usually bash the 'brain is like a computer' analogy because it's so general, and there are far more differences than similarities, but this is a specific example that sounds about right to me (given what I know about digital circuits, and neurotransmitter receptors).
  3. Synchronium
    I suppose the only difference between the NMDA receptor and the AND gate would be that the NMDA receptor would open its channel if it received two inputs a short time apart from each other (from one or more neurones), rather than from two simultaneous inputs.

    Actually, AMPA receptors are also responsible. They also bind glutamate and fire the first time, depolarising the post-synaptic neuron, which is required to remove the Mg2+ block on the NMDA receptor.
  4. Crazy Insane Sanity
    Interesting stuff...keep it comin, it'll help you remember, and help me learn ahead!

    And for what it's worth, I think the brain is just like a computer...just unlike any computer that's currently in existence. :)
  5. Crazy Insane Sanity
    I'm highly disappointed that this list has not grown, as you said it would in the blog. :p
  6. NeuroChi
    Thanks for the reminder, I had a bunch more to add on the topic of mechanisms of learning. Digging it up now... ;)
  7. Wanderer
    Try looking into "Perceptrons" which was a concept from the 1950's I believe and is still relevant today. Except today we create Artificial Neural Networks. They work just like axons and dendrites, and cause the neuron to fire at some threshold input value which then cascades to the next neuron, and so on. The weights associated with the connections and the threshold for which the neuron "fires" can be adjusted through training. This is typically done using patterns and comparing the output with the expected output. The weights are then modified in some sort of training algorithm.

    Here's one interesting part, the training seems to go quicker and achieve a better result when "noise" is introduced to the training data. No one is quite sure why this is so.

    The brain, though with all its chemical transmitters, each neuron is probably like a tiny parallel processor, adapting itself based on numerous inputs and creating many outputs as well.

    This is all interesting stuff, and I've been fascinated since I first read about it more than 30 years ago. Very interesting topic, and one which I could ponder for quite a while.
To make a comment simply sign up and become a member!