Artificial intelligence decodes fruit fly vision, paving the way for human vision

summary: Researchers have developed an artificial intelligence model of the fruit fly brain to understand how vision guides behavior. By genetically silencing specific visual neurons and monitoring changes in behavior, they trained the AI ​​to accurately predict neural activity and behavior.

Their findings reveal that multiple groups of neurons, rather than single types, process visual data in a complex “population code.” This achievement paves the way for future research into the human visual system and related disorders.

Key facts:

  • CSHL scientists have created an artificial intelligence model of the fruit fly brain to study vision-guided behavior.
  • Artificial intelligence predicts neural activity by analyzing changes in behavior after silencing specific visual neurons.
  • The research revealed a complex “population code” in which multiple groups of neurons process visual data.

source: CSHL

We have been told: “The eyes are the window to the soul.” Well, windows work in two ways. Our eyes are also our windows to the world. What we see and how we see it helps determine how we move through the world. In other words, our vision helps guide our actions, including social behaviors.

Now, a young scientist from Cold Spring Harbor Laboratory (CSHL) has uncovered key evidence about how this works. He did this by building a special artificial intelligence model of the brain of the common fruit fly.

However, Cowley hopes that his AI model will one day help us decipher the calculations behind the human visual system. Credit: Neuroscience News

CSHL Assistant Professor Benjamin Cowley and his team fine-tuned their AI model through a technique they developed called “knockout training.” First, they recorded the male fruit fly’s courtship behavior, which is chasing and singing to the female.

See also  Scientists find the fossil of a "giant" dinosaur spider in Australia

Next, they genetically silenced specific types of visual neurons in male flies, and trained their AI to detect any changes in behavior. By repeating this process with many different types of visual neurons, they were able to get the AI ​​to accurately predict how a real fruit fly would behave in response to any sighting of a female.

“We can actually predict neural activity computationally and ask how particular neurons contribute to behavior,” Cawley says. “This is something we haven’t been able to do before.”

Through new artificial intelligence, Cowley’s team discovered that the fruit fly brain uses a “population code” to process visual data. Instead of a single type of neuron associating each visual feature with a single action, as previously assumed, many groups of neurons were needed to sculpt the behavior.

The layout of these neural pathways looks like an incredibly complex subway map that would take years to decipher. However, it gets us where we need to go. It enables Cowley’s AI to predict how real-life fruit flies will behave when presented with visual stimuli.

Does this mean that AI can one day predict human behavior? Not so fast. Drosophila brains contain about 100,000 neurons. The human brain has approximately 100 billion.

“This is the case for the fruit fly. You can imagine what our visual system looks like,” Cowley says, pointing to the subway map.

However, Cowley hopes that his AI model will one day help us decipher the calculations behind the human visual system.

“This will be decades of work. But if we can figure this out, we will be ahead of the game,” Cawley says. “We have years of experience in this area.” [fly] With calculations, we can build a better artificial optical system. Most importantly, we will understand disorders of the visual system in much better detail.

See also  Detecting dark matter using quantum computers

How much better? You’ll have to see it to believe it.

About artificial intelligence and neuroscience research news

author: Sarah Giarnieri
source: CSHL
communication:Sarah Giarnieri – CSHL
picture: Image credited to Neuroscience News

Original search: Open access.
Mapping modularity of visual neurons reveals population code for social behavior“By Benjamin Cowley et al. nature

a summary

Mapping modularity of visual neurons reveals population code for social behavior

The rich diversity of behaviors observed in animals arises through the interplay between sensory processing and motor control. To understand these sensorimotor transformations, it is useful to build models that predict not only neural responses to sensory input, but also how each neuron contributes to behavior.

Here we demonstrate a new modeling approach to identify a one-to-one mapping between internal modules in a deep neural network and real neurons by predicting behavioral changes that arise from systematic perturbations of more than a dozen neuron types.

The main component we introduce is “knockout training,” which involves perturbing the network during training to match real neuronal perturbations during behavioral experiments. We apply this approach to model the sensorimotor transformations of Black-bellied fruit fly Males during complex, visually guided social behavior.

Visual projection neurons located at the interface between the optic lobe and the central brain form a group of discrete channels, and previous work suggests that each channel encodes a specific visual feature to stimulate a particular behavior.

Our model arrives at a different conclusion: groups of visual projection neurons, including neurons involved in antisocial behaviors, drive male-female interactions, forming a rich population code of behavior.

See also  Can't you go to the moon with NASA? Mistastein crater in Canada is the next best thing.

Overall, our framework integrates the behavioral effects of different neurological disorders into one unified model, providing a map from stimulus to neuron type to behavior, and enabling future incorporation of brain wiring diagrams into the model.

Leave a Reply

Your email address will not be published. Required fields are marked *