Posture controls mechanical tuning in the black widow spider mechanosensory system, bioRxiv, 2018-12-05

Spiders rely on mechanical vibration sensing for sexual signalling, prey capture and predator evasion. The sensory organs underlying vibration detection, called slit sensilla, resemble cracks in the spider's exoskeleton, and are distributed all over the spider body. Those crucial to sensing web- and other substrate-borne vibrations are called lyriform organs and are densely distributed around leg joints. It has been shown that forces that cause bending at leg joints also activate these lyriform organs. Little is known of how the biomechanics of the body of a freely-suspended spider in its natural posture interact with vibrations introduced into the body and how this affects vibration perception. Female black widow spiders, in particular, have a striking body-form; their long thin legs support a large pendulous abdomen. Here, we show that in their natural posture, the large abdominal mass of black widow females, interacts with the spring-like behaviour of their leg joints and determines the mechanical behaviour of different leg joints. Furthermore, we find that adopting different body postures enables females to alter both the level and tuning of the mechanical input to lyriform organs. Therefore, we suggest that posture may be used to flexibly and reversibly focus attention to different classes or components of web vibration. Postural effects thus emphasize the dynamic loop of interactions between behaviour and perception, i.e. between 'brain' and body.

biorxiv animal-behavior-and-cognition 100-200-users 2018

On the adaptive behavior of head-fixed flies navigating in two-dimensional, visual virtual reality, bioRxiv, 2018-11-05

AbstractA navigating animal’s sensory experience is shaped not just by its surroundings, but by its movements within them, which in turn are influenced by its past experiences. Studying the intertwined roles of sensation, experience and directed action in navigation has been made easier by the development of virtual reality (VR) environments for head-fixed animals, which allow for quantitative measurements of behavior in well-controlled sensory conditions. VR has long featured in studies of Drosophila melanogaster, but these experiments have typically relied on one-dimensional (1D) VR, effectively allowing the fly to change only its heading in a visual scene, and not its position. Here we explore how flies navigate in a two-dimensional (2D) visual VR environment that more closely resembles their experience during free behavior. We show that flies’ interaction with landmarks in 2D environments cannot be automatically derived from their behavior in simpler 1D environments. Using a novel paradigm, we then demonstrate that flies in 2D VR adapt their behavior in a visual environment in response to optogenetically delivered appetitive and aversive stimuli. Much like free-walking flies after encounters with food, head-fixed flies respond to optogenetic activation of sugar-sensing neurons by initiating a local search behavior. Finally, by pairing optogenetic activation of heat-sensing cells to the flies’ presence near visual landmarks of specific shapes, we elicit selective learned avoidance of landmarks associated with aversive “virtual heat”. These head-fixed paradigms set the stage for an interrogation of fly brain circuitry underlying flexible navigation in complex visual environments.

biorxiv animal-behavior-and-cognition 0-100-users 2018

Proximity sensors reveal social information transfer in maternity colonies of Common noctule bats, bioRxiv, 2018-09-20

Summary<jatslist list-type=order><jatslist-item>Bats are a highly gregarious taxon suggesting that social information should be readily available for making decision. Social information transfer in maternity colonies might be a particularly efficient mechanism for naïve pups to acquire information on resources from informed adults. However, such behaviour is difficult to study in the wild, in particular in elusive and small-bodied animals such as bats.<jatslist-item><jatslist-item>The goal of this study was to investigate the role of social information in acquiring access to two types of resources, which are crucial in the life of a juvenile bat suitable roosting sites and fruitful feeding grounds. We hypothesized that fledging offspring will make use of social information by following informed members of the social groups to unknown roosts or foraging sites.<jatslist-item><jatslist-item>In the present study we applied for the first time the newly developed miniaturized proximity sensor system ‘BATS’, a fully automated system for documenting associations among individual bats both while roosting and while on the wing. We quantified associations among juveniles and other group member while switching roosts and during foraging.<jatslist-item><jatslist-item>We found clear evidence for information transfer while switching roosts, mainly among juveniles and their genetically identified mothers. Anecdotal observations suggest intentional guidance behaviour by mothers, indicated by repeated commuting flights among the pup and the target roost. Infrequent, short meetings with colony members other than the mother indicate local enhancement at foraging sites, but no intentional information transfer.<jatslist-item><jatslist-item>Our study illustrates how advances in technology enable researchers to solve long-standing puzzles. Miniaturized proximity sensors facilitate the automated collection of continuous data sets and represent an ideal tool to gain novel insights into the sociobiology of elusive and small-bodied species.<jatslist-item>

biorxiv animal-behavior-and-cognition 0-100-users 2018

Fast animal pose estimation using deep neural networks, bioRxiv, 2018-05-25

AbstractRecent work quantifying postural dynamics has attempted to define the repertoire of behaviors performed by an animal. However, a major drawback to these techniques has been their reliance on dimensionality reduction of images which destroys information about which parts of the body are used in each behavior. To address this issue, we introduce a deep learning-based method for pose estimation, LEAP (LEAP Estimates Animal Pose). LEAP automatically predicts the positions of animal body parts using a deep convolutional neural network with as little as 10 frames of labeled data for training. This framework consists of a graphical interface for interactive labeling of body parts and software for training the network and fast prediction on new data (1 hr to train, 185 Hz predictions). We validate LEAP using videos of freely behaving fruit flies (Drosophila melanogaster) and track 32 distinct points on the body to fully describe the pose of the head, body, wings, and legs with an error rate of &lt;3% of the animal’s body length. We recapitulate a number of reported findings on insect gait dynamics and show LEAP’s applicability as the first step in unsupervised behavioral classification. Finally, we extend the method to more challenging imaging situations (pairs of flies moving on a mesh-like background) and movies from freely moving mice (Mus musculus) where we track the full conformation of the head, body, and limbs.

biorxiv animal-behavior-and-cognition 500+-users 2018

 

Created with the audiences framework by Jedidiah Carlson

Powered by Hugo