Abstract
A key function of the brain is to move the body through a rich, complex environment. When rodents engage their environment, they move their whiskers as they extract tactile information. Even though the study of whisking has a long history, the details of what mice move when they move a whisker actively to touch are still unknown. Here we trained head fixed mice in a simple go-cue task to move a whisker on one side of the face to touch a sensor and tracked facial movements. Our analysis shows that mice specifically control the movement of the whisker they use to touch and that as they move their whiskers, they move their nose and apply forces on the head-post in a manner that reflects the behavioral epoch. Importantly, mice controlled the setpoint, amplitude and frequency of movement of whiskers bilaterally and individually. Additionally, even though mice achieved the goal of the task -- to touch the sensor within 2 seconds -- how they did so, how they coordinated movement of the nose and forces on head post with movement of individual whiskers was stereotyped, specific and related to the distance they needed to move a whisker to touch the sensor. Our work shows how stereotyped mouse behavior can be, and it emphasizes both the level of fine motor control mice can exert over individual whiskers and the extent of facial movements in a goal directed whisking-to-touch task.
Introduction
Over the course of evolution, and over the course of the lifetime of an animal, the motor sequences used to achieve a particular goal are refined and movement of many parts of the body are linked together into a whole that is optimal for achieving a particular goal. While we know that most natural behavior is multi-modal, involving many sensory and motor modalities, how various parts of the body move to generate a particular behavior, how variability in the movement of any part of the body affects movement of other parts of the body, and generates activity in a variety of sensory systems is only now beginning to be studied (Churchland, Afshar & Shenoy, 2006; Ghazanfar & Schroeder, 2006; Kelso, 2009; Krakauer et al., 2017; McElvain et al., 2018; Murakami & Mainen, 2015; Waschke et al., 2021).
Even in a model system like the rodent whisker system, where the goal of whisking is first and foremost to simply palpate surfaces and use tactile input to guide the animal around its environment, we are only on the cusp of understanding how rodents coordinate their facial movements (Vincent, 1912; Carvell & Simons, 1990; Hartmann, 2011, 2001; Sachdev, Sato & Ebner, 2002; Berg & Kleinfeld, 2003; Knutsen, Pietr & Ahissar, 2006; Grant et al., 2009; Haidarliu et al., 2010, 2012, 2015; Deschênes, Moore & Kleinfeld, 2012; McElvain et al., 2018; Adibi, 2019). For example, recent work has revealed that facial expressions in mice can convey emotions like pain, disgust or joy (Langford et al., 2010; Finlayson et al., 2016; Dolensek et al., 2020), and that facial movements can explain variance in single unit activity in surprisingly widespread areas of the brain (Avitan & Stringer, 2022; Stringer et al., 2019; Zagha et al., 2022; Steinmetz et al., 2019; Musall et al., 2019).
Traditionally, work in the whisker system focused on simple organizational features. Firstly, input from each whisker forms a labelled line from the trigeminal brainstem to cortex (van der Loos & Woolsey, 1973). Secondly, while whiskers are associated with the whisker pad that can be moved by facial muscles, each whisker has a single muscle slung around it (Dörfl, 1982; Wineski, 1983, 1985; Haidarliu et al., 2010); thirdly, whisking is associated with active behavior that can be directional, reflecting direction of movement of the animal (Knutsen, Derdikman & Ahissar, 2005; Towal & Hartmann, 2006; Grant, Mitchinson & Prescott, 2012; Grant, Sperber & Prescott, 2012; Arkley et al., 2014; Saraf-Sinik, Assa & Ahissar, 2015; Schroeder & Ritt, 2016; Dominiak et al., 2019; Bergmann et al., 2022); fourth, whisking and breathing are linked together, by a central pattern generator in the brain stem (Welker, 1964; Cao et al., 2012; Moore et al., 2013; Moore, Kleinfeld & Wang, 2014; McElvain et al., 2018; Tantirigama et al., 2020); and finally, while whisking is central for navigation, mice also move their eyes as they navigate and they coordinate their movement with the movement of their whiskers, nose and eyes (Dominiak et al., 2019; Bergmann et al., 2022). Despite the relative sensory “isolation” of each whisker (but see, Severson et al. 2017; Severson et al. 2019) and the potential for independent control of movement of each whisker (Sachdev, Sato & Ebner, 2002; Grant et al., 2009; Haidarliu et al., 2010), the ability of rodents to control the movement of individual whiskers and to coordinate this motion, with movement of other aspects of the face remains largely unexplored.
Here we begin the task of specifically measuring the coordination, variability and individuality that defines mouse whisking behavior, in a simple go-cue triggered whisking to touch task. Our work shows that in this task mice move individual whiskers distinctly on the two sides of the face and they coordinate motion of the nose and forces on the head post in a specific manner. This work also shows that from day to day the individual movement patterns of each mouse can be measurably stereotyped in a goal directed manner.
Results
Whisking, nose motion and forces on head post
Sixteen mice were trained to use the right side C2 whisker to contact a sensor within 2 second of an auditory cue (Fig 1, sVideo1-2). In the course of 5964 correct trials over 84 sessions, the movement of the nose, and the painted C1 and C2 whiskers bilaterally were tracked as mice whisked to touch the sensor, waited for the reward, and licked (Fig 1A). The maximum intensity projection for a single session (n=40 trials) from one mouse shows the extent of movement of two whiskers on each side of the face, and the nose on all successful trials (Fig 1B). Additionally, a strain gauge was used to measure the up and down forces mice applied on the head post (Fig 1C). The movement / position of whiskers and nose, and the forces mice applied on the head post, on each trial, were converted into raster displays reflecting the position and movement of whiskers and side to side motion of the nose, and the extent of the forces applied on the head post in each trial (Fig 1D). This display shows the timing and extent of whisker and nose motion, and the dynamics of the coordination between movements and the forces on the head post on every trial. The median position for the 40 trials in a session are plotted below each raster plot.
This representative session reveals that, even though mice are highly trained to move the C2 whisker on the right side into contact with the sensor, and they reliably contact the sensor with just this one whisker, the mouse invariably moves all the tracked whiskers bilaterally on every trial. Even though mice move their whiskers bilaterally on every trial, the movement of the whiskers on the two sides of the face -- the contacting side and the non- contacting side -- were not identical. On the non-contacting side, the C1 and C2 whiskers move together, and show similar movement patterns: fast protractions and large movements. On the contacting side, the C1 whisker moves much less than the C2, and both whiskers show a different pattern of movement than on the non-contacting side. Part of the differences in motion on the two sides of the face simply reflect the presence of the contact sensor that blocks large movements of the front, C2 whisker, but part of the movement especially of the C1 whisker in the back reflects the independent control of this whisker. These analyses also show the trial by trial and session-long stereotypy of the side-to-side nose motion and the head strain, as the mouse touch the contact sensor and lick reward.
Side-to-side differences in multi-whisker behavior
To elucidate the behavioral strategies mice used in this task, we first examined the movement of individual whiskers during the task in more detail (Figs 2 and 3). The positions of whiskers were converted into an animal-centric coordinate system, using the position of the eyes, (Fig 2A). Whisker movement was converted into angular motion and separated into whisking setpoint and amplitude by examining the envelope of the whisking motion (Fig 2B).
The maximal intensity projections of tracked whiskers (Fig 2A) indicated that the resting position of the tracked whiskers on the two sides of the face were different. Examination of individual videos of mice engaged in the task supported this possibility (Video). To quantify this difference, the mode of the setpoint for each whisker during a given behavioral session was calculated and considered to be the resting position (Fig 2C). This analysis shows that the resting position of the whiskers on the non-contacting-side (relative to the level of the eyes) was more protracted than the resting postion of the whiskers on the contacting side (Fig 2C). Comparison across multiple sessions confirmed that the resting whisker position on the non-contacting side was significantly more protracted than on the contacting side (Fig 2D; front whiskers, p=0.0001***; back whiskers, p=0.0001***; Wilcoxon signed rank test, N=29 sessions of 6 animals).
Next, we examined the frequency components of whisking on the two sides of the face (Fig 2E-F). The power spectra of whisking traces for a single session on the contacting side for the front and back whiskers showed a bimodal / skewed distribution, with a peak at ∼25 Hz. Note that the peak at 25 Hz does not result from contact. the C1 whisker (back) adjacent to the contacting whisker does not contact the sensor at all (Fig 2E, light blue traces). The histogram of positions for the C1/back whisker was not affected by the contact sensor (Fig 2C, top right, gray area). This high frequency whisking band was prominent on the contacting side(Fig 2E).
To compare the frequency components detected across all sessions from all animals, we normalized the power of the low (3–10 Hz) and the high (10–30 Hz) frequency bands relative to the baseline (1–3 Hz) power. This analysis indicates that there was significantly more power at higher frequencies of whisking on the contacting side than on the non-contacting side (Fig 2F, right) (3–10 Hz, p=0.0623, NS; 10–30 Hz, p=0.0000***; Wilcoxon signed rank test, N=85 sessions of 16 animals). There were no significant differences in side-to-side power at lower frequencies of whisking (Fig. 2F, left).
Together, these results highlight the side-to-side asymmetry in whisking. Compared to the whiskers on the non-contacting side, the contacting side whiskers tended to be retracted during the task and tended to move with more vigor at higher frequencies.
Behavioral phase-dependent single-whisker movement dynamics
Next, we examined how changes in setpoints and whisking amplitudes characterize the whisking behavior of the mice during the task, taking into account the four whiskers on the two sides of the face (Fig 3).
Single session data illustrate the distinct setpoint of individual whiskers (Fig 3A). Note that the setpoint data shown here was defined as a deviation of the whisker position from the resting whisker position during the session. This definition avoids the bias introduced by day- to-day or whisker-to-whisker (C2, C1 positions) differences in resting positions. We see that on average, the setpoint of the whisker used to touch the sensor was more protracted than the setpoint of the adjacent whisker (Fig 3A, left). By comparison whiskers on the non-contacting side moved almost identically, with their setpoints being similar to each other (Fig 3A, right).
To examine the behavioral state dependent changes in whisking in more detail, each trial was divided into distinct 750 ms epochs (Table 1). At cue onset, mice adjusted the setpoint of whiskers bilaterally; they protracted their whiskers and kept them protracted until the licking phase of the trial (Fig 3B, Tables 2 and 3, p<0.05, Kruskal–Wallis test, followed by Dunn’s post hoc pairwise tests with Bonferroni correction).
On the contacting side, during the cued and the licking phase of the trial, adjacent whiskers were positioned distinctly, with the front whisker (C2) being kept significantly more protracted than the back whisker (C1) (Fig 3B, top; Table 2, Front vs Back; p<0.05, Dunn’s post hoc pairwise tests with Bonferroni correction). On the non-contacting side mice protract their whiskers but there were no significant differences in the positioning of the front and the back whiskers, for any of the behavioral phases examined (Fig 3B, bottom; Table 3, Front vs. Back; p>0.05, Dunn’s post hoc pairwise tests with Bonferroni correction).
Note that during the cued epoch, the difference in the whisking setpoint of the adjacent (front-back) whiskers was not likely to arise from side-to-side differences in whisking. In this epoch on average, there were no significant differences between the whisking setpoints on the contacting and non-contacting side whiskers (Fig 3C; pre-cue, p=0.5148, NS; cued, p=0.2371, NS; trigger, p=0.9156, NS; lick, p=0.0318*; end-of-trial, p=0.8664, NS; Mann–Whitney U-test, N=85 sessions from 16 animals). Taken together, these results indicate that, when mice are engaged in the task, they protract their whiskers differentially on the two sides of the face and specifically control the spread between whiskers.
Next, we examined whisking amplitude. Analysis of average whisking amplitude across sessions demonstrated that when compared to the pre-cue period, it increased significantly on both sides of the face from the onset of the auditory cue until the end of trial (Fig 3D & 3E; Tables 4 and 5; p<0.001, Kruskal–Wallis test, followed by Dunn’s post hoc pairwise tests with Bonferroni correction). Additionally, there were behavioral epoch dependent significant differences in amplitude of whisking for adjacent whiskers on the contacting side. During both the pre-cue and the end-of-trial phases, the whisking amplitude on the contacting side for the front, C2 whisker was significantly larger than that of the adjacent, back whisker, (Table 4, Front vs. Back). There were no significant differences in amplitude of whisking of the two whiskers on the non-contacting side (Table 5, Front vs. Back).
Whisking amplitudes on the two sides of the face did not differ significantly between the contacting and non-contacting sides, except in the contact trigger phase (Fig 3F) (pre-cue, p=0.0854, NS; cued, 0.7836, NS; trigger, 0.0023**; lick, 0.3082, NS; end-of-trial, 0.3156, NS; Mann–Whitney U-test, N=85 sessions of 16 animals). The difference during the contact trigger phase is likely to be related to the presence of the contact sensor which blocks motion of the front whisker on the contacting side.
Taken together, our analysis indicates that mice exert control over the movement of individual adjacent whiskers on each side of the face and they display distinct movement patterns on the two sides of the face. These differences in individual whisker positions primarily reflect the effect on the setpoint. When mice use a whisker to touch the sensor, the spread of whiskers -- the distance between adjacent whiskers -- changes, and the whiskers spread apart, but only on the side actively used in this tactile behavior.
Nose movement and forces on the head-post during whisking
Head movement, and independently nasal movement, and sniffing have previously been linked to whisking (Towal & Hartmann, 2006; Deschênes, Moore & Kleinfeld, 2012; Moore et al., 2013; Cao et al., 2012; Kurnikova et al., 2017; Dominiak et al., 2019). Here we examined the relationship between whisker motion and nose movement, and forces mice apply on the head post when whisking to touch in our goal directed task. Linear regression models were fit to estimate median nose positions, or median head strain, in each behavioral phase based on the median positions of the four whiskers (Fig 4A). Interestingly, whisker positions were most closely and significantly linked to the nose position immediately after the cue onset -- in the cued-phase (compared to the pre-cue phase; cued, p=0.0000***; trigger, p=0.0745, NS; lick, p=0.6653, NS; end-of-trial, p=1.0000; Wilcoxon signed rank test with Bonferroni correction, N=64 sessions from 14 animals) (Fig 4B).
Similarly, whisker position and head post strain were also best linked together after the cue onset (compared to the pre-cue phase; cued, p=0.0086**; trigger, p=1.0000, NS; lick, p=0.5093, NS; end-of-trial, p=1.0000, NS; Wilcoxon signed rank test with Bonferroni correction, N=69 sessions from 15 animals) (Fig 4C). These results imply that the mice recruit nose and head movement to their motor plan when they attempt to move their whiskers to actively touch the contact sensor. The results of this analysis are consistent with the previous studies linking left-right differences in whisking with nose movement (Dominiak et al., 2019).
Trial-to-trial stereotypy and animal to animal and day to day variability
While we found a few common patterns of whisker/nose movement across behavioral sessions of different animals, we also found a lot of variability across animals, sessions and trials. To examine the source of the variability, and whether this trial-to-trial variability could be attributed to the differences in the strategies that individual mice used to achieve the goal of the task, we used a principal component approach to examine the data. Each trial was represented as a set of features: the median whisking setpoint, median amplitude values, and median nose positions during 3 behavioral phases (the contact trigger, lick and end-of-trial phase). Each trial was visualized using t-SNE which revealed that from session to session and animal to animal, these features could form small clusters (Fig5A, right). The clusters corresponded to trials belonging to a single behavioral session from an animal or to multiple sessions from a single animal (Fig 5A).
To examine the state of clustering in detail, we introduced a metric of distance between trials. The extent of the cluster in a behavioral session, within the trial space, was defined as the mean distance between any pair of trials belonging to the session. The distance between a pair of behavioral sessions was defined as the mean distance between any pair of trials out of these two sessions. Animal- or session-based distance matrices revealed a set of bright pixels along the diagonal, indicating measurable similarities between trials belonging to an individual animal or to sessions from a single animal. This tendency appeared clearer for the session-based distance matrix than for the animal-based one (Fig 5B). The aggregation index, i.e. the ratio of inter-cluster distances to the size of clusters (see Materials and Methods for details), was larger for the actual dataset compared to the shuffled datasets, suggesting that clustering arose from the details of movement in trials belonging to single animals or to sessions from individual animals (Fig 5C) (p<0.01, based on comparison with 500 shuffled datasets).
Our definition of trial-to-trial distance included 3 behavioral features: whisking setpoints, whisking amplitudes and nose positions, in the distinct behavioral phases. To determine the contribution of individual behavioral features to trial-to-trial variability, the subset of behavioral features were shuffled, one after another, and the change in the in the aggregation index was computed (Fig 5D). Our analysis revealed that the animal’s behavior during the licking phase contributed most to the aggregation index compared to the other behavioral phases (p<0.01**, Kruskal–Wallis test and Dunn’s post hoc pairwise tests with Bonferroni correction). Furthermore, nose motion had larger per-feature contribution to the aggregation index than the whisker-related features (p<0.01**, Kruskal–Wallis test and Dunn’s post hoc pairwise tests with Bonferroni correction).
Together, these results imply that behavioral variability is reflected by movement of individual body parts, during each behavioral phase, and that the movement is linked to “uninstructed” behavior -- nose motion -- not to the goal of the task per se, i.e. whisking to touch. Trial-to-trial variability was related to epochs where mice were not engaged in the actual task, i.e. when they were licking.
Factors mediating behavioral stereotypy and variability
What factors contribute to behavioral variability across animals and sessions? One possibility was that animals use different motor strategies depending on difficulty of the task i.e. on the spatial configuration of the contact sensor and whiskers. To examine this possibility, we tested whether the inter-session distance co-varied with the distance of the contact sensor from the whiskers (sVideo1 and sVideo2). From session to session, mice could position their whiskers closer (sVideo1) or farther from the contact sensor (sVideo2). We found that the t-SNE coordinates of individual sessions correlated with the distance of the contact sensor from the resting position of the contacting whisker (R2 = 0.6949, linear regression) (Fig 5E). Comparison of the dataset to the 3000 shuffled datasets, where the correspondence between the sensor position and the t-SNE coordinates was randomized, revealed that the effect was significant (p<0.001***; shuffled R2, min=0.0000, median=0.0169, max=0.1834). This result supports the idea that during cue triggered whisking mice take into account the variability in position of their bodies relative to elements in the environment that they touch. The spatial environment around the face could be one of the most important sources of session-to-session behavioral variability in this task.
Finally, we examined neural mechanisms that underlie trial-to-trial stereotypy and variability. We took advantage of the Vgat-Cre; FLEX-Chr2 (Vgat-ChR2) transgenic mouse line to optogenetically suppress activity in primary motor cortex (M1) during the auditory cue. Even under unilateral optogenetic suppression of M1 activity, the animals performed the task at similar rates as in unsuppressed, control trials, presumably due to the activity remaining in the other cortical areas (sVideo3 and sVideo4). But when we performed t-SNE analysis, the suppressed trials tended to be located on the rim of the session-specific cluster of trials, i.e. suppression of M1 resulted in a less clustered state of the session (Fig 6A). When we computed trial-to-trial distances in the t-SNE space, significantly larger median trial-to-trial distances were observed for M1-suppressed trials than for the control trials in Vgat-Chr2 mice (Fig 6B, right) (p=0.02*; Mann–Whitney U-test, N=9 sessions of 3 animals). No significant differences were observed in trial-to-trial distances of ChR2(-) mice (Fig 6B, left) (p=0.35, NS; Mann– Whitney U-test, N=26 sessions of 7 animals). Taken together, these results indicate that activity in M1 contributes to session-to-session behavioral stereotypy in this task.
Discussion
Our work shows that mice can control how they move and position individual whiskers. They can control the extent of movement and can even independently control the vigor of whisking on two sides of the face. They can effectively control and coordinate in a behavioral state dependent manner the spread between whiskers. These results suggest an impressive level of fine spatial-temporal control over single facial muscles. In the course of a trial, mice do not just move a single whisker -- a whisker that they have learned to use in a tactile behavior -- but they invariably move whiskers bilaterally and they move their nose side to side and apply up and down forces on the head post. Finally, in a single session mice exhibit an extraordinary level of coordination and stereotypy in moving their whiskers and nose to achieve the goal of touching a sensor. A priori, it was possible that from trial to trial and day to day, that mice adjusted how they coordinated their movements, but our work shows the remarkable consistency in what mice move, how they move and how they adapt their movements to small changes in the external environment.
From an anatomic perspective these results are not surprising. The two sides of the face have independent muscles with independent points of attachment on the skull, and each mystacial whisker is associated with a sling muscle (Dörfl, 1982; Wineski, 1983; Haidarliu et al., 2010). It should therefore be possible for rodents to exert control over movement of each mystacial vibrissae independently. Despite the anatomical organization basic questions have lingered: when do rodents control whiskers on just one side of the face? How and when do rodents control single whiskers, and how is this control reflected? Do side to side differences in whisking just reflect direction of head or body movement (Towal & Hartmann, 2006; Dominiak et al., 2019; Bergmann et al., 2022)?
Our work here highlights how anatomy is transformed into function. First, even in highly trained mice, when mice are trained to move a whisker on one side of the face into contact with a sensor, mice invariably move whiskers bilaterally on every trial. This result suggests that facial movements in rodents often have a bilateral component. Second, even though whiskers move bilaterally, movement on the two sides of the face is not identical: whiskers spread apart more on the contacting side, whiskers are positioned distinctly on the two sides, and move with specific vigor at high frequencies on the two sides of the face. Third, mice control the movement of the whisker they used to touch distinctly from the movement of the adjacent whisker. Note that earlier work in in head fixed rats using a very similar paradigm, came to the conclusion that rats could move their whiskers independently and that this was mostly evident in the spread between whiskers (Sachdev, Sato & Ebner, 2002). In another study of freely moving rats (Grant et al., 2009), the authors also concluded that rats control movement of individual whiskers and that this was evident in the spread of whiskers. But in their study, whisker spread decreased before contact, before touch. In our hands, mice do something very different -- they increase the spread of whiskers on contacting side. The different results of the two studies are likely to be related to the context. The goal of whisking in our study was for mice to hit a sensor with enough force to get a reward. In the earlier work, rats were not water deprived, there was no task, and rats were simply exploring their environment (Grant et al., 2009). Additionally, in our earlier work, in the course of performing complex task of navigating a plus maze, side to side differences in set point of whisking and movement of the nose were evident on every trial as mice planned and executed turns. Whisking direction predicted movement direction and was coordinated with direction of eye movement (Bergmann et al., 2022; Dominiak et al., 2019). In a different set of studies, in head fixed mice we have shown that the spread between whiskers could be potentially used as a real-time parameter to trigger rewards (Sehara et al., 2021). Taken together these studies highlight varieties of neural-behavioral strategies that rodents use to control movement of individual whiskers bilaterally.
The present work emphasizes that a number of facial movements are coordinated together in the course of mouse actively using a single whisker in a tactile behavior. On one hand it seems that there is complexity and multiple variables to account for – individual whiskers, nose, head post, i.e. neck muscles. On the other hand, the stereotypy is remarkable: multiple aspects of behavior are coordinated from trial to trial, in some animals even from day to day. A significant portion of the variance in coordination can be explained by small changes in how far mice have to move a whisker.
The coordination and stereotypy in this behavior are remarkable but point to several fundamental issues for systems neuroscientists. First point being emphasized in recent papers by many labs is that variance in neural activity can arise from unexpected sources, like uninstructed movements (Steinmetz et al., 2019; Stringer et al., 2019; Musall et al., 2019). A second point that our work makes is that behavior is fundamentally multimodal involving multiple dimensions that are coordinated together, and that activity of a particular neuron or in a particular circuit can be related to any component of the overall behavior. Dissecting this, understanding what a spike or spikes in a circuit relate to, what triggers the activity or what the activity causes, require additional analytical approaches.
In course of a simple action, the brain or the animal actively controls and coordinates the movement of a number of parts of the body. In the present work, the whisker used to touch is the only instructed movement, all other behaviors -- the movement of adjacent whiskers, the movement of whiskers on the other side of the face, the movement of the nose, and the forces mice apply on the head post -- are uninstructed. Even though it is likely that we have under-sampled the actual number of uninstructed behaviors that occur when mice move a whisker to touch (for example, the posture and limbs probably also change position in a coordinated fashion), it is clear that they move more than just the whisker.
What are the neural mechanisms underlying behavioral variability? One explanation is that variability arises from variability in circuit activity. While it is likely that variability in cortical / subcortical activity contributes to behavior, it is highly unlikely, that individual cortical or subcortical neurons, modulate or coordinate the spectrum of facial movements. What is more likely is that variability in neural activity arises from the interaction between multiple coordinated action sequences -- the behavior -- and the neural circuits that control and respond to the movement. The timing and kinematics of a particular action -- a whisk -- varies because of variability in the other aspects of behavior that modulate activity in the brain.
Understanding how the brain controls active behavior, requires an understanding of the actual details of behavior, the moment-by-moment actions taken to achieve a goal. Our expectation is that knowing what moves when mice move a single whisker to touch will help to better understand both variance in cortical activity and variance in the sensory motor actions mice take in performing tactile behaviors.
Materials and Methods
Animals and surgery
We performed all procedures in accordance with protocols for the care and use of laboratory animal approved by the Charité–Universitätsmedizin Berlin and the Berlin Landesamt für Gesundheit und Soziales (LaGeSo). Adult C57bl6 mice (n=16), weighing 25 to 32 g were anesthetized with Ketamine/Xylazine (90mg/kg / 10mg/kg). Lightweight aluminium headposts were affixed to the skull using Rely X and Jet Acrylic (Ortho-Jet) black cement (Ebner et al., 2019; Dominiak et al., 2019). In the two days after surgery, analgesia was provided by Buprenorphine and Carprofen injections.
Behavioral task
We trained head-fixed mice to use their whiskers to touch a piezo film-based contact sensor. An auditory cue (piezo buzzer) initiated trial and the cue stayed for 2 seconds or until whisker contact with sensor. The contact sensor was placed in front of the C2 whisker of the mouse, with the animal’s whiskers rostral to the C2 being trimmed back to 2 mm. The signal from the contact sensor was fed through a custom-built amplifier, the analogue waveform was thresholded to emit a transistor-transistor logic (TTL) signal. If whisker contact evoked waveform exceeded a threshold, auditory cue was turned off, and reward was delivered. The duration of each trial varied and depended on reaction and movement time after cue onset.
Behavioral training
One week after surgery, mice were habituated to being handled, and habituated to the behavioral apparatus. In the first days of habituation, mice were acclimated to having their head post handled by the experimenter, and to short (up to several minutes) head fixation. In the course of the first week of habituation, the duration of head fixation was gradually increased from 5 to 40 minutes. In the course of habituation to head fixation, mice were also habituated to having their whiskers painted. During whisker painting, mice were rewarded with sweetened condensed milk.
After a week of habituation, mice were water deprived and were trained to respond to an auditory cue and lick the lick tube. Over the course of 2-3 weeks, mice were introduced to the piezo sensor positioned in front of the painted whiskers on the right side of the face. Initially, the sensor was positioned close to the C2 whisker on the right side, with the whiskers rostral to the C2 trimmed back to 2 mm. During the next training days, the sensor was positioned further way from the mouse. To achieve a consistent success rate for each mouse the position of the sensor varied by a few millimeters from day to day.
Behavioral tracking
To track whisker and nose positions, behavioral data was recorded at 200 Hz with a Basler acA1920-155uc USB 3.0-camera and a f=25mm / F1.4 objective, being set above the animal. The C1 and C2 whiskers on both sides, as well as the tip of the nose, were painted (UV glow, https://www.uvglow.co.uk/), and illuminated by UV torches. Videos of 200 Hz frame rate were acquired in the proprietary format from Matrox Imaging (https://www.matrox.com/) and later converted into the H.264 format. The acquisition and conversion were accomplished using ZR-view, a custom software (Robert Zollner, Eichenau, Germany). Camera triggers were generated either with the reward trigger, or at the offset of the auditory cue. A five second ring buffer was used to record frames of ±2.5 seconds around the camera triggers.
Whisker and nose positions were estimated from the videos using the custom Python script (videobatch) (https://doi.org/10.5281/zenodo.3407666). The top-view videos first underwent a maximum intensity projection using the videobatch script. Regions of interest (ROI) for tracking were selected manually and separately for the whiskers on both sides of the face and the nose positions, using Fiji’s freehand selection tool (Dominiak et al., 2019; Sehara et al., 2019). Using the Python script, pixels that belonged to a particular hue value were collected and the luma-weighted average position was computed. For frames where the algorithm failed for any reason, values were dropped and were filled in later by linear interpolation.
Licking was monitored with a piezo sensor attached to the lick tube. The signal from the sensor was amplified using a custom-built amplifier and licking waveform was exported to a CED-1401 interface. Head strain was monitored with a strain gauge built into the head post holding-rod. Its signal was amplified using a custom amplifier before being fed into the data acquisition interface.
Behavioral sessions were monitored and recorded using Spike2 (CED, Cambridge, UK) equipped with a Power1401 data acquisition interface. During each session, the duration of the auditory cues, reward triggers, camera triggers and lick event data were acquired, along with analog waveforms of the output of the contact sensor, piezo-based lick sensor and strain gauge. The acquired data was saved as compressed Spike2 files and exported as tab-separated value (TSV) files for analysis.
General analytical procedures
Analytical procedures were performed using Python (https://www.python.org/, version 3.7.6), along with several standard modules for scientific data analysis (NumPy, https://www.numpy.org/, version 1.18.1; Scipy, https://www.scipy.org/, version 1.4.1; matplotlib, https://matplotlib.org/, 3.1.3; pandas, https://pandas.pydata.org/, 1.0.1; scikit-learn, https://scikit-learn.org/stable/, version 0.22.1; Bottleneck, https://pypi.org/project/Bottleneck/, 1.3.2; Statsmodels, https://www.statsmodels.org/, version 0.13.0; h5py, https://www.h5py.org/, version 3.6.0) and non-standard packages (sliding1d, https://github.com/gwappa/python-sliding1d, 1.0; fitting2d, https://doi.org/10.5281/zenodo.3782790, 1.0.0a2). Kruskal–Wallis test and Dunn’s post hoc test was performed using a custom python script based on the one found at https://gist.github.com/alimuldal/fbb19b73fa25423f02e8. Data was sorted based on trial duration. Trials on which mice failed to contact the sensor in 2 seconds were not analyzed.
Behavioral data preprocessing
The whisker positions tracked during each behavioral session were converted to whisking angles. Using the fitting2d Python library, a circle was fit to the set of two-dimensional positions for each whisker, and the position at each timepoint was converted into the polar coordinates around the fitted circle. Whisking setpoint and amplitude were then estimated based on these whisking angles. Upper and lower envelopes of whisker positions were estimated by smoothing the whisker positions using sliding-window mean of a 300 ms time window. The lower envelope was defined as the setpoint; the difference between the upper and lower bounds of the envelope was defined as the whisking amplitude. The mode of whisking setpoints was estimated from the histogram of whisking offsets during each behavioral session. The lower half (up to 50 percentile) of the histogram was fitted using a cubic spline, and the mode was defined as the whisker position giving the maximal density of setpoint positions.
Unless otherwise indicated, this mode of whisking setpoints was considered as the resting position of the whisker and was used as the zero-position of the whisking angles. In the particular cases when the resting positions of individual whiskers were compared, whisking angles were calculated relative to the position of the eyes. The corners of the left and right eyes were annotated manually on a video frame of each behavioral session. A line connecting the corners of the two eyes was drawn. The crossing point between this line and the circle of whisking was defined as the eye level which was then used to calculate the whisking angles.
The position of the contact sensor relative to the mouse could change slightly from day to day, as each animal was positioned anew in the behavioral rig. The sensor position for each day / each session was measured post hoc using the maximal density of whisker positions evident in the video data.
Nose position data were projected on a 1D axis of the first principal component.
Analysis of behavioral epochs
Data from each trial was first parsed into behavioral epochs: a pre-cue phase where mice wait for cue onset, an auditory cue phase which initiated whisker movement, a trigger epoch where whisker contact with the sensor crossed threshold including 500 ms before touch and 250 ms after touch, a lick/reward epoch, and a post-reward epoch (Table 1).
To examine nose-whisker and head-whisker coordination, linear regression models were used to predict the nose position and the head strain based on whisker dynamics. In doing so, time points belonging to a single behavioral phase of all the trials during a single behavioral session were pooled. The values of whisker setpoints and whisking amplitude were used to predict the nose position and the head strain on the same time point. The models were fit using the least-square approach. Four-fold cross-validation scheme was used – the linear model was trained based on 75% of the data, and prediction accuracy was tested on the remaining 25% based on the coefficient of determination (cross-validated R2, or cvR2) -- and the mean of the four cvR2 scores was used as the cvR2 value of the given session.
To estimate trial-to-trial variability, we used 9 features of behavior tracked in 3 behavioral epochs (trigger / lick / end-of-phase) to generate a 27-dimensional vector. The 9 features consisted of: (a) the median whisking offsets of the four tracked whiskers, (b) the median whisking amplitudes of the four tracked whiskers, and (c) the median nose position. Trial-to-trial variability was mapped using two types of methods. The t-SNE method was used to visualize distribution of individual trials in a 2-dimensional space (https://jmlr.org/papers/v9/vandermaaten08a.html is the URL to the paper), whereas the Euclidean method was used to compute trial-to-trial distances. Before applying the Euclidean method, all of the 27 features were standardized by computing the Z scores, i.e. subtracting their means and then dividing it by their standard deviations. The Euclidean distance in the 27-dimensional space of Z scores was used as the distance metric between trials.
Euclidean trial-to-trial distances were used to estimate the diameters of groups (single animals and single sessions), and inter-group distances. The diameter of the group of trials belonging to a single animal or a session was determined as the mean distance between two randomly chosen trials from the group. The inter-group distance was determined as the mean distance between two randomly chosen pair of trials, each of which from one from the pair of groups. Because it was computationally demanding to compute distances across all the >4000 trials, we randomly sampled 100,000 pairs of trials and computed their inter-trial distances.
To examine the state of clustering in accordance with groups of trials, we devised the aggregation index (AI), as the ratio of the mean inter-group distance (i.e. mean trial-to-trial distance between different groups) to the mean group diameter (i.e. mean trial-to-trial distance within a single group).
To relate trial-to-trial variability to the position of the contact sensor, we computed 2-D coordinates of each behavioral session in the t-SNE space as the median of all the trials belonging to the session. A linear regression model was trained using the least-squares approach, to predict the sensor position during each session based on the 2-D coordinates of the session.
To examine the effects of optogenetics on trial-to-trial variability, we used trial-to-trial distances in the 2-D t-SNE coordinate space. For each session, trial-to-trial distances were normalized relative to the median distance of all the possible pairs of trials.
Sample size and statistics
For the analysis of whisking setpoints and frequency power, we annotated 48 behavioral sessions of 7 animals in total. Comparisons were made using Wilcoxon’s signed-rank test.
For the analysis of whisker and nose movements, two criteria were used to select sessions: there had to be more than 10 trials in a session and success rate in each session had to be more than 50%. As a result, we used 85 sessions from 16 animals in total. In one behavioral session nose positions were not tracked and this data was excluded from the analysis.
For each trial, behavioral phases were used only when the whole duration of the phase was recorded in the 5 second video. Unless otherwise noted, comparisons were made using Kruskal–Wallis test, followed by Dunn’s post hoc pairwise tests with Bonferroni correction. The analysis of trial-to-trial variability was based on three epochs: trigger onset, licking and end-of-trial phases. The data related to these epochs could be easily extracted from video. To statistically compare whether the given aggregation index was significantly higher than the chance level, we prepared 500 different datasets where annotation of each trial to the group (animal or session) was shuffled. All the resulting aggregation indices were represented as the deviation from the median value of the shuffled data, and the p-value was computed based on where the index of the true data resides relative to the other shuffled data.
To examine the contribution of different groups of parameters to trial-to-trial variability, we prepared 500 datasets (per group) where the given group of parameters were shuffled. The resulting reduction in the aggregation index (∆AI) was considered to be the contribution of the shuffled parameters to trial-to-trial variability. Comparisons between shuffled data were performed using Kruskal–Wallis test, followed by Dunn’s post-hoc pairwise tests with Bonferroni correction.
To examine statistical significance in the correlation between trial-to-trial variability and the contact sensor position, the R2 value of the linear regression model was used. Apart from generating a model based on the actual dataset (N=84 sessions), 3000 models based on random datasets were prepared and the correspondence between the shuffled 2-D session coordinates and the sensor position were examined. Statistical testing was based on comparison of the R2 value of the model of the actual data and those of the 3000 shuffled models; the R2 values were sorted based on the distance from the median value of the shuffled models, and the p-value for the R2 of the true dataset was computed by assessing what percentile the value was located.
To statistically examine the effect of optogenetic stimulation, Mann–Whitney’s U-test was performed between normalized trial-to-trial distances of non-stimulated and stimulated trials. N=26 sessions from 7 animals (no-ChR2 condition), and 9 sessions from 2 animals (Vgat-ChR2 condition).
Funding
European Union’s Horizon 2020 research and innovation program and Euratom research and training program 20142018 (under grant agreement No. 670118 to MEL); Deutsche Forschungsgemeinschaft (Exc 257 NeuroCure, Grant No. LA 3442/3-1 & Grant No. LA, Project number 327654276 SFB1315); European Union Horizon 2020 Research and Innovation Program (72070/HBP SGA1, 785907/HBP SGA2, 785907/HBP SGA3, 670118/ERC Active Cortex).
Author contributions
RS and KS planned and designed the analysis and experiments and wrote the manuscript. MS, NB, and SD performed the experiments. MS, KS, NB analyzed the data. MEL designed experiments and wrote the manuscript.
Conflict of interests
The authors declare no competing financial interests.
Acknowledgements
We would like to thank the Charite workshop in particular Jan-Erik Ode, Alexander Schill and Daniel Deblitz for machine and electronics work.
Footnotes
https://gin.g-node.org/Staab_TouchTask/Figures/src/master/reports/Figure1/Figure1.png
https://gin.g-node.org/Staab_TouchTask/Figures/src/master/reports/Figure1/Figure1.svg
https://gin.g-node.org/Staab_TouchTask/Figures/src/master/reports/Figure2/Figure2.png
https://gin.g-node.org/Staab_TouchTask/Figures/src/master/reports/Figure2/Figure2.svg
https://gin.g-node.org/Staab_TouchTask/Figures/src/master/reports/Figure3/Figure3.png
https://gin.g-node.org/Staab_TouchTask/Figures/src/master/reports/Figure3/Figure3.svg
https://gin.g-node.org/Staab_TouchTask/Figures/src/master/reports/Figure4/Figure4.png
https://gin.g-node.org/Staab_TouchTask/Figures/src/master/reports/Figure4/Figure4.svg
https://gin.g-node.org/Staab_TouchTask/Figures/src/master/reports/Figure5/Figure5.png
https://gin.g-node.org/Staab_TouchTask/Figures/src/master/reports/Figure5/Figure5.svg