A high-speed, modular display system for diverse neuroscience applications

Visual stimulation of animals in the laboratory is a powerful technique for studying sensory control of complex behaviors. Since commercial displays are optimized for human vision, we established a novel display system based on custom-built modular LED panels that provides millisecond refresh, precise synchronization, customizable color combinations, and varied display configurations. This system simplifies challenging experiments. With variants of this display, we probed the speed limits of motion vision and examined the role of color vision in behavioral experiments with tethered flying Drosophila. Using 2-photon calcium imaging, we comprehensively mapped the tuning of visual projection neurons across the fly’s field of view. Finally, using real-time behavior analysis, we developed low-latency interactive virtual environments and found that flying flies can independently control their navigation along two dimensions. This display system uniquely addresses most technical challenges of small animal vision experiments and is thoroughly documented for replicability.

The modular LED display system described previously 13 (and referred to as "G3" for 3 rd generation) has 69 received many incremental updates to improve its functionality, although it is based on a ~15-year-old 70 design. We have developed and validated the "G4" (4 th generation) modular display system that improves 71 on all aspects of the prior design. G4 includes modern technological advances and a simplified, yet more 72 integrated experimental control system. In this paper, we describe the hardware architecture and design 73 strategy of the modular G4 system (Figure 1) that enables full-display brightness control at 1 kHz refresh 74 rates, and precise synchronization with external equipment. We then introduce new software tools that we 75 developed to simplify the use of a G4 display for end users (Figure 2). We summarize our thorough 76 validation of the system, first using technical benchmarks, and then using different G4  The G4 modular display system 92 93 The G4 display system ( Figure 1A) is composed of LED panels arranged in columns, that are grouped to 94 form arenas of different shapes, such as flat walls or cylinders, that are quite compact and can therefore 95 be integrated into a variety of experimental setups ( Figure 1B). 96 97 Each panel is a 'sandwich' of a driver board housing the LEDs and a communication board that connects 98 to other panels. G4 has ~2.5 times the pixel density of the G3 LED modules 13 (16×16 LEDs over 99 40×40 mm vs. 8×8 LEDs over 32×32 mm), yielding higher resolution visual arenas. Faster data 100 transmission through SPI interfaces replaces I 2 C used in the prior G3 design. The G4 display is controlled 101 by custom software and an FPGA that drives 12 parallel SPI busses, achieving sustained data rates per 102 panel that are >20× faster than G3, supporting display refresh rates of 500 or 1000 Hz. By using surface-103 mount LEDs of widely available standard sizes, G4 panels can be customized with a variety and even 104 combinations of LEDs. 105 106 A typical 9 column × 4 row, 36-panel G4 arena reliably operates at 500 Hz refresh rate while streaming 107 visual patterns with 16 brightness levels and 1 kHz refresh rate for patterns containing 2 brightness levels. 108 In our display system the refresh rate describes the number of unique frames shown per second 109 ( Figure 1C). Fine-scale control of brightness is achieved by modulating the LEDs' duty cycle within a 110 frame through the 'stretch' value ( Figure 1C). The controller streams patterns directly to the arena, 111 eliminating the need for intermediate buffers, and serves as a flexible data acquisition (DAQ) system, 112 logging up to 8 analog inputs at user-specified sample rates (up to 200 kHz). Additionally, the controller 113 implements up to 8 analog outputs that are precisely synchronized with display refresh ( Figure 1D). 114

115
To allow researchers to rapidly use the G4 system without directly programming lower-level components, 116 we developed graphical user interfaces and high-level software tools in MATLAB that are supported by 117 The G4 display's combination of fine brightness control, fast refresh rates, and precise synchronization 125 between the display and input/output channels, all controlled through high-level software, enables a range 126 of experiments not previously possible. In each of the following sections, we introduce a biological 127 question about Drosophila vision along with the technical challenges required to address it. We then 128 describe our experiments that use unique features of the G4 display system to address these open 129

questions. 130
The optomotor response of flies, in which the animals orient their head and/or their body to compensate 134 for large-field visual motion, has been a critical behavioral tool for understanding the computational 135 properties of insect motion vision 4,28,29 and has recently been used to establish specific roles for specific 136 visual neurons 30-33 . However, many substantial issues remain unresolved, such as whether the speed 137 tuning of visual system neurons can explain the speed tuning of optomotor behavior 34,35 and whether all 138 axes of visual motion are identically speed tuned. In prior studies, the limited speed of the display 139 prevented experiments testing the behavioral responses to high velocity motion 31,36 , and consequently the 140 maximum stimulus speeds that flying flies can perceive are not well established. 141

142
Using the G4 display's 1 kHz refresh rate, we presented grating stimuli moving over a wide range of 143 speeds (>1000°/s) with corresponding temporal frequencies (the frequency of alternating bright/dark bars 144 in a drifting grating) of up to 62.5 Hz ( Figure 3A), which approaches the flicker fusion frequency of 145 Drosophila 24 . Since the display's refresh rate is independent of the pattern's complexity (another 146 improvement over G3), we examined responses to motion stimuli simulating rotational and translational 147 motion around and along the 3 cardinal axes (6 flow fields, 2 directions each; Figure 3B). We presented 148 these patterns in a randomized order to thorax-tethered, flying flies while recording the amplitude and In response to bilaterally asymmetric stimuli (yaw and roll rotations, lateral slip translation) flies mainly 158 turn (quantified as the difference between Left and Right wingbeat amplitude, 'L-R' in Figure 3B,C) 159 while responses to bilaterally symmetric stimuli (pitch rotation and lift translation) were primarily 160 changes in the symmetric sum of wingbeat amplitudes ('L+R') as well as the wingbeat frequency. These 161 results are largely in agreement with responses to impulsive visual motion along the 6 directions 37 , 162 although noteworthy differences in the relative amplitude of the responses were found: the impulsive 163 stimuli resulted in the largest turns to slip, and smaller yaw and roll responses, while our stimuli resulted 164 in steady-state response levels ( Figure 3A) that were similar for these 3 turn-inducing motion patterns 165 ( Figure 3B). These results clearly show that different visual motion patterns result in distinct responses 166 and that flying flies steer by changing different combinations of wing stroke sum and difference 167 ( Figure 3C and Movie S1). Importantly, we have (for the first time) recorded the behavioral reactions of 168 flying Drosophila to different visual motion stimuli over a speed range that includes stimuli faster than 169 the flies' ability to detection motion ( Figure 3C). By comparing the speed tuning curves for the 5 (of 6) 170 motion types that produced large behavioral reactions (thrust reactions were far more subtle; Figure S1) 171 we find that temporal frequency tuning to different visual stimuli is rather consistent ( Figure 3C, right), 172 peaking at ~8 Hz, and with a remarkably similar roll-off in response to higher motion speeds. These 173 findings are largely in agreement with the peak tuning of the yaw and thrust behaviors of walking flies 38 , 174 although we did not vary spatial frequency as was done in that study, and walking flies appear to be much 175 more sensitive to thrust motion than we found in our flight measurements. The diversity of response 176 levels to slower stimuli may reflect gain differences or contributions from visual pathways with distinct 177 tuning. Nevertheless, the similar temporal frequency optimum for flight reactions to different patterns of 178 visual motion, provides strong support for the view that a limited set of neurons (presumably the 179 directionally selective T4 and T5 cells) set the tuning properties of diverse behavioral responses. Projection Neurons (VPNs) can be mapped using a small moving stimulus at many points around the 186 animal 12,39 . However, not all neurons are sufficiently driven by these hyper-local stimuli, and more 187 complex combinations of input stimuli cannot be tested with this approach. We designed a motorized G4 188 display that overcomes these limitations by presenting larger, more flexible visual stimuli across the field 189 of view, while working within the highly constrained space of a typical 2-photon imaging microscope. Applying this technique to the recently described, regressive-motion-selective cell type LPC1 40,41 , we 207 confirmed that regressive (i.e. back-to-front) motion presented to the eye ipsilateral to the imaged LPC1 208 glomerulus resulted in strong LPC1 calcium responses ( Figure 4D,E top), while motion on the 209 contralateral eye did not ( Figure 4F, blue). Additionally, when a regressive (i.e. activating) motion 210 stimulus on the ipsilateral eye was paired with 2 nd motion stimulus in a different location simultaneously, 211 we found that progressive (i.e. front-to-back) motion on the contralateral eye consistently reduced LPC1 212 activation (4E, bottom). By mapping the activating and modulating motion directions in multiple visual 213 field locations on the ipsilateral eye, contralateral eye, and binocular overlap region ( Figure 4F, individual 214 fly responses in Figure S3), we found the LPC1 population to be highly selective not just to regressive 215 ipsilateral motion, but to the global pattern of optic flow produced by back-to-front translational motion 216 ( Figure 4G). This method, relying on a motorized G4 display and low-occlusion head mount, can be used 217 to map the excitatory and inhibitory directional receptive fields of any visual projection neurons (or their 218 targets in the central brain) over a wide field of view. Commercial displays designed for humans are inadequate for studying color vision in animals that see 224 wavelengths outside of the human visible spectrum, notably into UV. G4 display panels are designed for 225 standard surface-mount LEDs (0603 package), so that panels can be assembled with different LED 226 combinations. We built an arena using 2-color panels with a checkerboard arrangement of short-227 wavelength UV (365 nm) and green (570 nm) LEDs, to explore the relationship between color and 228 motion vision in Drosophila ( Figure 5A). This combination of LEDs is also well suited to mice and 229 zebrafish spectral sensitivities 21-23 . Since LEDs provide uniform illumination for each color channel and 230 from each panel, we arranged the panels into an experimentally convenient arenas (e.g. a hemi-cylinder) 231 without the need to compensate for the significant differential scattering of short vs. long illumination 232 wavelengths that complicate the implementation of projected displays 44 . Using our new 2-color display, We tested the contribution of UV and green wavelengths to the perception of visual motion by testing 238 whether they can be balanced at a point of isoluminance. At this condition, the relative motion of objects 239 rendered with different wavelengths cannot be detected. Pixelwise intensity control makes the G4 system 240 ideal for asking whether intensity combinations exist where the opposing motion of UV and green 241 patterns exhibit isoluminance. We found that for bright green and dim UV combinations, flies turn in the 242 direction of green. Similarly for bright UV and dim green they turn in the direction of the UV pattern 243 motion ( Figure 5C). However, we could not find an isoluminance point with these modest steps in 244 relative intensity between the green and UV patterns tested. To see whether we could reliably establish 245 isoluminant conditions, we used "stretch" modulation of a UV-only and green-only grating pattern 246 displayed on alternating refresh cycles (halving the effective refresh rate of this 2 brightness level pattern 247 to 500 Hz). While holding the green brightness constant, across trials, we adjusted the UV brightness in 248 <1% increments. With this fine intensity control we found that UV/green isoluminance is indeed a 249 reliable feature of the D. melonagaster optomotor visual response ( Figure 5D).  Figure 6B). Surprisingly, this behavior was unaffected by substantially increased I/O latency. 284 Flies could reliably orient towards the bar when its position updated at 50 Hz or 33 Hz, with 285 correspondingly longer delays (additional 10 ms and 20 ms) between the animal's measured behavior and 286 the display's update. However, the orienting behavior noticeably degraded with 25 Hz updates, while 287 only modest tracking was seen with 15 Hz updates, and no tracking at all was seen for 7.5 Hz updates 288 ( Figure 6B). For stripe fixation behavior in flying flies, we find that a closed-loop update of 33 Hz 289 (~30 ms I/O latency) or faster is compatible with stable object tracking, while longer delays apparently 290 break the 'realism' of this virtual reality experiment. Flight behavior is notoriously fast, and we expect 291 that more complex tasks, such as those defined not by the position of features but instead by their motion 292 (such as optic flow), may require lower closed-loop latency. Even these requirements should be well 293 within the capability of the G4 system, which we demonstrated is more than fast enough to accommodate 294 VR experiments in Drosophila. 295

296
Using the G4's ability to flexibly render scenes and record multiple channels of behavioral responses in 297 near real-time, we set out to create more complex, naturalistic closed-loop experiments by extending the 298 stripe fixation closed-loop task to 2-dimensions. The environment (generated using 16 brightness levels) 299 was composed of a dark vertical stripe over a green background consisting of a bright sky, and a slightly 300 darker ground, which are typical characteristics of sunlit open woodlands 51 . Tethered, flying flies 301 controlled the azimuthal rotational velocity through L-R differences ( Figure 6C, top). Building on this, we 302 then allowed flies to control the pitch position of the pattern, allowing the horizon to drift up and down 303 through changes to L+R, which we showed is related to pitch reactions ( Figure 3B) and is known to 304 contribute to body pitch control in flight 52 . Flies flying in this virtual environment controlled their 305 position in both dimensions. They mainly oriented towards the stripe, while also holding a relatively 306 constant pitch angle, with a preferred position for the horizon ( Figure 6C, middle). We further enhanced 307 this closed-loop simulation by replacing the green colored 'sky' with UV of a similar apparent brightness 308 (informed by the isoluminance experiment, Figure 5). Under this UV-sky condition, stripe fixation was 309 reduced and the pitch angle was much less stable, with flies often pitching fully downward ( Figure 6C, 310 bottom). Across all 3 conditions, the flies' motor output was similar, except that the 'UV sky' condition 311 resulted in strong reductions in flight vigor (measured by L+R, Figure 6D), suggesting perhaps that the 312 sky is too bright. By pre-rendering the individual elements of these complex scenes (sky, stripe, horizon, 313 and ground) and then arranging and coloring them in real-time according to the experimental condition 314 and the animal's estimated heading in the virtual environment, the frame rendering times are <1 ms, 315 maintaining the system at near minimum total latency (~17 ms), demonstrating the G4 system's flexibility 316 in powering these elaborate, interactive VR experiments.  presenting frames in a pre-set order stored as a "position function" (see "Display system software" 424 section). In closed-loop modes, an external signal acquired as an analog input determines which frames of 425 a pattern are presented (after some scaling) in the following display updates (see Figures 5 and 6 for an 426 example of closed-loop landmark fixation).

428
Position function Uses a position function (array of frame indices) to determine which frame number of a pattern will be displayed at each refresh cycle of the display Constant frame rate Cycles through the pattern in order at a constant frame rate Static frame Shows only a single frame from the pattern Closed-loop sets frame rate Cycles through pattern frames at a rate set by the ADC0 analog input voltage (modified by a gain and offset), with negative values cycling from higher to lower frame indices Closed-loop + position function Combines "closed-loop sets frame rate" with pattern frame position set by a position function Closed-loop sets frame index Displays pattern frames by scaling the analog input voltage to the pattern frame indices Closed-loop streaming Displays frames streamed from the high-level control software over the TCP/IP connection as fast as new frames can be delivered, up to 100 Hz, while sending 4 analog inputs back at the same rate to be used by the control software to generate new frames 429 via TCP/IP and passed on to the FPGA. The host application also provides a simple GUI to interact with 440 the system which internally sends the same TCP/IP commands specified in the protocol. In the next 441 section we describe a MATLAB-based user-friendly high-level method to interact with the host 442 application.

444
High-level display system software 445 446 For increased user-friendliness, the TCP/IP protocol commands of the host application, that ultimately 447 controls the FPGA and SPI commands sent to the panels, are accessible via high-level MATLAB control 448 software. MATLAB can be used to design, render, and save patterns prior to an experiment and to handle 449 the logic of the experimental structure. In addition, MATLAB can access analog input data during the 450 experiment (down-sampled to 100 Hz), for example to plot data during an experiment or to create custom 451 closed-loop streaming experiments (see Figure 6).

453
The software tools developed for the G4 display system are organized to follow the typical process of 454 creating visual patterns ("G4 Pattern Generator"), defining their temporal arrangement ("G4 Function 455 Generator"), defining an experimental protocol ("G4 Protocol Designer"), running experiments ("G4 456 Experiment Conductor), and processing logged data ( Figure 2). We note that these tools can also be used 457 independently to simplify specific tasks in cases where an advanced user wishes to further customize their 458 experiments.

460
The "G4 Pattern Generator" determines what a pattern (either a single-frame or multi-frame sequence) 461 will look like, based on user-defined parameters. This GUI tool allows for rapid visualization and 462 generation of various commonly used patterns such as lines, edges, gratings, and star fields, 463 parameterized for example by size, brightness, and movement direction. Using a 3D customizable 464 coordinate-based pixel map of the arena, patterns can be generated that simulate rotation around or 465 translation along a fixed point within the arena. The resulting patterns are presented either on the entire 466 display (e.g. see Figure 3) or in restricted spatial locations within the simulated visual field (e.g. see 467 Figure 4). To extend the functionality of the G4 Pattern Generator, users can modify, combine, or create 468 new patterns from scratch using scripts.

470
The "G4 Function Generator" can be used to specify when and how long each frame of a pattern is shown 471 on the display. These "Position Functions" set the temporal "position" of each pattern frame relative to 472 the display's refresh rate; when using 16 brightness level patterns at 500 Hz refresh rate, frame positions 473 will be set for every 2 ms, while patterns with two brightness levels at 1 kHz refresh rate will be set for 474 every 1 ms. Position Functions determine the order that frames of a pattern are displayed, as well as the 475 parameters defining pattern motion (e.g. constant speed drifts or sinusoidal motion). Additionally, the 476 "G4 Function Generator" tool can generate Analog Output (AO) Functions: just as Position Functions 477 control the displayed frame for every refresh cycle, AO functions control the voltages of 4 (extendible to 478 8) analog output channels for every refresh cycle, allowing precisely synchronized external equipment.

480
The "G4 Protocol Designer" is used to define the order in which the previously generated patterns are 481 sent to the display as part of an experiment protocol. A "G4 protocol" can be as simple as an ordered list 482 of patterns to be streamed to the display. A more sophisticated protocol may contain associated Position 483 or AO Functions, queueing patterns in randomized repeating blocks, with different display modes, 484 separated by delay periods, custom inter-trial patterns, and closed-loop trials. Each trial can be previewed 485 to verify that the selected pattern is correctly presented, either in a simulated display on the PC screen or 486 on the G4 arena. When saving an experiment, the "G4 Protocol Designer" generates the information and 487 folder structure accessible by the host application and FPGA (via DMA) that are necessary to run the 488 experiment. The actual experiments can be initiated through another GUI application, the "G4 489 Experiment Conductor".

491
The "G4 Experiment Conductor" is a GUI to run experiments. It asks the user for additional metadata 492 about the current instance of the experiment such as the experimenter's username, timestamp, and details 493 of the experimental animal. After starting the experiment, a progress bar provides feedback. The 494 Conductor can also process and plot streamed analog input data once individual trials are completed. 495 This provides additional quality control mechanism by visualizing the collected data as it is being logged.

497
When an experiment has completed, a set of acquired data, including timestamped frame positions and 498 analog input channel data are logged as .tdms files (Technical Data Management Streaming, a National 499 Instruments file format). We provide additional data processing MATLAB scripts to load these files into 500 the MATLAB workspace, convert the data to the .mat file format, and plot basic analysis in several 501 standard ways (e.g., time series, histograms, and trial averages). If the desired plot formats are already 502 known during the protocol design, the parameters for processing and plotting data can be pre-determined 503 from within the G4 Protocol Designer tool and executed immediately after an experiment has completed. 504 This standardization greatly simplifies organizing experiments across setups and experimenters.

506
System Validation 507 508 Operation and performance of the G4 system was assessed using our typically sized 9 column, 4 row 509 cylindrical arena, using standard rotational grating and single-stripe patterns (e.g. Figure 5B). Display 510 brightness and LED spectra ( Figures 1C, 5A) were measured using a spectrometer (Ocean Optics 511 USB4000) with a fiber optic cable positioned over a single LED in the center of the cylindrical arena. The 512 timing of display refresh cycles and analog outputs to the internally logged pattern positions ( Figure 1D) 513 was calculated using the measurements of a fast photodetector (Thorlabs PDA10) logged as an analog 514 input channel, with an analog output wired back into the G4 system on a second analog input channel. 515 Closed-loop latency was measured using an external function generator to deliver TTL pulses into a G4 516 analog input to switch pattern frames, where the delay between logged input and photodetector changes 517 was measured.

519
Experimental Animals 520 521 Flies were reared under standard conditions: 25 o C, 60% humidity, 16h light / 8h dark, cornmeal agar diet. Dickinson's lab, from which the Reiser lab established a copy at Janelia in 2007. This strain has been 526 used in dozens of behavioral studies and has been referred to as 'DL' 14 . For calcium imaging experiments 527 (Figure 4), flies expressing GCaMP6m in LPC1 cells were used 40 . Cell type specific expression was 528 achieved using the Split-GAL4/UAS expression control system 53 .

530
Preparation for tethered flight 531 532 Flies were prepared for flying behavior experiments as previously detailed 13,54 and summarized here: 533 Cold-anesthetized flies were glued to a tungsten rod (catalog #71600; A-M Systems) by the thorax with 534 UV curing glue (KOA 300-1; Kemxert, Poly-Lite, York, PA, USA) and were given at least 30 minutes of 535 recovery time at room temperature and brightness before testing. All experiments were completed within 536 5 hours of tethering. 537 538 Preparation for in vivo calcium imaging 539 540 Flies were prepared for 2-photon imaging experiments similar to previously described experiments 42 with 541 some modifications as summarized here: Cold-anesthetized flies were tethered to a fine tungsten wire 542 using UV-curing glue. The two most anterior legs (T1) were severed and sealed with glue to prevent the 543 fly from grooming and obstructing its visual field. The tethered fly was positioned up to the opening of a 544 custom-machined PEEK plastic conical mount to allow access to the back of the head for dissection and 545 imaging with fly saline (103mM NaCl, 3mM KCl, 1.5mM CaCl2, 4mM MgCl2, 26mM NaHCO3, 1mM 550 NaH2PO4, 8mM trehalose, 10mM glucose, 5mM TES) and a hole in the cuticle was cut to expose the 551 PLP region of the brain using a fine tungsten needle and holder (Fine Science Tools #10130-05, # 26018-552 17.) Muscles 1 and 16 55 were severed to reduce motion of the brain within the head capsule and excess fat 553 was removed from the surface of the brain. 554 555 Visual Stimuli 556 557 Visual stimuli were presented to tethered flies using two configurations of the G4 display system. diffusing screen to prevent reflections from the filter. All patterns that were displayed on the cylindrical 564 LED arenas were generated using the G4 Pattern Generator tools for parameterizing and visualizing 565 moving grating patterns for this display system. Displaying these patterns at multiple different 566 speeds/temporal frequencies was accomplished by changing the rate in which the frames of the patterns 567 are cycled through, defined by Position Functions. 568 569 All grating stimuli consisted of square-wave gratings set to 60 o spatial wavelength and 100% contrast -570 other parameters of the grating patterns (e.g. temporal frequency and color) are as described in the text. 571 For tethered flight optomotor experiments (Figure 3 and 5B), a single trial of open-loop grating stimuli 572 began with a 2 second pre-trial of closed-loop condition of a dark vertical bar (30 o width) on a green 573 background, called "stripe fixation". This was followed by a 2 second trial of the moving grating 574 stimulus, followed by another 2 second post-trial stripe fixation. The stripe fixation pre-and post-trial 575 segments were interspersed between all moving grating trials to maintain robust flying behavior for the 576 duration of the experiment. For UV-green competing grating stimuli, the closed-loop stripe fixation inter-577 trial only used the green LEDs for the stripe pattern. For calcium imaging experiments (Figure 4), a single 578 trial consisted of a 2 second pre-trial of a uniform medium brightness frame, followed by a 1 second trial 579 of moving gratings, followed by another 2 second post-trial of the uniform frame, after which the arena 580 was turned dark for 4 seconds to allow the arena to pitch to its new position. The increased time in 581 between trials in calcium imaging experiments relative to behavioral experiments allowed more time for 582 the GCaMP6m fluorescence signal to return to baseline fluorescence levels. For stripe fixation and VR 583 behavioral experiments ( Figure 5B and 6), each trial consisted of 15 seconds of closed-loop stripe fixation 584 beginning with the stripe in a random orientation around the fly, after which a 5 second inter-trial of stripe 585 fixation took place at default pattern settings (green background), beginning with the stripe directly in 586 front of the fly. All trials of visual stimuli to be presented to a fly were pre-generated and their order of 587 presentation was randomly permuted, and every trial was repeated 3 times in a new permuted order 588 (random block trial structure), after which a mean response for each experimental condition was 589 calculated for each fly individually by averaging the 3 repetitions. These experiments were implemented 590 with custom scripts, and these protocols formed the basis for the standardized functionality that is now 591 supported by the new software tools described in the section "High-level display system software." 592 593

Pitching Arena 594 595
A (2/3) cylindrical arena with 230 mm interior diameter and panels in 12 columns covering 240° azimuth 596 and 3 rows was attached on one side to a motorized rotary stage (Zaber X-RSW-E, Zaber Technologies) 597 using a custom mount to pitch the arena around a tethered fly (Figure 4). The rotary stage mount was 598 aligned to the center of the column 2, row 2 LED panel, while a free-rotating mount was aligned to the 599 center of the column 11, row 2 LED panel, to create a stable axis of rotation through the center of the 600 arena (mounts at ±90° to the sides of the fly). Rotation commands were sent to the rotary stage using a 601 MATLAB plugin, allowing the arena pitch and G4 display system to be controlled within the same script.

603
Tethered flying equipment and analysis 604 605 The methods used for behavioral experiments have been detailed previously 8,13 but are summarized here. 606 Tethered flies were positioned in a hovering posture (60 o body angle) in the center of the LED arena. 607 Wing motion was measured using a "wingbeat analyzer" that measures the amplitude of the left and right 608 wingbeats using an optical detector of the shadow size created by infrared LED placed above the fly. The 609 relevant behavioral outputs of the wingbeat analyzer are the instantaneous measurements of the wingbeat 610 amplitude of the fly's left and right wings, the instantaneous difference between the left and right 611 wingbeat amplitudes (left minus right, "L-R"), and the frequency of wingbeats. These analog signals were 612 acquired and logged at 1 kHz using the on-board data logging of the FPGA based reconfigurable I/O PCIe 613 card used as the controller of the G4 display system (see section "Low-level PC control of display  614 system"). The L-R signal was processed separately either using the on-board fast (1 kHz frame rate) 615 closed-loop mode of the G4 system (Figures 3, 5) or in MATLAB for closed-loop streaming experiments 616 ( Figure 6). All flies were positioned such that the wingbeats as measured by the wingbeat analyzer and 617 visualized through an oscilloscope were of similar shape and amplitude, and that the flies were able to 618 fixate a stripe in closed-loop operation, where the L-R signal (a reliable measurement of yaw rotational 619 torque) controlled the apparent rotation of a dark vertical stripe (30 o wide) on a bright arena background. 620 621 Using the logged left and right wingbeat amplitude data, a time-series of "L-R" was calculated by 622 subtracting the right wingbeat amplitudes from the left for every timepoint, and a time-series of "L+R" 623 was calculated by adding the left and right. The time-series of L-R, L+R, and wingbeat frequency (WBF) 624 measurements in response to multiple repetitions of each visual stimulus were averaged together to 625 produce a mean response for each fly, to each visual stimulus. For the mean time-series of L-R, all flies 626 were then normalized to the 98 th percentile, a robust estimator of the max, L-R measurement across all 627 conditions, which scaled the values to range from approximately -1 to +1, where -1 is a near maximum 628 leftward (or counter-clockwise) yaw turn and +1 is a near maximum rightward / clockwise yaw turn. For time-series and summary data are plotted as mean of all flies ± SEM. In some cases ( Figure 3C, 636 Figure 5B,C,D), a further averaging step is taken when bilaterally symmetric conditions, e.g. clockwise 637 and counter-clockwise versions of the same stimulus condition, are averaged, but with one set of 638 behavioral data also sign inverted (to preserve the turning direction). Ultra II) and a 40x objective (Nikon CFI APO 60XW). The excitation power varied between 18-22 mW 645 at the sample. T-series of z-stacks (z-series) were taken at 128×128×5 pixel resolution (0.446 µm/pixel 646 along x and y axes; 5 µm/pixel in z-axis) at a rate of 2.02 Hz; x and y dimensions were scanned with 647 galvo-galvo scanning and the z-dimension with piezoelectric scanning. For each experimental condition, 648 the acquisition of 10 z-stacks were triggered to start precisely 2 seconds before the start of the visual 649 stimulus using the on-board analog outputs of the display system: 4 z-stacks were recorded in the 650 2 seconds prior to the experimental stimulus, 2 recorded during the 1 second stimulus, followed by 651 another 4 after the stimulus had ended. 652 653 Z-series data was converted into a t-series by taking a mean z-projection of each z-stack. Forming a t-654 series by collapsing z-stacks minimizes the effects of brain motion in the z-direction. Motion in the x and 655 y directions was corrected using "imregister" in the MATLAB image processing toolbox: each image was 656 first gaussian filtered (alpha=3 pixels) to improve registration of noisy images and registered using the 657 'multimodal' metric and 'rigid' transformation. Using this motion-stabilized t-series, an ROI was 658 manually selected from the entire LPC1 glomerulus, excluding any additional axon tract that didn't 659 overlap with the glomerulus. The mean pixel value within the entire ROI was used as the population  Figure 4E). 666 The average response to each direction and the effect of modulation by stimulation at a second location 667 are converted to polar coordinates and the simple vector sum is used to summarize these responses 668 ( Figure 4F).

738
Visible area in white, occluded area in gray (new design compared to prior design in Figure S2). Spatial locations 739 targeted for receptive field mapping of the fly's un-occluded visual field (bottom). C) The spatial locations (in 740 Mercator projection) of the pitching LED arena used to stimulate the 9 receptive field locations: north for presenting 741 motion in spots N1-3; equatorial for E1-3; south for S1-3. D) The split-GAL4 line used to target LPC1 neurons.

748
Baseline response to E1 rightward motion in gray; modulated response by simultaneous E3 motion in magenta.

749
Example stimulus for E3 right is shown. Data plotted as mean ± SEM, N=8 flies. The combined vector sum (right)  amplitudes (L+R), and the wingbeat frequency (WBF) in response to front-to-back and back-to-front 797 drifting gratings ("thrust" motion) at 8 temporal frequencies. Each trial began with 2 seconds of stripe 798 fixation (data not shown, see Methods for details), followed by 2 seconds of the specified drifting grating 799 (gray shaded region), followed by another 2 seconds of stripe fixation. Responses are shown as mean ± 800 SEM of N = 21 flies, with reactions to front-to-back thrust in red and back-to-front thrust in blue.

801
Temporal frequency tuning curve summaries (mean ± SEM during the 2-second stimulus window, of N = 802 21 flies) of the behavioral reactions. controls the yaw rotational velocity of the environment while its pitch behavior (measured as L+R) 834 controls the pitch angle of the environment, capped at ±45°. 2D (pitch and yaw) orientation histograms of 835 flying flies in this environment, with separate yaw and pitch histograms shown above and to the right, 836 respectively. Data is combined from N = 13 flies, completed as part of the same series as in Figure 6. closed-loop stripe-fixation trials. This movie is played back at 50% speed and is composed of two 845 different synchronized videos combined to show the display arena as well as the flies' reactions. The fly 846 is suspended below an infrared LED that casts a shadow onto a pair of detectors below. These shadows 847 are measured to quantify the amplitude of each wing's stroke. Flies respond to right (clockwise) and left 848 (counter-clockwise) yaw rotational motion stimuli by differentially modulating their wingbeat amplitudes 849 (L-R), with little effect on the sum of left and right wingbeat amplitudes (L+R). Conversely, flies respond 850 with large changes in L+R to up and down lift translation motion stimuli, with minimal L-R changes. 851 During closed-loop stripe fixation, the actively orients towards the dark vertical stripe by turning. The 852 L-R signal controls the rotational velocity of the stripe. environment featuring a UV sky, green ground, dark green horizon, and single dark vertical stripe. As the 876 fly modulates its wingbeat amplitude differential (L-R) and sum (L+R), the yaw rotational position of the 877 stripe and the pitch position of the horizon are changed accordingly. (bottom) The 2D histogram of the 878 fly's yaw (x-axis) and pitch (y-axis) position, beginning from the trial start over the course of the trial. 879 The center of the histogram represents the [0,0] orientation of the visual scene (when the vertical stripe is 880 straight ahead, and the horizon is level). More time spent at any specific orientation results in a brighter 881 color at its associated position in the histogram. 882 883