Protocol

Real-Time Tracking of Multiple Moving Mosquitoes

  1. Florian T. Muijres2
  1. 1Institute of Biology I & Bernstein Center Freiburg, Faculty of Biology, Albert-Ludwigs-Universität Freiburg, 79085 Freiburg im Breisgau, Germany
  2. 2Department of Experimental Zoology, Wageningen University, 6708 PB Wageningen, the Netherlands
  1. 3Correspondence: straw{at}bio.uni-freiburg.de

Abstract

Tracking mosquitoes in real time, as opposed to recording video files and performing the tracking step later, is useful for two reasons. The first is efficiency. Real-time tracking requires less storage because video images do not need to be saved and followed by a tracking step. The second is that tracking data can be used to interact with the animal in some way, such as triggering the approach of a looming object. In this protocol, we discuss the use of Braid, free software for performing real-time, multicamera, multianimal tracking. We describe a setup with four cameras capable of tracking the three-dimensional (3D) position of mosquitoes at 100 frames per second in a volume of 30 cm × 30 cm × 60 cm with millimeter accuracy. The specific hardware configuration is flexible and can be substituted using different or additional components to adjust the tracking parameters as needed.

Materials

It is essential that you consult the appropriate Material Safety Data Sheets and your institution's Environmental Health and Safety Office for proper handling of equipment and hazardous materials used in this protocol.

Equipment

Arduino Nano to generate camera synchronization pulses

Braid

Cameras (four), C-mount, global shutter, external triggered, 2.3-MP, monochrome, 160-fps (Basler a2A1920-160umBAS)

  • Braid version 0.10.1 supports only Basler cameras.

  • Each camera should have a USB3 cable and 1/4-20 tripod mount.

Camera lenses (four), varifocal 4–13-mm, f/1.5, near-infrared corrected, C-mount (Tamron M118VM413IR)

Checkerboard with flat and rigid backing for camera calibration

Computer, operating system, and camera software

  • Minimum specifications: Intel 64-bit CPU, 3-GHz or higher; 8-GB RAM or more; four USB3 ports on two separate USB3 busses (e.g., two USB3 ports on motherboard, two additional USB3 ports using PCIe card); Ubuntu version 20.04 desktop operating system (https://ubuntu.com); Braid version 0.10.1 (https://strawlab.org/braid).

Hardware trigger cables with M8 six-pin connector for camera synchronization

  • Available with cameras. Solder cable pin 4 (black) to Nano pin GND and cable pin 6 (pink) to Nano pin D9.

Near-infrared lights for illumination (850-nm)

White or red light-emitting diode (LED) for camera calibration (small, handheld, battery-powered)

METHOD

Figure 1.
View larger version:
    Figure 1.

    The workflow for real-time tracking experiments and analysis.

    Lighting, Camera, and Lens Setup

    • Proper alignment/setup of all cameras is important from experimental inception (see Protocol: Designing a Generic Videography Experiment for Studying Mosquito Behavior [Muijres et al. 2022]). When the camera orientation, zoom, aperture, or focus position is changed, one must reconsider the camera calibration for both lens distortions and stereo calibration of the multiple cameras in the tracking system. The pylon Viewer provided by Basler (2021) provides some useful features for camera alignment such as a grid, histogram, and focus assistant.

    • 1. Launch pylon Viewer.

    • 2. Connect all cameras and start the video stream for each. Set camera trigger mode to “off” and use free run.

    • 3. Arrange infrared lights such that the entire tracking volume is well lit for all cameras.

    • 4. Fully open the lens aperture.

    • 5. Adjust the exposure time to get a well-lit scene and so that the reference points in the setup are clearly visible; adjust the focus if needed. Frame rate is not yet important for this step.

      • Use the camera's histogram to check whether the entire dynamic range is used. Sensor gain or pixel binning is optional to increase image brightness.

    • 6. Adjust the camera orientation and lens zoom such that as much of the tracking volume (or part of it) is in view.

      • Stop down the lens aperture to increase the depth of field.

    • 7. Repeat Steps 1–6 for each camera. Ensure that each point in the tracking volume is visible by at least two camera views. A 90° angle between cameras maximizes stereo information. More cameras are helpful to improve tracking performance.

    Synchronization Cables and Trigger Box Setup

    • 8. Connect each camera with the trigger cable to the Arduino Nano.

    Configuring Braid to Connect to Your Cameras

    • 9. Prior to running Braid, generate a configuration file with the identity of the cameras, a calibration, if available, and tracking parameters. Any values not explicitly set in the configuration file will default to reasonable values.

    • 10. To see all possible configuration items and their default settings, type “braid default-config” at the command line. Save the output of this program to a TOML file like “config.toml” then edit the file name “config.toml” to include the individual names of your cameras. Remove all values for which the defaults are OK.

    • 11. Ensure your configuration is correct by running “braid show-config config.toml.”

    Camera Intrinsic Parameter Calibration

    • 12. Start Braid with “braid run config.toml.”

      • Two stages of camera calibration are required, and they must be performed in sequential order. The first stage calibrates the intrinsic parameters of the camera. These are the focal length, optical center pixel, and any image distortion but do not include the position of the camera.

    • 13. From the Braid main page, launch Strand Cam for the first camera to open the Strand Cam user interface for each camera.

    • 14. Disable object detection by unchecking the “object detection” checkbox and go to the “checkerboard calibration” panel.

    • 15. Enter the checkerboard size in the user interface.

    • 16. Click “enable checkerboard calibration.”

    • 17. Collect at least 10 checkerboard images by holding the checkerboard in front of the camera. To accurately calibrate the camera's distortion, which is highest near the corners, ensure the checkerboard is visible in the corners of the image.

    • 18. To perform and save the calibration, click the “perform and save calibration” button. Repeat this procedure for all cameras.

    Multicamera Calibration

    • Here, the MultiCamSelfCal calibration routine (Svoboda et al. 2005) is used to create a single calibration for all cameras.

    • 19. Reduce camera exposure duration and set the object detection parameters in Strand Camera so that the LED is detected when turned on, but no other points are detected.

      • Turn off the infrared lights used for illumination.

    • 20. Begin recording a data file in the Braid user interface.

    • 21. Wave the LED smoothly through the tracking volume and ensure it tracked from all cameras, with at least some frames where it is visible from three or more cameras.

      • Turn the LED on and off intermittently while moving through the tracking volume to validate synchronization.

    • 22. Stop saving the data.

    • 23. Run the MultiCamSelfCal calibration code to generate an initial multicamera calibration that is not yet “aligned.”

    • 24. Align the new calibration to the correct coordinate frame with the “flydra analysis calibration align GUI” tool.

      • Use an LED to track a path along a known path in the coordinate frame.

    Data Collection

    • 25. In the Braid user interface, begin recording a data file.

    • 26. Perform your experiment.

    • 27. Stop saving the data. Braidz files can be viewed using the viewer at https://braidz.strawlab.org/ and analyzed starting with example scripts available in the User's Guide for Braid and Strand Camera (Straw et al. 2011; Straw 2021).

    ACKNOWLEDGMENTS

    This work was supported by a grant from the Momentum Programme of the Volkswagen Foundation to A.D.S.

    Footnotes

    • From the Mosquitoes collection, edited by Laura B. Duvall and Benjamin J. Matthews.

    REFERENCES

    | Table of Contents

    This Article

    1. Cold Spring Harb Protoc 2023: pdb.prot107927- © 2023 Cold Spring Harbor Laboratory Press
    1. All Versions of this Article:
      1. pdb.prot107927v1
      2. 2023/2/pdb.prot107927 most recent

    Article Category

    1. Protocol

    Personal Folder

    1. Save to Personal Folders

    Updates/Comments

    1. Alert me when Updates/Comments are published

    ORCID

    Share