Abstract
For the information content of microscopy images to be appropriately interpreted, reproduced, and meet FAIR (Findable Accessible Interoperable and Reusable) principles, they should be accompanied by detailed descriptions of microscope hardware, image acquisition settings, image pixel and dimensional structure, and instrument performance. Nonetheless, the thorough documentation of imaging experiments is significantly impaired by the lack of community-sanctioned easy-to-use software tools to facilitate the extraction and collection of relevant microscopy metadata. Here we present Micro-Meta App, an intuitive open-source software designed to tackle these issues that was developed in the context of nascent global bioimaging community organizations, including BioImaging North America (BINA) and QUAlity Assessment and REProducibility in Light Microscopy (QUAREP-LiMi), whose goal is to improve reproducibility, data quality and sharing value for imaging experiments. The App provides a user-friendly interface for building comprehensive descriptions of the conditions utilized to produce individual microscopy datasets as specified by the recently proposed 4DN-BINA-OME tiered-system of Microscopy Metadata model. To achieve this goal the App provides a visual guide for a microscope-user to: 1) interactively build diagrammatic representations of hardware configurations of given microscopes that can be easily reused and shared with colleagues needing to document similar instruments. 2) Automatically extracts relevant metadata from image files and facilitates the collection of missing image acquisition settings and calibration metrics associated with a given experiment. 3) Output all collected Microscopy Metadata to interoperable files that can be used for documenting imaging experiments and shared with the community. In addition to significantly lowering the burden of quality assurance, the visual nature of Micro-Meta App makes it particularly suited for training users that have limited knowledge of the intricacies of light microscopy experiments. To ensure wide-adoption by microscope-users with different needs Micro-Meta App closely interoperates with MethodsJ2 and OMERO.mde, two complementary tools described in parallel manuscripts.
Background
The establishment of community-driven, shared documentation and quality control specifications for light microscopy would allow to appropriately document imaging experiments, minimize errors and quantify any residual uncertainty associated with each step of the procedure 1–6. In addition to providing essential information about the provenance (i.e., origin, lineage) 7,8 of microscopy results, this would make it possible to faithfully interpret scientific claims, facilitate comparisons within and between experiments, foster reproducibility, and maximize the likelihood that data can be re-used by other scientists for further insight 5,6,9,10. First and foremost, such information would serve to facilitate the compilation of accurate Methods sections for publications that utilize the quantitative power of microscopy experiments to answer scientific questions 11–13. Furthermore, it would provide clear guidance to the manufacturers of microscopy instruments, hardware components, and processing software about what information the scientific community requires to ensure scientific rigor so that they can be automatically provided during acquisition and written in the headers of image files. Last but not least, machine-actionable versions of the same information 14 could be provided alongside image datasets on the growing number of public image data resources 3 that allow the deposition of raw image data associated with scientific manuscripts, a promise to emulate for light microscopy the successful path that has led to community standards in the field of genomics 15–19 (e.g., the IDR 20, EMPIAR 21, and Bioimage Archive 22 hosted at the EMBL - EBI; the European Movincell 23; the Japanese SSBD hosted by RIKEN 24; and, in the USA, the NIH-funded Cell Image Library 25,26, BRAIN initiative’s imaging resources 27, the Allen Cell Explorer 28, and the Human Cell Atlas 29–32).
In order to promote the development of shared community-driven Microscopy Metadata standards, the NIH funded 4D Nucleome (4DN) 33,34 and the Chan Zuckerberg Initiative (CZI) funded BioImaging North America (BINA) Quality Control and Data Management Working Group (QC-DM-WG) 35 have recently proposed the 4DN-BINA-OME (NBO) a tiered-system for Microscopy Metadata specifications 36–39. The 4DN-BINA-OME specifications lay the foundations for upcoming community-sanctioned standards being developed in the context of the Metadata Working Group (WG7) of the QUAlity Assessment and REProducibility for Instrument and Images in Light Microscopy (QUAREP-LiMi) initiative (quarep.org) 4,40. Their purpose is to provide a scalable, interoperable and Open Microscopy Environment (OME) 41–43 Next-Generation File Format (NGFF) 44 compatible framework for light microscopy metadata guiding scientists as to what provenance metadata and calibration metrics should be provided to ensure the reproducibility of different categories of imaging experiments.
Despite their value in indicating a path forward, guidelines, specifications and standards on their own lack the one essential feature that would make them actionable by experimental scientists faced with the challenge of producing well-documented, high-quality, reproducible and re-usable datasets: namely easy-to-use software tools or even better-automated pipelines to extract all available metadata from microscope configuration and image data files.
While some advances have been proposed, such as OMERO.forms 45, PyOmeroUpload 46 and MethodsJ 5, these tools only offer limited functionalities, are not integrated with community standards and are not per se future proof. To provide a way forward, in this and in two related manuscripts, we present a suite of three interoperable software tools (Supplemental Figure 1) that were developed to provide highly complementary, intuitive approaches for the bench-side collection of Image Metadata, with particular reference to Experimental Metadata and Microscopy Metadata 37,38. In two related manuscripts, we describe: 1) OMERO.mde, which is highly integrated with the widely used OMERO image data management repository and emphasizes the development of flexible, nascent specifications for experimental metadata 47–49; and 2) MethodsJ2 50, which is designed as an ImageJ plugin and emphasizes the consolidation of metadata from multiple sources and automatic generation of Methods sections of scientific publications.
In this manuscript, we present Micro-Meta App (Figure 1), which works both as a stand-alone app on the user’s desktop and as an integrated resource in third party web data portals. It offers a visual guide to navigate through the different steps required for the rigorous documentation of imaging experiments (Figures 2-4) as sanctioned by community specifications such as the 4DN-BINA-OME (NBO) Microscopy Metadata specifications that were recently put forth to extend the OME Data Model 36–38,51.
Methods: Implementation and Availability
Micro-Meta App is available in two JavaScript (JS) implementations. The first was designed to facilitate the incorporation of the software in existing third party web portals (i.e., the 4DN Data Portal) 34,52 and was developed using the JavaScript React library, which is widely used to build web-based user interfaces. Starting from this version, a stand-alone version of the App was developed by wrapping the React implementation using the JavaScript Electron library, with the specific purpose of lowering the barrier of adoption of the tool by labs that do not have access or prefer not to use imaging databases. More details about the implementation of Micro-Meta App are available in Supplemental Material.
In order to promote the adoption of Micro-Meta App, incorporation in third party data portals and re-use of the source code by other developers, the executables and source code for both Javascript React and Electron implementations of Micro-Meta App are available on GitHub 53,54. In addition, a website describing Micro-Meta App 55 was developed alongside complete documentation and tutorials 56.
Results + Discussion
Micro-Meta App: an intuitive, highly visual interface to facilitate microscopy metadata collection
While the establishment of data formats, metadata standards and QC procedures is important, it is not per se sufficient to make sure reporting and data quality guidelines are adopted by the community. In order to ensure their routine utilization, it is, therefore, necessary to produce software tools that expedite QC procedures and image data documentation and make it straightforward for investigators to reproduce results and make decisions regarding the utility of specific datasets for addressing their specific questions. However, despite the availability of the OME Data Model and Bioformats, the lack of standards has resulted in a scarce adoption of minimal information criteria and as a result the metadata provided by instrument and software manufacturers is scarce (Supplemental Table I and II).
Micro-Meta App was developed to address this unmet need. Micro-Meta App consists of a Graphical User Interface (GUI)-based open-source and interoperable software tool to facilitate and (when possible) automate the annotation of fluorescence microscopy datasets. The App provides an interactive approach to navigate through the different steps required for documenting light microscopy experiments based on available OME-compatible community-sanctioned tiered systems of specifications. Thus Micr-Meta App is not only capable of adapting to varying levels of imaging complexity and user experience but also to evolving data models that might emerge from the community. At the time of writing, the App implements the Core of the OME Data Model and the tiered 4DN-BINA-OME Basic extension 36–38,51. Efforts to implement the current Confocal and Advanced as well as the Calibration and Performance 4DN-BINA-OME extensions are underway (see Future Directions). To achieve this goal, Micro-Meta App is organized around two highly-related data processing flows (Figure 1):
In the Manage Instrument modality, the App guides the users through the steps required to interactively build a diagrammatic representation of a given Microscope (Figures 2A and 3) by dragging-and-dropping individual components onto the workspace and entering the relevant attribute values based on the tier-level that best suites the microscope modality, experimental design, instrument complexity, and image analysis needs 38.
From this, Micro-Meta App automatically generates structured descriptions of the microscope Hardware Specifications and outputs them as interoperable Microscope.JSON files that can be saved locally, used by existing third-party web-portals 52, integrated with other software tools (MethodsJ2) and shared with other scientists, thus significantly lowering the manual work required for rigorous record-keeping and enabling rapid uptake and widespread implementation.
When user is ready to collect metadata to document the acquisition of specific image data sets, the Manage Settings section of the App automatically imports Hardware Specifications metadata from previously-prepared Microscope.JSON files and uses the BioFormats library 43 to extract available, OME-compatible metadata from an image data file of interest. From this basis, the App interactively guides the user to enter missing metadata specifying the tier-appropriate Settings used for a specific Image Acquisition session (Figures 2B and 4).
As a final step, the App generates interoperable paired Microscope- and Settings-JSON files that together contain comprehensive documentation of the conditions utilized to produce individual microscopy datasets and can be stored locally or integrated by third-party data portals (i.e., the 4D Nucleome Data Portal) 57.
Depending on the specific implementation of the Micro-Meta App being used (see Implementation section), the workflow varies slightly. The discussion below refers specifically to the stand-alone version of Micro-Meta App implemented in JavaScript Electron.
Manage Instrument Hardware
The purpose of this section of Micro-Meta App is to guide microscope users and custodians in the creation of accurate but at the same time intuitive and easy-to-generate visual depictions of a given microscope. This is done while collecting relevant information for each hardware component that scales with experimental intent, instrument complexity and analytical needs of individual imaging experiments depending on tier-levels sanctioned by the 4DN-BINA-OME Microscopy Metadata specifications 36–38,51. Specifically, the workflow (Figure 3) is composed of the following steps :
After launching the application, the user selects an appropriate Tier to be used (Figure 3A) to document a given imaging experiment as determined by following the 4DN-BINA-OME tiered specifications 36–38,51 and launches the Manage Instrument modality of Micro-Meta App by clicking the appropriate button (Figure 3B). Because Micro-Meta App was specifically designed to be tier-aware, Micro-Meta App automatically displays only metadata fields that are specified by 4DN-BINA-OME to belong to the tier that was selected upon launching the App (Figure 3A), thus massively reducing the documentation burden. In addition, to increase flexibility, the tier-level utilized for validation can be modified dynamically after opening the main Manage Instrument workspace. This way, the user can, for example, be presented with all Tier 2 appropriate fields while being required to only fill in Tier 1 fields for validation (see also point 3 ii).
Once entered in the Manage Instrument section, the user is given the option of selecting one of three different methods for managing an instrument (Figure 3C and D):
By selecting one of the two Create from scratch options (i.e., Create Inverted from scratch or Create Upright from scratch), the user is presented with a blank microscope canvas of the selected type to work with (Figure 3E).
When selecting Load from file, the user is asked to select a pre-existing Microscope.JSON file from the local file system. Such Microscope.JSON files could, for example, be a template file that was created by the microscopy platform, shared by a colleague, or downloaded from a repository for local use.
Finally, a pre-existing Microscope.JSON file that has already been processed and saved to the local Micro-Meta App’s Home Folder can be loaded for further editing using the Load from repository modality. Here existing Microscope.JSON files are listed by Manufacturer and by Instrument Name to facilitate the selection of the appropriate file (Figure 3D).
Regardless of the chosen Manage Instrument modality, in the next step, the user is presented with the main instrument management workspace where they can perform the following actions:
In the top bar, the user can select a different tier-level for validation with respect to the one that had been selected upon entering the current Micro-Meta App session (Figure 3E). This feature allows the user to fill additional fields while not being required to provide all mandatory fields for a given tier level.
By clicking the Edit microscope button (Figure 3E), the user can then enter attributes that refer to the instrument in general and that allow the description of the Microscope Stand the instrument is built upon. Upon exiting the Edit Microscope GUI, the application signals validation by changing the color of the dot on the button from red (i.e., incomplete) to green (i.e., complete and validated).
By clicking the sidebar Hardware Navigation selection menus, the user can identify a given hardware component and drag it to the appropriate position on the instrument canvas, or to one of the top right generic “drawers” that can accommodate free-floating components that do not have a pre-existing position on the canvas (i.e., additional Objectives that do not fit in the objective turret). In the example depicted, a blank Objective component is selected from the Magnification menu (Figure 3E, [1]) and dragged to one of the objective-slots on the canvas, where it “snaps in place” upon release (Figure 3E, [2]).
Once a component has been placed on the canvas, the icon is highlighted with a red dot to signal that the attributes have not been yet filled in and validated. Thus, the user is alerted about which microscope components are in need of attention and can quickly identify which icon to work on. By clicking on the icon (Figure 3F, [3]), the user gains access to different tabs, each containing simple forms that display the required, tier-appropriate, metadata fields sanctioned by 4DN-BINA-OME (Figure 3F, [4]). To further increase usability, the Confirm button found at the bottom of each window can be used to automatically jump to mandatory fields that have to be filled before exiting. Upon completing all required fields, the icon is highlighted with a green dot making it easier for the user to assess documentation progress.
Once all appropriate microscope hardware components for a given instrument have been added to the canvas and appropriately attended to, the resulting tier-specific microscope Hardware Specification descriptions are then output as structured and interoperable Microscope.JSON files. These files can be Saved locally (Figure 3G) or used by existing third-party databases, such as the 4DN Data Portal 34,57 (Figure 6), for later utilization during the Manage Settings modality of the Micro-Meta App (see next section). In addition, such files can be imported in MethodsJ2 and used to automatically generate the Methods and Acknowledgement sections of scientific publications as described in a parallel manuscript 50. Finally, the same files could be associated with a Research Resource ID (RRID) 58 to acknowledge the work of microscope custodians, used by reviewers of manuscripts,shared with other users of the same microscope or with colleagues that need to document similar instrumentation, and broadly disseminated through appropriate data portals 33,34.
Manage Image Acquisition Settings
This modality is used to produce metadata-rich descriptions of the Image Acquisition Settings utilized to produce a given image dataset (Figure 4). In this modality, Micro-Meta App: 1) imports an existing Microscope.JSON file describing the instrument used to acquire the image data to be documented; 2) utilizes BioFormats 43 to read available OME-specified microscopy metadata stored in the header of the desired image file by the manufacturer of the microscope or of the acquisition software (Supplemental Tables I and II); 3) interactively guides the user through the collection of the missing, instrument-specific, tier-appropriate, 4DN-BINA-OME sanctioned Image Acquisition Settings utilized to produce the selected image data; and 4) produces structured Microscope.JSON files that can be stored locally or in appropriate data portals alongside the images and the appropriate Microscope.JSON file. More in detail, the Manage Settings modality of Micro-Meta App articulates around the following steps:
After selecting a documentation tier (Figure 3A) and launching the App in the Manage Settings modality (Figure 4A), the user selects an available Microscope.JSON file from the local file system or a suitable repository (Figure 4B), selects an available image dataset to be annotated (Figure 4C) and either creates a new Microscope.JSON file or opens an existing file to edit (Figure 4D). The integration of the Bio-Formats API as part of Micro-Meta App permits the App to interpret image file headers, extract available metadata and populate Instrument-specific, OME-compatible, tier-appropriate metadata fields to facilitate metadata annotation.
As a result of the previous step, a diagrammatic representation of Image Acquisition Settings is displayed to the user and components or fields containing missing metadata values are highlighted in red to solicit the user’s attention (Figure 4E). After attending to all missing values, the user can then produce a validated 4DN-BINA-OME-compatible and tier-appropriate output Microscope.JSON file, which coupled with the corresponding Microscope.JSON file can be associated with the relevant image datasets in the local file system or on an appropriate repository, such as the 4DN Data Portal 34,52. The Manage Settings section of the App consists of four types of user interaction interfaces:
Simple buttons with associated tabbed data entry forms, such as those that allow inspection, editing or entry of general information about the image Pixel structure (Figure 4E, [1.1 and [1.2]).
Interface for the selection of one of the available hardware components, addition to the Microscope.JSON file and editing of associated settings metadata fields. This type of user interface is used for Edit Objective Settings (Figure 4E, [3.1], [3.2] and [3.3]) and is also used for Edit Imaging Environment, Edit Microscope Table Settings, Edit Microscope Stand Settings and Edit Sample Positioning Settings.
Specialized Plane-management interface (Figure 4E, [2.1], [2.2] and [2.3]). This interface is used either to inspect and, if necessary, edit the automatically imported Planes metadata or to record such metadata in case none was available in the header of the image data file to be annotated.
Specialized Channel-management interface. Special attention was dedicated to the development of the GUI utilized to define the configuration and settings of the Light Path (i.e., Light Source → FilterSet → Detector) associated with each individual Channel (Figure 4F and G). To this aim, an intuitive Channel GUI (Figure 4F, [4.1] and [4.2]) is organized graphically around a visual representation of the Fluorescence Light Path where users can select among different light sources, filters, and detectors available in the underlying Microscope.JSON file and provide the appropriate settings to configure a given Channel. For example, the user would first select a given Light Source among those available (Figure 4F, [6.1]), and then enter the appropriate Light Source Settings (Figure 4F, [6.2]). The same Channel-specific interface can also be used to manage advanced Light Path features, such as in cases in which a custom-developed microscope has to be described (Figure 4G).
Once all components have been selected and configured, the Image Acquisition Settings are compiled in a structured Microscope.JSON file and saved either locally or remotely as desired.
Beta testing
Micro-Meta App was developed in the context of community efforts organized around the 4DN Consortium need for imaging data dissemination and integration with omics datasets 33,34 and the BINA 35 effort to improve rigor quality control and reproducibility in light microscopy. As part of this effort, several core facilities were identified to serve as reference beta-testing sites for Micro-Meta App (Supplemental Table III). To this aim, the stand-alone JavaScript Electron implementation of Micro-Meta App was locally deployed and microscope custodians at individual beta testing sites were trained both on the use of the App and on bug and feature request reporting. Such feedback was collected either directly or by taking advantage of the GitHub issue-reporting feature and incorporated into the main development branch in a close-iterative cycle ahead of the release of the initial production version of the software.
Case Studies
Utilization at core facilities
In response to significant interest we observed in the community and after beta testing, microscope custodians at several light microscopy facilities (Supplemental Table III) volunteered to serve as case studies on the use of Micro-Meta App to document both microscope instrumentation and example published image datasets produced in microscopy platforms 59–78. The application was utilized by microscope custodians at 16 sites to document one of their microscopes alongside the settings utilized for published images based on the 4DN-BINA-OME model Core + Basic Extension 37,38,51 (Figure 5 and Supplemental Figures 2-16).
As indicated, the microscopes whose hardware was documented using Micro-Meta App comprised advanced custom-built microscopes, widefield microscopes and microscope stands associated with confocal systems, produced by all four major manufacturers. As a further testament of the robustness of the approach, several different major categories of imaging experiments were covered in this case study including:
Immunofluorescence imaging of the three-dimensional distribution of HIV-1 retroviral particles in the nucleus of infected human cells (Figure 5).
Three-dimensional visualization of superhydrophobic polymer-nanoparticles (Supplemental Figure 2).
Immunofluorescence imaging of cryosection of Mouse kidney (Supplemental Figure 4).
Three-dimensional immunofluorescence imaging of the phagocytic activity of human rhinovirus 16 (HRV16) infected macrophages (Supplemental Figure 7).
Live-cell imaging of N. benthamiana leaves cells-derived protoplasts transiently expressing YFP-tagged P. chromatophora proteins (Supplemental Figure 9).
Single-particle tracking of Halo-tagged PCNA in Lox cells (Supplemental Figure 12).
Live-cell imaging of bacterial cells expressing PopZ tagged with super-folder-GFP (Supplemental Figure 15).
Transmitted light brightfield visualization of swimming spermatocytes (Supplemental Figure 16).
In the case of commercial microscopes and given the type of experimental question and imaging modality, the most appropriate reporting tier-level was 4DN-BINA-OME Tier 2 38. The exception was represented by one case in which no quantitative analysis was necessary, for which Tier 1 was sufficient (Supplemental Figure 2). On the other hand, two custom-built systems were documented at Tier 3 (Figure 5 and Supplemental Figure 16), again as sanctioned by the 4DN-BINA specifications 38.
The most striking result of these use cases was that in comparison with the metadata reporting baseline represented by the metadata fields made available using BioFormats alone, the use of Micro-Meta App significantly increased the uniformity of reported metadata fields. This facilitates the comparison of image data within and across different microscopes and imaging experiments. In addition, because the underlying data model utilized by Micro-Meta App is dynamically defined on the basis of shared guidelines that can evolve depending on needs and technological development, the use of this documentation method maximized reproducibility, quality and value and minimized effort on the part of individual scientists. The JSON files produced are available as part of Supplemental Material.
Integration to 4DN Data Portal
One of the initial impetus for the development of Micro-Meta App was the need to expedite and, when possible, automate the rigorous reporting of imaging experiments and quality control procedures. In this context, Micro-Meta App was developed to be directly integrated into the 4DN Data Portal 34,57. For this use, the App’s data flow was adapted to allow the direct integration of the content of the Microscope- and Settings-JSON files into the portal database. This allows individual fields to be utilized for filtering and searching purposes and to be visualized directly on the portal (Figure 6). In addition to maximizing flexibility, interfaces were developed to allow the import of pre-existing files the user might have produced using the desktop version of the App.
Micro-Meta App, microscopy platforms and teaching
Micro-Meta App provides a digital representation of freely configurable microscopes, ideal for microscopy platform staff to provide users with a detailed inventory of all available microscopes and a property that makes it perfectly suited for teaching purposes (Supplemental Video 1). Two major teaching use cases have been explored in the context of the Foundations in Biomedical Science (BBS 614) course (https://www.umassmed.edu/gsbs/academics/courses/) administered to first-year students at the Graduate School of Biomedical Sciences at the University of Massachusetts Medical school: 1) Micro-Meta App was used for students to work on specific problem sets; 2) Micro-Meta App was used for self-driven exploration of microscope components, functions and imaging modalities. In both cases, it is advisable to create specific teaching Microscope JSON files that students can load and work on. Specifically, the features and complexity of these teaching Microscope JSON files need to be aligned with course level and content by choosing the most appropriate tier-level among those available and, if necessary, by structuring the file without adhering to any one specific tier 36,7922. For example, a problem set might be assigned that focuses on choosing the most appropriate filter set for a given imaging experiment. Specifically, students are instructed to choose an appropriate light source and then specify each filter to be associated with the filter set. In this case, the depth of information associated with Tier 3 might be needed for the filters and a short list of possible light sources might be provided for the students to choose from (e.g., laser combiner with the wrong laser lines for the experiment and broadband source). At the same time the rest of the Microscope JSON file could be kept at a very basic level to reduce grading complexity. In another example, a course might start on Day 1 with a Microscope.JSON file that only has a few components at Tier 1. On subsequent days, more components might be added, and the tier-level might be raised up to Tier 3 depending on the specific course teaching goals.
In addition to the tested use-cases, Micro-Meta App might be used in teaching modules available online (e.g., core facility user training) or used for flipped-classroom settings, where Micro-Meta App would be used for the practical application of microscopy concepts learned individually. Specifically, Micro-Meta App might be used to create snapshots of microscopes or as a guided sequence for familiarizing the user with the intricacies of specific instrument hardware configurations. As with any teaching material, alignment of the teaching microscope in Micro-Meta App with course content, intended use and grading complexity is critical for success.
Future Directions
We will work closely within the context of 4DN 33,34, BINA 35, Global Bioimaging 3,80 and QUAREP-LiMi’s WG7 on Metadata (https://quarep.org/working-groups/wg-7-metadata/) on the following fronts:
Implementation of additional microscopy modalities: The current version of Micro-Meta App implements the Core of the OME data model and the 4DN-BINA-OME Basic extension 38,39. One of the most urgent next steps will be implementing the Confocal and Advance extensions and subsequently, in close collaboration with QUAREP-LiMi 4,40 to implement the Calibration and Performance extension 38,39.
Instrument Performance and Calibration implementation: The 4DN-BINA-OME Microscopy Metadata Specifications 38,39 include a Calibration and Performance extension which is not currently implemented in Micro-Meta App. In close collaboration with several of the core WG of QUAREP-LiMi we will work to integrate Quality Control metadata (procedure description and output metrics) into Micro-Meta App. As a starting point, illumination and detector calibration metrics calculated using the open-hardware Meta-Max81 calibration tool will be automatically imported and used to annotate imaging datasets.
Further integration with MethodsJ2: Future development will: 1) integrate the Micro-Meta App software metadata Settings.JSON file so that it can serve as a source of Microscopy Metadata for MethodsJ2 50. 2) Adapt Micro-Meta App to generate methods text directly so researchers can use their platform of choice.
Further OMERO and OMERO.mde integration: As part of the initial Micro-Meta App development endeavor, a partial-functionality, prototype OMERO-plugin version of the App was developed and integrated into the OMERO instance, which is available at the University of Massachusetts Medical School (UMMS)82. In an effort to facilitate the wide adoption of Micro-Meta App by the imaging community, integration with OMERO will be completed, including the extracting experimental metadata as specified by extended specifications as developed using OMERO.mde 49 and saving 4DN-BINA-OME metadata 38,39 as collections of key-value pairs associated with individual image datasets.
Creation of Instrument and Hardware components databases: While engaging with microscope hardware manufacturers to ensure the full automation of data provenance and quality control reporting for light microscopy, it will be necessary to further reduce the documentation burden imposed on bench-side scientists, therefore maximizing their adoption of community standards. To this purpose, we will work in the context of Bioimaging North America to develop sharable databases of Microscope.JSON files. This effort could be integrated with the RRID effort 58 and will have the added advantage of promoting the recognition of Microscope-configurations as an essential and quantifiable scientific output, therefore providing credit to the work of imaging scientists and, in particular, microscope custodians.
Outreach and education effort: To involve both microscope users, custodians and manufacturers and promote their adoption of Micro-Meta App. In particular, in the case of Manufacturers, Micro-Meta App provides them with an opportunity to produce pre-filled JSON files describing individual components and make them available to the community from individual sites (similar to Fiji plugin sites) and utilized to automate during the production of Microscope.JSON files.
FPbase spectra-viewer integration: The customized spectra-viewer feature of FPbase 83,84 will be integrated into Micro-Meta App so the users can produce interactive microscope-specific spectral representations of each optical configuration. Such representations can, for example, be used to calculate the excitation and collection efficiency of a given Channel/Fluorophore combination. Vice versa, the possibility of integrating Micro-Meta App-generated Microscope.JSON files into the FPbase microscope-configurations feature will be explored. The interoperability of these two tools will improve the capacity of core-facilities to provide realistic teaching scenarios thus facilitating user training.
Conclusions
For this work to have a broad impact tools such as Micro Meta App are required so that biological scientists will be able to quickly adopt metadata recommendations and incorporate them in their everyday work without requiring an extensive time commitment and regardless of their imaging expertise. This means that tools will need to be developed, accessible and straightforward to use in order to be successful and implement the proposed guidelines and metadata standards broadly. Ultimately, this will lead to automating all aspects of the process utilized by members of the community to annotate and upload metadata-rich imaging datasets to both local repositories such as OMERO 85 and public repositories such as Image Data Resource (IDR) 20 or other public image archives 22. In addition to the development of community standards for microscopy, reaching this goal will entail the automated interpretation of metadata stored in image file headers, the development of a community-wide repository for the storage of metadata specifications for commercial and custom-made microscopy hardware commonly utilized by imaging laboratories, and the automated annotation of imaging datasets to be uploaded in imaging data portals or disseminated through other means.
Availability, requirements and documentation
Project name: Micro-Meta App
Project home page: https://github.com/WU-BIMAC/MicroMetaApp.github.io Documentation: https://micrometaapp-docs.readthedocs.io/en/latest/index.html
Executable available at:
Desktop application (Javascript Electron): https://github.com/WU-BIMAC/MicroMetaApp-Electron/releases/latest
Source code available at:
Desktop application (Javascript Electron): https://zenodo.org/record/4750765
Dataportal integratable application (Javascript React): https://zenodo.org/record/4751438
Operating system(s): MacOS and PC
Programming language: Javascript and Java
Other requirements: Java v1.8.0. In addition, see dependencies listed in Supplemental Material.
License: GNU GPL v3 (https://www.gnu.org/licenses/gpl-3.0.html)
Authors contributions
Author contributions categories utilized here were devised by the CRediT initiative 86,87.
Alessandro Rigano: Conceptualization, Methodology, Software, Validation, Investigation; Shannon Ehmsen: Resources, Visualization; Serkan Utku Ozturk: Software; Joel Ryan: Validation, Resources, Data Curation; Alexander Balashov: Conceptualization, Methodology, Software; Mathias Hammer: Conceptualization, Validation; Koray Kirli: Validation; Kevin Bellve’: Resources; Ulrike Boehm: Validation, Resources, Data Curation, Writing - Review & Editing; Claire M. Brown: Validation, Resources, Data Curation, Writing - Review & Editing; James Chambers: Validation, Resources, Data Curation; Andrea Cosolo: Validation, Writing - Review & Editing; Robert Coleman: Validation, Resources, Data Curation; Kevin Fogarty: Resources; Orestis Faklaris: Validation, Resources, Data Curation, Writing - Review & Editing; Thomas Guilbert: Validation, Resources, Data Curation, Writing - Review & Editing; Anna B. Hamacher: Resources; Michelle S. Itano: Validation, Resources, Data Curation, Writing - Review & Editing; Daniel P. Keeley: Validation, Data Curation; Susanne Kunis: Resources; Judith Lacoste: Validation, Resources, Data Curation, Writing - Review & Editing; Alex Laude: Validation, Resources, Data Curation, Writing - Review & Editing; Willa Ma: Validation, Data Curation; Marco Marcello: Validation, Resources, Data Curation, Writing - Review & Editing; Paula Montero-Llopis: Validation, Resources, Data Curation, Writing - Review & Editing; Glyn Nelson: Validation, Resources, Data Curation, Writing - Review & Editing; Roland Nitschke: Validation, Resources, Data Curation, Writing - Review & Editing; Jaime A. Pimentel: Validation, Resources, Data Curation, Writing - Review & Editing; Stefanie Weidtkamp-Peters: Validation, Resources, Data Curation; Peter Park: Supervision, Project administration, Funding acquisition; Burak Alver: Conceptualization, Validation, Resources, Writing - Review & Editing, Supervision, Project administration, Funding acquisition; David Grunwald: Conceptualization, Methodology, Investigation, Resources, Writing - Review & Editing, Supervision, Project administration, Funding acquisition; Caterina Strambio-De-Castillia: Conceptualization, Methodology, Software, Validation, Investigation, Resources, Data Curation, Writing - Original Draft, Writing - Review & Editing, Visualization, Supervision, Project administration, Funding acquisition.
Acknowledgments
We would like to thank Lawrence Lifshitz at the Biomedical Imaging Group of the Program in Molecular Medicine at the University of Massachusetts Medical School for invaluable intellectual input and countless fruitful discussions and for their friendship, advice, and steadfast support throughout the development of this project. We acknowledge Matteo Luban for critically reading and editing the manuscript.
This project could never have been carried out without the leadership, insightful discussions, support and friendship of all OME consortium members, with particular reference to Jason Swedlow, Josh Moore, Chris Allan, Jean Marie Burel, and Will Moore. We are massively indebted to the RIKEN community for their fantastic work to bring open science into biology. We would like to particularly acknowledge Norio Kobayashi and Shuichi Onami for their friendship and support.
We thank all members of Bioimaging North America, German Bioimaging, Euro-Bioimaging (in particular Antje Keppler and Federica Paina) and QUAREP-LiMi (in particular all members of the Working Group 7 - Metadata; quarep.edu) for invaluable intellectual input, fruitful discussions and advice. We are also indebted to the following individuals for their continued and steadfast support: Jeremy Luban, Roger Davis, and Thoru Pederson at the University of Massachusetts Medical School; Burak Alver, Joan Ritland, Rob Singer, and Warren Zipfel at the 4D Nucleome Project; Ian Fingerman, John Satterlee, Judy Mietz, Richard Conroy, and Olivier Blondel at the NIH.
This work was supported by NIH grant #1U01EB021238 and NSF grant 1917206 to D.G., NIH grant # U01CA200059 to C.S.D.C and D.G., NIH grant # 5R01GM126045-04 to R.C. and by grant #2019-198155 (5022) awarded to C.S.D.C. by the Chan Zuckerberg Initiative DAF, an advised fund of Silicon Valley Community Foundation, as part of their Imaging Scientist Program. D.S. was funded in part by NIH/NCI grants U54CA209988 and U2CCA23380. C.M.B. was funded in part by grant #2020-225398 from the Chan Zuckerberg Initiative DAF, an advised fund of Silicon Valley Community Foundation. R.N. was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) grant number Ni 451/9-1 MIAP-Freiburg. C.S.S. was supported by the Netherlands Organisation for Scientific Research (NWO), under NWO START-UP project no. 740.018.015 and NWO Veni project no. 16761. T.G. is a member of RTmfm network and IMAG’IC core facility is supported by the National Infrastructure France BioImaging (grant ANR-10-INBS-04) and IBISA consortium. S.W.-P. was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) project-ID 267205415 – SFB 1208, project INF. The UNC Neuroscience Microscopy Core (RRID:SCR_019060) is supported, in part, by funding from the NIH-NINDS Neuroscience Center Support Grant P30 NS045892 and the NIH-NICHD Intellectual and Developmental Disabilities Research Center Support Grant P50 H103573. M.S.I. was supported by grant number 2019-198107, from the Chan Zuckerberg Initiative DAF, an advised fund of Silicon Valley Community Foundation.
Footnotes
↵# Members of the Bioimaging North America Quality Control and Data Management Working Group
Abbreviation list
- BINA
- BioImaging North America
- 4DN
- 4D Nucleome
- FAIR
- Findable Accessible Interoperable and Reproducible
- OME
- Open Microscopy Environment
- QUAREP-LiMi
- QUAlity Assessment and REProducibility for Instrument and Images in Light Microscopy
References
- 1.↵
- 2.
- 3.↵
- 4.↵
- 5.↵
- 6.↵
- 7.↵
- 8.↵
- 9.↵
- 10.↵
- 11.↵
- 12.
- 13.↵
- 14.↵
- 15.↵
- 16.
- 17.
- 18.
- 19.↵
- 20.↵
- 21.↵
- 22.↵
- 23.↵
- 24.↵
- 25.↵
- 26.↵
- 27.↵
- 28.↵
- 29.↵
- 30.
- 31.
- 32.↵
- 33.↵
- 34.↵
- 35.↵
- 36.↵
- 37.↵
- 38.↵
- 39.↵
- 40.↵
- 41.↵
- 42.
- 43.↵
- 44.↵
- 45.↵
- 46.↵
- 47.↵
- 48.
- 49.↵
- 50.↵
- 51.↵
- 52.↵
- 53.↵
- 54.↵
- 55.↵
- 56.↵
- 57.↵
- 58.↵
- 59.↵
- 60.
- 61.
- 62.
- 63.
- 64.
- 65.
- 66.
- 67.
- 68.
- 69.
- 70.
- 71.
- 72.
- 73.
- 74.
- 75.
- 76.
- 77.
- 78.↵
- 79.
- 80.↵
- 81.↵
- 82.↵
- 83.↵
- 84.↵
- 85.↵
- 86.↵
- 87.↵