Awarded
The Canyons MCZ DY200 drop camera & AUV data analysis
Descriptions
JNCC requires the analysis of seabed imagery (stills and video) collected on the DY200 survey from the drop-down camera and Autonomous Underwater Vehicle (AUV). As the survey has not been completed yet, estimates of still image numbers and video length have been provided. These numbers may be subject to change after the survey and will be discussed with the relevant contractor.
Table 2 Estimated number of stills and video data to be analysed from DY200.
Data type Quantity
Drop camera video 50 hours video
Drop camera stills 5,000 images
AUV stills 1,000 images
The following will be supplied to the successful contractor:
• Access to BIIGLE project with all stills, video and label trees.
o Please email the contacts for technical information (see page 1) if you would like to be added as a guest to the BIIGLE project to review the imagery.
• Epibiota Quality Assurance Framework Proforma spreadsheets
4. Project Objectives
1. Analysis of stills and video taken with the drop-camera system
2. Filtering, processing and analysis of a subset of AUV imagery
3. Provide substrate and taxonomic image reference collections for each substrate type and taxon identified from imagery.
4. Produce a final analysis report including, at a minimum, sections detailing the methodology, results , and details of all QA work undertaken with any remedial action deemed necessary. The report should be no longer than 10,000 words including all tables and appendices and be provided electronically via email as a Microsoft Word document.
5. Produce a report comparing the ability of the AUV and the drop camera to identify the following deep-sea features (if present in the AUV data) and make recommendations on the most appropriate survey technique for these features.
o Coral gardens
o Seapen and burrowing megafauna
o Deep Sea Sponge aggregations (if present in the data collected)
6. Create a subset of stills annotated in a way suitable to act as training data for artificial intelligence model training as described below
The contractor must
• Undertake the analysis as set out below and adhering to the NMBAQC Epibiota interpretation guidelines (Turner et al, 2016). Please note these guidelines are currently being updated by JNCC - the contractor must check with JNCC if the updated guidelines are available when the contract starts.
• Use BIIGLE to annotate video and stills as described below. Alternative image annotation software may be used subject to agreement with the project officer.
• Ensure that stills and video references used in analysis outputs are identical to those used in the naming of the original media to enable future reconciliation between data and media. If identical naming is not possible, a suitable alternative should be sought with JNCC.
Some information, where specified, may be recorded directly into the proformas provided. The majority will be recorded first into BIIGLE and then used to populate the proforma. No analysis additional to what is described in this document is required. Any deviation from this methodology should be approved in writing by the project officer.
4.1. Analysis of video data from drop-camera
Video should be analysed in BIIGLE using the label trees shown in Table 2. A high-level review should be conducted as described in section 2.1 of Turner et al (2016). Annotations can be added to videos as either tier 1 or tier 2 annotations depending on the label tree used. More details on the video annotation tiers and how they should be applied are provided in Appendix A.
Video will be analysed to extract the following information (all information should be recorded using the provided BIIGLE label trees, unless specified otherwise):
1. Video should be segmented into areas of continuous broadscale seabed habitat type (detailed in step 2) greater than or equal to 5 m along transect distance; JNCC will provide positional information for this purpose. The segment label tree should be used to delineate these segments and labels from other trees should be attached to each segment using the "add label" tool in BIIGLE.
2. The Marine Habitat Classification of Britain and Ireland (v 22.04) will be used, and a new segment should be started if the habitat classification changes.
3. Each segment will be assigned image quality scores using labels from the following two label trees. Further analysis of video segments will be dependent on the image quality score. For example, if a segment is given a score of zero, no further analysis should be carried out for that segment.
a. NMBAQC image quality, a summary of these scores is shown in Table 3 and described in more detail in section 2.1 of Turner et al (2016).
b. JNCC image quality, a summary of these labels is shown in Table 4.
4. Identify evidence of anthropogenic impacts on the seabed:
a. Use the litter label tree to record the presence of litter using the categories listed in Annex 5.1 of the Joint Research Centres Guidance on Monitoring of Marine Litter in European Seas6.
b. Use the Anthropogenic label tree to annotate trawl marks or anthropogenic impacts other than litter. This will not be a complete label tree and new labels may need to be added to the label tree.
5. Use the biotope label tree to assign biotopes, up to level 6 of the Marine habitat classification of Britain and Ireland hierarchy and in accordance with Parry (2019)7.
Table 3 Label trees for video annotation and the annotation type which should be used for each label tree
Video analysis label tree Video annotation type
Segment Tier 1
Marine habitat classification of Britain and Ireland (v 22.04) Tier 1
JNCC image quality Tier 1
NMBAQC image quality Tier 1
Biotope Tier 1
Coral Gardens (Henry and Roberts, 2014a)
Tier 1
Seapen and Burrowing Megafauna
(please note the definition of this FOCI currently being updated by JNCC - see Appendix B) Tier 1
Deep Sea Sponge Aggregations (Henry and Roberts, 2014b)
Tier 1
Coral Reef (if present) Tier 1
Burrows Tier 1
Litter Tier 2
Anthropogenic Tier 2
Table 4 Summary of NMBAQC image quality categories (Turner et al., 2016)
Table 5 JNCC image quality categories
• Imagery quality level • Description
• Fauna • Most fauna can be identified (e.g. including smaller taxa such as brittlestars etc.)
• Conspicuous fauna • Large and conspicuous fauna can be identified (e.g. sponges, soft corals etc.)
• Substrate • The substrate type can be identified, but the fauna cannot (e.g. the water column is obscured / the camera is too high off the seabed)
• Zero • No visibility of the seabed, substrate cannot be identified.
4.2. Analysis of stills data from drop-camera and AUV
Stills should be analysed in BIIGLE using the label trees used. Some tier 1 labels may be added directly into the proforma (see below). More details on the still annotation types and how they should be applied are provided in Appendix A.
Table 6 Label trees for stills annotation and the annotation type which should be used for each label tree
Stills analysis label tree Stills annotation type
NMBAQC image quality Tier 1
JNCC image quality Tier 1
Substrate Tier 1
Biota Tier 2
Laser points Tier 2
Litter Tier 2
Anthropogenic Tier 2
Stills will be analysed to extract the following information:
1. Each still will be assigned image quality scores using labels from the following two label trees. Further analysis of the still image will be dependent on the image quality score. For example, if a still image is given a score of 'very poor' and 'substrate', no taxonomic identification should be carried out for that still image.
a. NMBAQC image quality, a summary of these scores is shown in Table 3 and described in more detail in section 2.1 of Turner et al 2016.
b. JNCC image quality, a summary of these labels is shown in Table 4.
2. Identification and enumeration of epifauna, JNCC will provide a reference collection built from images previously taken in these sites. Annotations made using the Biota label tree. For each still image being analysed, identify, and quantify all8:
a. Solitary and/or erect epifaunal species present.
b. Bioturbation traces using counts (achieved by using the point annotation tool within BIIGLE).
c. Colonial and/or encrusting epifaunal species present as far as possible (using percentage cover (achieved by using the polygon, magic wand, or brush tools within BIIGLE).
3. Record substrate type and Faunal Turf cover to nearest 10% directly to proforma.
4. Use laser points label tree to annotate lasers and calculate image area in BIIGLE. Analysts are not to use the automated laser detection tool in BIIGLE.
5. Identify evidence of anthropogenic impacts on the seabed:
a. Use the litter label tree to record the presence of litter using the categories listed in Annex 5.1 of the Joint Research Centres Guidance on Monitoring of Marine Litter in European Seas6
b. Use the Anthropogenic label tree to annotate trawl marks or anthropogenic impacts other than litter. Note, new labels may need to be added to the label tree.
6. 4.3. Developing training data for artificial intelligence
In addition to providing costs for the analysis described above, please provide quotes for the following preparation of 100 stills from the drop-camera in BIIGLE to form a suitable training dataset for Artificial Intelligence models:
1. Stills will be re-annotated using a bounding box. Images of image quality zero (NMBAQC equivalent) should be excluded:
a. All points and polygons must be converted into a rectangle suitable for use in an AI model
b. Example box annotations for fauna annotated with a point will be provided to the successful contractor to enable an average box size (in pixels) to be estimated
2. JNCC will provide 100 images randomly selected from across the dataset and from varying time points in a transect .
3. Re-annotated images will be stored in a separate BIIGLE volume within the 2023 PBR survey project.
Timeline
Published Date :
Deadline :
Tender Awarded :
Awarded date :
Contract Start :
Contract End :
Tender Regions
CPV Codes
71700000 - Monitoring and control services
71900000 - Laboratory services
73112000 - Marine research services
Keywords
Tender Lot Details
2 Tender Lots
Workflows
Status :
Assign to :
Tender Progress :
Details
Notice Type :
Tender Identifier :
TenderBase ID :
Low Value :
High Value :
Region :
Attachments :
Buyer Information
Address :
Website :
Procurement Contact
Name :
Designation :
Phone :
Email :
Possible Competitors
1 Possible Competitors