Sei sulla pagina 1di 12

Marine Habitat Mapping Technology for Alaska, J.R. Reynolds and H.G. Greene (eds.

) Alaska Sea Grant College Program, University of Alaska Fairbanks. doi:10.4027/mhmta.2008.11

157

Video Analysis, Experimental Design, and Database Management of Submersible-Based Habitat Studies
Brian N. Tissot
Washington State University, School of Earth and Environmental Science, Vancouver, Washington

Abstract
Submersibles are used in a diverse array of scientific studies such as geophysical mapping and interpretation, physical and biological oceanographic studies, and fishery investigations. Studies using submersibles deserve careful attention to planning, design, and implementation in order to be effective in meeting their objectives. The most important first step in these studies is to carefully articulate and plan the experimental design of the project from start to finish. Special attention should also be paid to statistical issues, such as randomization, replication, independence, and power, as they can have a large impact on the usefulness of the data to meet the objectives of the study. The processing and analysis of videotapes resulting from submersible dives involves a high level of training and quality assurance with staff who are comfortable with taxonomic identification, survey methods, and the wide array of technical tools used to analyze these data. Issues associated with this video processing include training, minimizing observer bias, using the proper equipment, and entering and verifying data. Large, complex projects should always be managed using a relational database that can effectively integrate complex data types, can validate data types and ranges, and is significantly less prone to errors than spreadsheets. Given the high cost of conducting submersible studies, it is imperative that sufficient attention is focused on issues that impact data quality and that projects be well thought out in their entirety prior to going to sea.

Although there are numerous examples in the literature describing the results of such studies (e.g., Stein et al. 1992, OConnell et al. 2002, Yoklavich et al. 2000, Jagielo et al. 2003, Tissot et al. 2007) there are few published accounts that capture the complex issues associated with the design and implementation of these studies, especially post-processing and data management issues. Thus, the goal of this paper is to describe the design, implementation, and data management of products typically resulting from submersible studies. I will focus on the technical and logistical issues that cascade through these types of projects and how they can be managed effectively and efficiently both before and after fieldwork has been completed.

Experimental design
Before beginning any study the most important step is to carefully articulate and plan the experimental design of the project from start to finish (Green 1979). Detailed planning at the start of the project can save significant time, as later changes in the design of the project will cascade through multiple levels of the project such as field surveys, video logging, database design, and geospatial and statistical analyses. Initially, the objectives of the study need to be clarified in as much detail as possible. Based on the objectives, specific hypotheses should be developed from the outset in order to clarify the statistical tests to be used, their assumptions, and statistical power issues (Underwood 1991). The field sampling strategy can then be derived from the objectives and hypotheses by taking into account the areas to be sampled, time at sea, data collection capabilities, and other logistical constraints (Krebs 1999). Once field variables and the various types of data to be collected are defined, data types and structures can be specified for the relational database, which can then be designed for data entry and validation. Special attention should be paid to statistical issues from the outset as they can have a large impact on the usefulness of the data to meet the objectives of the study (Hurlbert 1984, Eberhardt and Thomas 1991). The principal issues include randomization, replication, independence, and power. If samples are large, random selection of study sites from all possible sampling locations should be used to minimize sampling bias. Most sampling strategies use a stratified design to focus on particular depths and/or habitat types. Appropriately sized grids are then overlaid on maps of the study area and randomly selecting grids are used to select

Introduction
Scientific studies using submersibles are widely used to investigate the abundance and distribution of fishes, benthic invertebrates, and their associated habitats. The vehicles used include human-occupied submersibles (HOVs), remotely operated vehicles (ROVs), towed sleds, and more recently autonomous underwater vehicles (AUVs). Projects involving observations from submersibles may involve fishery biologists, marine ecologists, and marine geologists working in collaboration. Because these projects may integrate a diverse array of studies including geophysical mapping and interpretation, physical and biological oceanographic studies, and fishery investigations (e.g., Reynolds et al. 2001, Nasby et al. 2002, Wakefield et al. 2005), these projects typically generate a large quantity of video data that require complex and time-consuming post-processing and data management (Somerton and Glendhill 2005).

158

TissotVideo Analysis, Experimental Design, and Database Management

Figure 1. Stratified sampling design using depth (contours in m) and habitat type (various colors) to randomly select sampling locations for cowcod rockfish (Sebastes levis) on Tanner and Cortez Bank, California. A 1.5 1.5 km sampling grid is overlaid on mixed sediment and rocky substrata areas at 75-300 m depths. Dive tracks are indicated by green lines, which were subdivided by three 20 min transects (yellow symbols). Inset shows a typical dive track with transects. Based on Yoklavich et al. (2007).

sampling sites (e.g., Jagielo et al. 2003, Yoklavich et al. 2007) (Fig. 1). If the number of possible grids is small relative to those being sampled, it is better to systematically select sites to be sampled to ensure appropriate interdispersion over the study area (Hurlbert 1984). Replication should occur both within and among strata (e.g., depth, habitat) to minimize confounding of spatial variation. If the study area has significant variation in habitat, the area should be stratified and subsampled with nested replicates to further account for spatial variation (Underwood 1997). If replicates within grids lack independence (i.e., transect segments within a dive) they should either be pooled or treated as non-independent repeated-measures. Finally, and importantly, the power of the statistical design to reject the null hypotheses should be examined if the study is focused on developing baselines, detecting change over time, or for control-impact studies (Mapstone 1996, Krebs 1999). To address this important design issue, power analyses should be conducted. Power analysis requires some measure of the variability in the measured traits, which can be derived from pilot studies or related studies, and the amount of change to be detected, or the effect size (example in Tissot et al. 2007).

Survey preparations
Prior to fieldwork it is imperative to provide training for field observers and to plan for the types of data to be collected and their management, both at sea and in the lab. One of the major advantages of HOVs over ROVs is that observers in the sub can often see, resolve, and record (via the audio track) many more animals than are visible from the video

cameras. However, training for submersible observers is critically important and time consuming but essential to providing accurate counts of marine organisms. Training usually involves a combination of pre-dive reviews of identification, counts, and/or size estimates, underwater surveys supervised by an experienced observer (which could initially be by scuba in some habitats), reviews of logged video with experienced observers, and/or comparisons to physical fish models (e.g., plastic fish) of known size and abundance. In general, observer bias in the band transect approaches, common to submersible surveys, is less on conspicuous slow-moving taxa and greater on fast-moving and/or cryptic species (Williams et al. 2006). However, with experience observer bias tends to rapidly diminish (Williams et al. 2005) and differences from values recorded by well-trained individuals can be quite small (Yoklavich and OConnell 2008). During the cruise detailed notes should be taken of all procedures, including lists of data types collected (e.g., tapes, images, navigation files, samples), where the raw data files are physically located, and the distribution of backup copies. Videotapes from the submersible are a critical product of the study and should be handled with special care. Ideally, two copies of the raw video feed should be made so one tape can be stored as a master and the other used to make additional copies as needed. Not only is this procedure good practice for protecting original data, but videotapes (analog or digital) can lose luminance after a very few plays. The navigation system (USBL, DVL, or other) needs to be time-synced with the time code on the video, ideally by encoding on the audio channel and/or the video overlay system. Any differences in the time codes of the various data sources need to be carefully checked and logged in order for data derived from the videotapes to be accurately linked to geospatial information (and thus habitat and bathymetric maps) and other information derived from the submersible (e.g., oceanographic data, depth, distance off bottom). During the submersible surveys it is important to record as much supportive information as possible for interpretation of survey data and video back in the lab. Still photos, voucher specimens, and comments recorded on the video audio channel all contribute to a better understanding of the various products from the study. Some towed sleds and ROVs, such as ROPOS (Shepherd and Wallace 2002), have surface operations that allow real-time frame grabs and/or data loggers where ship-based observers can record observations, navigation information, etc. that can provide data and assist in video interpretation later. Be sure to back up everything from the cruise such as videotapes, still photos, navigation, and bridge logs, and distribute copies to colleagues for safekeeping. For longevity, data should be archived with NOAAs National Ocean Data Center (NODC) or other appropriate centers.

Video analysis and data logging


The actual analysis and data logging from the videotapes can be very time consuming, so it is important to plan and budget accordingly. Depending on the amount and complexity

Marine Habitat Mapping Technology for Alaska

159

of the information to be extracted from the tapes, analysis time can exceed submersible time by as much as 5:1 or 10:1. Thus, a cruise that collects 100 hours of videotape can take 500-1,000 hours for lab analysis (or 3-6 months full-time work for a single individual). The process of video analysis involves a high level of training and quality assurance with staff who are comfortable with taxonomic identification, survey methods, and a wide array of technical tools. Staff need to be familiar with the various data types generated during fieldwork (e.g., navigation, oceanographic information) and the equipment used to analyze the data, such as video equipment, time code generator, and relational database. If the same people are involved in analysis they will also need to be familiar with GIS, graphing, and statistical analysis software. For consistency, and to minimize observer bias, it is essential to develop detailed protocols that describe each of the tasks to be completed. These should be developed in cooperation with all team members involved in the project and modified as the project progresses and new situations emerge. Protocols should describe the general procedure for logging data, with specific examples embedded in the text. Exceptions, caveats, and variations in the procedure due to different submersibles should also be included. Once the protocol becomes stable it can be used to train new individuals to conduct the work. An example of a procedure to classify physical habitats, based on the method in Stein et al. (1992), is partially listed in Table 1. Training and quality assurance are essential for teams of individuals working together on a project as the level of human bias in classifying habitat and identifying, counting, and sizing fishes and invertebrates can be significant. Ideally, individuals directly involved in field identification and counting should do the video analysis; otherwise new individuals must be trained from the beginning. Training to conduct video logging should be done using a range of habitat types and organisms likely to be encountered, and using tapes that have previously been analyzed by experienced observers. Initially, the trainer should lead the trainee through the process, following the protocol and discussing issues as they arise. Trainees are then free to log initial video on their own. After a few transects have been completed, the trainer should review the video and trial data with the trainee, using the protocol to discuss similarities and differences between the trial run and data collected by an experienced observer. The trial data can then be quantitatively compared to previous data for overall similarity in habitat coding, taxonomic identification, counts, sizes, and other variables. This quality assurance process, from sample logging to quantitative analysis, should be repeated until the trainee can produce data with a reasonable level of similarity to that from an experienced observer. In our lab 90% similarity is considered an acceptable level of consistency and trainees can, for the most part, then be left on their own to log data. However, for longterm data quality, projects should be conducted within the quality assurance cycle: plan the analysis, do the work using

Figure 2. Typical video analysis laboratory set-up illustrating digital video player, high-resolution monitor, counter, and computer with database.

the protocol, check periodically for data consistency, and act to update the protocol and retrain individuals as needed. There are a variety of ways to staff a video analysis laboratory. In academic institutions where graduate students are the primary source of labor, overlapping appointments allows more experienced students to train newer ones, with the faculty supervisor providing overall quality assurance. Other models, also used in academic settings but more common in state and federal agencies, are to use permanent staff (technicians, biologists, or other) to conduct video analyses, which ensures more long-term consistency, and/or to use independent specialized consultants. When logging data from videotapes it is important to have a comfortable workstation with easy access to all the necessary equipment. At a minimum there should be a highquality video player (VCR or digital video player) with audio (unless videotapes are digitized, see below), a high-resolution monitor, and a computer. Data should be entered directly into a relational database for validation. Other equipment that can aid in the analysis might include key counters for enumerating several different taxa simultaneously and a time code generator, such as a Horita time code wedge (Fig. 2). Depending on the complexity of the work, the workstations can be occupied by a single observer, which is common for fish observations and habitat classification, or 1-3 observers dividing up taxa into manageable pieces, which is common for benthic invertebrates. The video can be analyzed in a single pass or using multiple viewings, with each run focusing on different taxa or making different observations (e.g., size, counts, associations). In some set-ups, videotapes from multiple submersible cameras can be watched simultaneously to improve resolution and size measurements (e.g., Harvey et al. 2005). Video may also be randomly sampled to shorten post-processing time. Subsampling of video usually requires stratifying the sample by depth and/or habitat to

160

TissotVideo Analysis, Experimental Design, and Database Management

Table 1.

Example protocol for classifying benthic habitats. Methods are based on Stein et al. (1992) and Greene et al. (1999).

Protocol for Continental Shelf Habitat Classification


Revised: 10/18/2007 A. Recording habitat patches 1. Record the start and end time of each habitat patch (sections of contiguous habitat types). Start/end times correspond to when the sizing lasers touch the next habitat type. If the lasers swing up and down onto and off of the new patch, wait until the lasers stay on the patch to record the time. End time corresponds with the beginning of the next unique habitat type. 2. Transitions to new habitat types are difficult to delineate when gradients occur; be consistent and move through video several times to delineate changes; add a comment to the habitat entry if the delineation was difficult. B. Identifying habitat types 1. Habitat patches must be at least 10 seconds long for Delta and 30 seconds long for ROPOS with 10% error to allow for play with the lasers. This means the absolute minimum patch length for Delta is 9 seconds, and for ROPOS is 27 seconds. Any patches of shorter duration should be excluded from the analysis. 2. Patches must have two habitat codes, e.g., BC, MP. First code is 50 and <80% cover (primary habitat types). Second code is 20 and <50% cover (secondary habitat type). If two habitat components are in relatively equal proportions, classify that habitat patch with the smaller grain size of the two as dominant. For example, if you were in an area of boulder and cobble in equal proportions, the code would be CB, not BC since cobbles are smaller than boulders. In addition, if you were in a ridge-dominated area with both pebble and cobble as the secondary components and in equal proportions, the code would be RP, not RC. Habitat codes are as follows:
Type R = Ridge Description High relief rocky areas. Criteria Generally 1 m relief solid rock substrate angle 30 and <80 Classify the habitat type as Ridge even if you are in a ridge trough and it is relatively flat. If the trough is filled with sand, cobble, boulder, and such and those dominate then the patch description should reflect this. <1 m relief; substrate angle <30) These are flat rock areas that are away from ridges. Not to be confused with flat trough areas between ridges. 25.5 cm and <1 m. These are not connected to ridges. Isolated, and may show evidence of rolling. 6.5 cm and <25.5 cm 2 cm and <6.5 cm 4 mm and <2 cm Not always possible to classify Small particle size. Lighter color than mud. Generally shallower water. Small particle size. Darker color than sand. Generally deeper water. Small particle size that could be anywhere in grain size from the finest mud to the coarsest sand.

F = Flat rock

Moderate relief rocky areas.

B = Boulder C = Cobble P = Pebble G = Gravel S = Sand

Moderate relief boulder. Moderate relief cobble. Moderate relief boulder. Low relief gravel. Low relief sand.

M = Mud

Low relief mud.

U = Unconsolidated

Low relief unconsolidated sediment with indeterminate grain size.

C. Quality Assurance 1. Individuals classifying habitats should be trained in identification based on frame grabs from previously classified sample bottom types. Training should include a. The use of lasers to estimate geological sizes of habitat categories in the plane of the lasers. b. How to estimate percent cover of different categories; standard images should be available for comparison. 2. Training should encompass both habitat type classifications and percent cover estimates on tapes that have been classified by experienced users. 3. A quality assurance test should be done on the trainees data to assure accuracy is reasonable (>90% agreement).

Marine Habitat Mapping Technology for Alaska

161

avoid biasing the sample and a power analysis or analysis of species-areas curves to decide on the number of subsamples to use (Krebs 1999). The video can be viewed directly from the tapes or accessed in digital form on a computer. One advantage of the latter is the ability to instantly locate any position on the tape for analysis. A disadvantage is the time required to convert video to digital storage and the large amounts of hard drive space needed to accommodate the files.

Habitat classification and visual surveying


One commonly used method for classifying habitats, widely used on the U.S. West Coast, is based on the procedure described in Hixon et al. (1991) and Stein et al. (1992). For examples and variations on this method see Anderson and Yoklavich (2007), Love and Yoklavich (2007), Yoklavich et al. (2000, 2007), Tissot et al. (2006, 2007), and Wakefield et al. (2005). This method classifies physical habitats using a combination of nine different categories of substrata and standard geological definitions (e.g., Greene et al. 1999). In order of increasing particle size or relief, these substrata are: mud (code M), sand (S), gravel (G), pebble (P), cobble (C), boulder (B), continuous flat rock (F), rock ridge (R), and pinnacles (T). A two-character code is assigned each time a distinct change in substratum type is noted in the video, thus delineating habitat patches of uniform type. The first character in the code represents the substratum that accounted for at least 50% of the patch, and the second character represents the substratum accounting for at least 20% of the patch (e.g., RM represents a patch with at least 50% cover by rock ridge and at least 20% cover by mud). In some studies, habitat patches may also be assigned a code based on the degree of three-dimensional structure as defined by the vertical relief of the physical substrata relative to the seafloor (Tissot et al. 2006, Love and Yoklavich 2007). The area of each habitat patch can be determined using navigation data in ArcGIS to calculate habitat patch length and multiplying by the width of the transect as determined empirically in the field or by delineated paths using sizing lasers. The use of lasers to delineate transect boundaries work best in lowrelief habitats. In more complex, high relief habitats they may become problematic due to an increase in the chance of edge effects associated with uncertainty in defining the boundaries of the transect. Other methods to delineate transect width include measuring the area of view with lasers fixed at a set distance (Yoklavich et al. 2000) and photogrammetric methods that grab frames of precisely defined area for analysis (Jagielo 2004). Line transect methods (Buckland et al. 2001) can also be used when transect widths are uncertain or when the target species is uncommon (e.g., OConnell et al. 2002, Yoklavich et al. 2007). Fishes and benthic invertebrates are commonly surveyed using methods adapted from visual belt transects used for scuba surveys on shallow reefs (e.g., Sale and Sharp 1983). HOV observations can be made through either forward or

starboard portholes, which are mirrored by corresponding video cameras that overlap with the observers field of view. ROVs, towed sleds, and AUVs typically face forward, downward, or at an angle in relation to the substratum. Transect areas of known width are commonly delineated by lasers or other devices that allow the calculation of organism density. In HOVs, or ROVs where video is streamed real-time to scientists on the surface, observers can verbally tape-record observations about the species, size or size class, abundance, and behavior of individuals visible within the transect area. For invertebrates, which are frequently too numerous to count, the focus may often be on individuals larger than 5 cm in size, or megafaunal invertebrates (Tissot et al. 2006). Similarly, fishes can be grouped and recorded into varying taxonomic levels depending on the objectives of the study. Once in the lab, video logging of data is facilitated by reference to audio logs of the dive, written notes, and/ or voucher specimens identified by taxonomic experts. Additional observations may include microhabitat utilization (Hart 2004), behavior (Punwai 2002), and/or associations between fish, habitat, and structure-forming invertebrates (Love and Yoklavich 2007; Tissot et al. 2006, 2007).

Database management
Large, complex projects should always be managed using a relational database, such as MS Access. In addition to allowing multiple users to simultaneously access the data on a network, a relational database effectively integrates complex data types (included digital images and video), validates data types and ranges, and is significantly less prone to errors than flat-file databases such as spreadsheets. Through querying, relational databases can allow seamless integration with a wide variety of other programs including graphical and statistical programs, and with ArcGIS (Wright et al. 2007). Relational databases can link several different kinds of data using key fields in data tables. Data tables can include cruise metadata, vessel and submersible data, fish and invertebrate survey data, taxonomic information, habitat data, and navigation and oceanographic information. Through referential integrity, cross-referencing of (for example) dive numbers and taxonomic fields insures that valid information connects across multiple data sources. An example of a database structure used to manage data from a submersible project is illustrated in Fig. 3.

Querying and data integration


One of the major strengths of a relational database is the ability to manage large quantities of data and selectively integrate and extract multiple types of data through database querying. Unlike conventional flat-file databases, which require sorting and/or rearranging data to provide data summaries, relational databases do not change the structure of the underlying data but instead provide new data tables that represent joins of existing data from multiple sources (Fig. 4). Fish data, for example, can be summarized by location, species, habitat

162

TissotVideo Analysis, Experimental Design, and Database Management

Figure 3. Relationships among data tables in an MS Access relational database illustrating common fields and links among tables to establish referential integrity.

type, or time of day to make graphs or run statistical analyses. By varying the level of querying (e.g., by dive, habitat patch, depth, and/or time), data can be extracted at the proper level of replication for complex statistical analyses, such as repeatedmeasure ANOVA. Biological data can also be integrated with data connected at multiple spatial and/or temporal scales such as navigation, temperature, and depth, in order to create maps, examine correlations, stratify data, or any number of things depending on the hypotheses being tested. An additional major strength of relational databases is the ease of integration of geographically referenced data with external applications such as ArcGIS and other spatial tools. ArcGIS, for example, can directly read MS Access tables and create maps that integrate geospatial layers from multiple data types (Fig. 5). These types of maps can also generate spatial data (e.g., nearest neighbor distances) that can be used in statistical analyses to examine the randomness of biological associations (Pirtle 2005, Tissot et al. 2006). Further, geospatial tools such as the Kriging analysis can be used to examine large-scale distributional patterns that can subsequentially

contribute to management plans (Pirtle 2005, Wakefield et al. 2005). Kriging is a geospatial technique that uses least square algorithms to interpolate the value of any data point at an unobserved location from observations of its value at nearby locations. This method is illustrated in Fig. 6, which extrapolates the results of a multivariate analysis that included data on the abundance of fish, invertebrates and habitats derived from a submersible study at Cordell Bank, California.

Summary and conclusions


Effective and efficient management of data generated from submersible studies clearly requires careful thought, planning, and training at multiple levels of the project. The goal of this paper is to illustrate important considerations that must be made at various stages of the project, particularly with respect to experimental design, video analysis and logging, and database management. If proper attention is given to the collection and management of data, and the training of personnel involved in post-processing, studies can be conducted that clearly meet their stated objectives and provide

Marine Habitat Mapping Technology for Alaska

163

Figure 4. Examples of a query in MS Access that joins linked data tables (top) and the result of the query (left).

164

TissotVideo Analysis, Experimental Design, and Database Management

Figure 5. Map created with ArcGIS showing the distribution of the black coral Antipathes dendrochristos (green triangles) along Delta submersible transects (line segments) in relation to species of rockfish observed, which are indicated by species codes (e.g., YOY, BANK). Depth contour shown in meters. The geospatial link between fish and invertebrate data was used to conduct a nearest neighbor analysis of fish-invertebrate associations (Tissot et al. 2006).

Marine Habitat Mapping Technology for Alaska

165

Figure 6. Kriging analysis prediction map for Cordell Bank, California, produced using first dimension multivariate scores derived from Correspondence Analysis of fish, invertebrate, and habitat data along submersible transects (line segments). The map displays a spatial pattern from shallow, hard substrates (lighter shades) to deeper, unconsolidated substrates (darker shades) (from Pirtle 2005).

166

TissotVideo Analysis, Experimental Design, and Database Management

relatively seamless integration among the spatial, graphical, and statistical tools necessary to conduct the work. Given the high cost of conducting submersible studies, it is imperative that sufficient attention is focused on issues that impact data quality and that projects be well thought-out in their entirety prior to going to sea.

Harvey, E., J. Seager, M. Shortis, S. Robson, and M. Cappo. 2005. Use of stereo-video photography in remote camera and diver transect assessments of fish populations. In: D.A. Somerton and C.T. Glendhill (eds.), Report of the National Marine Fisheries Service Workshop on Under Water Video Analysis. NOAA Tech. Memo. NMFS-F/SPO-68, pp. 44-48. Hixon, M.A., B.N. Tissot, and W.G. Pearcy. 1991. Fish assemblages of rocky banks of the Pacific northwest (Heceta, Coquille, and Daisy banks). U.S. Minerals Management Service, OCS Study 91-0052, Camarillo, California. Hurlbert, S. 1984. Pseudoreplication and the design of ecological field experiments. Ecol. Monogr. 54(2):187-211. Jagielo, T. 2004. Remote Sensing using laser projection photogrammetry for underwater surveys. In: D.A. Somerton and C.T. Glendhill (eds.), Report of the National Marine Fisheries Service Workshop on Under Water Video Analysis. NOAA Tech. Memo. NMFS-F/SPO-68, pp. 56-57. Jagielo, T., A. Hoffmann, J. Tagart, and M. Zimmermann. 2003. Demersal groundfish densities in trawlable and untrawlable habitats off Washington: Implications for the estimation of habitat bias in trawl surveys. Fish. Bull. U.S. 101(3):545-565. Krebs, C.J. 1999. Ecological methodology, 2nd edn. Addison-Wesley and Benjamin Cummings. 620 pp. Love, M.S., and M.M. Yoklavich. 2007. Habitat characteristics of juvenile cowcod, Sebastes levis (Scorpaenidae), in southern California. Environ. Biol. Fishes 10.1007/s10641-007-9290-x. Mapstone, B. 1996. Scalable decision criteria for environmental impact assessment: Effect size, Type I, and Type II errors. In: R.J. Schmitt and C.W. Osenberg (eds.), Detecting ecological impacts: Concepts and applications in coastal habitats. Academic Press, San Diego, pp. 67-79. Nasby, N., R. Embley, B.N. Tissot, M.A. Hixon, D. Wright, S. Merle, and W.W. Wakefield. 2002. Integration of historical groundfish habitat data with new high resolution remotely sensed data on Heceta Bank, Oregon. Fish. Bull. U.S. 100(4):739-751. OConnell, V.M., C. Brylinsky, and D. Carlile. 2002. Demersal shelf rockfish stock assessment for 2003. Alaska Department of Fish and Game, Division of Commercial Fisheries, Regional Information Report No. 1J02-44. Pirtle, J. 2005. Habitat-based assessment of megafaunal invertebrates and fishes on Cordell Bank, California. M.S. thesis, Washington State University. 74 pp. Puniwai, N.F. 2002. Spatial and temporal distribution of the crinoid Florometra serratissima on the Oregon continental shelf. M.S. thesis, Washington State University. 34 pp. Reynolds, J.R., R.C. Highsmith, B. Konar, C.G. Wheat, and D. Doudna. 2001. Fisheries and fisheries habitat investigations using undersea technology. Marine Technology Society/IEEE Oceans 2001, Conference Proceedings, MTS 0-933957-29-7. Sale, P.F., and B.J. Sharp. 1983. Correction for bias in visual transect censuses of coral reef fishes. Coral Reefs 2:37-42. Shepherd, K., and K. Wallace. 2002. Deep precision deployment and heavy package recoveries by the ROPOS R.O.V. system. Marine Technology Society, MTS 0 933057-28-9, 1-6.

Acknowledgments
I would especially like to thank my colleagues who are part of the West Coast submersible research group and who have directly or indirectly contributed to this paper: Tara Anderson, Mark Amend, Joe Bizzarro, Julie Clemons, Bob Embley, Gary Greene, Mark Hixon, Milton Love, Bill Pearcy, Susan Merle, Natalie Reed, Linda Snook, Rick Starr, David Stein, Waldo Wakefield, Curt Whitmire, and Mary Yoklavich. My graduate students have been especially helpful and I have learned more from them than I have given: Camelia Bianchi, Jen Blaine, Jennifer Bright, Kaitlin Graiff, Noelani Puniwai, Jodi Pirtle, and Keri York. The projects described in this paper were funded by the West Coast and Polar Regions Undersea Research Center of the NOAA National Undersea Research Program, NOAAs Office of Ocean Exploration, NOAAs Office of Habitat Conservation, NOAA Northwest Fisheries Science Center and Southwest Fisheries Science Center, NOAA Pacific Marine Environmental Laboratory, and Washington State University. Id also like to acknowledge all of the folks who operated the submersibles, ROVs, and ships involved in the various projects during which these methods were developed (Mermaid II, Delta, ROPOS, F/V McGaw, NOAA ship Ronald Brown, and F/V Velero IV). The paper benefited from comments from Jennifer Reynolds and three anonymous reviewers. North Pacific Research Board (NPRB) publication no. 171.

References
Anderson, T.J., and M.M. Yoklavich. 2007. Multiscale habitat associations of deepwater demersal fishes off central California. Fish. Bull. U.S. 105:168-179. Buckland, S.T., D.R. Anderson, K.P. Burnham, J.L. Laake, D.L. Borchers, and L. Thomas. 2001. Introduction to distance sampling: Estimating abundance of biological populations. Oxford University Press, Inc., New York. Eberhardt, L.L., and J.M. Thomas. 1991. Designing environmental field studies. Ecol. Monogr. 61(1):53-73. Green, R.H. 1979. Sampling design and statistical methods for environmental biologists. Wiley, New York. 257 pp. Greene, H.G., M.M. Yoklavich, R.M Starr, V.M. OConnell, W.W. Wakefield, D.E. Sullivan, J.E. McRea, and G.M. Cailliet. 1999. A classification scheme for deep seafloor habitats. Oceanol. Acta 22:663-678. Hart, T. 2004. Diel activity patterns in demersal fishes on Heceta Bank, Oregon. M.S. thesis, Oregon State University, Corvallis.

Marine Habitat Mapping Technology for Alaska

167

Somerton, D.A., and C.T. Glendhill (eds.). 2005. Report of the National Marine Fisheries Service Workshop on Under Water Video Analysis. NOAA Tech. Memo. NMFS-F/SPO-68. 69 pp. Stein, D.L., B.N. Tissot, M.A. Hixon, and W. Barss. 1992. Fishhabitat associations on a deep reef at the edge of the Oregon continental shelf. Fish. Bull. U.S. 90:540-551. Tissot, B.N., M.A. Hixon, and D. Stein. 2007. Habitat-based submersible assessment of macroinvertebrate and groundfish assemblages at Heceta Bank, Oregon, from 1988 to 1990. J. Exp. Mar. Biol. Ecol. 352:50-64. Tissot, B.N., M.M. Yoklavich, M.S. Love, K. York, and M. Amend. 2006. Benthic invertebrates that form habitat structures on deep banks off southern California, with special reference to deep sea coral. Fish. Bull. U.S. 104:167-181. Underwood, A.J. 1991. The logic of ecological experiments: A case history from studies of the distribution of macro-algae on rocky intertidal shores. J. Mar. Biol. Assoc. U.K. 71:841-866. Underwood, A.J. 1997. Experiments in ecology: Their logical design and interpretation using analysis of variance. Cambridge University Press. Wakefield, W.W., C.E. Whitmire, J.E.R. Clemons, and B.N. Tissot. 2005. Fish habitat studies: Combining high-resolution geological and biological data. In: P.W. Barnes and J.P. Thomas (eds), Benthic habitats and the effects of fishing. Am. Fish. Soc. Symp. 41:138-199.

Williams, I.D., W.J. Walsh, B.N. Tissot, and L.E. Hallacher. 2006. Impact of observers experience level on counts of fishes in underwater visual surveys. Mar. Ecol. Prog. Ser. 310:185-191. Wright, D.J., M.J. Blongewicz, P.N. Halpin, and J. Breman. 2007. Arc Marine: GIS for a blue planet. ESRI Press, Redlands, California. 202 pp. Yoklavich, M.M., and V. OConnell. 2008. Twenty years of research on demersal communities using the Delta submersible in the northeast Pacific. In: J.R. Reynolds and H.G. Greene (eds.), Marine habitat mapping technology for Alaska. Alaska Sea Grant, University of Alaska Fairbanks. (This volume.) Yoklavich, M.M., M.S. Love, and K.A. Forney. 2007. A fishery-independent assessment of an overfished rockfish stock, cowcod (Sebastes levis), using direct observations from an occupied submersible. Can. J. Fish. Aquat. Sci. 64:1795-1804. Yoklavich, M.M., H.G. Greene, G.M. Cailliet, D.E. Sullivan, R.N. Lea, and M.S. Love. 2000. Habitat associations of deepwater rockfishes in a submarine canyon: An example of a natural refuge. Fish. Bull. U.S. 98:625-641.

Potrebbero piacerti anche