Item talk:Q147509

Add topic
Active discussions
Revision as of 13:52, 1 August 2023 by Sky (talk | contribs) (Added abstract and other texts to publication item's discussion page for reference)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Developing a quality assurance plan for telemetry studies: A necessary management tool for an effective study

Telemetry has been used to answer various questions associated with research, management, and monitoring programs and to monitor animal behavior and population dynamics throughout the world. Many telemetry projects have been developed to study the passage, behavior, and survival of migrating adult and juvenile salmonids at hydroelectric projects on the mainstem Columbia and Snake rivers (Skalski et al. 2001a, 2001b; Skalski et al. 2002; Keefer et al. 2004; Goniea et al. 2006; Plumb et al. 2006). Telemetry based field evaluations of the survival of salmon through hydroelectric projects are costly because of the technology (tags, telemetry systems, infrastructure, etc.) and personnel required to conduct the evaluations. Given the cost of implementing these projects, and the financial and conservation implications of the decisions made from the research results (e.g., forgone electricity production and conservation of threatened and endangered animals), ensuring quality data are collected by documenting all procedures, training, data checks, and that sound protocols and quality assurance and control procedures are in place is paramount.

Telemetry studies can pose unique data collection, processing, and analysis challenges. For instance, inferences about entire populations of animals are made from study animals that are captured, held, and tagged at disparate locations. Consequently great care must be taken to ensure that any potential biases that could arise from field procedures must be minimized (Peven et al. 2005). Interrogations of released study animals are remotely conducted by telemetry systems throughout the study area. The continuous recording of telemetry systems can result in large numbers of detections over a short time frame and the potential for false positive detections from records that are weak or erroneous. Thus, there is the potential to generate large data sets (many thousands of lines) that require significant postprocessing. Data reduction can be done using software or programming code within a software package or manually to discern noise from valid data and pull out the pertinent information for analysis. In either case, consistent well-documented procedures need to be in place to ensure quality results and allow for repeatability of study methods.

Return to "Q147509" page.