Temporal trends in intake category data for animal shelter and rescue organizations in Colorado from 2008 to 2018

Author(s):  
Sloane M. Hawes ◽  
Tess M. Hupe ◽  
Jaci Gandenberger ◽  
Kevin N. Morris

Abstract OBJECTIVE To investigate trends in animal shelter and rescue organization intake for dogs and cats in Colorado from 2008 to 2018. SAMPLE 482 animal shelters and rescue organizations that reported annual intake data to the State of Colorado Department of Agriculture for 1,086,630 dogs and 702,333 cats. PROCEDURES Total intake, intake for each of 5 Pet Animal Care and Facilities Act categories (stray, owner surrender, intrastate transfer, interstate transfer, or other), and community-based intake (total intake after exclusion of transfers) of dogs and cats were assessed in total and for each organization type (shelter or rescue organization). The number taken in per year, number taken in/1,000 capita (human residents)/y, and number in each intake category as a percentage of total intake for the same species per year were analyzed with linear regression models. RESULTS Trend lines indicated that total dog intake increased over the study period, but there was no change when these data were adjusted for the human population. Cat intake decreased over time according to both of these measures. Total community-based intake decreased, whereas total intake by interstate transfer from other organizations increased for both species during the study period. CLINICAL RELEVANCE Increased transfer of dogs and cats across state lines into regions with low community-based shelter intake suggested that regional and national animal disease trends could potentially impact disease profiles for recipient areas. Findings supported efforts toward collecting animal shelter and rescue organization intake and outcome data across larger systems.

2019 ◽  
Vol 254 (3) ◽  
pp. 363-372 ◽  
Author(s):  
Sloane M. Hawes ◽  
Bridget A. Camacho ◽  
Philip Tedeschi ◽  
Kevin N. Morris

Animals ◽  
2020 ◽  
Vol 10 (8) ◽  
pp. 1395
Author(s):  
Daniel D. Spehar ◽  
Peter J. Wolf

The number of cats and dogs impounded and euthanized at animal shelters in the USA has declined dramatically in recent decades. The Humane Society of the United States reported that in 1973 an estimated 13.5 million cats and dogs were euthanized nationwide; according to Best Friends Animal Society, in 2018 that number had been reduced to approximately 733,000. A disproportionate number of animals euthanized at shelters today are free-roaming feral and stray cats, who most often face euthanasia due to their temperament or a lack of shelter space. Over the past decade, two new management tactics—return-to-field (RTF) and targeted trap-neuter-return (TNR)—have exhibited the capacity to contribute to significant reductions in feline euthanasia and intake. The present study examines changes in feline euthanasia and intake, as well as impacts on additional metrics, at a municipal animal shelter in Jefferson County, KY, USA, after an RTF program was added to an ongoing community-based TNR program. A combined total of 24,697 cats were trapped, sterilized, vaccinated, and returned over 8 years as part of the concurrent RTF and TNR programs. Feline euthanasia at Louisville Metro Animal Services (LMAS) declined by 94.1% and feline intake dropped by 42.8%; the live-release rate (LRR) increased by 147.6% due primarily to reductions in both intake and euthanasia. The results of the present study corroborate prior research on the effectiveness of combining RTF and TNR and exemplify the flexibility available to communities in configuring such programs to align with their particular needs and resources.


2012 ◽  
Vol 21 (2) ◽  
pp. 141 ◽  
Author(s):  
Brian R. Miranda ◽  
Brian R. Sturtevant ◽  
Susan I. Stewart ◽  
Roger B. Hammer

Most drivers underlying wildfire are dynamic, but at different spatial and temporal scales. We quantified temporal and spatial trends in wildfire patterns over two spatial extents in northern Wisconsin to identify drivers and their change through time. We used spatial point pattern analysis to quantify the spatial pattern of wildfire occurrences, and linear regression to quantify the influence of drought and temporal trends in annual number and mean size of wildfires. Analyses confirmed drought as an important driver of both occurrences and fire size. When both drought and time were incorporated in linear regression models, the number of wildfires showed a declining trend across the full study area, despite housing density increasing in magnitude and spatial extent. Fires caused by campfires and debris-burning did not show any temporal trends. Comparison of spatial models representing biophysical, anthropogenic and combined factors demonstrated human influences on wildfire occurrences, especially human activity, infrastructure and property values. We also identified a non-linear relationship between housing density and wildfire occurrence. Large wildfire occurrence was predicted by similar variables to all occurrences, except the direction of influence changed. Understanding these spatial and temporal drivers of wildfire occurrence has implications for land-use planning, wildfire suppression strategies and ecological goals.


2005 ◽  
Vol 7 (2) ◽  
pp. 109-119 ◽  
Author(s):  
Michael J. Bannasch ◽  
Janet E. Foley

Upper respiratory tract infection (URI) propagates readily within cats in shelters and often results in euthanasia of affected cats. In a case-control evaluation of 573 cats in eight shelters in California in 2001 and 2002, the prevalence of feline calicivirus (FCV) was from 13 to 36%, feline herpesvirus (FHV) was from 3 to 38%, and prevalence of Bordetella bronchiseptica, Chlamydophila felis, and Mycoplasma species was from 2 to 14%. Cats with URI tended to be housed in isolation, dehydrated, and younger than cats without URI, and infected with FHV, Mycoplasma species, FCV, or C felis. Shelters differed in the prevalence of pathogens and many cats appeared positive for infection after about 1 week of sheltering. It is helpful for shelters to understand the risk factors associated with URI in order to evaluate the costs and benefits of treatment and improve their procedures to decrease the incidence of URI within their facilities. Antiherpetics and antimycoplasmal drugs may be beneficial for individual animal care. Results document the utility of comprehensive URI surveillance and herd management for specific pathogens typical in that shelter.


2020 ◽  
Vol 71 (Supplement_2) ◽  
pp. S96-S101
Author(s):  
Franziska Olgemoeller ◽  
Jonathan J Waluza ◽  
Dalitso Zeka ◽  
Jillian S Gauld ◽  
Peter J Diggle ◽  
...  

Abstract Background Typhoid fever remains a major source of morbidity and mortality in low-income settings. Its most feared complication is intestinal perforation. However, due to the paucity of diagnostic facilities in typhoid-endemic settings, including microbiology, histopathology, and radiology, the etiology of intestinal perforation is frequently assumed but rarely confirmed. This poses a challenge for accurately estimating burden of disease. Methods We recruited a prospective cohort of patients with confirmed intestinal perforation in 2016 and performed enhanced microbiological investigations (blood and tissue culture, plus tissue polymerase chain reaction [PCR] for Salmonella Typhi). In addition, we used a Poisson generalized linear model to estimate excess perforations attributed to the typhoid epidemic, using temporal trends in S. Typhi bloodstream infection and perforated abdominal viscus at Queen Elizabeth Central Hospital from 2008–2017. Results We recruited 23 patients with intraoperative findings consistent with intestinal perforation. 50% (11/22) of patients recruited were culture or PCR positive for S. Typhi. Case fatality rate from typhoid-associated intestinal perforation was substantial at 18% (2/11). Our statistical model estimates that culture-confirmed cases of typhoid fever lead to an excess of 0.046 perforations per clinical typhoid fever case (95% CI, .03–.06). We therefore estimate that typhoid fever accounts for 43% of all bowel perforation during the period of enhanced surveillance. Conclusions The morbidity and mortality associated with typhoid abdominal perforations are high. By placing clinical outcome data from a cohort in the context of longitudinal surgical registers and bacteremia data, we describe a valuable approach to adjusting estimates of the burden of typhoid fever.


2017 ◽  
Vol 46 (6) ◽  
pp. 488-497 ◽  
Author(s):  
Finnian R. Mc Causland ◽  
Brian Claggett ◽  
Marc A. Pfeffer ◽  
Emmanuel A. Burdmann ◽  
Kai-Uwe Eckardt ◽  
...  

Background: The pathogenesis of chronic kidney disease associated anemia is multifactorial and includes decreased production of erythropoietin (EPO), iron deficiency, inflammation, and EPO resistance. To better understand the trajectory of these parameters, we described temporal trends in hemoglobin (Hb), ferritin, transferrin saturation, C-reactive protein (CRP), and darbepoetin dosing in the Trial to Reduce cardiovascular Events with Aranesp Therapy (TREAT). Methods: We performed a post hoc analysis of 4,038 participants in TREAT. Mixed effects linear regression models were used to determine the trajectory of parameters of interest prior to end-stage renal disease (ESRD). Likelihood ratio tests were used to determine the overall differences in biomarker values and differences in trajectories between those who did and did not develop ESRD. Results: Hb declined precipitously in the year prior to the development of ESRD (irrespective of treatment assignment), and was on average 1.15 g/dL (95% CI –1.26 to –1.04) lower in those who developed ESRD versus those who did not, at the time of ESRD/end of follow-up. Simultaneously, the mean darbepoetin dose and CRP concentration increased, while serum ferritin and transferrin saturations were >140 μg/L and 20%, respectively. Conclusions: Our analyses provide descriptive insights regarding the temporal changes of Hb, darbepoetin dose, and related parameters as ESRD approaches in participants of TREAT. Hb declined as much as 1–2 years prior to the development of ESRD, without biochemical evidence of iron deficiency. The most precipitous decline occurred in the months immediately prior to ESRD, despite administration of escalating doses of darbepoetin and in parallel with an increase in CRP.


Author(s):  
Elenuel T. Genova ◽  
Mario N. Abeto ◽  
Noel N. Lebrilla

In 2012, The Philippine National Aquasilviculture Project (PNAP) was forged and formally launched by the Department of Agriculture - Bureau of Fisheries and Aquatic Resources (DABFAR). To implement the PNAP, a Memorandum of Agreement (MOA) was executed by and between BFAR and the Commission on Higher Education (CHED) on December 16, 2011. The study was conducted to assess the status and development of the PNAP program implemented in Southern Negros, at four (4) Municipalities and two (2) Cities. The BFAR downloaded a total fund amounting Php 10,148,812.50 for the four (4) phases of the program wherein, the Php 6,422,762.5 was spent for resource rehabilitation (mangrove planting); Php 845,000.00 for Aquasilviculture; Php 1,140,000.00 for Multi-species hatchery and Php 1,741,050.00 for administrative costs. The total paid mangrove propagules planted in Southern Negros was 978,000 with 30% buffer with a total of 1,144,260 survived propagules planted in 130.4 hectares of coastal land in Southern Negros and commensurate 673 direct beneficiaries. The 274 fisherfolk beneficiaries augmented income from Aquasilviculture project while a total of 1,284 berried wild bluecrab which estimated to produce up to 2 million eggs have been reared in Multi-species hatchery. A conservative estimate of 1% survival in natural habitat under natural conditions represented the contribution of the project for the beleaguered bluecrab capture fisheries. The beneficiaries really appreciate the efforts made by CHMSC-Binalbagan as program implementer based on their responses on the given important factors of service delivery to the community.


Pathogens ◽  
2021 ◽  
Vol 10 (11) ◽  
pp. 1409
Author(s):  
Lance D. Erickson ◽  
Dawson W. Hedges ◽  
Bruce L. Brown ◽  
Bradley Embley ◽  
Shawn D. Gale

Several viral, bacterial, and parasitic diseases have been associated with cognitive function and neuropsychiatric outcomes in humans, including human T-cell lymphotropic virus 1 (HTLV-1). In this study, we sought to further generalize previously reported associations of cognitive function and depression with HTLV-1 seropositivity and serointensity using a community-based sample of adults aged approximately 40 to 70 years (mean = 55.3 years) from the United Kingdom. In this sample, the results of adjusted linear regression models showed no associations of HTLV-1 seropositivity or serointensity with reasoning, pairs-matching, or reaction-time cognitive tasks or with depression. In addition, neither age, sex, educational attainment, nor income moderated associations of HTLV-1 seropositivity or serointensity with cognitive function or depression. In this middle-aged to older middle-aged adult community sample, HTLV-1 seropositivity and serointensity do not appear to be associated with reasoning, pairs-matching, and reaction-time tasks or with depression.


2021 ◽  
Vol 83 (4) ◽  
pp. 1665-1678
Author(s):  
Cuiling Wang ◽  
Mindy J. Katz ◽  
Katherine H. Chang ◽  
Jiyue Qin ◽  
Richard B. Lipton ◽  
...  

Background: The Uniform Data Set, Version 3 Neuropsychological Battery (UDSNB3.0), from the database of the University of Washington’s National Alzheimer’s Coordinating Center (NACC), is widely used to characterize cognitive performance in clinical and research settings; however, norms for underrepresented community-based samples are scarce. Objective: We compared UDSNB 3.0 test scores between the Einstein Aging Study (EAS), composed of racially/ethnically diverse, community-dwelling older adults aged≥70 and the NACC, and report normative data from the EAS. Methods: Analyses included 225 cognitively normal EAS participants and comparable data from 5,031 NACC database participants. Linear regression models compared performance between the samples, adjusting for demographics (sex, age, education, race/ethnicity), depressive symptoms, and whether English was the first language. Linear regression models to examine demographic factors including age, sex, education and race/ethnicity as predictors for the neuropsychological tests were applied in EAS and NACC separately and were used to create a demographically adjusted z-score calculator. Results: Cognitive performance across all domains was worse in the EAS than in the NACC, adjusting for age, sex, education, race/ethnicity, and depression, and the differences remained in visuo-construction, visuospatial memory, confrontation naming, visual attention/processing speed, and executive functioning after further adjusting for whether English was the first language. In both samples, non-Hispanic Whites outperformed non-Hispanic Blacks and more education was associated with better cognitive performance. Conclusion: Differences observed in demographic, clinical, and cognitive characteristics between the community-based EAS sample and the nationwide NACC sample suggest that separate normative data that more accurately reflect non-clinic, community-based populations should be established.


Sign in / Sign up

Export Citation Format

Share Document