Segmenting the Mature Travel Market with Data Mining Tools

Author(s):  
Yawei Wang

The graying of America is one of the most significant demographic changes to the present and future of the United States (Moisey & Bichis, 1999). As more baby boomers enter their 50s and 60s, the mature travel market becomes a fast-growing market segment and starts to attract attention from many tourism researchers and professionals. The significant increases in size and wealth of the older population make the mature travel market a strong component of the general travel market (Reece, 2004). Understanding the mature market as well as mature travelers’ motivations are vital to the success of the travel industry (Brewer, Poffley, & Pederson, 1995; Hsu, Cai, & Wong, 2007). Today’s mature travel market can be generalized as being “different, diverse and demanding” (Harssel, 1994, p. 376). Faranda and Schmidt (1999) suggest that mature tourism marketers must recognize three critical components: the aging process comprehended from multiple disciplines, the acknowledged “heterogeneity and dynamic nature” of the mature market, and the “necessity for sound segmentation methods” (p. 24). It is not a simple task for marketers to fully understand the mature travel market. In order to better understand and serve the diverse market, tourism professionals will have to use data mining (DM) tools and techniques to discover the hidden patterns and characteristics of the mature travel market. According to Pyo, Uysal, and Chang (2002), DM can be applied to many areas in tourism research. These areas include destination quality control and perceptions, environmental scanning and optimization, travel behavior, tourism forecasting, and market segmentation and positioning. Therefore, the purpose of this study is to review and analyze the segmentation methods reported in the literature during the past seven years on the mature travel market and to explore the application of DM tools regarding the segmentation of the mature travel market in the near future.

Author(s):  
Rebecca Sanders

This chapter explores shifting patterns of intelligence surveillance in the United States. The Fourth Amendment protects Americans from unreasonable search and seizure without a warrant, but foreign spying is subject to few constraints. During the Cold War, surveillance power was abused for political purposes. Operating in a culture of secrecy, American intelligence agencies engaged in extensive illegal domestic spying. The intelligence scandals of the 1970s revealed these abuses, prompting new laws, notably the Foreign Intelligence Surveillance Act. Fearing further recrimination, the national security establishment increasingly demanded legal cover. After 9/11, Congress expanded lawful surveillance powers with the PATRIOT Act. Meanwhile, the Bush administration directed the National Security Agency to conduct warrantless domestic wiretapping. To justify this program, officials sought to redefine unconstrained foreign surveillance to subsume previously protected communications. The Obama administration continued to authorize mass surveillance and data mining programs and legally rationalize bulk collection of Americans’ data.


2021 ◽  
Vol 118 (52) ◽  
pp. e2110347118
Author(s):  
Ray Block ◽  
Charles Crabtree ◽  
John B. Holbein ◽  
J. Quin Monson

In this article, we present the results from a large-scale field experiment designed to measure racial discrimination among the American public. We conducted an audit study on the general public—sending correspondence to 250,000 citizens randomly drawn from public voter registration lists. Our within-subjects experimental design tested the public’s responsiveness to electronically delivered requests to volunteer their time to help with completing a simple task—taking a survey. We randomized whether the request came from either an ostensibly Black or an ostensibly White sender. We provide evidence that in electronic interactions, on average, the public is less likely to respond to emails from people they believe to be Black (rather than White). Our results give us a snapshot of a subtle form of racial bias that is systemic in the United States. What we term everyday or “paper cut” discrimination is exhibited by all racial/ethnic subgroups—outside of Black people themselves—and is present in all geographic regions in the United States. We benchmark paper cut discrimination among the public to estimates of discrimination among various groups of social elites. We show that discrimination among the public occurs more frequently than discrimination observed among elected officials and discrimination in higher education and the medical sector but simultaneously, less frequently than discrimination in housing and employment contexts. Our results provide a window into the discrimination that Black people in the United States face in day-to-day interactions with their fellow citizens.


Author(s):  
R. J. Engel ◽  
P. J. Tyler ◽  
L. R. Wood ◽  
D. T. Entenmann

Westinghouse has been a strong supporter of Reliability, Availability, and Maintainability (RAM) principles during product design and development. This is exemplified by the actions taken during the design of the 501F engine to ensure that high reliability and availability was achieved. By building upon past designs, utilizing those features most beneficial, and improving other areas, a highly reliable product was developed. A full range of RAM tools and techniques were utilized to achieve this result, including reliability allocations, modelling, and effective redesign of critical components. These activities began during the conceptual design phase and will continue throughout the life cycle of these engines until they are decommissioned.


Hadmérnök ◽  
2020 ◽  
Vol 15 (4) ◽  
pp. 141-158
Author(s):  
Eszter Katalin Bognár

In modern warfare, the most important innovation to date has been the utilisation of information as a  weapon. The basis of successful military operations is  the ability to correctly assess a situation based on  credible collected information. In today’s military, the primary challenge is not the actual collection of data.  It has become more important to extract relevant  information from that data. This requirement cannot  be successfully completed without necessary  improvements in tools and techniques to support the acquisition and analysis of data. This study defines  Big Data and its concept as applied to military  reconnaissance, focusing on the processing of  imagery and textual data, bringing to light modern  data processing and analytics methods that enable  effective processing.


Author(s):  
Alejandro Henao ◽  
Wesley E. Marshall

Millions of people in the United States travel by personal automobile to attend professional sports matches played at various stadiums. Engineering and planning publications lack information on parking provisions for major sporting events. The results from this paper on parking outcomes suggest that the current parking provisions are not efficient. This case study examines parking supply, parking utilization, event auto occupancy, and event auto modal share at four major professional sports venues in the Denver, Colorado, region. The percentage of parking supply per parking demand was calculated for several surveyed games in terms of the average attendance, and parking utilization was evaluated during nonevent periods. In general, the surveys of the games indicated that more parking was provided than was necessary, even when attendance was higher than typical. For an event with average attendance, parking utilization was as low as 65%, with 2.2 persons per vehicle. In contrast, when parking occupancy was high, auto occupancy increased to 3.0 persons per vehicle. With such different carpool rates, as well as evidence suggesting that spectators who travel to some facilities are willing to park and walk farther than a half-mile, the results suggest that parking supply and travel behavior are endogenous and should not be treated independently. This study also considered parking occupancy at nonevent times and found whole-scale underutilization, even in downtown locations with great opportunity costs.


Author(s):  
Eric Eidlin

Los Angeles, California, is generally considered the archetypal sprawling metropolis. Yet traditional measures equate sprawl with low population density, and Los Angeles is among the densest and thereby the least sprawling cities in the United States. How can this apparent paradox be explained? This paper argues that the answer lies in the fact that Los Angeles exhibits a comparatively even distribution of population throughout its urbanized area. As a result, the city suffers from many consequences of high population density, including extreme traffic congestion, poor air quality, and high housing prices, while offering its residents few benefits that typically accompany this density, including fast and effective public transit, vibrant street life, and tightly knit urban neighborhoods. The city's unique combination of high average population density with little differentiation in the distribution of population might best be characterized as dense sprawl, a condition that embodies the worst of urban and suburban worlds. This paper uses Gini coefficients to illustrate variation in population density and then considers a number of indicators–-most relating either to the provision of transportation infrastructure or to travel behavior–-that demonstrate the effects of low-variation population distribution on the quality of urban life in Los Angeles. This approach offers researchers, practitioners, and policy makers in Los Angeles and in smaller cities that are evolving in similar ways a useful and user-friendly tool for identifying, explaining, measuring, and addressing the most problematic aspects of sprawl.


2021 ◽  
Author(s):  
Jeffrey Basara ◽  
Stuart Edris ◽  
Jordan Christian ◽  
Bradley Illston ◽  
Eric Hunt ◽  
...  

<p>Flash droughts occur rapidly (~1 month timescale) and have produced significant ecological, agricultural, and socioeconomical impacts. Recent advances in our understanding of flash droughts have resulted in methods to identify and quantify flash drought events and overall occurrence. However, while it is generally understood that flash drought consists of two critical components including (1) anomalous, rapid intensification and (2) the subsequent occurrence of drought, little work has been done to quantify the spatial and temporal occurrence of the individual components, their frequency of covariability, and null events. Thus, this study utilized the standardized evaporative stress ratio (SESR) method of flash drought identification applied to the North American Regional Reanalysis NARR) to quantify individual components of flash drought from 1979 – 2019. Individual case studies were examined and the the drought component was assessed using the United States Drought Monitor for 2010 – 2019.   Additionally, the flash component was assessed using results of previous flash drought studies. Further, the correlation coefficient and composite mean difference was calculated between the flash component and flash droughts identified to determine what regions, if any, experienced rapid intensification but did not fall into flash drought. The results yielded that SESR was able to represent the spatial coverage of drought well for regions east of the Rocky Mountains, with mixed success regarding the intensity of the drought events. The flash component tended to agree well with other flash drought studies though some differences existed especially for areas west of the Rocky Mountains which experience rapid intensification at high frequencies but did not achieve drought designations due to hyper-aridity.</p>


Sign in / Sign up

Export Citation Format

Share Document