scholarly journals Visual Fixation in NBA Free-Throws and the Relationship to On-Court Performance

2020 ◽  
Vol 2 (1) ◽  
pp. e1-e7
Author(s):  
Daniel Laby

Purpose Although hitting a baseball is often described as the most difficult task in all of sports, shooting baskets during a game likely ranks a close second. Previous studies have described the role of vision in basketball and more specifically a concept termed the “quiet eye” has been shown to be related to basketball performance. How a shooter visualizes the target, how consistent their visual fixation is, and how long they maintain that fixation has been correlated to shooting success. Although the majority of previous reports have included non-professional basketball shooters, we evaluated NBA (National Basketball Association) players to determine if this skill was significant at the professional level. Materials and Methods We evaluated 16 professional NBA players prior to the 2018-2019 NBA season. All players shot 30 consecutive free-throws while wearing Tobii Pro eye-tracking glasses. Following the completion of the task, several metrics were calculated including shooting success rate, as well as four measures of the position and duration of ocular fixation just prior to, during, and immediately after ball release for each shot of each player. Additionally, player performance statistics from the 2018-2019 season were recorded and compared to the visual fixation data. Descriptive statistics as well as correlations between the visual fixation metrics and on-court performance metrics were calculated. Results NBA shooters averaged a 79% success rate in free throw shooting (SD = 14%, min = 56%, max=100%) during the study. Moderate statistically significant correlations were found between the percentage of successful free throws and the four measures of visual fixation (r=0.539 to 0.687). In addition, visual fixation measures were found to be corelated with on-court metrics suggesting that shooters who had more frequent, as well as longer, fixations on the rim where more likely to have lower USG%, and ORB% as well as higher FG3%. The percentage of successful shots in the study was compared to the on-court FT% and found to be moderately correlated (r=0.536). Conclusions The need to maintain ocular fixation on the rim as one shoots seems elementary, but in fact varies greatly among NBA players, as noted in these results. Our data suggests that players who visually fixate longer and more frequently on the rim are more likely to be successful in free throws, as well as more successful in 3-point goals. Likely due to their likely distance from the basket, they do not make as many offensive rebounds. This data set appears to describe basketball guards in contrast to forwards/centers and supports previous research on non-professional basketball players.

2017 ◽  
Vol 10 (2) ◽  
pp. 169-174 ◽  
Author(s):  
Moin Khan ◽  
Kim Madden ◽  
M. Tyrrell Burrus ◽  
Joseph P. Rogowski ◽  
Jeff Stotts ◽  
...  

Background: Professional basketball players in the National Basketball Association (NBA) subject their lower extremities to significant repetitive loading during both regular-season and off-season training. Little is known about the incidence of lower extremity bony stress injuries and their impact on return to play and performance in these athletes. Hypothesis: Stress injuries of the lower extremity will have significant impact on performance. Study Design: Case series. Level of Evidence: Level 4. Methods: All bony stress injuries from 2005 to 2015 were identified from the NBA. Number of games missed due to injury and performance statistics were collected from 2 years prior to injury to 2 years after the injury. A linear regression analysis was performed to determine the impact of injury for players who returned to sport. Results: A total of 76 lower extremity bony stress injuries involving 75 NBA players (mean age, 25.4 ± 4.1 years) were identified. Fifty-five percent (42/76) involved the foot, and most injuries occurred during the regular season (82.9%, 63/76), with half occurring within the first 6 weeks. Among players who sustained a fifth metatarsal stress fracture, 42.9% were unable to return to professional play. Players who sustained stress injuries had reduced play performance, specifically related to number of games played ( P = 0.014) and number of steals per game ( P = 0.004). Players who had surgery had significantly better performance at 2 years than those who were managed nonoperatively, independent of the type of injury (β = 4.561; 95% CI, 1.255-7.868). Conclusion: Lower extremity bony stress injuries may significantly affect both short- and long-term player performance and career length. Stress injuries result in decreased player performance, and surgical intervention results in improved performance metrics compared with those treated using conservative methods. Clinical Relevance: Stress injuries result in decreased player performance, and surgical intervention results in improved performance metrics.


Author(s):  
Jordan L. Fox ◽  
Robert Stanton ◽  
Aaron T. Scanlan ◽  
Masaru Teramoto ◽  
Charli Sargent

Purpose: To investigate the associations between sleep and competitive performance in basketball. Methods: A total of 7 semiprofessional, male players were monitored across the in-season. On nights prior to competition, sleep duration and quality were assessed using actigraphs and sleep diaries. The data were accumulated over 1 (night 1), 2 (nights 1–2 combined), 3 (nights 1–3 combined), and 4 (nights 1–4 combined) nights prior to competition. Performance was reported as player statistics (field goal and free-throw accuracy, rebounds, assists, steals, blocks, and turnovers) and composite performance statistics (offensive rating, defensive rating, and player efficiency). Linear regression analyses with cluster-robust standard errors using bootstrapping (1000 replications) were performed to quantify the association between sleep and performance. Results: The night before competition, subjective sleep quality was positively associated with offensive rating and player efficiency (P < .05). Conclusions: Strategies to increase subjective sleep quality the night before competition should be considered to increase the likelihood of successful in-game performance, given its association with composite performance metrics.


2020 ◽  
pp. 1-14
Author(s):  
Esraa Hassan ◽  
Noha A. Hikal ◽  
Samir Elmuogy

Nowadays, Coronavirus (COVID-19) considered one of the most critical pandemics in the earth. This is due its ability to spread rapidly between humans as well as animals. COVID_19 expected to outbreak around the world, around 70 % of the earth population might infected with COVID-19 in the incoming years. Therefore, an accurate and efficient diagnostic tool is highly required, which the main objective of our study. Manual classification was mainly used to detect different diseases, but it took too much time in addition to the probability of human errors. Automatic image classification reduces doctors diagnostic time, which could save human’s life. We propose an automatic classification architecture based on deep neural network called Worried Deep Neural Network (WDNN) model with transfer learning. Comparative analysis reveals that the proposed WDNN model outperforms by using three pre-training models: InceptionV3, ResNet50, and VGG19 in terms of various performance metrics. Due to the shortage of COVID-19 data set, data augmentation was used to increase the number of images in the positive class, then normalization used to make all images have the same size. Experimentation is done on COVID-19 dataset collected from different cases with total 2623 where (1573 training,524 validation,524 test). Our proposed model achieved 99,046, 98,684, 99,119, 98,90 In terms of Accuracy, precision, Recall, F-score, respectively. The results are compared with both the traditional machine learning methods and those using Convolutional Neural Networks (CNNs). The results demonstrate the ability of our classification model to use as an alternative of the current diagnostic tool.


2021 ◽  
Vol 11 (3) ◽  
pp. 1225
Author(s):  
Woohyong Lee ◽  
Jiyoung Lee ◽  
Bo Kyung Park ◽  
R. Young Chul Kim

Geekbench is one of the most referenced cross-platform benchmarks in the mobile world. Most of its workloads are synthetic but some of them aim to simulate real-world behavior. In the mobile world, its microarchitectural behavior has been reported rarely since the hardware profiling features are limited to the public. As a popular mobile performance workload, it is hard to find Geekbench’s microarchitecture characteristics in mobile devices. In this paper, a thorough experimental study of Geekbench performance characterization is reported with detailed performance metrics. This study also identifies mobile system on chip (SoC) microarchitecture impacts, such as the cache subsystem, instruction-level parallelism, and branch performance. After the study, we could understand the bottleneck of workloads, especially in the cache sub-system. This means that the change of data set size directly impacts performance score significantly in some systems and will ruin the fairness of the CPU benchmark. In the experiment, Samsung’s Exynos9820-based platform was used as the tested device with Android Native Development Kit (NDK) built binaries. The Exynos9820 is a superscalar processor capable of dual issuing some instructions. To help performance analysis, we enable the capability to collect performance events with performance monitoring unit (PMU) registers. The PMU is a set of hardware performance counters which are built into microprocessors to store the counts of hardware-related activities. Throughout the experiment, functional and microarchitectural performance profiles were fully studied. This paper describes the details of the mobile performance studies above. In our experiment, the ARM DS5 tool was used for collecting runtime PMU profiles including OS-level performance data. After the comparative study is completed, users will understand more about the mobile architecture behavior, and this will help to evaluate which benchmark is preferable for fair performance comparison.


2021 ◽  
Vol 9 (6) ◽  
pp. 232596712110152
Author(s):  
Rafael Sanchez ◽  
Blake H. Hodgens ◽  
Joseph S. Geller ◽  
Samuel Huntley ◽  
Jonathan Kaplan ◽  
...  

Background: Achilles tendon (AT) ruptures are devastating injuries that are highly prevalent among athletes. Despite our understanding of the effect of AT rupture and in particular its relationship to basketball, no study has examined the effects of AT rupture and repair on performance metrics in collegiate basketball players. Purpose: To evaluate the effect of AT rupture and subsequent surgical repair on performance metrics in National Collegiate Athletic Association (NCAA) Division I basketball players who return to play after injury. Study Design: Descriptive epidemiology study. Methods: NCAA Division I basketball players who sustained an AT rupture and underwent subsequent surgical repair between 2000 and 2019 were identified by systematically evaluating individual injury reports from databases comprising NCAA career statistics and individual school statistics; 65 male and 41 female players were identified. Athletes were included if they participated in at least one-half of the games of 1 collegiate season before tearing the AT and at least 1 season after operative repair. A total of 50 male and 30 female athletes were included. Each injured athlete was matched to a healthy control by conference, position, starter status at time of injury, class year, and number of games played. Matched controls were healthy players and experienced no significant injuries during their NCAA careers. Results: After AT repair, male athletes had significantly more minutes per game, points per game, and compared with before injury. Total blocks significantly decreased after injury. Female athletes scored significantly more points per game but demonstrated a significantly lower 3-point shooting percentage after return to play. Despite undergoing AT rupture and repair, 14% of male players played in the National Basketball Association, and 20% of injured female athletes played in the Women’s National Basketball Association. Conclusion: After returning to play, men demonstrated a significant drop-off in performance only in regard to total blocks. Female athletes after AT repair demonstrated a significant improvement in points per game but had a significant drop-off in 3-point shooting percentage.


2015 ◽  
Vol 26 (6) ◽  
pp. 619-623 ◽  
Author(s):  
Michael G. Azzam ◽  
Thomas W. Throckmorton ◽  
Richard A. Smith ◽  
Drew Graham ◽  
Jim Scholler ◽  
...  

2009 ◽  
Vol 2009 ◽  
pp. 1-8 ◽  
Author(s):  
Janet Myhre ◽  
Daniel R. Jeske ◽  
Michael Rennie ◽  
Yingtao Bi

A heteroscedastic linear regression model is developed from plausible assumptions that describe the time evolution of performance metrics for equipment. The inherited motivation for the related weighted least squares analysis of the model is an essential and attractive selling point to engineers with interest in equipment surveillance methodologies. A simple test for the significance of the heteroscedasticity suggested by a data set is derived and a simulation study is used to evaluate the power of the test and compare it with several other applicable tests that were designed under different contexts. Tolerance intervals within the context of the model are derived, thus generalizing well-known tolerance intervals for ordinary least squares regression. Use of the model and its associated analyses is illustrated with an aerospace application where hundreds of electronic components are continuously monitored by an automated system that flags components that are suspected of unusual degradation patterns.


Sign in / Sign up

Export Citation Format

Share Document