scholarly journals Variable Data Rate in Optical LEO Direct-to-Earth Links: Design Aspects and System Analysis

Author(s):  
Pantelis-Daniel Arapoglou ◽  
Giulio Colavolpe ◽  
Tommaso Foggi ◽  
Nicolò Mazzali ◽  
Armando Vannucci

In the frame of ongoing efforts between space agencies to define an on-off-keying-based optical low-Earth-orbit (LEO) direct-to-Earth (DTE) waveform, this paper offers an in-depth analysis of the Variable Data Rate (VDR) technique. VDR, in contrast to the currently adopted Constant Data Rate (CDR) approach, enables the optimization of the average throughput during a LEO pass over the optical ground station (OGS). The analysis addresses both critical link level aspects, such as receiver (time, frame, and amplitude) synchronization, as well as demonstrates the benefits stemming from employing VDR at system level. This was found to be around 100% compared to a CDR transmission approach.

2021 ◽  
Author(s):  
Pantelis-Daniel Arapoglou ◽  
Giulio Colavolpe ◽  
Tommaso Foggi ◽  
Nicolò Mazzali ◽  
Armando Vannucci

In the frame of ongoing efforts between space agencies to define an on-off-keying-based optical low-Earth-orbit (LEO) direct-to-Earth (DTE) waveform, this paper offers an in-depth analysis of the Variable Data Rate (VDR) technique. VDR, in contrast to the currently adopted Constant Data Rate (CDR) approach, enables the optimization of the average throughput during a LEO pass over the optical ground station (OGS). The analysis addresses both critical link level aspects, such as receiver (time, frame, and amplitude) synchronization, as well as demonstrates the benefits stemming from employing VDR at system level. This was found to be around 100% compared to a CDR transmission approach.


Author(s):  
Dhaneesh R. Tahilramani ◽  
Juliet Hitchins

For the past decade Cummins Inc. have increased the use of standard Finite Element Analysis (FEA) techniques to drive the design of its products. However, as FEA models are not scalable to the limits of hardware, running traditional FEA, especially on large High Horse Power (HHP) engine structures’ assemblies, both reliably and within a reasonable time frame was found to be not possible. This led to carrying out numerous analyses with fewer parts and assumed boundary conditions. This strategy ignores effects due to system vibration of the assembly. To reduce the risk of failures on complex assemblies, high speed engines required a more accurate analytical prediction of modal stresses on a system level. To increase the capacity of running system level analyses, a structured approach was followed and the Model Reduction Techniques Functional Excellence mini team was set up to develop methods and train analysts. The team have been using Six Sigma tools [1] to carry out voice of the customer interviews in order to define the analytical requirements for running models for large complex structures (>20 million degree of freedom). This consists of brainstorming concepts to select solutions based on advanced analytical Substructuring techniques to best fit requirements. The benefits of the new process include a significant reduction in solve time, the ability to carry out system analysis, to follow an efficient working practice using a modular approach, to allow parallel processing globally and secure intellectual property rights when working with suppliers and customers of the Cummins Inc. products. This paper shares experience on applying model reduction techniques following a structured approach and highlights computing and training resources for an analysis team.


Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 2915
Author(s):  
Matteo Bertolucci ◽  
Riccardo Cassettari ◽  
Luca Fanucci

In recent years there have been significant developments in satellite transmitter technology to follow the rapid innovation of sensors on-board new satellites. The CCSDS 131.2-B-1 standard for telemetry downlink, released in 2012, is part of the next generation of standards that aims to support the increased data-rate caused by these improvements in resolution. As a result of its relative novelty, this standard currently lacks in-depth analysis by researchers, but it is also strongly supported by the European Space Agency (ESA) for future missions. For these reasons, it seems important to evaluate how major receiver sub-components, such as timing recovery and carrier frequency correction, can be designed and implemented in new receivers that support this standard. The timing error detectors (TED) and frequency error detectors (FED) were therefore studied on the specific peculiarities of CCSDS 131.2-B-1 in its usual environment of Low Earth Orbit (LEO). Estimators have been evaluated highlighting performances, trade-offs and peculiarities of each one with respect to corresponding architectural choices. Finally, a receiver architecture derived from the paper considerations is proposed in the aim of supporting very different mission scenarios. Specifically, the realized architecture employs a parallel feedforward estimator for the timing recovery section and a novel multi-algorithm feedback frequency correction loop to efficiently cover both low symbol rates (5 Mbaud) and high data-rates (up to 500 Mbaud). This solution represents a good trade-off to support these scenarios in a very compact footprint by pushing the clock frequency to the FPGA limit. The FPGA resources occupation on a Zynq Ultrascale+ RFSoC XCZU28DR FPGA is 5202 LUT, 4851 FF, 5 BRAM, and 21 DSP for the timing recovery part, while the frequency recovery section occupies 1723 LUT, 1511 FF, 2.5 BRAM and 32 DSP.


2012 ◽  
Vol 114 (11) ◽  
pp. 1-48 ◽  
Author(s):  
Julie A. Marsh

Background/Context In recent years, states, districts, schools, and external partners have recognized the need to proactively foster the use of data to guide educational decision-making and practice. Understanding that data alone will not guarantee use, individuals at all levels have invested in interventions to support better access to, interpretation of, and responses to data of all kinds. Despite the emergence of these efforts, there has been little systematic examination of research on such efforts. Purpose/Objective/Research Question/Focus of Study This article synthesizes what we currently know about interventions to support educators’ use of data—ranging from comprehensive, system-level initiatives, such as reforms sponsored by districts or intermediary organizations, to more narrowly focused interventions, such as a workshop. The article summarizes what is what is known across studies about the design and implementation of these interventions, their effects at the individual and organizational levels, and the conditions shown to affect implementation and outcomes. Research Design Literature review. Data Collection and Analysis This review entailed systematic searches of electronic databases and careful sorting to yield a total of 41 books, peer-reviewed journal articles, and reports. Summaries of each publication were coded to identify the study methods (design, framework, sample, time frame, data collection), intervention design (level of schooling, focal data and data user, leverage points, components), and findings on implementation, effects, and conditions. Findings/Results The review uncovers a host of common themes regarding implementation, including promising practices (e.g., making data “usable” and “safe,” targeting multiple leverage points) and persistent challenges (e.g., developing support that is generic but also customized, sustaining sufficient support). The review also finds mixed findings and levels of research evidence on effects of interventions, with relatively more evidence on effects on educators’ knowledge, skills, and practice than on effects on organizations and student achievement. The article also identifies a set of common conditions found to influence intervention implementation and effects, including intervention characteristics (capacity, data properties), broader context (leadership, organizational structure), and individual relationships and characteristics (trust, beliefs and knowledge). Conclusions/Recommendations The review finds that the current research base is limited in quantity and quality. It suggests the need for more methodologically rigorous research and greater attention to the organizational and student-level outcomes of interventions, comparative analyses, interventions that help educators move from knowledge to action, and specific ways in which the quality of data and leadership practices shape the effectiveness of interventions.


2014 ◽  
Vol 1003 ◽  
pp. 230-234
Author(s):  
Feng Long Fan ◽  
Xu Li ◽  
Xu Sheng Yu ◽  
Li Wang

With the development of computer technology, especially the rapid development and extensive application of database technology and computer network technology, the data quantity increases sharply, the accumulation of a large number of data in various fields, and rising fast. Enterprise database or data warehouse has stored a large number of customer data, these data include many aspects of the information of customers, but also contains the advantages and disadvantages of the enterprise operation. If we can use these data quickly, efficiently in-depth analysis and research, to find the rules and modes, to obtain the necessary knowledge from it, will help enterprises to better decision making. The system can successfully excavate students and staff of the tendency of consumption and consumption habits, analyze their satisfaction with each restaurant. The results of system analysis to a certain role for school students and teachers to understand and analyze the condition of daily consumption, so the establishment of this system has great practical value.


Author(s):  
Todd D. Jack ◽  
Carl N. Ford ◽  
Shari-Beth Nadell ◽  
Vicki Crisp

A causal analysis of aviation accidents by engine type is presented. The analysis employs a top-down methodology that performs a detailed analysis of the causes and factors cited in accident reports to develop a “fingerprint” profile for each engine type. This is followed by an in-depth analysis of each fingerprint that produces a sequential breakdown. Analysis results of National Transportation Safety Board (NTSB) accidents, both fatal and non-fatal, that occurred during the time period of 1990–1998 are presented. Each data set is comprised of all accidents that involved aircraft with the following engine types: turbofan, turbojet, turboprop, and turboshaft (includes turbine helicopters). During this time frame there were 1461 accidents involving turbine powered aircraft; 306 of these involved propulsion malfunctions and/ or failures. Analyses are performed to investigate the sequential relationships between propulsion system malfunctions or failures with other causes and factors for each engine type. Other malfunctions or events prominent within each data set are also analyzed. Significant trends are identified. The results from this study can be used to identify areas for future research into intervention, prevention, and mitigation strategies.


Stroke ◽  
2016 ◽  
Vol 47 (suppl_1) ◽  
Author(s):  
J A Oostema ◽  
Maria Tecos ◽  
Deborah Sleight ◽  
Brian Mavis

Introduction: Ischemic stroke patients who arrive by emergency medical service (EMS) receive faster emergency department evaluations and improved rates of thrombolytic treatment. However, EMS stroke recognition and compliance with prehospital stroke quality measures are inconsistent. We hypothesized that EMS stroke care is influenced by a complex interaction of knowledge, beliefs, and system-level variables that influence behavior. Methods: Focus groups of paramedics from a single urban/suburban county were assembled to discuss their experiences identifying and transporting stroke patients. Focus groups were conducted using a semi-structured interview format and audio recorded. Transcripts of focus groups were qualitatively analyzed to identify themes, subthemes, and patterns of paramedic responses. The Clinical Practice Guidelines Framework provided the initial coding scheme, which was modified during the coding process by three coders using grounded theory methods, who came to consensus on which codes to apply. Results: Three focus groups (n=13) were conducted to reach theme saturation. Overall, paramedics reported high confidence in clinical gestalt for assessing stroke patients and a strong desire to “do the right thing,” but were unfamiliar with published guidelines. Paramedics identified variability in the clinical presentations of stroke, inadequate or inconsistent hospital guidance, and lack of feedback regarding care as principle barriers to ideal prehospital stroke care. Participants reported conflicting hospital guidance regarding the appropriate time frame for a high priority transport and hospital prenotification. Feedback regarding final diagnosis was viewed as critical for developing improved clinical acumen. Direct to CT protocols were cited as an effective way to integrate EMS into hospital stroke response. Conclusion: In this qualitative analysis, paramedics expressed a desire for clear, hospital-directed guidance and consistent feedback regarding outcomes for suspected stroke patients.


Sign in / Sign up

Export Citation Format

Share Document