logarithmic decay
Recently Published Documents


TOTAL DOCUMENTS

47
(FIVE YEARS 8)

H-INDEX

14
(FIVE YEARS 1)

2021 ◽  
Vol 931 ◽  
Author(s):  
Peter A. Monkewitz

The scaling of different features of streamwise normal stress profiles $\langle uu\rangle ^+(y^+)$ in turbulent wall-bounded flows is the subject of a long-running debate. Particular points of contention are the scaling of the ‘inner’ and ‘outer’ peaks of $\langle uu\rangle ^+$ at $y^+\approxeq ~15$ and $y^+ ={O}(10^3)$ , respectively, their infinite Reynolds number limit, and the rate of logarithmic decay in the outer part of the flow. Inspired by the thought-provoking paper of Chen & Sreenivasan (J. Fluid Mech., vol. 908, 2021, p. R3), two terms of an inner asymptotic expansion of $\langle uu\rangle ^+$ in the small parameter $Re_{\tau }^{-1/4}$ are constructed from a set of direct numerical simulations (DNS) of channel flow. This inner expansion is for the first time matched through an overlap layer to an outer expansion, which not only fits the same set of channel DNS within 1.5 % of the peak stress, but also provides a good match of laboratory data in pipes and the near-wall part of boundary layers, up to the highest $Re_{\tau }$ values of $10^5$ . The salient features of the new composite expansion are first, an inner $\langle uu\rangle ^+$ peak, which saturates at 11.3 and decreases as $Re_{\tau }^{-1/4}$ . This inner peak is followed by a short ‘wall log law’ with a slope that becomes positive for $Re_{\tau }$ beyond ${O}(10^4)$ , leading up to an outer peak, followed by the logarithmic overlap layer with a negative slope going continuously to zero for $Re_{\tau }\to \infty$ .


Sensors ◽  
2020 ◽  
Vol 20 (10) ◽  
pp. 2780 ◽  
Author(s):  
Pang-jo Chun ◽  
Tatsuro Yamane ◽  
Shota Izumi ◽  
Naoya Kuramoto

It is necessary to assess damage properly for the safe use of a structure and for the development of an appropriate maintenance strategy. Although many efforts have been made to measure the vibration of a structure to determine the degree of damage, the accuracy of evaluation is not high enough, so it is difficult to say that a damage evaluation based on vibrations in a structure has not been put to practical use. In this study, we propose a method to evaluate damage by measuring the acceleration of a structure at multiple points and interpreting the results with a Random Forest, which is a kind of supervised machine learning. The proposed method uses the maximum response acceleration, standard deviation, logarithmic decay rate, and natural frequency to improve the accuracy of damage assessment. We propose a three-step Random Forest method to evaluate various damage types based on the results of these many measurements. Then, the accuracy of the proposed method is verified based on the results of a cross-validation and a vibration test of an actual damaged specimen.


Mathematics ◽  
2020 ◽  
Vol 8 (5) ◽  
pp. 715 ◽  
Author(s):  
Luc Robbiano ◽  
Qiong Zhang

In this paper, we analyze the longtime behavior of the wave equation with local Kelvin-Voigt Damping. Through introducing proper class symbol and pseudo-diff-calculus, we obtain a Carleman estimate, and then establish an estimate on the corresponding resolvent operator. As a result, we show the logarithmic decay rate for energy of the system without any geometric assumption on the subdomain on which the damping is effective.


2020 ◽  
Vol 101 (2) ◽  
Author(s):  
Hiroyuki Kitamoto ◽  
Yoshihisa Kitazawa ◽  
Takahiko Matsubara

Water SA ◽  
2019 ◽  
Vol 45 (2 April) ◽  
Author(s):  
Talia Tokyay ◽  
Can Kurt

A three-dimensional numerical model of ANSYS, Fluent (2011) was employed for studying mid to high discharge supercritical two-phase flow over a single slope spillway with a single step for aeration of the flow. In this study 18 simulations were conducted using the Volume of Fluid (VOF) method for air-water interface tracking and simple k-ɛ model for turbulence closure. Submerged circular shaped pipes located at the bottom of the step were utilized as aerators. Analyses concentrate on the air-entrainment phenomenon and jet-length of the flow from the step to the re-attachment point. The variables considered in the study are discharge, aerator size, different aerator arrangements, Froude number of the flow, presence of a ramp before the step and its angle. Observed jet-length values in this study were compared with two sets of empirical formulae from literature for code validation. Cross-sectional average of air concentration due to bottom aeration was determined in the streamwise direction downstream of the re-attachment of the jet. The air concentration is observed to follow a logarithmic decay in the flow direction within the de-aeration zone.


Author(s):  
Ben D. Sawyer ◽  
Peter A. Hancock

Objective: This work assesses the efficacy of the “prevalence effect” as a form of cyberattack in human-automation teaming, using an email task. Background: Under the prevalence effect, rare signals are more difficult to detect, even when taking into account their proportionally low occurrence. This decline represents diminished human capability to both detect and respond. As signal probability (SP) approaches zero, accuracy exhibits logarithmic decay. Cybersecurity, a context in which the environment is entirely artificial, provides an opportunity to manufacture conditions enhancing or degrading human performance, such as prevalence effects. Email cybersecurity prevalence effects have not previously been demonstrated, nor intentionally manipulated. Method: The Email Testbed (ET) provides a simulation of a clerical email work involving messages containing sensitive personal information. Using the ET, participants were presented with 300 email interactions and received cyberattacks at rates of either 1%, 5%, or 20%. Results: Results demonstrated the existence and power of prevalence effects in email cybersecurity. Attacks delivered at a rate of 1% were significantly more likely to succeed, and the overall pattern of accuracy across declining SP exhibited logarithmic decay. Application: These findings suggest a “prevalence paradox” within human-machine teams. As automation reduces attack SP, the human operator becomes increasingly likely to fail in detecting and reporting attacks that remain. In the cyber realm, the potential to artificially inflict this state on adversaries, hacking the human operator rather than algorithmic defense, is considered. Specific and general information security design countermeasures are offered.


Sign in / Sign up

Export Citation Format

Share Document