Passive Control of Radiation Directivity for Auditory Signals

2017 ◽  
Vol 2017 (0) ◽  
pp. 424
Author(s):  
Yoshihisa HONDA
1970 ◽  
Vol 83 (3, Pt.1) ◽  
pp. 458-464 ◽  
Author(s):  
Sydney J. Segal ◽  
Vincent Fusella
Keyword(s):  

2006 ◽  
Vol 32 (4) ◽  
pp. 483-490 ◽  
Author(s):  
Kristy L. Lindemann ◽  
Colleen Reichmuth Kastak ◽  
Ronald J. Schusterman
Keyword(s):  

1989 ◽  
Author(s):  
John D. Charlton ◽  
James J. Brickley

Author(s):  
Guilherme Silva Prado ◽  
Heinsten Frederich Leal dos Santos

Author(s):  
Y Madhusudan Rao ◽  
Gayatri P ◽  
Ajitha M ◽  
P. Pavan Kumar ◽  
Kiran kumar

Present investigation comprises the study of ex-vivo skin flux and in-vivo pharmacokinetics of Thiocolchicoside (THC) from transdermal films. The films were fabricated by solvent casting technique employing combination of hydrophilic and hydrophobic polymers. A flux of 18.08 µg/cm2h and 13.37µg/cm2h was achieved for optimized formulations containing 1, 8-cineole and oleic acid respectively as permeation enhancers. The observed flux values were higher when compared to passive control (8.66 µg/cm2h). Highest skin permeation was observed when 1,8-cineole was used as chemical permeation enhancer and it considerably (2-2.5 fold) improved the THC transport across the rat skin. In vivo studies were performed in rabbits and samples were analysed by LC-MS-MS. The mean area under the curve (AUC) values of transdermal film showed about 2.35 times statistically significant (p<0.05) improvement in bioavailability when compared with the oral administration of THC solution. The developed transdermal therapeutic systems using chemical permeation enhancers were suitable for drugs like THC in effective management of muscular pain.    


1994 ◽  
Author(s):  
David M. Green
Keyword(s):  

Symmetry ◽  
2020 ◽  
Vol 12 (10) ◽  
pp. 1718
Author(s):  
Chien-Hsing Chou ◽  
Yu-Sheng Su ◽  
Che-Ju Hsu ◽  
Kong-Chang Lee ◽  
Ping-Hsuan Han

In this study, we designed a four-dimensional (4D) audiovisual entertainment system called Sense. This system comprises a scene recognition system and hardware modules that provide haptic sensations for users when they watch movies and animations at home. In the scene recognition system, we used Google Cloud Vision to detect common scene elements in a video, such as fire, explosions, wind, and rain, and further determine whether the scene depicts hot weather, rain, or snow. Additionally, for animated videos, we applied deep learning with a single shot multibox detector to detect whether the animated video contained scenes of fire-related objects. The hardware module was designed to provide six types of haptic sensations set as line-symmetry to provide a better user experience. After the system considers the results of object detection via the scene recognition system, the system generates corresponding haptic sensations. The system integrates deep learning, auditory signals, and haptic sensations to provide an enhanced viewing experience.


Sign in / Sign up

Export Citation Format

Share Document