external information
Recently Published Documents


TOTAL DOCUMENTS

739
(FIVE YEARS 300)

H-INDEX

35
(FIVE YEARS 6)

Publications ◽  
2022 ◽  
Vol 10 (1) ◽  
pp. 2
Author(s):  
Francisca Suau-Jiménez ◽  
Francisco Ivorra-Pérez

The recent COVID-19 pandemic has triggered an enormous stream of information. Parascientific digital communication has pursued different avenues, from mainstream media news to social networking, at times combined. Likewise, citizens have developed new discourse practices, with readers as active participants who claim authority. Based on a corpus of 500 reader comments from The Guardian, we analyse how readers build their authorial voice on COVID-19 news as well as their agentive power and its implications. Methodologically, we draw upon stance markers, depersonalisation strategies, and heteroglossic markers, from the perspective of discursive interpersonality. Our findings unearth that stance markers are central for readers to build authority and produce content. Depersonalised and heteroglossic markers are also resorted, reinforcing readers’ authority with external information that mirrors expert scientific communication. Conclusions suggest a strong citizen agentive power that can either support news articles, spreading parascientific information, or challenge them, therefore, contributing to produce pseudoscientific messages.


Econometrics ◽  
2022 ◽  
Vol 10 (1) ◽  
pp. 4
Author(s):  
Chung-Yim Yiu ◽  
Ka-Shing Cheung

The age–period–cohort problem has been studied for decades but without resolution. There have been many suggested solutions to make the three effects estimable, but these solutions mostly exploit non-linear specifications. Yet, these approaches may suffer from misspecification or omitted variable bias. This paper is a practical-oriented study with an aim to empirically disentangle age–period–cohort effects by providing external information on the actual depreciation of housing structure rather than taking age as a proxy. It is based on appraisals of the improvement values of properties in New Zealand to estimate the age-depreciation effect. This research method provides a novel means of solving the identification problem of the age, period, and cohort trilemma. Based on about half a million housing transactions from 1990 to 2019 in the Auckland Region of New Zealand, the results show that traditional hedonic prices models using age and time dummy variables can result, ceteris paribus, in unreasonable positive depreciation rates. The use of the improvement values model can help improve the accuracy of home value assessment and reduce estimation biases. This method also has important practical implications for property valuations.


2022 ◽  
pp. 1-20
Author(s):  
Rongjian Xie ◽  
Yucai Jia ◽  
Yuanmei Wu ◽  
Peiyun Zhang

During major epidemics, monitoring vaccine quality can ensure the public health and social stability. Considering that social media has become an important way for the public to obtain external information during the epidemic. We developed a dual regulatory system of vaccine quality with the government in the leading role and the participation of We Media, and constructed a four-party evolutionary game model (government regulatory agency, We Media, vaccine industry groups, and the public) and analyzed the stability of each game player’s strategy choice. The system’s possible equilibrium points are identified using Lyapunov’s first law. Then the game trajectory between stakeholders is simulated by MATLAB, the effects of initial intention and parameters on the evolution process and results are analyzed. The results show that to ensure the quality and safety of vaccines and stabilize network public opinion during epidemics, the government should invest in an effective supervision mechanism. By strengthening responsibility, increasing penalties, and reducing supervision costs, the probability of vaccine industry groups providing high-quality vaccines is effectively enhanced. Restricting the behavior of We Media and supervising vaccine industry groups to reduce speculation reduces the cost of government supervision and improves its efficiency.


2022 ◽  
Vol 2022 ◽  
pp. 1-10
Author(s):  
Hongbin Chen

With the continuous advancement of science and technology and the rapid development of robotics, it has become an inevitable trend for domestic robots to enter thousands of households. In order to solve the inconvenience problem of the elderly and people with special needs, because the elderly and other people in need may need the help of domestic robots due to inconvenient legs and feet, the research of the robot target position based on monocular stereo vision and the understanding of the robot NAO are very important. Research and experiments are carried out on the target recognition and positioning in the process of NAO robot grasping. This paper proposes a recognition algorithm corresponding to quantitative component statistical information. First, extract the area of interest that contains the purpose from the image. After that, to eliminate interference in various fields and achieve target recognition, the robot cameras have almost no common field of view and can only use one camera at the same time. Therefore, this article uses the monocular vision principle to locate the target, and the detection algorithm is based on the structure of the robot head material, establishes the relationship between the height change of the machine head and the tilt angle, and improves the monocular vision NAO robot detection algorithm. According to experiments, the accuracy of the robot at close range can be controlled below 1 cm. This article completes the robot’s grasping and transmission of the target. About 80% of the external information that humans can perceive comes from vision. In addition, there are advantages such as high efficiency and good stability.


Diagnostics ◽  
2022 ◽  
Vol 12 (1) ◽  
pp. 105
Author(s):  
Fallon Branch ◽  
Isabella Santana ◽  
Jay Hegdé

When making decisions under uncertainty, people in all walks of life, including highly trained medical professionals, tend to resort to using ‘mental shortcuts’, or heuristics. Anchoring-and-adjustment (AAA) is a well-known heuristic in which subjects reach a judgment by starting from an initial internal judgment (‘anchored position’) based on available external information (‘anchoring information’) and adjusting it until they are satisfied. We studied the effects of the AAA heuristic during diagnostic decision-making in mammography. We provided practicing radiologists (N = 27 across two studies) a random number that we told them was the estimate of a previous radiologist of the probability that a mammogram they were about to see was positive for breast cancer. We then showed them the actual mammogram. We found that the radiologists’ own estimates of cancer in the mammogram reflected the random information they were provided and ignored the actual evidence in the mammogram. However, when the heuristic information was not provided, the same radiologists detected breast cancer in the same set of mammograms highly accurately, indicating that the effect was solely attributable to the availability of heuristic information. Thus, the effects of the AAA heuristic can sometimes be so strong as to override the actual clinical evidence in diagnostic tasks.


BMJ Open ◽  
2022 ◽  
Vol 12 (1) ◽  
pp. e051764
Author(s):  
Dane Lansdaal ◽  
Femke van Nassau ◽  
Marije van der Steen ◽  
Martine de Bruijne ◽  
Marian Smeulers

ObjectiveThis study aims to obtain insight into experienced facilitators and barriers of implementing a tailored value-based healthcare (VBHC) model in a Dutch university hospital from a perspective of physicians and nurses.MethodA descriptive qualitative study with 12 physicians, nurses and managers of seven different care pathways who were involved in the implementation of a tailored VBHC methodology was conducted. Thematic content analysis was used to analyse the data guided by all factors of the Consolidated Framework for Implementation Research (CFIR).FindingsThe method designed for the implementation of a tailored VBHC methodology was appointed as a structured guide for the process. Throughout the implementation process, leadership and team dynamics were considered as important for the implementation to succeed. Also, sharing experiences with other value teams and the cooperation with external Information Technology (IT) teams in the hospital was mentioned as desirable. The involvement of patients, that is part of the VBHC methodology, was considered useful in the decision-making and improvement of the care process because it gave better insights in topics that are important for patients. The time-consuming nature of the implementation process was named as barrier to the VBHC methodology. On top of that, the shaping of the involvement of patients and the ongoing changes in departments were established as difficult. Finally, working with the Electronic Health Records and acquiring the necessary digital skills were considered to be often forgotten and, thus, hindering implementation.ConclusionClinical Healthcare organisations implementing a tailored VBHC methodology will benefit from the use of a structured implementation methodology, a well-led strong team and cooperation with (external) teams and patients. However, shaping patient involvement, alignment with other departments and attention to digitisation were seen as a most important concerns in implementation and require further attention.


2021 ◽  
Vol 11 (1) ◽  
pp. 28
Author(s):  
Ana Bárbara Cardoso ◽  
Bruno Martins ◽  
Jacinto Estima

This article describes a novel approach for toponym resolution with deep neural networks. The proposed approach does not involve matching references in the text against entries in a gazetteer, instead directly predicting geo-spatial coordinates. Multiple inputs are considered in the neural network architecture (e.g., the surrounding words are considered in combination with the toponym to disambiguate), using pre-trained contextual word embeddings (i.e., ELMo or BERT) as well as bi-directional Long Short-Term Memory units, which are both regularly used for modeling textual data. The intermediate representations are then used to predict a probability distribution over possible geo-spatial regions, and finally to predict the coordinates for the input toponym. The proposed model was tested on three datasets used on previous toponym resolution studies, specifically the (i) War of the Rebellion, (ii) Local–Global Lexicon, and (iii) SpatialML corpora. Moreover, we evaluated the effect of using (i) geophysical terrain properties as external information, including information on elevation or terrain development, among others, and (ii) additional data collected from Wikipedia articles, to further help with the training of the model. The obtained results show improvements using the proposed method, when compared to previous approaches, and specifically when BERT embeddings and additional data are involved.


Author(s):  
Hai Wang ◽  
Baoshen Guo ◽  
Shuai Wang ◽  
Tian He ◽  
Desheng Zhang

The rise concern about mobile communication performance has driven the growing demand for the construction of mobile network signal maps which are widely utilized in network monitoring, spectrum management, and indoor/outdoor localization. Existing studies such as time-consuming and labor-intensive site surveys are difficult to maintain an update-to-date finegrained signal map within a large area. The mobile crowdsensing (MCS) paradigm is a promising approach for building signal maps because collecting large-scale MCS data is low-cost and with little extra-efforts. However, the dynamic environment and the mobility of the crowd cause spatio-temporal uncertainty and sparsity of MCS. In this work, we leverage MCS as an opportunity to conduct the city-wide mobile network signal map construction. We propose a fine-grained city-wide Cellular Signal Map Construction (CSMC) framework to address two challenges including (i) the problem of missing and unreliable MCS data; (ii) spatio-temporal uncertainty of signal propagation. In particular, CSMC captures spatio-temporal characteristics of signals from both inter- and intra- cellular base stations and conducts missing signal recovery with Bayesian tensor decomposition to build large-area fine-grained signal maps. Furthermore, CSMC develops a context-aware multi-view fusion network to make full use of external information and enhance signal map construction accuracy. To evaluate the performance of CSMC, we conduct extensive experiments and ablation studies on a large-scale dataset with over 200GB MCS signal records collected from Shanghai. Experimental results demonstrate that our model outperforms state-of-the-art baselines in the accuracy of signal estimation and user localization.


2021 ◽  
Author(s):  
Qingxing Cao ◽  
Wentao Wan ◽  
Xiaodan Liang ◽  
Liang Lin

Despite the significant success in various domains, the data-driven deep neural networks compromise the feature interpretability, lack the global reasoning capability, and can’t incorporate external information crucial for complicated real-world tasks. Since the structured knowledge can provide rich cues to record human observations and commonsense, it is thus desirable to bridge symbolic semantics with learned local feature representations. In this chapter, we review works that incorporate different domain knowledge into the intermediate feature representation.These methods firstly construct a domain-specific graph that represents related human knowledge. Then, they characterize node representations with neural network features and perform graph convolution to enhance these symbolic nodes via the graph neural network(GNN).Lastly, they map the enhanced node feature back into the neural network for further propagation or prediction. Through integrating knowledge graphs into neural networks, one can collaborate feature learning and graph reasoning with the same supervised loss function and achieve a more effective and interpretable way to introduce structure constraints.


2021 ◽  
Vol 3 (3) ◽  
pp. 31-44
Author(s):  
Nenubari Ikue John ◽  
Emeka Nkoro ◽  
Jeremiah Anietie

There is a pool of techniques and methods in addressing dynamics behaviors in higher frequency data, prominent among them is the ARCH/GARCH techniques. In this paper, the various types and assumptions of the ARCH/GARCH models were tried in examining the dynamism of exchange rate and international crude oil prices in Nigeria. And it was observed that the Nigerian foreign exchange rates behaviors did not conform with the assumptions of the ARCH/GARCH models, hence this paper adopted Lag Variables Autoregressive (LVAR) techniques originally developed by Agung and Heij multiplier to examine the dynamic response of the Nigerian foreign exchange rates to crude oil prices. The Heij coefficient was used to calculate the dynamic multipliers while the Engel & Granger two-step technique was used for cointegration analysis.  The results revealed an insignificant dynamic long-term response of the exchange rate to crude oil prices within the periods under review. The coefficient of dynamism was insignificantly in most cases of the sub-periods. The paper equally revealed that the significance of the dynamic multipliers depends greatly on external information about both market indicators which are two-way interactions. Thus, the paper recommends periodic intervention in the foreign exchange market by the monetary authorities to stabilize the market against any shocks in the international crude oil market, since crude oil is the main source of foreign exchange in Nigeria.


Sign in / Sign up

Export Citation Format

Share Document