rss feeds
Recently Published Documents


TOTAL DOCUMENTS

132
(FIVE YEARS 7)

H-INDEX

8
(FIVE YEARS 1)

Author(s):  
Jane Igie Aba ◽  
Theresa Osasu Makinde

This chapter is on relevance of Web 2.0 for library services in digital era. Web 2.0 tools play crucial role in effective service delivery of librarians. The study covers awareness, utilization, benefits, and challenges that affect the use of Web 2.0 by librarians for effective service delivery. The concepts generally implied that Web 2.0 can be used by librarians as information acquisition tools to gather information from sources outside libraries (e.g., blogs and wikis), information dissemination tools (such as RSS feeds), information organization tools that facilitate storage and subsequent retrieval of information (social bookmarking and tagging), and information sharing tools that facilitate the bilateral flow of information between libraries and patrons (social networking and media sharing sites). This chapter examines the concept of digital libraries and highlights the major features of a digital library and how it can be used. The potentials of digital library are very crucial as means of access to knowledge and information that will facilitate development.


Author(s):  
Jane Igie Aba ◽  
Theresa Osasu Makinde

This chapter is on relevance of Web 2.0 for library services in digital era. Web 2.0 tools play crucial role in effective service delivery of librarians. The study covers awareness, utilization, benefits, and challenges that affect the use of Web 2.0 by librarians for effective service delivery. The concepts generally implied that Web 2.0 can be used by librarians as information acquisition tools to gather information from sources outside libraries (e.g., blogs and wikis), information dissemination tools (such as RSS feeds), information organization tools that facilitate storage and subsequent retrieval of information (social bookmarking and tagging), and information sharing tools that facilitate the bilateral flow of information between libraries and patrons (social networking and media sharing sites). This chapter examines the concept of digital libraries and highlights the major features of a digital library and how it can be used. The potentials of digital library are very crucial as means of access to knowledge and information that will facilitate development.


Author(s):  
Khalid Mahboob ◽  
◽  
Fayyaz Ali ◽  
Hafsa Nizami

2019 ◽  
Vol 23 (5) ◽  
pp. 801-816
Author(s):  
Niamh Kirk

Irish emigration has resulted in large and highly organised diasporas in the United States, the United Kingdom and Australia which sustain commercially successful ethnic news organisations serving the communities’ informational and cultural needs. Some of these titles have been operating in print for decades and expanded operations as they transitioned online. Diaspora journalism has an important role in recreating ethnic identity among deterritorialised Irish audiences. However, little is understood about what aspects of homeland culture diaspora news media represent, how ‘Irishness’ is characterised or the extent these representations can be regarded as homogeneous across different hostlands. The focus of this research is on Irish diasporic news organisations, comparing how news titles in each of the regions represented Irish identity over 6 months in 2016. Using RSS Feeds and automated data entry, it maps the news flows from Ireland to the digital diaspora press in each of the regions, revealing differences in the salience of news categories and topics. In addition, a comparative frame analysis of how the 1916 Centenary event in Ireland was covered revealed differences in the conceptualisation and representation of this part of Irish culture. This article highlights the complexity of diaspora news media’s role in representing ethnic identities as they respond and republish homeland current affairs. It reveals unbalanced news flows to the diaspora press and divergences among Irish diasporic news media over how transnational Irish culture is conceptualised and represented.


Comunicar ◽  
2019 ◽  
Vol 27 (59) ◽  
pp. 29-38 ◽  
Author(s):  
Concha Edo ◽  
Juan Yunquera ◽  
Helder Bastos

The growing expansion of Internet access and mass-scale usage of social networking platforms and search engines have forced digital newspapers to deal with challenges, amongst which are the need to constantly update news, the increasing complexity of sources, the difficulty of exercising their function as gatekeepers in a fragmented environment in which the opinions, biases and preconceptions of pundits, their followers, Twitter users, etc. has taken on a new and decisive weight and the mounting pressure to publish certain news items simply because they sell. They must also share audiences with aggregators devoted to the business of disseminating content produced by digital news publishers, blogs and RSS feeds, which is chosen on the basis of search engine algorithms, the votes of users or the preferences of readers. The fact that these computerized systems of news distribution seldom employ the criteria upon which journalism is based suggests that the work of gatekeeping is being reframed in a way that progressively eliminates journalists from the process of deciding what is newsworthy. This study of these trends has entailed a 47 point assessment of 30 news aggregators currently providing syndicated content and eight semi-structured interviews with editors of quality mass-distribution digital newspapers published in the U.S., Spain and Portugal. La creciente expansión del acceso a Internet y el uso masivo de las plataformas de redes sociales y los motores de búsqueda han obligado a los medios digitales a enfrentarse a desafíos como la necesidad de actualizar constantemente las noticias, la creciente complejidad de las fuentes, la dificultad de ejercer su función de «gatekeeper» en un entorno fragmentado en el que las opiniones, los prejuicios y las ideas preconcebidas de los expertos y sus seguidores, los usuarios de Twitter, etc. han adquirido un peso nuevo y decisivo, y la creciente presión para publicar ciertas noticias simplemente porque venden. Tienen además que compartir audiencias con agregadores cuyo negocio consiste en difundir contenido producido por editores de noticias digitales, blogs y «feeds» RSS, que hacen la selección basándose en algoritmos de búsqueda, en los votos de los usuarios o en las preferencias de los lectores. El hecho de que estos sistemas computarizados de distribución de noticias rara vez tienen en cuenta criterios periodísticos sugiere que ese trabajo de selección se está replanteando de tal manera que se va eliminando progresivamente a los periodistas del proceso de decidir lo que tiene interés periodístico. Este estudio sobre las tendencias descritas se ha llevado a cabo mediante la evaluación de 47 parámetros en 30 agregadores de noticias que actualmente ofrecen contenido sindicado, y se ha completado con ocho entrevistas semiestructuradas con editores de medios digitales de calidad y de difusión elevada publicados en los EEUU, España y Portugal.


2019 ◽  
Vol 37 (8_suppl) ◽  
pp. 77-77
Author(s):  
Craig Whittington ◽  
Todd Feinman ◽  
Sandra Zelman Lewis ◽  
Greg Lieberman ◽  
Michael del Aguila

77 Background: In February 2018, ASCO published a guideline on how clinicians should manage immune-related adverse events (irAEs) in cancer patients treated with immune checkpoint inhibitors (ICPis). Recommendations were based on informal consensus due to a lack of "high-quality" evidence. Our objective was to determine whether DOC Search, a cloud-based AI search engine, could be used to rapidly determine if any new evidence matches the inclusion criteria of the guideline. Methods: PubMed, ASCO abstracts, and 85 RSS feeds were queried within DOC Search to identify publications since the guideline search was last conducted. DOC Search automatically includes comprehensive synonym lists for the search terms entered, and annotates co-occurring characteristics, interventions and outcomes. Results: Between 11/1/2017 and 10/31/2018, 1178 published references were identified (85.7% from PubMed, 13.8% from ASCO, and 0.5% from official RSS feeds). Title/abstract screening of a sample of the most recent articles indicated that 44% were relevant, and of these, 8% specifically reported research on the management of irAEs. Through automated term indexing of search results, some of the most frequently reported terms were melanoma, corticosteroids, and colitis (Table). Conclusions: DOC Search employs robust ontology mapping—including UMLS, ASCO’s proprietary toxonomy, and more—which obviated the need for complex search strings. The machine learning and natural language processing technology provided real-time analysis and automated term indexing of search results, improving our understanding of the rapidly changing evidence landscape. This analysis met the objective to use DOC Search for rapid identification and review of new published evidence for an existing guideline.[Table: see text]


2019 ◽  
pp. 761
Author(s):  
نضال زكى العمارين ◽  
خلدون خليل سليم الحباشنة
Keyword(s):  

2018 ◽  
Vol 10 (1) ◽  
Author(s):  
Heather Baker ◽  
Asher Grady ◽  
Collin Schwantes ◽  
Emily Iarocci ◽  
Rachel Campbell ◽  
...  

ObjectiveThe National Biosurveillance Integration Center (NBIC) is deploying a scalable, flexible open source data collection, analysis, and dissemination tool to support biosurveillance operations by the U.S. Department of Homeland Security (DHS) and its federal interagency partners.IntroductionNBIC integrates, analyzes, and distributes key information about health and disease events to help ensure the nation’s responses are well-informed, save lives, and minimize economic impact. To meet its mission objectives, NBIC utilizes a variety of data sets, including open source information, to provide comprehensive coverage of biological events occurring across the globe. NBIC Biofeeds is a digital tool designed to improve the efficiency of analyzing large volumes of open source reporting and increase the number of relevant insights gleaned from this dataset. Moreover, the tool provides a mechanism to disseminate tailored, electronic message notifications in near-real time so that NBIC can share specific information of interest to its interagency partners in a timely manner.MethodsNBIC intends to implement operational use of the capability in FY 2018. The core components of the system are data collection, curation, and dissemination of information deemed important by NBIC subject matter experts. NBIC Biofeeds has captured information from more than 70,000 unique sources published from around the globe and presents, on average, 9,000 new biosurveillance-relevant articles to users each day. NBIC leverages a variety of data feeds, including third party aggregators like Google and subscription-based feeds such as HealthMap, as well as Really Simple Syndication (RSS) feeds and web-scraping of highly relevant sources.The NBIC biosurveillance taxonomy imbedded in the tool consists of more than 600 metadata targets that cover key information for understanding the significance of an active biological event, including etiologic agents, impact to humans and animals (e.g., infection severity, healthcare workers involved, type of host), social disruption, infrastructure strain, countermeasures engaged, and ‘red flag’ characteristics (e.g., pathogen appearance in a new geographic area, unusual clinical signs). This taxonomy serves as a foundation for data curation and can be tailored by NBIC partners to more directly meet their own mission objectives.At this time, metadata is predominately captured by NBIC analysts, who manually tag information, which triggers the population of three automatically-disseminated products from the tool: 1) the NBIC Daily Biosurveillance Review, 2) immediate and daily summary email notifications, and 3) custom-designed RSS feeds. These products are meant for individual recipients in the federal interagency and for consumption by other biosurveillance information technology systems, such as the Department of Defense, Defense Threat Reduction Agency (DTRA) Biosurveillance Ecosystem (BSVE). NBIC is working in partnership with DTRA to integrate NBIC Biofeeds as an application directly into the BSVE and further develop the BSVE as an all-in-one platform for biosurveillance data analytics.To improve the efficiency and effectiveness of gaining insights using NBIC Biofeeds, developers of the tool at the Pacific Northwest National Laboratory (PNNL) are researching and testing a variety of advanced analytics techniques focused on: 1) article relevancy ratings to improve the review of queried data, 2) significance ratings to elucidate the perceived severity of an event based on reported characteristics, 3) full-text article retrieval and storage for improved machine-tagging, and 4) anomaly detection for emerging threats. Testing and implementation of new analytic capabilities in NBIC Biofeeds is planned for this fiscal year.ResultsNBIC Biofeeds was developed to serve as a sophisticated and powerful open source biosurveillance technology of value to the federal government by providing information to stakeholders conducting open source biosurveillance as well as those consuming biosurveillance information. In FY 2018, NBIC Biofeeds will begin operational use by NBIC and an initial set of users in various federal agencies. User accounts for testing purposes will be available to other federal partners, and a broad scope of federal stakeholders can receive products directly from NBIC Biofeeds based on their interests.ConclusionsNBIC Biofeeds is expected to enable more rapid recognition and enhanced analysis of emerging biological events by NBIC analysts. NBIC anticipates other federal agencies with biosurveillance missions will find this technology of value and intends to offer use of the platform to those federal partners that can benefit from access to the tool and information generated from NBIC Biofeeds.  


2018 ◽  
Vol 7 (1) ◽  
pp. 47-64
Author(s):  
Fekade Getahun ◽  
Richard Chbeir
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document