scholarly journals Transcription Profiling of the Metal-hyperaccumulator Thlaspi caerulescens (J. & C. PRESL)

2005 ◽  
Vol 60 (3-4) ◽  
pp. 216-223 ◽  
Author(s):  
Markus Plessl ◽  
Diana Rigola ◽  
Viivi Hassinen ◽  
Mark G. M. Aarts ◽  
Henk Schat ◽  
...  

Abstract Thlaspi caerulescens is a well-studied metal-hyperaccumulator of zinc, cadmium and nickel, belonging to the Brassicaceae family. Moreover it is one of the few hyperaccumulators that occur on different metalliferous soil types, as well as on nonmetalliferous soils. We are interested in the development of systems to improve phytoremediation of metal contaminated soils through improved metal-accumulation. About 1900 cDNAs isolated from T. caerulescens roots were hybridized with reverse transcribed RNA from zinc-treated T. caerulescens plants of two accessions originating from two different soil types. This comparative transcript profiling of T. caerulescens plants resulted in the identification of genes that are affected by heavy metals. The developed microarray proved to be an appropriate tool for a large scale analysis of gene expression in this metal-accumulator species.

1999 ◽  
Vol 39 (12) ◽  
pp. 63-67 ◽  
Author(s):  
B. L. Turner ◽  
P. M. Haygarth

Phosphorus (P) transfer from agricultural land to surface waters can contribute to eutrophication, excess algal growth and associated water quality problems. Grasslands have a high potential for P transfer, as they receive P inputs as mineral fertiliser and concentrates cycled through livestock manures. The transfer of P can occur through surface and subsurface pathways, although the capacity of most soils to fix inorganic P has meant that subsurface P transfer by leaching mechanisms has often been perceived as negligible. We investigated this using large-scale monolith lysimeters (135 cm deep, 80 cm diameter) to monitor leachate P under four grassland soil types. Leachate was collected during the 1997–98 drainage year and analysed for a range of P fractions. Mean concentrations of total P routinely exceeded 100 μg l−1 from all soil types and, therefore, exceeded P concentrations above which eutrophication and algal growth can occur. The majority of the leachate P was in algal-available Mo-reactive (inorganic) forms, although a large proportion occurred in unreactive (organic) forms. We suggest that subsurface transfer by leaching can represent a significant mechanism for agricultural P transfer from some soils and must be given greater consideration as a potential source of diffuse P pollution to surface waters.


2020 ◽  
pp. 1-26
Author(s):  
Qinwen Hu ◽  
Muhammad Rizwan Asghar ◽  
Nevil Brownlee

HTTPS refers to an application-specific implementation that runs HyperText Transfer Protocol (HTTP) on top of Secure Socket Layer (SSL) or Transport Layer Security (TLS). HTTPS is used to provide encrypted communication and secure identification of web servers and clients, for different purposes such as online banking and e-commerce. However, many HTTPS vulnerabilities have been disclosed in recent years. Although many studies have pointed out that these vulnerabilities can lead to serious consequences, domain administrators seem to ignore them. In this study, we evaluate the HTTPS security level of Alexa’s top 1 million domains from two perspectives. First, we explore which popular sites are still affected by those well-known security issues. Our results show that less than 0.1% of HTTPS-enabled servers in the measured domains are still vulnerable to known attacks including Rivest Cipher 4 (RC4), Compression Ratio Info-Leak Mass Exploitation (CRIME), Padding Oracle On Downgraded Legacy Encryption (POODLE), Factoring RSA Export Keys (FREAK), Logjam, and Decrypting Rivest–Shamir–Adleman (RSA) using Obsolete and Weakened eNcryption (DROWN). Second, we assess the security level of the digital certificates used by each measured HTTPS domain. Our results highlight that less than 0.52% domains use the expired certificate, 0.42% HTTPS certificates contain different hostnames, and 2.59% HTTPS domains use a self-signed certificate. The domains we investigate in our study cover 5 regions (including ARIN, RIPE NCC, APNIC, LACNIC, and AFRINIC) and 61 different categories such as online shopping websites, banking websites, educational websites, and government websites. Although our results show that the problem still exists, we find that changes have been taking place when HTTPS vulnerabilities were discovered. Through this three-year study, we found that more attention has been paid to the use and configuration of HTTPS. For example, more and more domains begin to enable the HTTPS protocol to ensure a secure communication channel between users and websites. From the first measurement, we observed that many domains are still using TLS 1.0 and 1.1, SSL 2.0, and SSL 3.0 protocols to support user clients that use outdated systems. As the previous studies revealed security risks of using these protocols, in the subsequent studies, we found that the majority of domains updated their TLS protocol on time. Our 2020 results suggest that most HTTPS domains use the TLS 1.2 protocol and show that some HTTPS domains are still vulnerable to the existing known attacks. As academics and industry professionals continue to disclose attacks against HTTPS and recommend the secure configuration of HTTPS, we found that the number of vulnerable domain is gradually decreasing every year.


2021 ◽  
pp. 089443932110068
Author(s):  
Aleksandra Urman ◽  
Mykola Makhortykh ◽  
Roberto Ulloa

We examine how six search engines filter and rank information in relation to the queries on the U.S. 2020 presidential primary elections under the default—that is nonpersonalized—conditions. For that, we utilize an algorithmic auditing methodology that uses virtual agents to conduct large-scale analysis of algorithmic information curation in a controlled environment. Specifically, we look at the text search results for “us elections,” “donald trump,” “joe biden,” “bernie sanders” queries on Google, Baidu, Bing, DuckDuckGo, Yahoo, and Yandex, during the 2020 primaries. Our findings indicate substantial differences in the search results between search engines and multiple discrepancies within the results generated for different agents using the same search engine. It highlights that whether users see certain information is decided by chance due to the inherent randomization of search results. We also find that some search engines prioritize different categories of information sources with respect to specific candidates. These observations demonstrate that algorithmic curation of political information can create information inequalities between the search engine users even under nonpersonalized conditions. Such inequalities are particularly troubling considering that search results are highly trusted by the public and can shift the opinions of undecided voters as demonstrated by previous research.


2021 ◽  
Vol 33 ◽  
pp. 258-269
Author(s):  
Matilda Holmes ◽  
Richard Thomas ◽  
Helena Hamerow

Sign in / Sign up

Export Citation Format

Share Document