scholarly journals Defab: Architecture for a Circular Economy

2021 ◽  
Author(s):  
◽  
Gerard Finch

<p>Mainstream construction practices result in the production of large quantities of toxic waste at all stages of a building’s life cycle. This can be attributed to widespread adoption of irreversible fixing methods that prioritise rapid assembly, bespoke design practices and the increased use of ‘low-value’ materials. Unprecedented levels of consumption and waste production are set to continue as demand for residential housing in New Zealand grows rapidly. In response to these concerns, this thesis aims to develop innovative construction methods that facilitate the development of a Circular Economy for the building industry.  The resulting design proposal is a modular architectural construction system with integrated jointing capacity, redundant expansion potential and details that enable the effective separation of discrete building layers. This proposed assembly specification calls for the mass-standardisation of structural components to promote economically viable material retrieval and resale at the end of a building’s useful life. Computer-aided manufacturing technologies are used to facilitate the incorporation of sophisticated reusable assembly parameters into connection details on a large scale.  Analysis of the proposed solution indicates that waste over an entire building’s life can be reduced by more than 94% through the deployment of alternative architectural assemblies. Additionally, optimised assemblies enable deconstruction times to be reduced by up to 30% versus conventional light timber framing.</p>

2021 ◽  
Author(s):  
◽  
Gerard Finch

<p>Mainstream construction practices result in the production of large quantities of toxic waste at all stages of a building’s life cycle. This can be attributed to widespread adoption of irreversible fixing methods that prioritise rapid assembly, bespoke design practices and the increased use of ‘low-value’ materials. Unprecedented levels of consumption and waste production are set to continue as demand for residential housing in New Zealand grows rapidly. In response to these concerns, this thesis aims to develop innovative construction methods that facilitate the development of a Circular Economy for the building industry.  The resulting design proposal is a modular architectural construction system with integrated jointing capacity, redundant expansion potential and details that enable the effective separation of discrete building layers. This proposed assembly specification calls for the mass-standardisation of structural components to promote economically viable material retrieval and resale at the end of a building’s useful life. Computer-aided manufacturing technologies are used to facilitate the incorporation of sophisticated reusable assembly parameters into connection details on a large scale.  Analysis of the proposed solution indicates that waste over an entire building’s life can be reduced by more than 94% through the deployment of alternative architectural assemblies. Additionally, optimised assemblies enable deconstruction times to be reduced by up to 30% versus conventional light timber framing.</p>


2020 ◽  
pp. 1-26
Author(s):  
Qinwen Hu ◽  
Muhammad Rizwan Asghar ◽  
Nevil Brownlee

HTTPS refers to an application-specific implementation that runs HyperText Transfer Protocol (HTTP) on top of Secure Socket Layer (SSL) or Transport Layer Security (TLS). HTTPS is used to provide encrypted communication and secure identification of web servers and clients, for different purposes such as online banking and e-commerce. However, many HTTPS vulnerabilities have been disclosed in recent years. Although many studies have pointed out that these vulnerabilities can lead to serious consequences, domain administrators seem to ignore them. In this study, we evaluate the HTTPS security level of Alexa’s top 1 million domains from two perspectives. First, we explore which popular sites are still affected by those well-known security issues. Our results show that less than 0.1% of HTTPS-enabled servers in the measured domains are still vulnerable to known attacks including Rivest Cipher 4 (RC4), Compression Ratio Info-Leak Mass Exploitation (CRIME), Padding Oracle On Downgraded Legacy Encryption (POODLE), Factoring RSA Export Keys (FREAK), Logjam, and Decrypting Rivest–Shamir–Adleman (RSA) using Obsolete and Weakened eNcryption (DROWN). Second, we assess the security level of the digital certificates used by each measured HTTPS domain. Our results highlight that less than 0.52% domains use the expired certificate, 0.42% HTTPS certificates contain different hostnames, and 2.59% HTTPS domains use a self-signed certificate. The domains we investigate in our study cover 5 regions (including ARIN, RIPE NCC, APNIC, LACNIC, and AFRINIC) and 61 different categories such as online shopping websites, banking websites, educational websites, and government websites. Although our results show that the problem still exists, we find that changes have been taking place when HTTPS vulnerabilities were discovered. Through this three-year study, we found that more attention has been paid to the use and configuration of HTTPS. For example, more and more domains begin to enable the HTTPS protocol to ensure a secure communication channel between users and websites. From the first measurement, we observed that many domains are still using TLS 1.0 and 1.1, SSL 2.0, and SSL 3.0 protocols to support user clients that use outdated systems. As the previous studies revealed security risks of using these protocols, in the subsequent studies, we found that the majority of domains updated their TLS protocol on time. Our 2020 results suggest that most HTTPS domains use the TLS 1.2 protocol and show that some HTTPS domains are still vulnerable to the existing known attacks. As academics and industry professionals continue to disclose attacks against HTTPS and recommend the secure configuration of HTTPS, we found that the number of vulnerable domain is gradually decreasing every year.


2021 ◽  
pp. 089443932110068
Author(s):  
Aleksandra Urman ◽  
Mykola Makhortykh ◽  
Roberto Ulloa

We examine how six search engines filter and rank information in relation to the queries on the U.S. 2020 presidential primary elections under the default—that is nonpersonalized—conditions. For that, we utilize an algorithmic auditing methodology that uses virtual agents to conduct large-scale analysis of algorithmic information curation in a controlled environment. Specifically, we look at the text search results for “us elections,” “donald trump,” “joe biden,” “bernie sanders” queries on Google, Baidu, Bing, DuckDuckGo, Yahoo, and Yandex, during the 2020 primaries. Our findings indicate substantial differences in the search results between search engines and multiple discrepancies within the results generated for different agents using the same search engine. It highlights that whether users see certain information is decided by chance due to the inherent randomization of search results. We also find that some search engines prioritize different categories of information sources with respect to specific candidates. These observations demonstrate that algorithmic curation of political information can create information inequalities between the search engine users even under nonpersonalized conditions. Such inequalities are particularly troubling considering that search results are highly trusted by the public and can shift the opinions of undecided voters as demonstrated by previous research.


2021 ◽  
Author(s):  
Mehdi A. Beniddir ◽  
Kyo Bin Kang ◽  
Grégory Genta-Jouve ◽  
Florian Huber ◽  
Simon Rogers ◽  
...  

This review highlights the key computational tools and emerging strategies for metabolite annotation, and discusses how these advances will enable integrated large-scale analysis to accelerate natural product discovery.


Sign in / Sign up

Export Citation Format

Share Document