scholarly journals Open Source Timber Systems as a Democratic Tool for Construction

Author(s):  
Ulrich Dangel

Architecture, as it exists today, is deeply rooted in perceptions that were established during the Renaissance, which credited the architect as the sole author of creative thinking processes and the resultant design ideas. Since then, the architectural profession has desired to develop new and innovative ways of building, often without being bound by traditions, the environment, or any other constraints and limitations. This approach has frequently failed to address the needs and concerns of many. As a result, architects have not been successful in imparting significant social change that is valuable to large portions of the population. In contrast, however, many other industries have adopted shared design and production practices for the benefit of the masses, warranting further exploration into how architectural practice might evolve its current modes of operation. Wood as a building material has many beneficial characteristics–specifically its widespread availability, versatility, and ease of workability–which make it particularly suitable for investigating shared authorship and collective production methodologies. As an alternative to steel and concrete for mid-rise and high-rise buildings, mass timber construction, in particular, has experienced significant advancements in recent years, resulting in the development of entirely new building processes that rely on innovative engineered wood products, digital manufacturing, and prefabrication techniques. However, this has frequently led to expensive one-off proprietary solutions that are limited in their application. To foster innovation and disseminate knowledge, an open source culture of designing and sharing is necessary. To this end, this paper will present approaches for open source mass timber construction systems that can be applied to a wide range of scenarios and settings, with the aim of ultimately increasing the acceptance and market share of wood construction for the benefit of society at large.

2021 ◽  
Author(s):  
Susi Lehtola ◽  
Antti Karttunen

Abstract Long in the making, computational chemistry for the masses [J. Chem. Educ. 1996, 73, 104] is finally here. Our brief review on various free and open source software (FOSS) quantum chemistry packages points out the existence of software offering a wide range of functionality, all the way from approximate semiempirical calculations with tight-binding density functional theory to sophisticated ab initio wave function methods such as coupled-cluster theory, both for molecular and for solid-state systems. Combined with the remarkable increase in the computing power of personal devices, which now rivals that of the fastest supercomputers in the world of the 1990s, we demonstrate that a decentralized model for teaching computational chemistry is now possible thanks to FOSS computational chemistry packages, enabling students to perform reasonable modeling on their own computing devices, in the bring your own device (BYOD) scheme. FOSS software can be made trivially simple to install and keep up to date, eliminating the need for departmental support, and also enables comprehensive teaching strategies, as various algorithms' actual implementations can be used in teaching. We exemplify what kinds of calculations are feasible with four FOSS electronic structure programs, assuming only extremely modest computational resources, to illustrate how FOSS packages enable decentralized approaches to computational chemistry education within the BYOD scheme. FOSS also has further benefits: the open access to the source code of FOSS packages democratizes the science of computational chemistry, and FOSS packages can be used without limitation also beyond education, in academic and industrial applications, for example. For these reasons, we believe FOSS will become ever more pervasive in computational chemistry.


Author(s):  
Todd Beyreuther ◽  
◽  
Darren Griechen ◽  

Mass timber is an emergent building assembly technology that advances themes of prefabrication, modularization, parametric design, and renewable materials in architectural practice and education. Mass timber is a collective term for several engineered heavy panel wood products including cross-laminated timber (CLT), nail-laminated timber (NLT), glued laminated timber (GLT) laminated veneer lumber (LVL), laminated strand lumber (LSL), and parallel strand lumber (PSL).


Forests ◽  
2021 ◽  
Vol 12 (1) ◽  
pp. 91
Author(s):  
R. Dan Seale ◽  
Rubin Shmulsky ◽  
Frederico Jose Nistal Franca

This review primarily describes nondestructive evaluation (NDE) work at Mississippi State University during the 2005–2020 time interval. Overall, NDE is becoming increasingly important as a means of maximizing and optimizing the value (economic, engineering, utilitarian, etc.) of every tree that comes from the forest. For the most part, it focuses on southern pine structural lumber, but other species such as red pine, spruce, Douglas fir, red oak, and white oak and other products such as engineered composites, mass timber, non-structural lumber, and others are included where appropriate. Much of the work has been completed in conjunction with the U.S. Department of Agriculture, Forest Service, Forest Products Laboratory as well as the Agricultural Research Service with the overall intent of improving lumber and wood products standards and valuation. To increase the future impacts and adoption of this NDE-related work, wherever possible graduate students have contributed to the research. As such, a stream of trained professionals is a secondary output of these works though it is not specifically detailed herein.


2021 ◽  
Author(s):  
Farhan Ali ◽  

Thinking creatively, is a necessary condition of the Design process to transform ideas into novel solutions and break barriers to creativity. Although, there are many techniques and ways to stimulate creative thinking for designers, however, this research paper adopts SCAMPER; which is acronym of: Substitute- Combine-Adapt- Modify or Magnify-Put to another use-Eliminate-Reverse or Rearrange- to integrate the sustainability concepts within architectural design process. Many creative artifacts have been designed consciously or unconsciously adopting SCAMPER strategies such as rehabilitation and reuse projects to improve the functional performance or the aesthetic sense of an existing building for the better. SCAMPER is recognized as a divergent thinking tool are used during the initial ideation stage, aims to leave the usual way of thinking to generate a wide range of new ideas that will lead to new insights, original ideas, and creative solutions to problems. The research focuses on applying this method in the architectural design, which is rarely researched, through reviewing seven examples that have been designed consciously or unconsciously adopting SCAMPER mnemonic techniques. The paper aims to establish a starting point for further research to deepen it and study its potentials in solving architectural design problems.


2021 ◽  
Author(s):  
Jason Hunter ◽  
Mark Thyer ◽  
Dmitri Kavetski ◽  
David McInerney

<p>Probabilistic predictions provide crucial information regarding the uncertainty of hydrological predictions, which are a key input for risk-based decision-making. However, they are often excluded from hydrological modelling applications because suitable probabilistic error models can be both challenging to construct and interpret, and the quality of results are often reliant on the objective function used to calibrate the hydrological model.</p><p>We present an open-source R-package and an online web application that achieves the following two aims. Firstly, these resources are easy-to-use and accessible, so that users need not have specialised knowledge in probabilistic modelling to apply them. Secondly, the probabilistic error model that we describe provides high-quality probabilistic predictions for a wide range of commonly-used hydrological objective functions, which it is only able to do by including a new innovation that resolves a long-standing issue relating to model assumptions that previously prevented this broad application.  </p><p>We demonstrate our methods by comparing our new probabilistic error model with an existing reference error model in an empirical case study that uses 54 perennial Australian catchments, the hydrological model GR4J, 8 common objective functions and 4 performance metrics (reliability, precision, volumetric bias and errors in the flow duration curve). The existing reference error model introduces additional flow dependencies into the residual error structure when it is used with most of the study objective functions, which in turn leads to poor-quality probabilistic predictions. In contrast, the new probabilistic error model achieves high-quality probabilistic predictions for all objective functions used in this case study.</p><p>The new probabilistic error model and the open-source software and web application aims to facilitate the adoption of probabilistic predictions in the hydrological modelling community, and to improve the quality of predictions and decisions that are made using those predictions. In particular, our methods can be used to achieve high-quality probabilistic predictions from hydrological models that are calibrated with a wide range of common objective functions.</p>


2018 ◽  
Author(s):  
Fabien Maussion ◽  
Anton Butenko ◽  
Julia Eis ◽  
Kévin Fourteau ◽  
Alexander H. Jarosch ◽  
...  

Abstract. Despite of their importance for sea-level rise, seasonal water availability, and as source of geohazards, mountain glaciers are one of the few remaining sub-systems of the global climate system for which no globally applicable, open source, community-driven model exists. Here we present the Open Global Glacier Model (OGGM, http://www.oggm.org), developed to provide a modular and open source numerical model framework for simulating past and future change of any glacier in the world. The modelling chain comprises data downloading tools (glacier outlines, topography, climate, validation data), a preprocessing module, a mass-balance model, a distributed ice thickness estimation model, and an ice flow model. The monthly mass-balance is obtained from gridded climate data and a temperature index melt model. To our knowledge, OGGM is the first global model explicitly simulating glacier dynamics: the model relies on the shallow ice approximation to compute the depth-integrated flux of ice along multiple connected flowlines. In this paper, we describe and illustrate each processing step by applying the model to a selection of glaciers before running global simulations under idealized climate forcings. Even without an in-depth calibration, the model shows a very realistic behaviour. We are able to reproduce earlier estimates of global glacier volume by varying the ice dynamical parameters within a range of plausible values. At the same time, the increased complexity of OGGM compared to other prevalent global glacier models comes at a reasonable computational cost: several dozens of glaciers can be simulated on a personal computer, while global simulations realized in a supercomputing environment take up to a few hours per century. Thanks to the modular framework, modules of various complexity can be added to the codebase, allowing to run new kinds of model intercomparisons in a controlled environment. Future developments will add new physical processes to the model as well as tools to calibrate the model in a more comprehensive way. OGGM spans a wide range of applications, from ice-climate interaction studies at millenial time scales to estimates of the contribution of glaciers to past and future sea-level change. It has the potential to become a self-sustained, community driven model for global and regional glacier evolution.


Author(s):  
Oleksiy Pastukhov

The purpose of the article is to substantiate the theoretical principles of using specific approaches in the process of training performers of modern dance. The research methodology is based on an interdisciplinary synthesis of scientific methods and approaches integrated with pedagogy, art history, and psychology. General scientific methods were also used: analysis, synthesis, generalization. The scientific novelty of the article lies in the conceptualization of the theoretical substantiation of specific approaches in the preparation of performers for modern dance, in particular, taking into account the latest technologies and psycho-emotional and mental characteristics of the performer. Conclusions. Along with traditional methods of teaching modern dances, it is necessary to develop and implement innovative methods and approaches that would meet the requirements of the latest technological development. In particular, they are related to the peculiarities of distance education, the ability to use computer programs to simulate biomechanical models of movement, to hone their kinematic technique, which largely determines the aesthetic and visual superiority and complexity of modern dance compositions. It is important to take into account the psycho-emotional characteristics of the modern dancer based on the development of his creative and innovative thinking, improvisation, as well as the socio-communicative component, which involves the ability to convey a wide range of potential emotional expressions and social signals from performer to spectator. Keywords: modern dance, innovative thinking, creative thinking, new approaches, teaching choreography, hand biomechanics, psychoemotional state.


2019 ◽  
Author(s):  
Scott A. Longwell ◽  
Polly M. Fordyce

Microfluidic devices are an empowering technology for many labs, enabling a wide range of applications spanning high-throughput encapsulation, molecular separations, and long-term cell culture. In many cases, however, their utility is limited by a ‘world-to-chip’ barrier that makes it difficult to serially interface samples with these devices. As a result, many researchers are forced to rely on low-throughput, manual approaches for managing device input and output (IO) of samples, reagents, and effluent. Here, we present a hardware-software platform for automated microfluidic IO (micrIO). The platform, which is uniquely compatible with positive-pressure microfluidics, comprises an ‘AutoSipper’ for input and a Fraction Collector for output. To facilitate wide-spread adoption, both are open-source builds constructed from components that are readily purchased online or fabricated from included design files. The software control library, written in Python, allows the platform to be integrated with existing experimental setups and to coordinate IO with other functions such as valve actuation and assay imaging. We demonstrate these capabilities by coupling both the AutoSipper and Fraction Collector to a microfluidic device that produces beads with distinct spectral codes, and an analysis of the collected bead fractions establishes the ability of the platform to draw from and output to specific wells of multiwell plates with no detectable cross-contamination between samples.


2020 ◽  
Author(s):  
Arcangela Bollino ◽  
Anna Maria Marotta ◽  
Federica Restelli ◽  
Alessandro Regorda ◽  
Roberto Sabadini

<p>Subduction is responsible for surface displacements and deep mass redistribution. This rearrangement generates density anomalies in a wide spectrum of wavelengths which, in turn, causes important anomalies in the Earth's gravity field that are visible as lineaments parallel to the arc-trench systems. In these areas, when the traditional analysis of the deformation and stress fields is combined with the analysis of the perturbation of the gravity field and its slow time variation, new information on the background environment controlling the tectonic loading phase can be disclosed.</p><p>Here we present the results of a comparative analysis between the geodetically retrieved gravitational anomalies, based on the EIGEN-6C4 model, and those predicted by a 2D thermo-chemical mechanical modeling of the Sumatra and Mariana complexes.</p><p>The 2D model accounts for a wide range of parameters, such as the convergence velocity, the shallow dip angle, the different degrees of coupling between the facing plates. The marker in cell technique is used to compositionally differentiate the system. Phase changes in the crust and in the mantle and mantle hydration are also allowed. To be compliant with the geodetic EIGEN-6C4 gravity data, we define a model normal Earth considering the vertical density distribution at the margins of the model domain, where the masses are not perturbed by the subduction process.</p><p>Model predictions are in good agreement with data, both in terms of wavelengths and magnitude of the gravity anomalies measured in the surroundings of the Sumatra and Marina subductions. Furthermore, our modeling supports that the differences in the style of the gravity anomaly observed in the two areas are attributable to the different environments – ocean-ocean or ocean-continental subduction – that drives a significantly different dynamic in the wedge area.</p>


Sign in / Sign up

Export Citation Format

Share Document