scholarly journals A Map Is a Living Structure with the Recurring Notion of Far More Smalls than Larges

2020 ◽  
Vol 9 (6) ◽  
pp. 388
Author(s):  
Bin Jiang ◽  
Terry Slocum

The Earth’s surface or any territory is a coherent whole or subwhole, in which the notion of “far more small things than large ones” recurs at different levels of scale ranging from the smallest of a couple of meters to the largest of the Earth’s surface or that of the territory. The coherent whole has the underlying character called wholeness or living structure, which is a physical phenomenon pervasively existing in our environment and can be defined mathematically under the new third view of space conceived and advocated by Christopher Alexander: space is neither lifeless nor neutral, but a living structure capable of being more alive or less alive. This paper argues that both the map and the territory are a living structure, and that it is the inherent hierarchy of “far more smalls than larges” that constitutes the foundation of maps and mapping. It is the underlying living structure of geographic space or geographic features that makes maps or mapping possible, i.e., larges to be retained, while smalls to be omitted in a recursive manner (Note: larges and smalls should be understood broadly and wisely, in terms of not only sizes, but also topological connectivity and semantic meaning). Thus, map making is largely an objective undertaking governed by the underlying living structure, and maps portray the truth of the living structure. Based on the notion of living structure, a map can be considered to be an iterative system, which means that the map is the map of the map of the map, and so on endlessly. The word endlessly means continuous map scales between two discrete ones, just as there are endless real numbers between 1 and 2. The iterated map system implies that each of the subsequent small-scale maps is a subset of the single large-scale map, not a simple subset but with various constraints to make all geographic features topologically correct.

Author(s):  
A. Brychtová ◽  
A. Çöltekin ◽  
V. Pászto

In this study, we first develop a hypothesis that existing quantitative visual complexity measures will overall reflect the level of cartographic generalization, and test this hypothesis. Specifically, to test our hypothesis, we first selected common geovisualization types (i.e., cartographic maps, hybrid maps, satellite images and shaded relief maps) and retrieved examples as provided by Google Maps, OpenStreetMap and SchweizMobil by swisstopo. Selected geovisualizations vary in cartographic design choices, scene contents and different levels of generalization. Following this, we applied one of Rosenholtz et al.’s (2007) visual clutter algorithms to obtain quantitative visual complexity scores for screenshots of the selected maps. We hypothesized that visual complexity should be constant across generalization levels, however, the algorithm suggested that the complexity of small-scale displays (less detailed) is higher than those of large-scale (high detail). We also observed vast differences in visual complexity among maps providers, which we attribute to their varying approaches towards the cartographic design and generalization process. Our efforts will contribute towards creating recommendations as to how the visual complexity algorithms could be optimized for cartographic products, and eventually be utilized as a part of the cartographic design process to assess the visual complexity.


Urban Science ◽  
2019 ◽  
Vol 3 (3) ◽  
pp. 96 ◽  
Author(s):  
Bin Jiang

Discovered by Christopher Alexander, living structure is a physical phenomenon, through which the quality of the built environment or artifacts can be judged objectively. It has two distinguishing properties just like a tree: “Far more small things than large ones” across all scales from the smallest to the largest, and “more or less similar things” on each scale. As a physical phenomenon, and mathematical concept, living structure is essentially empirical, discovered and developed from miniscule observation in nature- and human-made things, and it affects our daily lives in some practical ways, such as where to put a table or a flower vase in a room, helping us to make beautiful things and environments. Living structure is not only empirical, but also philosophical and visionary, enabling us to see the world and space in more meaningful ways. This paper is intended to defend living structure as a physical phenomenon, and a mathematical concept, clarifying some common questions and misgivings surrounding Alexander’s design thoughts, such as the objective or structural nature of beauty, building styles advocated by Alexander, and mysterious nature of his concepts. For this purpose, we first illustrate living structure—essentially organized complexity, as advocated by the late Jane Jacobs (1916–2006)—that is governed by two fundamental laws (scaling law and Tobler’s law), and generated in some step by step fashion by two design principles (differentiation and adaptation) through the 15 structural properties. We then verify why living structure is primarily empirical, drawing evidence from Alexander’s own work, as well as our case studies applied to the Earth’s surface including cities, streets, and buildings, and two logos. Before reaching conclusions, we concentrate on the most mysterious part of Alexander’s work—the luminous ground or the hypothesized “I”—as a substance that pervasively exists everywhere, in space and matter including our bodies, in order to make better sense of living structure in our minds.


Author(s):  
A. Brychtová ◽  
A. Çöltekin ◽  
V. Pászto

In this study, we first develop a hypothesis that existing quantitative visual complexity measures will overall reflect the level of cartographic generalization, and test this hypothesis. Specifically, to test our hypothesis, we first selected common geovisualization types (i.e., cartographic maps, hybrid maps, satellite images and shaded relief maps) and retrieved examples as provided by Google Maps, OpenStreetMap and SchweizMobil by swisstopo. Selected geovisualizations vary in cartographic design choices, scene contents and different levels of generalization. Following this, we applied one of Rosenholtz et al.’s (2007) visual clutter algorithms to obtain quantitative visual complexity scores for screenshots of the selected maps. We hypothesized that visual complexity should be constant across generalization levels, however, the algorithm suggested that the complexity of small-scale displays (less detailed) is higher than those of large-scale (high detail). We also observed vast differences in visual complexity among maps providers, which we attribute to their varying approaches towards the cartographic design and generalization process. Our efforts will contribute towards creating recommendations as to how the visual complexity algorithms could be optimized for cartographic products, and eventually be utilized as a part of the cartographic design process to assess the visual complexity.


1992 ◽  
Vol 238 ◽  
pp. 325-336 ◽  
Author(s):  
M. Germano

Explicit or implicit filtered representations of chaotic fields like spectral cut-offs or numerical discretizations are commonly used in the study of turbulence and particularly in the so-called large-eddy simulations. Peculiar to these representations is that they are produced by different filtering operators at different levels of resolution, and they can be hierarchically organized in terms of a characteristic parameter like a grid length or a spectral truncation mode. Unfortunately, in the case of a general implicit or explicit filtering operator the Reynolds rules of the mean are no longer valid, and the classical analysis of the turbulence in terms of mean values and fluctuations is not so simple.In this paper a new operatorial approach to the study of turbulence based on the general algebraic properties of the filtered representations of a turbulence field at different levels is presented. The main results of this analysis are the averaging invariance of the filtered Navier—Stokes equations in terms of the generalized central moments, and an algebraic identity that relates the turbulent stresses at different levels. The statistical approach uses the idea of a decomposition in mean values and fluctuations, and the original turbulent field is seen as the sum of different contributions. On the other hand this operatorial approach is based on the comparison of different representations of the turbulent field at different levels, and, in the opinion of the author, it is particularly fitted to study the similarity between the turbulence at different filtering levels. The best field of application of this approach is the numerical large-eddy simulation of turbulent flows where the large scale of the turbulent field is captured and the residual small scale is modelled. It is natural to define and to extract from the resolved field the resolved turbulence and to use the information that it contains to adapt the subgrid model to the real turbulent field. Following these ideas the application of this approach to the large-eddy simulation of the turbulent flow has been produced (Germano et al. 1991). It consists in a dynamic subgrid-scale eddy viscosity model that samples the resolved scale and uses this information to adjust locally the Smagorinsky constant to the local turbulence.


2021 ◽  
Vol 51 (4) ◽  
pp. 1283-1300
Author(s):  
Qunshu Tang ◽  
Zhiyou Jing ◽  
Jianmin Lin ◽  
Jie Sun

AbstractThe Mariana Ridge is one of the prominent mixing hotspots of the open ocean. The high-resolution underway marine seismic reflection technique provides an improved understanding of the spatiotemporal continuous map of ocean turbulent mixing. Using this novel technique, this study quantifies the diapycnal diffusivity of the subthermocline (300–1200-m depth) turbulence around the Mariana Ridge. The autotracked wave fields on seismic images allow us to derive the dissipation rate ε and diapycnal diffusivity Kρ based on the Batchelor model, which relates the horizontal slope spectra with +1/3 slope to the inertial convective turbulence regime. Diffusivity is locally intensified around the seamounts exceeding 10−3 m2 s−1 and gradually decreases to 10−5–10−4 m2 s−1 in ~60-km range, a distance that may be associated with the internal tide beam emanating paths. The overall pattern suggests a large portion of the energy dissipates locally and a significant portion dissipates in the far field. Empirical diffusivity models Kρ(x) and Kρ(z), varying with the distance from seamounts and the height above seafloor, respectively, are constructed for potential use in ocean model parameterization. Geographic distributions of both the vertically averaged dissipation rate and diffusivity show tight relationships with the topography. Additionally, a strong agreement of the dissipation results between seismic observation and numerical simulation is found for the first time. Such an agreement confirms the suitability of the seismic method in turbulence quantification and suggests the energy cascade from large-scale tides to small-scale turbulence via possible mechanisms of local direct tidal dissipation, near-local wave–wave interactions, and far-field radiating and breaking.


1996 ◽  
Vol 23 (1) ◽  
pp. 3-24 ◽  
Author(s):  
D M Mark ◽  
A U Frank

In this paper human experience and perception of phenomena and relations in space are studied. This focus is in contrast to previous work where space and spatial relations were examined as objective phenomena of the world. This study leads in turn to a goal: to identify models of space that can be used both in cognitive science and in the design and implementation of geographic information systems (GISs). Experiential models of the world are based on sensorimotor and visual experiences with environments, and form in individual minds, as the associated bodies and senses experience their worlds. Formal models consist of axioms expressed in a formal language, together with mathematical rules to infer conclusions from these axioms. In this paper we will review both types of models, considering each to be an abstraction of the same ‘real world’. The review of experiential models is based primarily on recent developments in cognitive science, expounded by Rosch, Johnson, Talmy, and especially Lakoff. In these models it is suggested that perception and cognition are driven by image-schemata and other mental models, often language-based. Cross-cultural variations are admitted and even emphasized. The ways in which people interact with small-scale (‘tabletop’) spaces filled with everyday objects are in sharp contrast to the ways in which they experience geographic (large-scale) spaces during wayfinding and other spatial activities. We then address the issue of the ‘objective’ geometry of geographic space. If objectivity is defined by measurement, this leads to a surveyor's view and a near-Euclidean geometry. These models are then related to issues in the design of GISs. To be implemented on digital computers, geometric concepts and models must be formalized. The idea of a formal geometry of natural language is discussed and some aspects of it are presented. Formalizing the links between cognitive categories and models on the one hand and between geometry and computer representations on the other are key elements in the research agenda.


Water ◽  
2021 ◽  
Vol 13 (6) ◽  
pp. 805
Author(s):  
Wenya Gu ◽  
Xiaochen Zhu ◽  
Xiangrui Meng ◽  
Xinfa Qiu

Terrain plays an important role in the formation, development and distribution of local precipitation and is a major factor leading to locally abnormal weather in weather systems. Although small-scale topography has little influence on the spatial distribution of precipitation, it interferes with precipitation fitting. Due to the arbitrary combination of small, medium and large-scale terrain, complex terrain distribution is formed, and small-scale terrain cannot be clearly defined and removed. Based on the idea of bidimensional empirical mode decomposition (BEMD), this paper extracts small-scale terrain data layer by layer to smooth the terrain and constructs a macroterrain model for different scales in Central China. Based on the precipitation distribution model using multiple regression, precipitation models (B0, B1, B2 and B3) of different scales are constructed. The 18-year monthly average precipitation data of each station are compared with the precipitation simulation results under different scales of terrain and TRMM precipitation data, and the influence of different levels of small-scale terrain on the precipitation distribution is analysed. The results show that (1) in Central China, the accuracy of model B2 is much higher than that of TRMM model A and monthly precipitation model B0. The comprehensive evaluation indexes are increased by 3.31% and 1.92%, respectively. (2) The influence of different levels of small-scale terrain on the precipitation distribution is different. The first- and second-order small-scale terrain has interference effects on precipitation fitting, and the third-order small-scale terrain has an enhancement effect on precipitation. However, the effect of small-scale topography on the precipitation distribution is generally reflected as interference.


2000 ◽  
Vol 45 (4) ◽  
pp. 396-398
Author(s):  
Roger Smith
Keyword(s):  

2020 ◽  
Vol 1 (1) ◽  
pp. 1-10
Author(s):  
Evi Rahmawati ◽  
Irnin Agustina Dwi Astuti ◽  
N Nurhayati

IPA Integrated is a place for students to study themselves and the surrounding environment applied in daily life. Integrated IPA Learning provides a direct experience to students through the use and development of scientific skills and attitudes. The importance of integrated IPA requires to pack learning well, integrated IPA integration with the preparation of modules combined with learning strategy can maximize the learning process in school. In SMP 209 Jakarta, the value of the integrated IPA is obtained from 34 students there are 10 students completed and 24 students are not complete because they get the value below the KKM of 68. This research is a development study with the development model of ADDIE (Analysis, Design, Development, Implementation, and Evaluation). The use of KPS-based integrated IPA modules (Science Process sSkills) on the theme of rainbow phenomenon obtained by media expert validation results with an average score of 84.38%, average material expert 82.18%, average linguist 75.37%. So the average of all aspects obtained by 80.55% is worth using and tested to students. The results of the teacher response obtained 88.69% value with excellent criteria. Student responses on a small scale acquired an average score of 85.19% with highly agreed criteria and on the large-scale student response gained a yield of 86.44% with very agreed criteria. So the module can be concluded receiving a good response by the teacher and students.


2019 ◽  
Vol 61 (1) ◽  
pp. 5-13 ◽  
Author(s):  
Loretta Lees

Abstract Gentrification is no-longer, if it ever was, a small scale process of urban transformation. Gentrification globally is more often practised as large scale urban redevelopment. It is state-led or state-induced. The results are clear – the displacement and disenfranchisement of low income groups in favour of wealthier in-movers. So, why has gentrification come to dominate policy making worldwide and what can be done about it?


Sign in / Sign up

Export Citation Format

Share Document