scholarly journals Fluid interaction for information visualization

2011 ◽  
Vol 10 (4) ◽  
pp. 327-340 ◽  
Author(s):  
Niklas Elmqvist ◽  
Andrew Vande Moere ◽  
Hans-Christian Jetter ◽  
Daniel Cernea ◽  
Harald Reiterer ◽  
...  

Despite typically receiving little emphasis in visualization research, interaction in visualization is the catalyst for the user's dialogue with the data, and, ultimately, the user's actual understanding and insight into these data. There are many possible reasons for this skewed balance between the visual and interactive aspects of a visualization. One reason is that interaction is an intangible concept that is difficult to design, quantify, and evaluate. Unlike for visual design, there are few examples that show visualization practitioners and researchers how to design the interaction for a new visualization in the best manner. In this article, we attempt to address this issue by collecting examples of visualizations with ‘best-in-class’ interaction and using them to extract practical design guidelines for future designers and researchers. We call this concept fluid interaction, and we propose an operational definition in terms of the direct manipulation and embodied interaction paradigms, the psychological concept of ‘flow’, and Norman's gulfs of execution and evaluation.

Info ◽  
2014 ◽  
Vol 16 (6) ◽  
pp. 8-23 ◽  
Author(s):  
Ellen Wauters ◽  
Verónica Donoso ◽  
Eva Lievens

Purpose – This article aims to reflect on possible ways to optimise current ways to deliver information provision to make it more transparent to users. In particular, this article will refer to the benefits (and challenges) of using more user-centred approaches to inform users in a more transparent way. Design/methodology/approach – In this paper we analyse individual, as well as contextual factors (e.g. cognitive differences, time constraints, specific features of social networking sites [SNS] platforms) which may have an impact on the way users deal with Terms of Use, privacy policies and other types of information provision typically made available on SNS platforms. In addition, possible ways of improving current practices in the field are discussed. In particular, the benefits (and challenges) of a user-centred approach have been referred to when it comes to informing users in a way that is more meaningful to them. Finally, it is discussed how user-centred approaches can act as mechanisms to increase transparency in SNS environments and how (alternative) forms of regulation could benefit from such an approach. Findings – The authors believe that it is necessary to start focussing on users/consumers’ needs, expectations and values to develop visualisation tools that can help make law (more) meaningful to users/consumers by giving them a better insight into their rights and obligations and by guiding them in making truly informed decisions regarding their online choices and behaviour. Originality/value – By looking at different techniques such as visual design and the timing of information, the article contributes to the discussion on how people can be made more aware of legal documents and actually read them.


2020 ◽  
Vol 179 ◽  
pp. 02013
Author(s):  
Yi Zou ◽  
Na Qi

The visual design of the infographic is designed to compress complex information and present it to the audience through an intuitive and easy-to-understand expression, so that they can effectively absorb the content therein. With the continuous development of science and information visualization technology, the production methods and presentation forms of information charts have become more and more abundant, and the direction from two-dimensional information charts to multi-dimensional information charts and dynamic information charts has continuously evolved. This paper cuts in from the perspective of user experience, and proposes optimization suggestions for the current status of visual design of infographics.


1995 ◽  
Vol 117 (1) ◽  
pp. 47-52 ◽  
Author(s):  
V. R. Dhole ◽  
J. P. Zheng

Pinch technology has developed into a powerful tool for thermodynamic analysis of chemical processes and associated utilities, resulting in significant energy savings. Conventional pinch analysis identifies the most economical energy consumption in terms of heat loads and provides practical design guidelines to achieve this. However, in analyzing systems involving heat and power, for example, steam and gas turbines, etc., pure heat load analysis is insufficient. Exergy analysis, on the other hand, provides a tool for heat and power analysis, although at times it does not provide clear practical design guidelines. An appropriate combination of pinch and exergy analysis can provide practical methodology for the analysis of heat and power systems. The methodology has been successfully applied to refrigeration systems. This paper introduces the application of a combined pinch and exergy approach to commercial power plants with a demonstration example of a closed-cycle gas turbine (CCGT) system. Efficiency improvement of about 0.82 percent (50.2 to 51.02 percent) can be obtained by application of the new approach. More importantly, the approach can be used as an analysis and screening tool for the various design improvements and is generally applicable to any commercial power generation facility.


1968 ◽  
Vol 26 (3_suppl) ◽  
pp. 1043-1046
Author(s):  
R. G. Merrill ◽  
D. R. Metcalf

Subjective evaluations of television systems are usually avoided because they involve individual differences that are considered unpredictable and uncontrollable. The psychological concept of visual cognitive styles can lead to measures of individual differences in visual perception and therefore give insight into the nature of the human information-handling system. Evaluation of cognitive styles can therefore provide psychological controls in the interpretation of subjective evaluations of television systems; they may also provide guidance for solutions of the television bandwidth reduction problem.


2021 ◽  
Vol 3 ◽  
Author(s):  
Reyhan Pradantyo ◽  
Max V. Birk ◽  
Scott Bateman

The visual design of antagonists—typically thought of as “bad guys”—is crucial for game design. Antagonists are key to providing the backdrop to a game's setting and motivating a player's actions. The visual representation of antagonists is important because it affects player expectations about the character's personality and potential actions. Particularly important is how players perceive an antagonist's morality. For example, an antagonist appearing disloyal might foreshadow betrayal; a character who looks cruel suggests that tough fights are ahead; or, a player might be surprised when a friendly looking character attacks them. Today, the art of designing character morality is informed by archetypal elements, existing characters, and the artist's own background. However, little work has provided insight into how an antagonist's appearance can lead players to make moral judgments. Using Mechanical Turk, we collected participant ratings on a stimulus image set of 105 antagonists from popular video games. The results of our work provide insights into how the visual attributes of antagonists can influence judgments of character morality. Our findings provide a valuable new lens for understanding and deepening an important aspect of game design. Our results can be used to help ensure that a particular character design has the best chance to be universally seen as “evil,” or to help create more complex and conflicted emotional experiences through carefully designed characters that do not appear to be bad. Our research extends current research practices that seek to build an understanding of game design and provides exciting new directions for exploring how design and aesthetic practices can be better studied and supported.


Author(s):  
Terry Griffiths ◽  
Scott Draper ◽  
David White ◽  
Liang Cheng ◽  
Hongwei An ◽  
...  

The on-bottom stability design of subsea pipelines is important to ensure safety and reliability but is challenging to achieve, particularly in Australia due to onerous metocean and seabed conditions, and the prevalence of light gas pipelines. This challenge has been amplified by the fact that industry design guidelines have given no guidance on how to incorporate the potential benefits of seabed mobility, which can lead to lowering and self-burial of the pipeline on a sandy seabed. In this paper, we review the learnings of the STABLEpipe Joint Industry Project (JIP), which was initiated with the aim of developing new design guidelines to assess the on-bottom stability of pipelines on mobile seabeds. The paper summarises the new research undertaken within the STABLEpipe JIP to better predict sedimentation and scour, pipe-fluid interaction and pipe-soil interaction. New design methods to assess the on-bottom stability are also outlined, which have been developed based on the new research. These methods have been adopted in a DNVGL guideline authored by the JIP researchers in collaboration with DNVGL and presently available for use by the JIP participants. Finally, applications of the STABLEpipe JIP outcomes and focus areas for further work are discussed.


Entropy ◽  
2020 ◽  
Vol 22 (9) ◽  
pp. 958
Author(s):  
Stella Civelli ◽  
Marco Secondini

Probabilistic amplitude shaping—implemented through a distribution matcher (DM)—is an effective approach to enhance the performance and the flexibility of bandwidth-efficient coded modulations. Different DM structures have been proposed in the literature. Typically, both their performance and their complexity increase with the block length. In this work, we present a hierarchical DM (Hi-DM) approach based on the combination of several DMs of different possible types, which provides the good performance of long DMs with the low complexity of several short DMs. The DMs are organized in layers. Each upper-layer DM encodes information on a sequence of lower-layer DMs, which are used as “virtual symbols”. First, we describe the Hi-DM structure, its properties, and the encoding and decoding procedures. Then, we present three particular Hi-DM configurations, providing some practical design guidelines, and investigating their performance in terms of rate loss and energy loss. Finally, we compare the system performance obtained with the proposed Hi-DM structures and with their single-layer counterparts: a 0.19dB SNR gain is obtained by a two-layer Hi-DM based on constant composition DMs (CCDM) compared to a single-layer CCDM with same complexity; a 0.12dB gain and a significant complexity reduction are obtained by a Hi-DM based on minimum-energy lookup tables compared to a single-layer DM based on enumerative sphere shaping with same memory requirements.


Sign in / Sign up

Export Citation Format

Share Document