Design Concept Structures in Massive Group Ideation

Author(s):  
Shi Ying Candice Lim ◽  
Bradley Adam Camburn ◽  
Diana Moreno ◽  
Zack Huang ◽  
Kristin Wood

Empirical work in design science has highlighted that the process of ideation can significantly affect design outcome. Exploring the design space with both breadth and depth increases the likelihood of achieving better design outcomes. Furthermore, iteratively attempting to solve challenging design problems in large groups over a short time period may be more effective than protracted exploration by an isolated set of individuals. There remains a substantial opportunity to explore the structure of various design concept sets. In addition, many empirical studies cap analysis at sample sizes of less than one hundred individuals. This has provided substantial, though partial, models of the ideation space. This work explores one new territory in large scale ideation. Two conditions are evaluated. In the first condition, an ideation session was run with 2400 practicing designers and engineers from one organization. In the second condition 1000 individuals ideate on the same problem in a completely distributed environment and without awareness of each other. We compare properties of solution sets produced by each of these groups and activities. Analytical tools from network modeling theory are applied as well as traditional ideation metrics such as concept binning with saturation analysis. Structural network modeling is applied to evaluate the interconnectivity of design concepts. This is a strictly quantitative, and at the same time graphically expressive, means to evaluate the diversity of a design solution set. Observations indicate that the group condition approached saturation of distinct categories more rapidly than the individual, distributed condition. The total number of solution categories developed in the group condition was also higher. Additionally, individuals generally provided concepts across a greater number of solution categories in the group condition. The indication for design practice is that groups of just under forty individuals would provide category saturation within group ideation for a system level design, while distributed individuals may provide additional concept differentiation. This evidence can support development of more systematic ideation strategies. Furthermore, we provide an algorithmic approach for quantitative evaluation of variety in design solution sets using networking analysis techniques. These methods can be used in complex or wicked problems, and system development where the design space is vast.

2013 ◽  
Vol 136 (3) ◽  
Author(s):  
Jie Hu ◽  
Masoumeh Aminzadeh ◽  
Yan Wang

In complex systems design, multidisciplinary constraints are imposed by stakeholders. Engineers need to search feasible design space for a given problem before searching for the optimum design solution. Searching feasible design space can be modeled as a constraint satisfaction problem (CSP). By introducing logical quantifiers, CSP is extended to quantified constraint satisfaction problem (QCSP) so that more semantics and design intent can be captured. This paper presents a new approach to formulate searching design problems as QCSPs in a continuous design space based on generalized interval, and to numerically solve them for feasible solution sets, where the lower and upper bounds of design variables are specified. The approach includes two major components. One is a semantic analysis which evaluates the logic relationship of variables in generalized interval constraints based on Kaucher arithmetic, and the other is a branch-and-prune algorithm that takes advantage of the logic interpretation. The new approach is generic and can be applied to the case when variables occur multiple times, which is not available in other QCSP solving methods. A hybrid stratified Monte Carlo method that combines interval arithmetic with Monte Carlo sampling is also developed to verify the correctness of the QCSP solution sets obtained by the branch-and-prune algorithm.


Author(s):  
Pauline Jacobson

This chapter examines the currently fashionable notion of ‘experimental semantics’, and argues that most work in natural language semantics has always been experimental. The oft-cited dichotomy between ‘theoretical’ (or ‘armchair’) and ‘experimental’ is bogus and should be dropped form the discourse. The same holds for dichotomies like ‘intuition-based’ (or ‘thought experiments’) vs. ‘empirical’ work (and ‘real experiments’). The so-called new ‘empirical’ methods are often nothing more than collecting the large-scale ‘intuitions’ or, doing multiple thought experiments. Of course the use of multiple subjects could well allow for a better experiment than the more traditional single or few subject methodologies. But whether or not this is the case depends entirely on the question at hand. In fact, the chapter considers several multiple-subject studies and shows that the particular methodology in those cases does not necessarily provide important insights, and the chapter argues that some its claimed benefits are incorrect.


Author(s):  
Andrew Reid ◽  
Julie Ballantyne

In an ideal world, assessment should be synonymous with effective learning and reflect the intricacies of the subject area. It should also be aligned with the ideals of education: to provide equitable opportunities for all students to achieve and to allow both appropriate differentiation for varied contexts and students and comparability across various contexts and students. This challenge is made more difficult in circumstances in which the contexts are highly heterogeneous, for example in the state of Queensland, Australia. Assessment in music challenges schooling systems in unique ways because teaching and learning in music are often naturally differentiated and diverse, yet assessment often calls for standardization. While each student and teacher has individual, evolving musical pathways in life, the syllabus and the system require consistency and uniformity. The challenge, then, is to provide diverse, equitable, and quality opportunities for all children to learn and achieve to the best of their abilities. This chapter discusses the designing and implementation of large-scale curriculum as experienced in secondary schools in Queensland, Australia. The experiences detailed explore the possibilities offered through externally moderated school-based assessment. Also discussed is the centrality of system-level clarity of purpose, principles and processes, and the provision of supportive networks and mechanisms to foster autonomy for a diverse range of music educators and contexts. Implications for education systems that desire diversity, equity, and quality are discussed, and the conclusion provokes further conceptualization and action on behalf of students, teachers, and the subject area of music.


Author(s):  
Miguel Ángel Hernández-Rodríguez ◽  
Ermengol Sempere-Verdú ◽  
Caterina Vicens-Caldentey ◽  
Francisca González-Rubio ◽  
Félix Miguel-García ◽  
...  

We aimed to identify and compare medication profiles in populations with polypharmacy between 2005 and 2015. We conducted a cross-sectional study using information from the Computerized Database for Pharmacoepidemiologic Studies in Primary Care (BIFAP, Spain). We estimated the prevalence of therapeutic subgroups in all individuals 15 years of age and older with polypharmacy (≥5 drugs during ≥6 months) using the Anatomical Therapeutic Chemical classification system level 4, by sex and age group, for both calendar years. The most prescribed drugs were proton-pump inhibitors (PPIs), statins, antiplatelet agents, benzodiazepine derivatives, and angiotensin-converting enzyme inhibitors. The greatest increases between 2005 and 2015 were observed in PPIs, statins, other antidepressants, and β-blockers, while the prevalence of antiepileptics was almost tripled. We observed increases in psychotropic drugs in women and cardiovascular medications in men. By patient´s age groups, there were notable increases in antipsychotics, antidepressants, and antiepileptics (15–44 years); antidepressants, PPIs, and selective β-blockers (45–64 years); selective β-blockers, biguanides, PPIs, and statins (65–79 years); and in statins, selective β-blockers, and PPIs (80 years and older). Our results revealed important increases in the use of specific therapeutic subgroups, like PPIs, statins, and psychotropic drugs, highlighting opportunities to design and implement strategies to analyze such prescriptions’ appropriateness.


2021 ◽  
Vol 20 (1) ◽  
Author(s):  
Jingru Zhou ◽  
Yingping Zhuang ◽  
Jianye Xia

Abstract Background Genome-scale metabolic model (GSMM) is a powerful tool for the study of cellular metabolic characteristics. With the development of multi-omics measurement techniques in recent years, new methods that integrating multi-omics data into the GSMM show promising effects on the predicted results. It does not only improve the accuracy of phenotype prediction but also enhances the reliability of the model for simulating complex biochemical phenomena, which can promote theoretical breakthroughs for specific gene target identification or better understanding the cell metabolism on the system level. Results Based on the basic GSMM model iHL1210 of Aspergillus niger, we integrated large-scale enzyme kinetics and proteomics data to establish a GSMM based on enzyme constraints, termed a GEM with Enzymatic Constraints using Kinetic and Omics data (GECKO). The results show that enzyme constraints effectively improve the model’s phenotype prediction ability, and extended the model’s potential to guide target gene identification through predicting metabolic phenotype changes of A. niger by simulating gene knockout. In addition, enzyme constraints significantly reduced the solution space of the model, i.e., flux variability over 40.10% metabolic reactions were significantly reduced. The new model showed also versatility in other aspects, like estimating large-scale $$k_{{cat}}$$ k cat values, predicting the differential expression of enzymes under different growth conditions. Conclusions This study shows that incorporating enzymes’ abundance information into GSMM is very effective for improving model performance with A. niger. Enzyme-constrained model can be used as a powerful tool for predicting the metabolic phenotype of A. niger by incorporating proteome data. In the foreseeable future, with the fast development of measurement techniques, and more precise and rich proteomics quantitative data being obtained for A. niger, the enzyme-constrained GSMM model will show greater application space on the system level.


Author(s):  
Zsolt Lattmann ◽  
Adam Nagel ◽  
Jason Scott ◽  
Kevin Smyth ◽  
Chris vanBuskirk ◽  
...  

We describe the use of the Cyber-Physical Modeling Language (CyPhyML) to support trade studies and integration activities in system-level vehicle designs. CyPhyML captures parameterized component behavior using acausal models (i.e. hybrid bond graphs and Modelica) to enable automatic composition and synthesis of simulation models for significant vehicle subsystems. Generated simulations allow us to compare performance between different design alternatives. System behavior and evaluation are specified independently from specifications for design-space alternatives. Test bench models in CyPhyML are given in terms of generic assemblies over the entire design space, so performance can be evaluated for any selected design instance once automated design space exploration is complete. Generated Simulink models are also integrated into a mobility model for interactive 3-D simulation.


Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 598
Author(s):  
Jean-François Pratte ◽  
Frédéric Nolet ◽  
Samuel Parent ◽  
Frédéric Vachon ◽  
Nicolas Roy ◽  
...  

Analog and digital SiPMs have revolutionized the field of radiation instrumentation by replacing both avalanche photodiodes and photomultiplier tubes in many applications. However, multiple applications require greater performance than the current SiPMs are capable of, for example timing resolution for time-of-flight positron emission tomography and time-of-flight computed tomography, and mitigation of the large output capacitance of SiPM array for large-scale time projection chambers for liquid argon and liquid xenon experiments. In this contribution, the case will be made that 3D photon-to-digital converters, also known as 3D digital SiPMs, have a potentially superior performance over analog and 2D digital SiPMs. A review of 3D photon-to-digital converters is presented along with various applications where they can make a difference, such as time-of-flight medical imaging systems and low-background experiments in noble liquids. Finally, a review of the key design choices that must be made to obtain an optimized 3D photon-to-digital converter for radiation instrumentation, more specifically the single-photon avalanche diode array, the CMOS technology, the quenching circuit, the time-to-digital converter, the digital signal processing and the system level integration, are discussed in detail.


Author(s):  
Sudhakar Y. Reddy

Abstract This paper describes HIDER, a methodology that enables detailed simulation models to be used during the early stages of system design. HIDER uses a machine learning approach to form abstract models from the detailed models. The abstract models are used for multiple-objective optimization to obtain sets of non-dominated designs. The tradeoffs between design and performance attributes in the non-dominated sets are used to interactively refine the design space. A prototype design tool has been developed to assist the designer in easily forming abstract models, flexibly defining optimization problems, and interactively exploring and refining the design space. To demonstrate the practical applicability of this approach, the paper presents results from the application of HIDER to the system-level design of a wheel loader. In this demonstration, complex simulation models for cycle time evaluation and stability analysis are used together for early-stage exploration of design space.


Sign in / Sign up

Export Citation Format

Share Document