Together in cyberspace

2021 ◽  
pp. 162-167
Author(s):  
Ryan Kirkbride

A recent musical practice that has emerged as a result of the twenty-first century’s rapidly developing technological landscape is live coding. This is the act of writing computer code for generating music in front of an audience while the performer projects their screen. As the number of live coders performing together increases, so too does the number of screens required to project all of the ensemble’s code. This well-documented problem is addressed in this chapter, which introduces a live coding editor built for collaborative improvisation and reflects on its impact on group creativity and ensemble interaction. The editor Troop displays all performer’s code in one window, simplifying technical setup, and shares inter-performer communication with audiences. This case study explores technological design parameters that allow live-coding composers to collaboratively compose music in real time and discuss what means of interaction and collaboration these afford.

1997 ◽  
Vol 36 (8-9) ◽  
pp. 331-336 ◽  
Author(s):  
Gabriela Weinreich ◽  
Wolfgang Schilling ◽  
Ane Birkely ◽  
Tallak Moland

This paper presents results from an application of a newly developed simulation tool for pollution based real time control (PBRTC) of urban drainage systems. The Oslo interceptor tunnel is used as a case study. The paper focuses on the reduction of total phosphorus Ptot and ammonia-nitrogen NH4-N overflow loads into the receiving waters by means of optimized operation of the tunnel system. With PBRTC the total reduction of the Ptot load is 48% and of the NH4-N load 51%. Compared to the volume based RTC scenario the reductions are 11% and 15%, respectively. These further reductions could be achieved with a relatively simple extension of the operation strategy.


Energies ◽  
2020 ◽  
Vol 14 (1) ◽  
pp. 156
Author(s):  
Paige Wenbin Tien ◽  
Shuangyu Wei ◽  
John Calautit

Because of extensive variations in occupancy patterns around office space environments and their use of electrical equipment, accurate occupants’ behaviour detection is valuable for reducing the building energy demand and carbon emissions. Using the collected occupancy information, building energy management system can automatically adjust the operation of heating, ventilation and air-conditioning (HVAC) systems to meet the actual demands in different conditioned spaces in real-time. Existing and commonly used ‘fixed’ schedules for HVAC systems are not sufficient and cannot adjust based on the dynamic changes in building environments. This study proposes a vision-based occupancy and equipment usage detection method based on deep learning for demand-driven control systems. A model based on region-based convolutional neural network (R-CNN) was developed, trained and deployed to a camera for real-time detection of occupancy activities and equipment usage. Experiments tests within a case study office room suggested an overall accuracy of 97.32% and 80.80%. In order to predict the energy savings that can be attained using the proposed approach, the case study building was simulated. The simulation results revealed that the heat gains could be over or under predicted when using static or fixed profiles. Based on the set conditions, the equipment and occupancy gains were 65.75% and 32.74% lower when using the deep learning approach. Overall, the study showed the capabilities of the proposed approach in detecting and recognising multiple occupants’ activities and equipment usage and providing an alternative to estimate the internal heat emissions.


Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 405
Author(s):  
Marcos Lupión ◽  
Javier Medina-Quero ◽  
Juan F. Sanjuan ◽  
Pilar M. Ortigosa

Activity Recognition (AR) is an active research topic focused on detecting human actions and behaviours in smart environments. In this work, we present the on-line activity recognition platform DOLARS (Distributed On-line Activity Recognition System) where data from heterogeneous sensors are evaluated in real time, including binary, wearable and location sensors. Different descriptors and metrics from the heterogeneous sensor data are integrated in a common feature vector whose extraction is developed by a sliding window approach under real-time conditions. DOLARS provides a distributed architecture where: (i) stages for processing data in AR are deployed in distributed nodes, (ii) temporal cache modules compute metrics which aggregate sensor data for computing feature vectors in an efficient way; (iii) publish-subscribe models are integrated both to spread data from sensors and orchestrate the nodes (communication and replication) for computing AR and (iv) machine learning algorithms are used to classify and recognize the activities. A successful case study of daily activities recognition developed in the Smart Lab of The University of Almería (UAL) is presented in this paper. Results present an encouraging performance in recognition of sequences of activities and show the need for distributed architectures to achieve real time recognition.


Sign in / Sign up

Export Citation Format

Share Document