Distributed Training Technology Selection Advisor (TECHSELECT). (User's Manual)

1988 ◽  
Author(s):  
Joseph D. Hagman ◽  
Jr Dykstra ◽  
Dewey I.
2013 ◽  
Author(s):  
Farshad Madani ◽  
Benjamin Stolt ◽  
Greg Wease ◽  
Phaneendra Rampalli

1999 ◽  
Vol 39 (8) ◽  
pp. 177-184 ◽  
Author(s):  
Derin Orhon ◽  
Seval Sözen ◽  
Erdem Görgün ◽  
Emine Ubay Çokgör ◽  
Nazik Artan

Coastal tourist areas should be classified as environmentally sensitive areas. Effective nutrient control should be implemented to safeguard the quality of receiving waters in these areas. In this context, the applicable discharge limitations are reviewed with specific reference to European directives and criteria developed for small coastal residential areas in Turkey are reported; wastewater characterization and its impact on treatment technology selection is reviewed; appropriate treatment technologies are evaluated in terms of selecting new applications and upgrading and retrofitting existing systems.


Entropy ◽  
2021 ◽  
Vol 23 (4) ◽  
pp. 460
Author(s):  
Samuel Yen-Chi Chen ◽  
Shinjae Yoo

Distributed training across several quantum computers could significantly improve the training time and if we could share the learned model, not the data, it could potentially improve the data privacy as the training would happen where the data is located. One of the potential schemes to achieve this property is the federated learning (FL), which consists of several clients or local nodes learning on their own data and a central node to aggregate the models collected from those local nodes. However, to the best of our knowledge, no work has been done in quantum machine learning (QML) in federation setting yet. In this work, we present the federated training on hybrid quantum-classical machine learning models although our framework could be generalized to pure quantum machine learning model. Specifically, we consider the quantum neural network (QNN) coupled with classical pre-trained convolutional model. Our distributed federated learning scheme demonstrated almost the same level of trained model accuracies and yet significantly faster distributed training. It demonstrates a promising future research direction for scaling and privacy aspects.


Sign in / Sign up

Export Citation Format

Share Document