scholarly journals CORBA-based Runtime Environments for Standardized Distributed Learning Architectures

10.28945/2412 ◽  
2001 ◽  
Author(s):  
Luis Anido Rifon ◽  
Martin Llamas Nistal ◽  
Manuel J. Fernandez Iglesias ◽  
Manuel Caeiro Rodnguez ◽  
Juan M. Santos Gago ◽  
...  

The learning technology standardization process is taking the lead role in the research efforts into computer-based education. Institutions like the IEEE or the US Department of Defense have set up committees to deliver recommendations and specifications on this area to provide interoperability between different educational systems. The first part of this paper shows an up-to-date survey on this field. In the second part we present our contribution to this area: a distributed architecture to develop interoperable educational frameworks over a CORBA domain interface. Our system aims at the standardization the development process of distributed educational environments from reusable software components. We focus our attention on the runtime environment, which is responsible for contents delivering, student tracking and course routing.

2011 ◽  
Vol 4 (2) ◽  
Author(s):  
Jean Underwood ◽  
Arne Dahlberg ◽  
Simon FitzPatrick ◽  
Malcolm Greenwood

The STILE Project (Students' and Teachers' Integrated Learning Environment) is one of 76 projects set up under the UK Government's Teaching and Learning Technology Programme (TLTP) initiative sponsored by the British Higher Education Funding Councils (HEFCs). The STILE Project uses hypermedia to provide greater opportunities for independent and flexible modes of learning both in a campus situation and for distance learning. The approach is resource-based. STILE provides a mechanism for both tutors and learners to discover and access relevant resources when they need them, together with facilities that enable users readily to use and re-use existing materials, to integrate them together, and to add further materials of their own in a way that seems natural to them (see Ruggles et al, 1995). The result is not a closed and finished product,but a set of tools and services and a continually developing resource base. The effect is to ease the load on academic staff in maintaining and supporting student access to resources, and to enrich the set of resources available to both staff and students.DOI:10.1080/0968776960040205


2011 ◽  
Vol 4 (3) ◽  
Author(s):  
Gabriel Jacobs

I am currently a member of a working party set up in my own university to look into a range of IT matters, including learning technology, with the aim of producing a mediumterm institutional plan. At many of the meetings I attend, I hear about the urgency of focusing our CAL effort, but the conviction around the table is often tempered by lecturers' complaints that the off-the-shelf courseware they have tried either does not work well, or does not fit their particular needs, or both. So a suggestion is made: we should move in the direction of developing our own high-quality educational software tailored to our individual requirements. And since these requirements are very diverse over the whole campus, we should establish a Centre for Educational Technology, a Courseware Resources and Advice Unit, a Virtual Learning Development Laboratory, an Institute for Computer-Based Academic Practice . . . call it what you will. It should be staffed by experts who can advise departments and produce for them, or help them to produce, the exact software they require. It should be supported by a battalion of technicians, and should not only be equipped with white-hot multimedia but also backed by sufficient financial resources to ensure continuous upgrades so as to remain in a permanent state of state-of-the-art. The bank balance is not as healthy as it might be (whose is?), but the university management must nevertheless somehow be convinced of the necessity of spending money on the project.DOI:10.1080/0968776960040301


2021 ◽  
pp. 263300242110244
Author(s):  
Alice M. Greenwald ◽  
Clifford Chanin ◽  
Henry Rousso ◽  
Michel Wieviorka ◽  
Mohamed-Ali Adraoui

How do societies and states represent the historical, moral, and political weight of the terrorist attacks they have had to face? Having suffered in recent years from numerous terrorist attacks on their soil originating from jihadist movements, and often led by actors who were also their own citizens, France and the United States have set up—or seek to do so—places of memory whose functions, conditions of creation, modes of operation, and nature of the messages sent may vary. Three of the main protagonists and initiators of two museum-memorial projects linked to terrorist attacks have agreed to deliver their visions of the role and of the political, social, and historical context in which these projects have emerged. Allowing to observe similarities and differences between the American and French approach, this interview sheds light on the place of memory and feeling in societies struck by tragic events and seeking to cure their ills through memory and commemoration.


Author(s):  
P. Kadochnikov ◽  
M. Ptashkina

The US and the EU are negotiating a comprehensive Trans-Atlantic Trade and Investment Partnership (TTIP). The main purposes of the agreement are to stimulate economic growth and employment, to facilitate trade and investment and raise competitiveness on both sides of the Atlantic. The US and EU are the biggest trade and investment partners for each other, as well as most important partners for a number of other countries. The Trans-Atlantic free trade agreement would not only facilitate bilateral cooperation, but has a potential to set up new, more advanced international trade and investment rules and practices. The agreement is aimed, among other point, at resolving some of the existing problems in bilateral relations, such as differences in regulatory practices, market access conditions, government procurement, intellectual property rights (IPR) and investor protection. However, some of these differences are deeply inherent in the regulatory systems and have become the reasons for numerous disputes. Despite the fact that the negotiations on TTIP are still in progress, it is already possible to identify and assess the underlying differences that would potentially hamper the creation of deep provisions in the future agreement. The paper aims at analyzing the most difficult areas of negotiations and giving predictions for the future provisions. Firstly, the paper gives an overview of the scope and structure of bilateral relations between the US and EU. Secondly, the authors give detailed analysis of the most important points of the negotiation’s agenda, making stress on the underlying differences in domestic regulation and assessing the depth of those differences. The conclusions are as follows. While some of the areas, such as tariffs, labor and environment, SMEs, state enterprises and others, are relatively easy to agree upon, as both economies are striving to achieve high standards, negotiations on other issues, such as government procurement, NTM regulation and IPR are less likely to achieve high standards.


Author(s):  
Susan Murray

In response to a growing demand from the public for health information resources, North American public libraries have provided varying levels of consumer health information (CHI) services since the 1970s. Due to the availability of funding in the US, many American public libraries have provided CHI services, although the majority of these have been as partnerships with health sciences libraries or via the “Go Local” programs. In Canada, where no specific funding has been available for CHI services, few public libraries have set up CHI services; health information has generally been provided by augmenting health collections or “virtually,” i.e., by providing links to recommended electronic resources via the library’s Web site.


2020 ◽  
Vol 245 ◽  
pp. 07010
Author(s):  
Marcelo Vogel ◽  
Mikhail Borodin ◽  
Alessandra Forti ◽  
Lukas Heinrich

This paper describes the deployment of the offline software of the ATLAS experiment at LHC in containers for use in production workflows such as simulation and reconstruction. To achieve this goal we are using Docker and Singularity, which are both lightweight virtualization technologies that can encapsulate software packages inside complete file systems. The deployment of offline releases via containers removes the interdependence between the runtime environment needed for job execution and the configuration of the computing nodes at the sites. Docker or Singularity would provide a uniform runtime environment for the grid, HPCs and for a variety of opportunistic resources. Additionally, releases may be supplemented with a detector’s conditions data, thus removing the need for network connectivity at computing nodes, which is normally quite restricted for HPCs. In preparation to achieve this goal, we have built Docker and Singularity images containing single full releases of ATLAS software for running detector simulation and reconstruction jobs in runtime environments without a network connection. Unlike similar efforts to produce containers by packing all possible dependencies of every possible workflow into heavy images (≈ 200GB), our approach is to include only what is needed for specific workflows and to manage dependencies efficiently via software package managers. This approach leads to more stable packaged releases where the dependencies are clear and the resulting images have more portable sizes ( 16GB). In an effort to cover a wider variety of workflows, we are deploying images that can be used in raw data reconstruction. This is particularly challenging due to the high database resource consumption during the access to the experiment’s conditions payload when processing data. We describe here a prototype pipeline in which images are provisioned only with the conditions payload necessary to satisfy the jobs’ requirements. This database-on-demand approach would keep images slim, portable and capable of supporting various workflows in a standalone fashion in environments with no network connectivity.


2019 ◽  
Vol 4 (2) ◽  
Author(s):  
Rozali Toyib ◽  
Ardi Wijaya

Abstack: Data stored in storage media is often lost or opened by certain parties who are not responsible, so that it is very detrimental to the owner of the data, it is necessary to secure data so that the data can be locked so that it cannot be opened by irresponsible parties. The RC5 and RC6 algorithms are digestive massage algorithms or sometimes also known as the hash function which is an algorithm whose input is a message whose length is not certain, and produces an output message digest from its input message with exactly 128 bits in length. RC6 password is a protection for the user in securing data on a PC or computer. Based on the results of the conclusions taken: For the experiments carried out on the RC5 algorithm the execution time for the generation of keys (set-up key) is very fast, which is about 9-10 ns, a trial carried out on the RC6 algorithm execution time for the key generator (set up key ) faster than 10-11 ns. In the encryption and decryption process, the execution time depends on the size or size of the plaintext file. The larger the size of the plaintext file, the longer the execution time.Abstrak : Data yang tersimpan dalam media penyimpanan sering hilang atau dibuka oleh pihak-pihak tertentu yang tidak bertanggung jawab, sehinga merugikan sekali bagi pemilik data tersebut, untuk itu diperlukan suatu pengamanan data agar data tersebut dapat terkunci sehingga tidak dapat dibuka oleh pihak yang tidak bertanggung jawab.. Algoritma RC5 dan RC6 merupakan algoritma massage digest atau kadang juga dikenal dengan hash function yaitu suatu algoritma yang inputnya berupa sebuah pesan yang panjangnya tidak tertentu, dan menghasilkan keluaran sebuah message digest dari pesan inputnya dengan panjang tepat 128 bit. Password RC6 merupakan salah satu perlindungan kepada user dalam pengamanan data yang berada dalam sebuah Pc atau computer. Berdasarkan hasil pengujian diambil kesimpulan : Untuk uji coba yang dilakukan pada algoritma RC5 waktu eksekusi untuk pembangkitan kunci  (set up key) sangat cepat sekali yaitu sekitar  9-10 ns, uji coba yang dilakukan pada algoritma RC6 waktu eksekusi untuk pembangkit kunci (set up key) lebih cepat sekali yaitu 10-11 ns, Pada proses enkripsi dan dekripsi, waktu eksekusi tergantung dari besar atau kecilnya ukuran file plaintext.s emakin besar ukuran file plaintext maka semakin lama waktu eksekusinya.


2016 ◽  
Vol 5 (2) ◽  
Author(s):  
Rainer Baule ◽  
Hannes Wilke

This paper bridges two recent studies on the role of analysts to provide new and relevant information to investors. On the one hand, the contribution of analysts to long-term price discovery on the US market is rather low. Considering earnings per share forecasts as the main output of analysts’ reports, their information share amounts to only 4.6% on average. On the other hand, trading strategies set up on these EPS forecasts are quite profitable. Self-financing portfolios yield excess returns of more than 5% over the S&P 100 index for a time period of 36 years, which is persistent after controlling for the well-known risk factors. In this paper, we discuss the link between the low information shares and the high abnormal returns. We argue that information shares of analysts cannot be higher, because otherwise their forecasts would lead to excessively profitable trading strategies which are very unlikely to persist over such a long period of time.


Author(s):  
Tanja Bueltmann ◽  
Donald M. MacRaild

This chapter moves beyond the St George’s societies that scholars portray as proof that the English principally indulged in elite civic activism rather than ethnic behaviour. A second tier of English association developed in the 1870s catering specifically for independent working class migrants. The Order of the Sons of St George (OSStG; 1870) and the Sons of England (1874) represented something different. Clearly, working-class Englishmen and women in the US and Canada felt the need for another type of organization—one whose fees they could afford, something that provided them with mutual aid. These English ethnic friendly societies drew upon homeland traditions. In the US, they also took shape with an American culture of associating. Such organizations were structured by the imperatives of class solidarity and ethnic togetherness. Indeed, ethnicity also sponsored (and was sponsored by) tension and competition with the Irish. This chapter traces these developments with a particular view to the context in which they were founded, and where they were set up. The OSStG, for instance, came about in part as a coordinated response to a heightened ethnic consciousness.


Author(s):  
Anton Michlmayr ◽  
Philipp Leitner ◽  
Florian Rosenberg ◽  
Schahram Dustdar

Service-oriented Architectures (SOA) and Web services have received a lot of attention from both industry and academia. Services as the core entities of every SOA are changing regularly based on various reasons. This poses a clear problem in distributed environments since service providers and consumers are generally loosely coupled. Using the publish/subscribe style of communication service consumers can be notified when such changes occur. In this chapter, we present an approach that leverages event processing mechanisms for Web service runtime environments based on a rich event model and different event visibilities. Our approach covers the full service lifecycle, including runtime information concerning service discovery and service invocation, as well as Quality of Service attributes. Furthermore, besides subscribing to events of interest, users can also search in historical event data. We show how this event notification support was integrated into our service runtime environment VRESCo and give some usage examples in an application context.


Sign in / Sign up

Export Citation Format

Share Document