Multidimensional Programming

Author(s):  
E. A. Ashcroft ◽  
A. A. Faustini ◽  
R. Jaggannathan ◽  
W. W. Wadge

This book describes a powerful language for multidimensional declarative programming called Lucid. Lucid has evolved considerably in the past ten years. The main catalyst for this metamorphosis was the discovery that Lucid is based on intensional logic, one commonly used in studying natural languages. Intensionality, and more specifically indexicality, has enabled Lucid to implicitly express multidimensional objects that change, a fundamental capability with several consequences which are explored in this book. The author covers a broad range of topics, from foundations to applications, and from implementations to implications. The role of intensional logic in Lucid as well as its consequences for programming in general is discussed. The syntax and mathematical semantics of the language are given and its ability to be used as a formal system for transformation and verification is presented. The use of Lucid in both multidimensional applications programming and software systems construction (such as a parallel programming system and a visual programming system) is described. A novel model of multidimensional computation--education--is described along with its serendipitous practical benefits for harnessing parallelism and tolerating faults. As the only volume that reflects the advances over the past decade, this work will be of great interest to researchers and advanced students involved with declarative language systems and programming.

1986 ◽  
Vol 15 (2) ◽  
pp. 283-328
Author(s):  
Francis Renaud

In this paper, we propose a formal system, directly implementable on computer, allowing a detailed analysis of temporal expressions. The use of fonctional language facilitates the construction of meaning representations, which takes into account the fuzziness of natural languages and the dynamical character of understanding process (as the dynamical processing of the reference point à la Kamp).


1992 ◽  
Vol 28 (6) ◽  
pp. 943-944
Author(s):  
A. V. Palagin ◽  
V. P. Boyun ◽  
A. S. Yurchenko

Author(s):  
Alan Reed Libert

Artificial languages—languages which have been consciously designed—have been created for more than 900 years, although the number of them has increased considerably in recent decades, and by the early 21st century the total figure probably was in the thousands. There have been several goals behind their creation; the traditional one (which applies to some of the best-known artificial languages, including Esperanto) is to make international communication easier. Some other well-known artificial languages, such as Klingon, have been designed in connection with works of fiction. Still others are simply personal projects. A traditional way of classifying artificial languages involves the extent to which they make use of material from natural languages. Those artificial languages which are created mainly by taking material from one or more natural languages are called a posteriori languages (which again include well-known languages such as Esperanto), while those which do not use natural languages as sources are a priori languages (although many a posteriori languages have a limited amount of a priori material, and some a priori languages have a small number of a posteriori components). Between these two extremes are the mixed languages, which have large amounts of both a priori and a posteriori material. Artificial languages can also be classified typologically (as natural languages are) and by how and how much they have been used. Many linguists seem to be biased against research on artificial languages, although some major linguists of the past have been interested in them.


Computers ◽  
2020 ◽  
Vol 9 (1) ◽  
pp. 10 ◽  
Author(s):  
Johannes Grohmann ◽  
Nikolas Herbst ◽  
Avi Chalbani ◽  
Yair Arian ◽  
Noam Peretz ◽  
...  

Failure prediction is an important aspect of self-aware computing systems. Therefore, a multitude of different approaches has been proposed in the literature over the past few years. In this work, we propose a taxonomy for organizing works focusing on the prediction of Service Level Objective (SLO) failures. Our taxonomy classifies related work along the dimensions of the prediction target (e.g., anomaly detection, performance prediction, or failure prediction), the time horizon (e.g., detection or prediction, online or offline application), and the applied modeling type (e.g., time series forecasting, machine learning, or queueing theory). The classification is derived based on a systematic mapping of relevant papers in the area. Additionally, we give an overview of different techniques in each sub-group and address remaining challenges in order to guide future research.


Author(s):  
Badri Gvasalia ◽  

The problem of determining the optimal the values of the parameters of the PI controller by the quadratic integral criterion taking into account the coefficient of attenuation of the transition process. The problem is presented as a nonlinear mathematical problem programming. Finding the global minimum of the objective function carried out by random search, and to determine the parameters of the PI controller used the expanded amplitude-phase frequency response of the system. An attempt is made to provide designers with an effective tool for solving the above problem. The given graphs and tables facilitate the selection of parameters. To compile a computer program, the visual programming system VBA (Visual Basic for Application) is used. Concrete examples are given.


10.28945/2486 ◽  
2002 ◽  
Author(s):  
Julia Alpert Gladstone

This paper examines the various regimes that are used to protect databases to suggest that the continued progress o' science and technology that has enabled economic prosperity will be fostered by less regulation. The diversity between and within each of these regimes reflects fundamentally different views of intellectual property. Technology, specifically digitalization that has facilitated the creation, replication and easy dissemination of information has changed the value of information and threatens to create a striated society of information "haves" and "have-nots" due to enclosure mechanisms. As technology advances, the laws which we implement to build upon the existing intellectual property infrastructure must be developed with care to preserve the careful balance of the public good and private interest that has maintained the past 200 years of "progress of science and useful arts." The author suggests ways to structure a database to encourage or reward database developers while simultaneously fostering the advancement of science. Web Technology has changed conventional Information Systems (IS) and conventional Information Technology (IT) as we know it. There is no doubt that Web technology will provide the foundation for most future software systems. IS curriculum therefore needs to be brought up to date to reflect this reality. In this paper we update our earlier research leading to the design of a graduate model curriculum for Information Systems and describe a generic web-centric Information Systems Masters curriculum model. It is strong on web-technology and its goal is to produce students who are comfortable with both today's technology and technology of the future. Universities and colleges can adapt this curriculum model to design a new Masters in IS curriculum or simply to bring up to date any existing IS/IT curriculum. The model suggests new core concentration courses, and concentration electives.


2020 ◽  
Vol 102 (4) ◽  
pp. 5-7
Author(s):  
Teresa Preston

In this monthly column, Kappan managing editor Teresa Preston explores how the magazine has covered current issues in the past. This column looks at how authors have discussed the best ways to educate the most advanced students. Ensuring that students are appropriately challenged at school has interested Kappan authors since as far back as the 1940s. Programs have had the goal of improving students’ abilities, helping them with social challenges, and building a better workforce. Although some have decried these programs for elitism, educators have long sought ways to expand the definition of giftedness to bring in more students.


Author(s):  
Дмитрий Алексеевич Диденко

В статье рассматривается методика обучения программированию на уроках информатики в 9-м классе с помощью системы Lazarus и тренажера "Лазарус онлайн", разработанного автором статьи. The article discusses the method of teaching programming in computer science lessons in the 9th grade using the Lazarus system and the "Lazarus online" simulator, that was developed by the author of the article.


Author(s):  
E. A. Ashcroft ◽  
A. A. Faustini ◽  
R. Jaggannathan ◽  
W. W. Wadge

The intensional programming language, Lucid, described in Chapter 1 is based directly on intensional logic, a family of mathematical formal systems that permit expressions whose value depends on hidden contexts or indices. Our use of intensional logic is one in which the hidden contexts or indices are integers or tuples of integers. Intensional logic, as used to give semantics to natural language, uses a much more general notion of context or index. Of course, intensional logic is hardly the first example of a formal system of interest to both logicians and computer scientists. The language LISP (invented by McCarthy and others in the early sixties [34]) was originally intended to be an adaptation of the lambda calculus, although it diverged in its treatment of variable-binding and higher-order functions. Shortly after, however, Landin produced ISWIM, the first true functional language [30]. These “logical” programming languages such as ISWIM are in many respects vastly superior to the more conventional ones. They are much simpler and better defined and yet at the same time more regular and more powerful. These languages are notationally closer to ordinary mathematics and are much more problem-oriented. Finally, programs are still expressions in a formal system, and are still subject to the rules of the formal system. It is therefore much easier to reason formally about their correctness, or to apply meaningpreserving transformations. With these languages, programming really is a respectable branch of applied mathematical logic. These logic-based (or declarative) languages at first proved difficult to implement efficiently, and interest in declarative languages declined soon after the promising initial work of McCarthy and Landin. Fortunately, the advent of large scale integration and new compiling technology reawakened interest in declarative languages, and brought about a series of new “second generation” declarative languages, such as Prolog [12] and Miranda [44]. Lucid itself was one of these second generation declarative languages. Lucid is based not so much on classical logical systems as on the possible worlds approach to intensional logic—itself a relatively new branch of logic [43] which reached maturity during the period (1965-75) in which declarative programming languages were in eclipse.


Sign in / Sign up

Export Citation Format

Share Document