PC grammar systems versus some non-context-free constructions from natural and artificial languages

Author(s):  
Adrian Chiţu
2016 ◽  
Vol 8 (2) ◽  
pp. 113-170
Author(s):  
Mary Sarah Ruth Wilkin ◽  
Stefan D. Bruda

Abstract Parallel Communicating Grammar Systems (PCGS) were introduced as a language-theoretic treatment of concurrent systems. A PCGS extends the concept of a grammar to a structure that consists of several grammars working in parallel, communicating with each other, and so contributing to the generation of strings. PCGS are usually more powerful than a single grammar of the same type; PCGS with context-free components (CF-PCGS) in particular were shown to be Turing complete. However, this result only holds when a specific type of communication (which we call broadcast communication, as opposed to one-step communication) is used. We expand the original construction that showed Turing completeness so that broadcast communication is eliminated at the expense of introducing a significant number of additional, helper component grammars. We thus show that CF-PCGS with one-step communication are also Turing complete. We introduce in the process several techniques that may be usable in other constructions and may be capable of removing broadcast communication in general.


2011 ◽  
Vol 22 (01) ◽  
pp. 203-212 ◽  
Author(s):  
ERZSÉBET CSUHAJ-VARJÚ ◽  
MARION OSWALD ◽  
GYÖRGY VASZIL

We introduce PC grammar systems where the components form clusters and the query symbols refer to clusters not individual grammars, i.e., the addressee of the query is not precisely identified. We prove that if the same component replies to all queries issued to a cluster in a rewriting step, then non-returning PC grammar systems with 3 clusters and 7 context-free components are able to generate any recursively enumerable language. We also provide open problems and directions for future research.


2013 ◽  
Vol 47 (2) ◽  
pp. 59-67
Author(s):  
V. V. Gribova ◽  
A. S. Kleschev ◽  
D. A. Krylov

1987 ◽  
Vol 16 (222) ◽  
Author(s):  
Kurt Nørmark

A syntax-directed editing environment intended for development of artificial languages, e.g. programming languages, specification languages, and grammar definition languages, is presented. Various applications of a simple, syntactic transformation facility is central to the work. There is a description of how most syntax-directed editing operations can be implemented and understood as transformations. It is furthermore demonstrated how documents, which are represented as abstract syntax trees, can be kept consistent with a grammar that is under development. A multi-formalism transformation technique is also described. Abstract presentation of documents on a screen is another central topic. Two simple presentation formalisms that allow documents to be shown as trees and graphs are proposed. As a basis for the whole work, a new formalism for description of context-free languages has been worked out. The formalism is based on generalization/specialization hierarchies of syntax domains.


1997 ◽  
Vol 08 (01) ◽  
pp. 67-80 ◽  
Author(s):  
Sorina Dumitrescu ◽  
Gheorghe Păun ◽  
Arto Salomaa

We compare the power of two (fairly different) recently investigated language identifying devices: patterns and parallel communicating (PC) grammar systems. The simulation of multi-patterns by context-free PC grammar systems is rather obvious, but, unexpectedly, this can be realized also by (non-centralized) PC grammar systems with right-linear components. Moreover, infinite multi-patterns forming a regular set can also be simulated by PC grammar systems with right-linear components, whereas PC grammar systems with context-free components can simulate context-free multi-patterns with context-free domains for variables.


2007 ◽  
Vol 18 (06) ◽  
pp. 1313-1322
Author(s):  
ANDREAS MALCHER ◽  
BETTINA SUNCKEL

A generalization of centralized and returning parallel communicating grammar systems with linear components (linear CPC grammar systems) is studied. It is known that linear CPC grammar systems are more powerful than regular CPC grammar systems and that CPC grammar systems with context-free components are more powerful than linear CPC grammar systems. Here, the intermediate model of metalinear CPC grammar systems is studied. This is a CPC grammar system where the master is allowed to use metalinear rules whereas the remaining components are restricted to use linear rules only. It turns out that metalinear CPC grammar systems are more powerful than linear CPC grammar systems and less powerful than CPC grammar systems with context-free components. Furthermore, it is shown that all languages generated by metalinear CPC grammar systems are semilinear.


Sign in / Sign up

Export Citation Format

Share Document