The Future

Author(s):  
Lance Fortnow

This chapter explores some of today's great challenges of computing. These challenges include parallel computation, dealing with big data, and the networking of everything. The chapter then argues that P versus NP goes well beyond a simple mathematical puzzle. The P versus NP problem is a way of thinking, a way to classify computational problems by their inherent difficulty. P versus NP also brings communities together. There are NP-complete problems in physics, biology, economics, and many other fields. Physicists and economists work on very different problems, but they share a commonality that can give great benefits from sharing tools and techniques. Tools developed to find the ground state of a physical system can help find equilibrium behavior in a complex economic environment. Ultimately, the inherent difficulty of NP problems leads to new technologies.

Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
David Orellana-Martín ◽  
Luis Valencia-Cabrera ◽  
Bosheng Song ◽  
Linqiang Pan ◽  
Mario J. Pérez-Jiménez

Over the last few years, a new methodology to address the P versus NP problem has been developed, based on searching for borderlines between the nonefficiency of computing models (only problems in class P can be solved in polynomial time) and the presumed efficiency (ability to solve NP-complete problems in polynomial time). These borderlines can be seen as frontiers of efficiency, which are crucial in this methodology. “Translating,” in some sense, an efficient solution in a presumably efficient model to an efficient solution in a nonefficient model would give an affirmative answer to problem P versus NP. In the framework of Membrane Computing, the key of this approach is to detect the syntactic or semantic ingredients that are needed to pass from a nonefficient class of membrane systems to a presumably efficient one. This paper deals with tissue P systems with communication rules of type symport/antiport allowing the evolution of the objects triggering the rules. In previous works, frontiers of efficiency were found in these kinds of membrane systems both with division rules and with separation rules. However, since they were not optimal, it is interesting to refine these frontiers. In this work, optimal frontiers of the efficiency are obtained in terms of the total number of objects involved in the communication rules used for that kind of membrane systems. These optimizations could be easier to translate, if possible, to efficient solutions in a nonefficient model.


Symmetry ◽  
2019 ◽  
Vol 11 (11) ◽  
pp. 1412
Author(s):  
Hao ◽  
Liu

Boolean propositional satisfiability (SAT) problem is one of the most widely studied NP-complete problems and plays an outstanding role in many domains. Membrane computing is a branch of natural computing which has been proven to solve NP problems in polynomial time with a parallel compute mode. This paper proposes a new algorithm for SAT problem which combines the traditional membrane computing algorithm of SAT problem with a classic simplification rule, the splitting rule, which can divide a clause set into two axisymmetric subsets, deal with them respectively and simultaneously, and obtain the solution of the original clause set with the symmetry of their solutions. The new algorithm is shown to be able to reduce the space complexity by distributing clauses with the splitting rule repeatedly, and also reduce both time and space complexity by executing one-literal rule and pure-literal rule as many times as possible.


Author(s):  
Abhay Kumar Bhadani ◽  
Dhanya Jothimani

With the advent of Internet of Things (IoT) and Web 2.0 technologies, there has been a tremendous growth in the amount of data generated. This chapter emphasizes on the need for big data, technological advancements, tools and techniques being used to process big data. Technological improvements and limitations of existing storage techniques are also presented. Since the traditional technologies like Relational Database Management System (RDBMS) have their own limitations to handle big data, new technologies have been developed to handle them and to derive useful insights. This chapter presents an overview of big data analytics, its application, advantages, and limitations. Few research issues and future directions are presented in this chapter.


Author(s):  
Alasdair Urquhart

The theory of computational complexity is concerned with estimating the resources a computer needs to solve a given problem. The basic resources are time (number of steps executed) and space (amount of memory used). There are problems in logic, algebra and combinatorial games that are solvable in principle by a computer, but computationally intractable because the resources required by relatively small instances are practically infeasible. The theory of NP-completeness concerns a common type of problem in which a solution is easy to check but may be hard to find. Such problems belong to the class NP; the hardest ones of this type are the NP-complete problems. The problem of determining whether a formula of propositional logic is satisfiable or not is NP-complete. The class of problems with feasible solutions is commonly identified with the class P of problems solvable in polynomial time. Assuming this identification, the conjecture that some NP problems require infeasibly long times for their solution is equivalent to the conjecture that P≠NP. Although the conjecture remains open, it is widely believed that NP-complete problems are computationally intractable.


2000 ◽  
Vol 3 ◽  
pp. 86-95 ◽  
Author(s):  
Bernd Borchert ◽  
Lane A. Hemaspaandra ◽  
Jörg Rothe

AbstractOne way of suggesting that an NP problem may not be NP-complete is to show that it is in the promise class UP. We propose an analogous new method—weaker in strength of evidence but more broadly applicable—for suggesting that concrete NP problems are not NP-complete. In particular, we introduce the promise class EP, the subclass of NP consisting of those languages accepted by NP machines that, when they accept, always have a number of accepting paths that is a power of two. We show that FewP, bounded ambiguity polynomial time (which contains UP), is contained in EP. The class EP applies as an upper bound to some concrete problems to which previous approaches have never been successful, for example the negation equivalence problem for OBDDs (ordered binary decision diagrams).


Author(s):  
Lance Fortnow

This chapter looks at some of the hardest problems in NP. Most of the NP problems that people considered in the mid-1970s either turned out to be NP-complete or people found efficient algorithms putting them in P. However, some NP problems refused to be so nicely and quickly characterized. Some would be settled years later, and others are still not known. These NP problems include the graph isomorphism, one of the few problems whose difficulty seems somewhat harder than P but not as hard as NP-complete problems like Hamiltonian paths and max-cut. Other NP problems include prime numbers and factoring, and linear programming. The linear programming problem has good algorithms in theory and practice—they just happen to be two very different algorithms.


2018 ◽  
Vol 27 (5) ◽  
pp. 808-828 ◽  
Author(s):  
LEONID A. LEVIN ◽  
RAMARATHNAM VENKATESAN

NP-complete problems should be hard on some instances but those may be extremely rare. On generic instances many such problems, especially related to random graphs, have been proved to be easy. We show the intractability of random instances of a graph colouring problem: this graph problem is hard on average unless all NP problems under all samplable (i.e. generatable in polynomial time) distributions are easy. Worst case reductions use special gadgets and typically map instances into a negligible fraction of possible outputs. Ours must output nearly random graphs and avoid any super-polynomial distortion of probabilities. This poses significant technical difficulties.


Author(s):  
Manbir Sandhu ◽  
Purnima, Anuradha Saini

Big data is a fast-growing technology that has the scope to mine huge amount of data to be used in various analytic applications. With large amount of data streaming in from a myriad of sources: social media, online transactions and ubiquity of smart devices, Big Data is practically garnering attention across all stakeholders from academics, banking, government, heath care, manufacturing and retail. Big Data refers to an enormous amount of data generated from disparate sources along with data analytic techniques to examine this voluminous data for predictive trends and patterns, to exploit new growth opportunities, to gain insight, to make informed decisions and optimize processes. Data-driven decision making is the essence of business establishments. The explosive growth of data is steering the business units to tap the potential of Big Data to achieve fueling growth and to achieve a cutting edge over their competitors. The overwhelming generation of data brings with it, its share of concerns. This paper discusses the concept of Big Data, its characteristics, the tools and techniques deployed by organizations to harness the power of Big Data and the daunting issues that hinder the adoption of Business Intelligence in Big Data strategies in organizations.


2019 ◽  
Vol 10 (4) ◽  
pp. 106
Author(s):  
Bader A. Alyoubi

Big Data is gaining rapid popularity in e-commerce sector across the globe. There is a general consensus among experts that Saudi organisations are late in adopting new technologies. It is generally believed that the lack of research in latest technologies that are specific to Saudi Arabia that is culturally, socially, and economically different from the West, is one of the key factors for the delay in technology adoption in Saudi Arabia. Hence, to fill this gap to a certain extent and create awareness about Big Data technology, the primary goal of this research was to identify the impact of Big Data on e-commerce organisations in Saudi Arabia. Internet has changed the business environment of Saudi Arabia too. E-commerce is set for achieving new heights due to latest technological advancements. A qualitative research approach was used by conducting interviews with highly experienced professional to gather primary data. Using multiple sources of evidence, this research found out that traditional databases are not capable of handling massive data. Big Data is a promising technology that can be adopted by e-commerce companies in Saudi Arabia. Big Data’s predictive analytics will certainly help e-commerce companies to gain better insight of the consumer behaviour and thus offer customised products and services. The key finding of this research is that Big Data has a significant impact in e-commerce organisations in Saudi Arabia on various verticals like customer retention, inventory management, product customisation, and fraud detection.


Sign in / Sign up

Export Citation Format

Share Document