Join thousands of book lovers
Sign up to our newsletter and receive discounts and inspiration for your next reading experience.
By signing up, you agree to our Privacy Policy.You can, at any time, unsubscribe from our newsletters.
The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology.
This book explores a different pragmatic approach to algorithmic complexity rooted or motivated by the theoretical foundations of algorithmic probability and explores the relaxation of necessary and sufficient conditions in the pursuit of numerical applicability, with some of these approaches entailing greater risks than others in exchange for greater relevance and applicability.Some established and also novel techniques in the field of applications of algorithmic (Kolmogorov) complexity currently coexist for the first time, ranging from the dominant ones based upon popular statistical lossless compression algorithms (such as LZW) to newer approaches that advance, complement, and also pose their own limitations. Evidence suggesting that these different methods complement each other for different regimes is presented, and despite their many challenges, some of these methods are better grounded in or motivated by the principles of algorithmic information. The authors propose that the field can make greater contributions to science, causation, scientific discovery, networks, and cognition, to mention a few among many fields, instead of remaining either as a technical curiosity of mathematical interest only or as a statistical tool when collapsed into an application of popular lossless compression algorithms. This book goes, thus, beyond popular statistical lossless compression and introduces a different methodological approach to dealing with algorithmic complexity. For example, graph theory and network science are classic subjects in mathematics widely investigated in the twentieth century, transforming research in many fields of science from economy to medicine. However, it has become increasingly clear that the challenge of analyzing these networks cannot be addressed by tools relying solely on statistical methods. Therefore, model-driven approaches are needed. Recent advances in network science suggest that algorithmic information theory could play an increasingly important role in breaking those limits imposed by traditional statistical analysis (entropy or statistical compression) in modeling evolving complex networks or interacting networks. Further progress on this front calls for new techniques for an improved mechanistic understanding of complex systems, thereby calling out for increased interaction between systems science, network theory, and algorithmic information theory, to which this book contributes.
This book presents state-of-the-art solution methods and applications of stochastic optimal control. It is a collection of extended papers discussed at the traditional Liverpool workshop on controlled stochastic processes with participants from both the east and the west. New problems are formulated, and progresses of ongoing research are reported.Topics covered in this book include theoretical results and numerical methods for Markov and semi-Markov decision processes, optimal stopping of Markov processes, stochastic games, problems with partial information, optimal filtering, robust control, Q-learning, and self-organizing algorithms. Real-life case studies and applications, e.g., queueing systems, forest management, control of water resources, marketing science, and healthcare, are presented. Scientific researchers and postgraduate students interested in stochastic optimal control,- as well as practitioners will find this book appealing and a valuable reference. ¿
Helping readers connect these fields, it appeals to a wide audience, including computer scientists, engineers, mathematicians, physicists, biologists or economists.The book is richly illustrated and basic concepts are accessible to readers with basic training in science.
On the other hand, the problem is that even at the sensing stage each unicellular organism can be regarded as a logic gate in which the number of outputs (means of perceiving signals) greatly exceeds the number of inputs (signals).
This book is a tribute to Kenichi Morita's ideas and achievements in theoretical computer science, reversibility and computationally universal mathematical machines. It offers a unique source of information on universality and reversibility in computation and is an indispensable book for computer scientists, mathematicians, physicists and engineers. Morita is renowned for his works on two-dimensional language accepting automata, complexity of Turing machines, universality of cellular automata, regular and context-free array grammars, and undecidability. His high-impact works include findings on parallel generation and parsing of array languages by means of reversible automata, construction of a reversible automaton from Fredkin gates, solving a firing squad synchronization problem in reversible cellular automata, self-reproduction in reversible cellular spaces, universal reversible two-counter machines, solution of nondeterministic polynomial (NP) problems in hyperbolic cellular automata, reversible P-systems, a new universal reversible logic element with memory, and reversibility in asynchronous cellular automata.Kenichi Morita's achievements in reversibility, universality and theory of computation are celebrated in over twenty high-profile contributions from his colleagues, collaborators, students and friends. The theoretical constructs presented in this book are amazing in their diversity and depth of intellectual insight, addressing: queue automata, hyperbolic cellular automata, Abelian invertible automata, number-conserving cellular automata, Brownian circuits, chemical automata, logical gates implemented via glider collisions, computation in swarm networks, picture arrays, universal reversible counter machines, input-position-restricted models of language acceptors, descriptional complexity and persistence of cellular automata, partitioned cellular automata, firing squad synchronization algorithms, reversible asynchronous automata, reversible simulations of ranking trees, Shor's factorization algorithms, and power consumption of cellular automata.
Software implementations include distance-vector algorithms for distributed path computation in dynamics networks, parallel solutions of the constrained shortest path problem, and application of the shortest path solutions in gathering robotic swarms.
This book introduces characteristic features of the protein structure prediction (PSP) problem. It focuses on systematic selection and improvement of the most appropriate metaheuristic algorithm to solve the problem based on a fitness landscape analysis, rather than on the nature of the problem, which was the focus of methodologies in the past. Protein structure prediction is concerned with the question of how to determine the three-dimensional structure of a protein from its primary sequence. Recently a number of successful metaheuristic algorithms have been developed to determine the native structure, which plays an important role in medicine, drug design, and disease prediction. This interdisciplinary book consolidates the concepts most relevant to protein structure prediction (PSP) through global non-convex optimization. It is intended for graduate students from fields such as computer science, engineering, bioinformatics and as a reference for researchers and practitioners.
This book is dedicated to Professor Selim G. Akl to honour his groundbreaking research achievements in computer science over four decades. The book is an intellectually stimulating excursion into emergent computing paradigms, architectures and implementations. World top experts in computer science, engineering and mathematics overview exciting and intriguing topics of musical rhythms generation algorithms, analyse the computational power of random walks, dispelling a myth of computational universality, computability and complexity at the microscopic level of synchronous computation, descriptional complexity of error detection, quantum cryptography, context-free parallel communicating grammar systems, fault tolerance of hypercubes, finite automata theory of bulk-synchronous parallel computing, dealing with silent data corruptions in high-performance computing, parallel sorting on graphics processing units, mining for functional dependencies in relational databases, cellular automata optimisation of wireless sensors networks, connectivity preserving network transformers, constrained resource networks, vague computing, parallel evolutionary optimisation, emergent behaviour in multi-agent systems, vehicular clouds, epigenetic drug discovery, dimensionality reduction for intrusion detection systems, physical maze solvers, computer chess, parallel algorithms to string alignment, detection of community structure. The book is a unique combination of vibrant essays which inspires scientists and engineers to exploit natural phenomena in designs of computing architectures of the future.
This treatise in unconventional computing appeals to readers from all walks of life, from high-school pupils to university professors, from mathematicians, computers scientists and engineers to chemists and biologists.
This book explores Probabilistic Cellular Automata (PCA) from the perspectives of statistical mechanics, probability theory, computational biology and computer science.
This fascinating, colourful book offers in-depth insights and first-hand working experiences in the production of art works, using simple computational models with rich morphological behaviour, at the edge of mathematics, computer science, physics and biology.
This book is devoted to Slime mould Physarum polycephalum, which is a large single cell capable for distributed sensing, concurrent information processing, parallel computation and decentralized actuation. The ease of culturing and experimenting with Physarum makes this slime mould an ideal substrate for real-world implementations of unconventional sensing and computing devices The book is a treatise of theoretical and experimental laboratory studies on sensing and computing properties of slime mould, and on the development of mathematical and logical theories of Physarum behavior. It is shown how to make logical gates and circuits, electronic devices (memristors, diodes, transistors, wires, chemical and tactile sensors) with the slime mould. The book demonstrates how to modify properties of Physarum computing circuits with functional nano-particles and polymers, to interface the slime mould with field-programmable arrays, and to use Physarum as a controller of microbial fuel cells. A unique multi-agent model of slime is shown to serve well as a software slime mould capable for solving problems of computational geometry and graph optimization. The multiagent model is complemented by cellular automata models with parallel accelerations. Presented mathematical models inspired by Physarum include non-quantum implementation of Shor's factorization, structural learning, computation of shortest path tree on dynamic graphs, supply chain network design, p-adic computing and syllogistic reasoning. The book is a unique composition of vibrant and lavishly illustrated essays which will inspire scientists, engineers and artists to exploit natural phenomena in designs of future and emergent computing and sensing devices. It is a 'bible' of experimental computing with spatially extended living substrates, it spanstopics from biology of slime mould, to bio-sensing, to unconventional computing devices and robotics, non-classical logics and music and arts.
This fascinating, colourful book offers in-depth insights and first-hand working experiences in the production of art works, using simple computational models with rich morphological behaviour, at the edge of mathematics, computer science, physics and biology.
High-density memristive data storage combined with memristive circuit-design paradigms and computational tools applied to solve NP-hard artificial intelligence problems, as well as memristive arithmetic-logic units, certainly pave the way for a very promising memristive era in future electronic systems.
The behaviour of a simple organism which is capable of remarkable biological and computational feats that seem to transcend its simple component parts is examined and modelled.
This book examines some of the major contributions of Stephen Wolfram's best-selling classic, A New Kind of Science, ten years after its publication.
More exactly, readers will have an encounter with the structural complexity of vortex flows, the use of chaotic dynamics within evolutionary algorithms, complexity in synthetic biology, types of complexity hidden inside evolutionary dynamics and possible controlling methods, complexity of rugged landscapes, and more.
This book explores Probabilistic Cellular Automata (PCA) from the perspectives of statistical mechanics, probability theory, computational biology and computer science.
Sign up to our newsletter and receive discounts and inspiration for your next reading experience.
By signing up, you agree to our Privacy Policy.