Apart from any fair dealing for the purposes of research or private study, or criticism or review, as permitted under the Copyright, Designs and Patents Act 1988, this publication may only be reproduced, stored or transmitted, in any form or by any means, with the prior permission in writing of the publishers, or in the case of reprographic reproduction in accordance with the terms and licenses issued by the CLA. Enquiries concerning reproduction outside these terms should be sent to the publishers at the undermentioned address:
ISTE Ltd
27-37 St George’s Road
London SW19 4EU
UK
www.iste.co.uk
John Wiley & Sons, Inc.
111 River Street
Hoboken, NJ 07030
USA
www.wiley.com
© ISTE Ltd 2020
The rights of Alain Cappy to be identified as the author of this work have been asserted by him in accordance with the Copyright, Designs and Patents Act 1988.
Library of Congress Control Number: 2019957598
British Library Cataloguing-in-Publication Data
A CIP record for this book is available from the British Library
ISBN 978-1-78630-472-8
I wish to thank my colleagues, Virginie Hoel, Christophe Loyez, François Danneville, Kevin Carpentier and Ilias Sourikopoulos, who have accompanied my work on neuro-inspired information processing. This book would not have been possible without our numerous discussions on this new research theme.
I would also like to thank Marie-Renée Friscourt for her diligent and efficient proofreading of the manuscript, and for the many insightful remarks made for the benefit of its improvement.
The invention of the junction transistor in 1947 was undoubtedly the most significant innovation of the 20th Century, with our day-to-day lives coming to entirely depend on it. Since this date, which we will come back to later, the world has “gone digital”, with virtually all information processed in binary form by microprocessors.
In order to attain the digital world we know today, several steps were essential, such as the manufacture of the first integrated circuit in 1958. It soon became apparent that integrated circuits not only enabled the processing of analog signals, such as those used in radio, but also digital signals. Such digital circuits were used in the Apollo XI mission that led humankind onto the moon, on July 21, 1969. Astronauts only had very limited computing means at their disposal to achieve this spectacular feat. The flight controller was a machine that we might consider very basic by today’s standards. Composed of 2,800 integrated circuits, each comprising two three-input “NOR” gates, 2,048 words RAM 1 and 38,000 words ROM 2 for programs, it worked at a clock frequency of 80 kHz and weighed no more than 32 kg for 55 W power consumption.
The exploit was thus essentially based on “human” or “cortical” processing of information: processing power, too often advanced today, is not always the sine qua non condition for success!
In order to reduce the weight of processing systems, while improving their performance, it is necessary to incorporate a large number of logic gates into the same circuit. In 1971, this integration pathway led to a veritable revolution: the development of the first microprocessor. Since then, digital information processing technologies have witnessed tremendous progress, in terms of both their technical performance and their impact on society.
The world in which we live has become one of a “data deluge”, a term coined to describe the massive growth in the volume of data generated, processed and stored by digital media (audio and video), business transactions, social networks, digital libraries, etc. Every minute, for example, the Internet handles almost 200 million e-mails, 40 million voice messages, 20 million text messages, and 500,000 tweets. In 2016, the size of the digital universe, defined as the amount of data created, digitized and stored by human beings, was estimated at 16 ZB 3 (zettabytes) and this figure is predicted to double every two years, i.e. 44 ZB in 2020 and 160 ZB in 2025. What a leap in just half a century!
This progression, symbolized by the famous “Moore’s law” 4 , which predicted the doubling of microprocessor power 5 every 18 months, occurred at constant price, i.e. the price of a modern microprocessor is much the same as that of the 1971 microprocessor, even though performance has been improved by more than five orders of magnitude.
This remarkable evolution was only made possible by the existence of a universal model of information processing machines, the Turing machine, and a technology capable of physically implementing these machines, that of semiconductor devices. More specifically, the “binary coding/Von Neumann architecture/CMOS technology” triplet has been the dominant model of information processing systems since the early 1970s.
Yet two limits have been reached at present: that of miniaturization, with devices not exceeding several nanometers in size, and that of power dissipated, with a barrier of the order of 100 Watts when the processor is working intensely.
As long as performance improved steadily, the search for new information processing paradigms was not ever a priority. With the foreseeable saturation of processor performance in the medium term, and also with the emergence of new application domains such as connected objects and artificial intelligence, the question of an information processing paradigm possessing both (i) high energy efficiency and (ii) superior performance in relation to current systems, in order to resolve certain types of problems, is resurfacing as a matter of some urgency.
This book, dedicated to neuro-inspired 6 information processing, reflects these considerations. Its purpose is to offer students and researchers interested in this fascinating topic, a general overview of the current knowledge and state of the art, while heightening awareness of the innumerable questions posed and problems that remain unresolved.
Associating neuroscience, information technology, semiconductor physics and circuit design as well as mathematics and information theory, the subject matter addressed covers a wide variety of fields.
To enable the reader to progress uninterrupted through this book, they are regularly reminded of the basic concepts, or referred to the list of reference documents provided. Wherever possible, mathematical models of the phenomena studied are proposed, in order to enable an analysis that while simplified, offers a quantitative picture of the influence of the various parameters. This thinking aid using analytical formulations is, we believe, the condition for sound understanding of the physics of the phenomena involved.
This book is organized into four essentially independent chapters:
– Chapter 1introduces the basic concepts of electronic information processing, in particular coding, memorization, machine architecture and CMOS technology, which constitutes the hardware support for such processing. As one of the objectives of this book is to expand on the link between information processing and energy consumption, various ways of improving the performance of current systems are presented – particularly neuro-inspired processing, the central topic of this book. A fairly general comparison of the operating principles and performance of a modern microprocessor and of the brain is also presented in this chapter.
– Chapter 2is dedicated to the known principles of the functioning of the brain, and in particular those of the cerebral cortex, also known as “gray matter”. In this part, the approach is top-down, i.e. the cortex is first looked at from a global, functional perspective before we then study its organization as a basic processing unit, the cortical columns. An emblematic example, vision and the visual cortex, is also described to illustrate these different functional aspects.
Читать дальше