2014-03-24

Dépendance/addiction numérique


Certes, l’attrait irrésistible que nous éprouvons pour ces liens numériques semble nouveau. Mais il ne fait que refléter l’expérience du nouveau-né dans le carré parental et la durable nostalgie organique et psychique que nous éprouvons inconsciemment depuis la séparation de la naissance, lorsque le cordon ombilical a été coupé. Chacun ressent le désir d’être rebranché, au point où cette connexion évoque le cordon ombilical du fœtus par rapport au placenta. Nous l’appellerons « ombilical numérique ». Le web devient alors un ersatz du corps maternel. La croissance personnelle, la satisfaction physique et psychique passent par lui. La métaphore organique de la nature vaut aussi pour la communauté humaine, à laquelle on ressent ce besoin sécuritaire d’appartenance, et dont on ne supporte pas d’être exclu. Le succès des réseaux sociaux amplifie l’importance de cet imaginaire. Les hyperliens qu’on évoque métaphoriquement à propos de la navigation sur le web sont des liens électroniques de point en point sur les réseaux, certes, mais ce sont aussi des liens affectifs, car ils participent eux-aussi de ce besoin psychique, de cette soif inextinguible de solidarité organique et non pas seulement mécanique, selon la différence proposée par Durkheim, que nous ressentons comme atome social isolé dans la masse.  

Le mythe élémentaire de l’unité perdue est déterminant dans l’image du monde que crée chaque enfant. Il perdure et suscite encore chez l’adulte de fortes représentations compensatrices qui détermineront ses comportements et ses désirs fondamentaux. 

2014-03-23

Le mythe humain


Avec l'émergence de l'âge du numérique, le mythe humain devient un mythe mutant, celui de CyberProméthée, excitant l'instinct de puissance de l'Homme, qui devient lui-même un dieu et prend la place de la vieille rengaine monothéiste, et invitant l'homme à créer l'hyperhumanisme, une déclinaison du nouveau mythe humain centrée sur la valorisation de l'éthique planétaire.

2014-03-14

La loi de la divergence: définition


La loi de la divergence s'oppose à la loi darwinienne de l'adaptation et de la sélection naturelle, qui a eu en son temps le mérite de nous libérer du créationnisme, mais qui est très insuffisante pour expliquer l'évolution. La divergence est ce qui survient et qu'on ne pouvait pas prévoir; si ce n'est a posteriori. La divergence échappe à la pensée linéaire inductive ou déductive et même à la dialectique hégélienne. Elle relève de la pensée en arabesque. La divergence est le contraire de la répétition: elle est la création. Dans l'histoire humaine, elle a souvent tenu à des génies individuels rebelles aux idées de la majorité. Ils ont été le plus souvent marginalisés voire suppliciés avant d'être célébrés a posteriori. La loi darwinienne de l'adaptation ne peut expliquer l'évolution de l'espèce humaine, dont les étapes marquantes ont été des ruptures, des sauts, des mutations, des projets qui l'ont projetée chaque fois dans des directions nouvelles, que personne n'avait prévues. Cela est évident dans l'histoire de l'art, des idées, de la science. Mais la nature elle-même évolue aussi par sauts tous azimuts, ruptures, création de configurations génétiques nouvelles et inattendues. Elle essaie tous les scénarios biologiques. La nature est créatrice et aventureuse. La loi darwinienne n'explique que l'anecdotique, des détails. Elle demeurait trop linéaire et mécanicienne pour embrasser le principal : la nature elle-même évolue par mutations et divergences. La divergence fait son chemin vers l'inédit, l'inconnu et implique toujours une prise de risque, éventuellement fatale, qui se situe à l'opposé de l'adaptation. L'éthique, par exemple, est une invention de l'espèce humaine, c'est la protection du plus faible; elle se situe à l'opposé de la loi darwinienne de la jungle et ne se trouve aucunement dans l'état de nature. Elle contredit en tout la loi de la compétition et de l'adaptation.
Avec l'émergence de l'âge du numérique nous avons le privilège de faire l'expérience en temps réel d'une formidable divergence dans notre évolution humaine.

2014-03-05

Singularity University : un concept inévitablement décevant


Sans ressasser les débats éculés sur le hasard et la nécessité, ni pérorer sur les modélisations très légitimes et pertinentes qu’étudient les spécialistes des catastrophes, ou sur les calculs de risque dont les compagnies d’assurances sont devenues les champions toutes catégories, ni rappeler les vertiges intellectuels sur la complexité de l'Intitut de Santa Fe, je donnerai plus d’attention au « mur de la singularité ». « Mur du futur », mur de la singularity (l’anglais d’origine de ce mot lui donne sans doute plus de crédibilité), nous voilà depuis bientôt quinze ans à la veille du grand basculement du monde. L'dée remonte aux années 1950, sans doute due à John von Neumann, dont le nom même annonce la prophétie ! On cite souvent la description qu’en donne Irving John Good en 1965, dont le nom, quant à lui, est rassurant. Il invoque le développement, pour ne pas dire l'explosion de l’intelligence artificielle : « Mettons qu’une machine supra-intelligente soit une machine capable dans tous les domaines d’activités intellectuelles de grandement surpasser un humain, aussi brillant soit-il. Comme la conception de telles machines est l’une de ces activités intellectuelles, une machine supra-intelligente pourrait concevoir des machines encore meilleures ; il y aurait alors sans conteste une « explosion d’intelligence », et l’intelligence humaine serait très vite dépassée. Ainsi, l’invention de la première machine supra-intelligente est la dernière invention que l’Homme ait besoin de réaliser. » Après, c’est la machine intelligence qui prend en charge l’évolution de l’homme – ou plutôt sa disparition pour inutilité.
On nous en reparle sans cesse aujourd’hui. Cela hante les esprits prospectivistes comme un incontournable. Car, au-delà de ce "mur", aucune visibilité. Nous perdons l'entendement et le contrôle avec nos petits cerveaux humains. C’est la fin de l’homme que nous connaissons, c’est la fin de la nature et du carbone : nous allons entrer dans le nouvel âge du silicium et de l’artifice mur à mur. Cette grande divergence, radicale qu’on nous prophétise, a même donné lieu en 2008 à la fondation d’une Université de la Singularité, bien sûr en Californie, financée par des déesses du cybermonde et de la finance : Google (deuxième capitalisation financière mondiale en 2014), Nokia, Cisco, Autodesk et la NASA. Plus rien ne sera comme avant, l’intelligence artificielle prenant le contrôle de notre espèce, nous soumettra à des algorithmes de la sagesse et de la raison ; l’innovation dominera sans cesse et partout nos activités. On imprimera des hommes nouveaux avec la nouvelle imprimante 3D.
Lorsqu'on entend parler de cette "singularité", ce n’est pas sans un malin plaisir qu’on peut remettre les choses à plat. La science-fiction a cultivé ce concept de singularité pour désigner le mur du futur, au-delà duquel s’opérera un changement radical que nous sommes incapables de penser. Voilà le grand déversoir de nos esprits futuristes les plus audacieux. Mais il faudra bien l’admettre lorsqu’on atteindra l’âge de la réalisation de cette Divergence de notre évolution : ce mur opaque et sans retour recule sans cesse devant nos pas comme l’arc-en-ciel. L’ingénuité positiviste de Ray Kurzweil l’a déduite de la loi de Moore qui double tous les dix-huit mois la capacité de nos ordinateurs. Sa date, évidemment prochaine, a d’abord été prévue pour 20025, puis pour 2050, lorsque notre prophète a pris conscience que l’évolution est moins précipitée que le progrès de nos ordinateurs. D’ici qu’on y arrive, il faudra qu’il comprenne aussi que la singularité n’est qu’un mot-écran désignant notre incapacité à penser rationnellement la peur ou la rédemption dont nous colorons notre futur. En termes de mathématiques, ce concept de singularité désigne depuis plus d’un demi-siècle une limite de nos arabesques programmatiques, au-delà de laquelle Alan Turing, Irving John Good ou Carl Sagan jugeaient devoir rendre les armes, tant les complexités des calculs de plus en plus abstraits les dépassaient et aboutissaient hors de toute préhension réelle.

Du point de vue métaphysique – car ce concept en relève évidemment - la singularité n’est qu’un fantasme sur lequel on peut prophétiser sans restriction, ou un simple lieu-commun qui s’énonce clairement comme suit : nous sommes incapables de penser le futur au-delà des limites de nos connaissances. Par définition même, la « singularité » ne peut se penser. La divergence ne peut se programmer. Lorsqu’on lit attentivement le programme des activités de l’University of Singularity, on s’étonne de n’y trouver en fait que des annonces de laboratoires et de séminaires d’experts reconnus en médecine, en économie ou en urbanisme, qui font certainement preuve d’excellence et d’esprit d’innovation, mais qui ne sauraient diverger des modes de pensée actuels. Ils développent des déductions linéaires audacieuses, osent des non-sens, mais qui ne sont pas des sens nouveaux. On ne peut qu’être déçus, mais on ne devrait pas être étonnés finalement par la banalité des pensées et des recherches prospectives considérées. Le nom même d’University of Singularity est peut-être une trouvaille promotionnelle, mais c’est un concept contradictoire. Si je prends à la lettre le concept de Singularity, ce devrait être une université pensée et animée seulement par des robots combinant des intelligences artificielles que le cerveau humain serait incapable de partager.  Ce constat de modestie marque les limites de nos déductions et de nos prévisions. Et on observe, comme dans les films de science-fiction, que notre imaginaire futuriste s’épanche le plus souvent dans l’archaïsme. Les figures des cyborgs évoquent Hercule et les Titans. Le Bien et le Mal se partagent un univers élémentaire. Toutes ces innovations techniques sortent d’un sac à surprises pour enfants. Des serpentins, des bonbons, des lanceurs de bulles de savon, des baguettes magiques, des des pierres à feu, des confettis, des paillettes et des étoiles de toutes les couleurs. Tout sauf une divergence. Et il n'est pas sûr qu'au-delà du mur de la singularité ce soit la fête.



2014-03-04

Un plan numérique pour la culture au Québec


Bien sûr, l'annonce d'un premier financement par le ministre de la culture du Québcc, M. Maka Kotto. pour le développement numérique de la culture québécoise survient en période électorale. Mais au moins cet opportunisme nous donne-t-il un premier signal d'écoute de la demande des 13 Etonnés pour un plan numérique pour le Québec. Bien sûr, ce n'est qu'une partie du Plan, mais le ministère de la culture n'a pas mandat de financer le commerce, l'éducation, etc.
Cette annonce intervient au lendemain d'un constat d'impuissance et de dénonciation que je faisais lors d'une table ronde au Palais des congrès de Montréal il y a quelques jours, en soulignant que nous n'avions d'écoute d'aucun parti, pas même du P. Q. qui avait pourtant su montrer un grand leadership sur ce sujet au temps de M. Landry.
Il faut donc saluer cette annonce, même si elle est répartie sur plusieurs années et demeure encore très modeste. Elle n'aura pas, de plus d'effet de grand entrainement. Elle ne fait que répondre à une urgence de première nécessité. Mais j'espère qu'elle pourra se concrétiser après les élections et qu'un gouvernement enfin majoritaire pourra faire beaucoup plus. Car nous avons encore du travail à faire pour obtenir que le gouvernement assume ses responsabilités numériques, et donc qu'il en prenne vraiment conscience. Je demeure dans l'espoir que le ministre Jean-François Lisée joue un rôle dans cette conscientisation. Espérons que nous pourrons tenir des Etats Généraux du numérique et être entendus.

2014-03-03

McLuhan: The Last Great Thinker of the Fire Age






We are here to pay due tribute to Marshall McLuhan. I was among the first to teach his theories at the Sorbonne Paris V at the beginning of the 1970s. He was a remarkable philosopher of technology and communications, particularly of the sociological implications of oral culture, printing and electronic media. But while taking nothing from that tribute, we need to situate McLuhan in his era and avoid crediting him with things he could not have known. He has been cast in the role of pioneer and visionary of the digital world, which is highly debatable. Curiously, we still have no better a sense than he did of the radical paradigm shift involved in moving from the energy age to the information age. And this is something we should give more thought to, in light of the misunderstandings and misinterpretations it has generated.

I – McLuhan’s thermal metaphors

We need to take another look the origins of the digital revolution. The technological power of binary code is what makes this revolution so surprising, but its genesis was long and clearly dates back to the invention of the first phonetic alphabets that followed pictograms, well before the Christian era. Of course, only today are we able to understand it, and provided we oppose McLuhan’s famous theories, in particular in The Gutenberg Galaxy, which came out in 1962 and has misled all of us in our understanding of digital technology today. We admired Marshall McLuhan’s provocative intelligence in his role as a pioneering analyst of new electric media – telephone, radio, television and audiovisual technologies in general – and therefore of the beginning of the information society we know today. He had the ingenious insight that the media are not bridges between us and nature, but rather a new environment. And many of us, including me, would agree that Marshall McLuhan, the new communication media theorist, the famous author of The Gutenberg Galaxy, the gifted provocateur, was a visionary. But his lexicon shackles him historically. Of course, in the last chapter of Understanding Media, which looks at automation, McLuhan tries to analyze cybernetics as one of the major applications of the new electric age. And yet, the major rupture in Western civilization was not, as he suggests, electricity, which is merely a stage in the fire age, but cybernetics, which rings in a new era of humanity: the digital age. We should even compare the age of energy (including water, wind, fire, electricity and nuclear energy) with the digital age.  What first grabs your attention is the unexpected opposition – which I call thermal – that McLuhan establishes between hot and cool media. The analogy is difficult to argue with and quickly appears arbitrary. His analysis of the effects of the invention of print is insightful, but his proposed word-images, or word-objects, in Counterblast, have more in common with Gutenbergian melted led than with the binary code of digital information. And one can’t help but be surprised by the very physical, almost mechanical, character of his idea of the massage: the dramatic bombardment of TV viewers by electrons from the TV screen creates quite an image, but is not terribly convincing because of the actual physical mundaneness. Furthermore, massage also suggests mechanical heat, which creates physical euphoria. We have entered the territory of deep theoretical musings!And this leads him to maintain that the medium is the message, which he sees as a massage. At a time when humanist tradition prevented us from understanding that technology has a significant impact on our cognitive structures, not to mention on our social structures, the idea was a new one. Now we can clearly see that the cell phone and the internet are extremely powerful instruments of socialization, even though the content they carry can be quite banal, as is the case with chatting or young people and their trivial exchanges on cell phones:
 - Where are you?
- Here.
- Me too.
- Talk later.
That said, McLuhan’s assertion was as brilliant as it was simplistic, and it is high time that we once again stress the importance of content, if we are to resist the alienating “massage” of the mass media. On this point, McLuhan was definitely right, getting worked up about what has become a failing of our times. We need to recognize the value of content and critical thinking, today more than ever, given the extent of the danger. In the information age, this famous axiom no longer holds true. In other words, digital technology, although it uses electricity, processes and disseminates information and not energy. We are in an information society, which, as the term suggests, attaches a great deal of importance to content and has pushed electricity into the shadows. One must not confuse the two as McLuhan did. Digital technology is an entirely different beast than electricity, and infinitely more powerful. Energy actually destroys information. McLuhan’s theoretical musings belong to the age of fire and of the mechanized and electrical industry. This example shows that it’s also at the simple level of language metaphors that we can spot the myths that underpin a broad theoretical narrative, not only in the fascinating figures of mythical epics. And if he were still with us, McLuhan would no doubt be the first to point out, as he often did, that we are interpreting today’s world incorrectly using yesterday’s concepts and content, and that is what he did himself, including in some of his basic premises, which are no longer relevant and have in fact become counterproductive.

II – Theoretical errors arising from McLuhan’s provocative ideas

We should not, however, out of uncritical enthusiasm, invent non-existent ruptures that we find satisfying. McLuhan thought that the electronic media would take us back to a multisensory state similar to that of oral culture before the invention of print. The reign of Gutenberg, or of print, which the West owes so much to, would have lasted only a few centuries. It would wind up being a mere parenthesis in our evolution. It is true that we are now calling into question print newspapers and books, making them available online and on e-readers and tablets. And the pulp and paper industry is in decline. We would be moving toward a paperless society. So we need to ask whether these grand ideas of Western modernity, in particular individualism, the critical mind and rationalism, which McLuhan attributes to the development of print, are in turn threatened by the rise of digital technology, its new culture based on time rather than space, multimedia rather than visual, and its event-based, emotive, playful sensibility of rapid consumption, rather than sustained effort. Won’t mass societies of the digital age become much easier to manipulate and prone to a new obscurantism? It’s not out of the question. But to judge, first we have to reveal more commonly occurring errors.

In foretelling the end of the “Gutenberg galaxy” in the name of electricity, McLuhan got it wrong. The digital revolution, on the contrary, ensured its triumph. And today, when we think about the digital revolution through the lens of McLuhan’s theories, we are in the wrong era, committing a twofold error: McLuhan’s and our own. We have all taken digital technology for a new application of electricity, even though it is based on binary code. But in spite of appearances and contrary to what we keep hearing, we should not confuse revolutions. The move from analogue to the phonetic alphabet is what is behind the evolution toward binary code. We should have seen the digital shock coming.
There are at least five errors that originate with McLuhan that have been mutually reinforcing and have kept us groping in the dark.

Error number one. Contrary to what we hear in this so-called new age of oral culture, we are creating text more than ever. We have never written or read so much. We spend hours every day on websites, writing and reading emails and messages on our phone screens, sending texts, chatting online, reading television news crawls, writing and sending resumés, posting and reading information on social media platforms, producing information online, etc. We create websites, which have to be constantly updated, we write one or more blogs, sometimes daily. We write more on screen than we ever did on paper. Just 15 years ago, we would send the odd letter. We are now prolific in our daily email correspondence. Even young people have become hyperactive, learning by necessity to master a modicum of spelling and clever shorthand codes. We have more than one machine for writing. Estimates place the number of active computer keyboards on the planet in the billions, not to mention keyboards on phones, tablets, e-readers and countless other pieces of equipment and gadgets. 

The volume of voice traffic carried on phone networks has become marginal compared with written data traffic. Voice-activated technology remains the exception rather than the rule. And the computer keyboard is nothing more than an electronic typewriter that replaces and improves upon Gutenberg’s old drawers of movable lead characters by automating their manipulation. Of course, now I can disseminate sounds and movements and combine them with text and images. But in no way does this mimic the oral character of the larynx or the movement of the body, as claims implicitly suggest; it is instead a remarkable extension of writing. The internet has become a printing shop for music, film and television. The fact that I can download thousands of films and songs is the culmination of reproduction and dissemination, which Gutenberg initiated the golden age of. And for works in the public domain, downloads are free, instantaneous and often of excellent quality. So thousands of films, including many well-known feature films, have become available in your home for free. The same goes for television. Even better, we have invented the 3D printer, which makes it possible to remotely write, disseminate and reproduce in three dimensions with extreme precision any object that can be manufactured or any human organ.

Error number two. Web 2.0, chatting, online forums and discussions on social networks are supposedly taking us back to the social interaction and collective rituals of oral culture of old, while reading books fostered individualism. We should not underestimate the impact of multimedia, but its screen-based multisensory virtues are perhaps not as great as originally thought. A lot more has been suggested with less. The abundance of images and sounds often impedes the momentum of the imagination. As for interactivity, its virtues are undeniable when it comes to convenience, but questionable when it comes to cultural creation.

We cannot forget that this networking occurs online, in other words, remotely and generally anonymously. One cannot deny the success of multi-user games, online karaoke, video ping pong and other sports that software brands Wii and Kinect market. But these electronic rituals remain very limited and cannot be compared with the gestural nature of being in someone’s presence and the visual, olfactory and tactile interactivity of tribal celebrations. Online voodoo may be for another day.

Error number three. We forget that this upheaval started 6000 years ago with the creation of the phonetic alphabet, which succeeded ideographic writing. We must remember that Gutenberg’s genius was not in inventing printing, which already existed, the first known example of it being the Chinese The Diamond Sutra, a book that dates back to the 9th century, which is stored in the library of the British Museum. What Gutenberg invented is movable type, which accelerated the power of printing. And contrary to what people say, the binary code of digital technologies is not a rupture with the phonetic alphabet. The phonetic alphabet, which has 26 or 30 letters, depending on the language, already broke with the analogue character of ideographic writing. It imposed itself as an abstract, instrumental code. Digital is simplification. In digital transmissions, as in cathode ray displays, whether in terms of text, images, movement or sound, the bytes and pixels are like movable type reduced to their simplest form, more versatile than characters of the alphabetic code or musical code. This reduction to two signs, 1 and 0, gave binary code the power and speed of electricity and established the convergence of media. It is the outcome of the invention of Gutenberg’s movable type.

Error number four. To finish up with Gutenberg, people also point to the commercial success of e-readers and tablets, which are growing in number. But this success, which was a long time coming, is growing only to the extent that new cathode-ray media are getting better at imitating the trusty old paper book and its ergonomics: the matte aspect of paper and ink, the pocket format, sounds of virtual pages turning, the lightness, the user-friendly manipulation of pages, even the smell of printer ink in a diffusing sachet, the lower prices, etc. The Japanese are even marketing an e-reader that mimics the manual interaction and free movement of pages when you lean one way or another. Ironically, the printed book remains the benchmark for its electronic imitators.

Error number five. People keep forecasting the demise of the book. And we keep hearing that publishing houses and news organizations are in crisis. Yet virtual libraries are growing. And libraries still carry books. They even offer access to books that are no longer available, whether old, out of print, protected in image libraries, sold only in far-off countries, or simply sold in the city when the reader lives in the country. We can no longer avoid the digital networks of the information society we are immersed in. Websites are now counted in the billions of pages. We swim, surf and dive in an ocean of letters of the alphabet. Ironically, and flying in the face of McLuhan’s prophecies, we are witnessing a second phase in the development of literacy, much more widespread than the first, and this time immersive.


There were elated announcements that in 2012 there would be as many digital books as hardcover books sold worldwide. For some this heralds a mutation and the end of the book. Instead, we should be rejoicing in this impressive upswing in the book industry. The fact that one third of books sold today are digital, all categories combined, is good news, because rather than a mere substitution of publishing media, this represents a one-third increase in overall book production. Whether the books are on screens or on paper does not change the fact that they are books. In fact, digital is not the opposite of paper, nor its mortal enemy bent on replacing it. As paradoxical as this statement may seem to digital fundamentalists, in this area, digital technology is simply a complement of paper. And efforts are afoot to invent “cathodic paper.” Adding sound and movement enriches text, obviating its disappearance. The fact that Amazon and other large corporations benefit from electronic technology and can sell books on paper or on screen is major technological progress in printing and a great opportunity for the book. The internet also offers new power for promoting, distributing and selling the… paper book ― an instance of remarkable complementarity. One could even call the internet the “21st century print shop.” No doubt, but we should be careful of making sweeping statements, because the paper book that comes off the press is far from in decline. It remains the dominant form by far; the digital book is actually the one not living up to its promise, at times showing flagging growth. The most recent studies show that it has lost some steam, no matter what its champions claim: only 16% of Americans to date have bought a digital book, whereas 89% of regular readers still buy paper books (Bowker Market Research, 2012). It looks like the digital book will not replace the paper book, but will offer an additional medium for specific activities like practical reading and light entertainment.

Regardless, the “Gutenberg parentheses” are not closing; on the contrary, with digital technology, what we are witnessing is the triumph of movable type electronic printing. We even worry about preserving this new continent of letters of the alphabet. Because the flip side of the coin of this second lettrist revolution is how fragile this electronic heritage is. At least the oral culture of yore cultivated memory and is responsible for the conservation of texts that date back several thousand years. The same cannot be said of what we are entrusting today to the memory of so-called “hard” discs, which are much more volatile than the terracotta plaques of yesteryear.

One could ask whether we should be concerned about a return to obscurantism, for want of writing and reading. We shouldn’t; on the contrary. We could also ask whether, with the advent and meteoric rise of digital communication, we will experience two forms of illiteracy rather than one, the traditional one and the digital one, with the economic and social divide remaining a major factor in illiteracy, both traditional and digital. Quite the opposite. Paradoxically, digital technology will help reduce the number of illiterates. On the one hand, this “second-generation illiteracy” ― as the inability to use digital technology is called, due both to the generation gap and the economic divide― will gradually decline as new generations appear. On the other hand, the appeal of new digital tools, more playful and powerful than the traditional book, will draw illiterate users to them. And digital technology is already generating much more widespread and immediate use of reading and writing than Gutenberg’s invention did.

Digital technology is therefore not a new problem on top of traditional illiteracy. It is part of the solution. It is turning out to be our greatest ally – including in schools – for reducing illiteracy. It is an effective tool in that it motivates young people to read and write through social media. Its allure – its playful magic – is also encouraging young people who left school prematurely to go back, dropouts who have no time for traditional academic pedagogy and the efforts it entails. And thanks to appealing, effective courseware, digital technology also helps teachers in their work and the task of learning to read and write in adult education centres. Digital technology offers a new interactive, playful and multimedia pedagogy that is more appealing and certainly more effective in helping socially disadvantaged groups, whether young people or adults, make the sustained effort that literacy requires, and enjoy the social inclusion that modern life demands. After Gutenberg’s invention, digital technology is what now offers us hope for new human progress. We have become digitally well read. All hail e-Gutenberg!

(translation from the French by Rhonda Mullins)



2014-03-02

Le lien, c'est le sens


Il y a bien des jours où je crois que penser, chercher, dialoguer, n'est une nécessité que pour soi-même et publier qu'une vanité illusoire de la plus grande inutilité. Une misère.  Une peine perdue. Et comment pourrait-il en être autrement?
Il faut en tirer sagesse et en recentrer sur soi seul le bénéfice quotidien. Se changer soi-même sans prétendre changer le monde.

Mais je me reprends aussitôt. Car cette résignation disparaît dès que je suis confronté au scandale. Seuls l'amour et le scandale motivent et donnent sens à la vie. Et je l'affirme alors: c'st le lien qui compte. Vivre pour soi seul n'a aucun sens. Le lien, c'est le sens.