Digital Aesthetics: Introduction
Claudia Giannetti
The early twentieth century saw the formation in various fields of new theoretical approaches sharing a skeptical attitude towards the fundamental certainties that had profoundly influenced occidental culture and science. Towards the mid-twentieth century concepts like truth, reality, reason and knowledge became central in an intensive contest between rationalism and relativism. In the course of this debate, several theories were dissociated from the self-referential character of their scientific disciplines and increasingly placed in correlation with other fields. Examples of metadisciplinary models include the cybernetic analysis of message transmission and man-machine communication or, more recently, postmodernist philosophy and its notion of ‹contaminated,› ‹weak› thinking. [1] This relativism manifested itself in various aspects of art: as an essential component in the process of producing experimental art from the first avantgarde movements onward; in the radical transformation of the forms of art reception; in the tendency to interconnect and establish interchange among various art genres (discernible in interventionist and interdisciplinary works or ‹mixed media›); and finally in the intensified exchange among art, science, and technology. Artistic practice appropriated new media—initially photography and film, later video and computer—and new communication systems—post and telephone, followed by television and Internet. Under this premise, and above all from the 1960s onward, a gradual shift set in away from academic, orthodox positions attempting to confine art to traditional techniques, and aesthetics to ontological foundations.
However, the profound transformations resulting from these new approaches did not invariably meet with understanding, let alone acceptance, from artists. If one further takes into consideration the recently re-ignited controversy about the long-predicted crises of art and philosophical aesthetics, as well the widespread discourse among postmodernist writers which was linked to tendencies in technological and academic theory, then everything does in fact seem to point toward a disintegration of art and aesthetics. Yet a large part of such polemics can be attributed to the fact that aesthetic theory and artistic practice have gone separate ways. Artists’ increasing use of technology is bringing to light a far-reaching and on-going discrepancy between artistic perception, art theory, and aesthetics, which are seen to be notably diverging instead of developing synchronously and congruently. This gulf between theoretical «corpus» and artistic practice culminates in a paradox that without doubt leads to the often proclaimed end of art.
Nevertheless the conviction remains that certain symptoms of transition cannot be immediately equated with the radical disintegration of the fields involved. It is rather the case that new intellectual approaches and modes of experiencing must be found in order to enable the analysis and assimilation—as opposed to rejection—of the contemporary phenomena. One access route to these new forms is shown by the theory and practice of media art, and of interactive media art especially, whose renewing concepts are discernible in the fact that aesthetic the+ory is no longer focussed exclusively on the art object itself, but on its process, on system and contexts, on the broad linkage of different disciplines, and on reformulating the roles of the maker and the viewer of a work of art.
The complex process of transformation undergone by art and aesthetics, as well as the closely intermeshed interdisciplinary relationships, can be understood only by investigating those phenomena and theories which have so far driven forward the syntopy [2] of art, science, and technology, and in the future will continue to do so. It is not sufficient to describe the current state of art by concentrating on its epicenter; instead one must expand the horizon of consideration to adjacent fields and trace the historical developments in which corresponding changes and contemporary phenomena can be discerned. One aim of this hypertext monograph is to work out an aesthetic concept inherently formed by the context and creative experience of interactivity-based works, as well as their presentation and reception. The intention is to show potential paths towards a renewal of aesthetic discourses: paths already smoothed by those pioneers and artists whose tracks this essay follows. In this way various concepts of science, technology, and art are linked with a view to revising the notions of art, aesthetics, and spectator.
Without a doubt the artistic use of new technologies and the specific current forms of interlocking science and art lead to diverse formulations of questions—of practical and formal, as well as conceptual and philosophical nature—to which only future developments will deliver an answer. The «Aesthetics of the Digital» addresses several of these principle questions. Some contain possible answers, others lead to new questions that open up space for further considerations.
Translation: Tom Morrison
ART, SCIENCE, AND TECHNOLOGY
Claudia Giannetti
Art – science – art
Deliberations on the connection between art and science have various points of departure. The most general considerations are limited to the assumption of a parallel development. In his writings published in 1970, Werner Heisenberg, who along with Max Planck counts as a founding father of quantum theory, stated that the tendencies towards abstraction in the sciences were comparable with those in the field of art. According to Heisenberg, new artistic and scientific forms can result only from new content, but the converse does not apply. To renew art or to revolutionize science, he wrote, meant to create new content and concepts, and not just new forms. [1]
A question more complex than that of parallels between art and science is the extent to which art influences the sciences. According to Peter Weibel, this question can be answered only methodologically, that is by applying a comparison which views art and science as methods. While science, says Weibel, is distinctly methodological in character, art is generally not regarded as a method: «This is our first claim: art and science can only reasonably be compared if we
accept that both are methods. This does not mean that we declare that both have the same methods. We only want to declare that both have a methodological approach, even if their methods are or can be different.»[2]
Accordingly it would be permissible to view art and science as convergent in the methodological sense. As Weibel sees it, science is influenced by art in regard to its methods, but not by its products and references: «Because any time science develops the tendency for its methods to become too authoritarian, too dogmatic, science turns to art and to the methodology of art which is plurality of methods.» [3] The objective nature exists neither in the framework of the sciences nor in culture regardless of social construction, «art and science meet and converge in the method of social construction.» [4]
This position finds its most radical expression in the science-theoretical contributions of Paul Feyerabend. As a critic of scientific rationalism, he develops new interpretations and connections among the arts and sciences. He is of the opinion that artists and scientists developing a style or theory frequently pursue a secondary intention, namely that of representing ‹the› truth or ‹the› reality. However, artistic styles are closely connected with styles of thought.
That which a specific form of thought understands concepts such as truth or reality to mean is what this form of thought asserts as truth. When one decides in favor of a style, a reality, or a form of truth, then one always chooses a human-made construction. In other words, Feyerabend negates the possibility of absolute rationality and logic in regard to that which is created by the human mind. He asserts that this relativist, and in a certain sense irrational, factor inherent in every branch of science places science in the proximity of art. According to Feyerabend, the sciences are not an institution of objective truth, but are arts along the lines of a progressive understanding of art. [5]
Feyerabend’s line of argument reflects the skepticism that deeply influenced occidental culture and science well into the twentieth century. The aforementioned questions of truth, reality and reason are central components of the contest between rationalism and relativism affecting art no less directly
than science. If the nature of science were to be considered a research method under the premises of reality, plausibility, and dialectics, then whoever attempted to identify these three principles by strictly observing the complexity of the objects would, according to the Spanish scientist Jorge Wagensberg, reach the conclusion that the object resisted the method. The only manner of proceeding would be to «soften up» the method, with the result that «science is transformed into ideology.» «At its core ideology means not research, but faith. It follows from this consideration that one must stop with ideology all the holes which science has itself failed to stop. […] If the knowledge towards which we aspire is ruled not by laws but by world-views, then it would seem expedient to take our leave of scientific methods, and perhaps even adopt principles radically opposed to the latter. Precisely that is the case in art, in a kind of knowledge whose creators have not the least interest in distancing themselves from their creation.»[6]
Of particular relevance to the understanding of a new interpenetration of art and science is the generative nature of either area, which brings forth words or world-views of its own. For that reason, «the worlds of art and science are ideologically no longer opposites,» as Ilya Prigogine states, «the variety of the significates and the basic opacity of the world are reflected in new languages and new formalisms.» [7]
The origins of information theory
The technological revolution received its fundamental impetus from the first industrial revolution in the nineteenth century. By starting up a process of mechanization, the industrial revolution triggered the phenomenon of crises of control. [8] The mounting production levels resulting from mechanization led to the need for control systems to accelerate the flow of information. Researchers began to seek solutions in feedback techniques, automatic control systems, and information processing.
Under the title «On Governors» in 1868, Clerk Maxwell presented the first theoretical study towards an analysis of control and feedback mechanisms, so ushering in the radical transformation in automatic control engineering. By the late nineteenth century, a series of developments and technical innovations were
The control revolution produced not only feedback techniques and a new hierarchization of media, but also revolutionized the cultural reproduction forms of society. [10] This included areas like communications and art, since the technologies exercised a direct influence on the forms of sociocultural (re)production.
Until then, nevertheless, the themes associated with control mechanisms and automation were discussed in connection with only one common parameter, namely energy. As the basic concept of Newtonian mechanics, energy retained the same position in the natural sciences and in research fields like acoustics, electrical science, and optics. The invariant of ‹mass› similarly occupied a central position in physics. However, as production techniques continued to be improved, so the relationship of human and machine began to change likewise, leading to the emergence of questions about new terms and theories able to make this communication process between biological and technological systems the object of targeted research.
The constitution of two new disciplines: cybernetics and artificial intelligence
That «society can only be understood through a study of the messages and the communication facilities which belong to it; and that in the future development of these messages and communication facilities, messages between man and machines, between machines and man, and between machine and machine, are destined to play an ever increasing part,» [11] was the key idea of the American mathematician Norbert Wiener (1894–1964), which he elaborated in his book «The Human Use of Human Beings. Cybernetics and Society,» published in 1950 after a first technical study «Cybernetic, or Control and Communication in the Animal and in the Machine» of 1948. In 1950 likewise, the British mathematician Alan Turing (1912–1954) raised the question of the feasibility of logical thought by machines. In his essay «Computing Machinery and Intelligence,» published in volume 59 of the philosophical journal «Mind,» Turing proceeds from the basic question with which his text begins: «Can machines think?»
Until the mid-twentieth century no more than a few researchers working in isolation were concerned with subjects such as communication between dissimilar systems (for instance, biological and technical systems), or with the feasibility of technically designing thought machines. In addition to Wiener and Turing their ranks included Charles Babbage, Claude Shannon, Warren Weaver and Hermann Schmidt. However, from the 1950s on these subjects rapidly became two fields of basic research: cybernetics and Artificial Intelligence. [12] The two aforementioned texts triggered a flood of publications containing speculation and analysis on these subjects. In the first three years after 1950 alone, more than a thousand essays published dealt with intelligence and with communication with and between machines. Yet, when Turing published his essay there existed no more than four digital computers worldwide (Mark I and EDSAC in England, ENIAC and BINAC in the USA). [13]Although Turing’s theorem—everything the human mind can do in the form of an algorithm can also be carried out by a Universal Turing Machine—was based on models so far investigated only as a hypothetical experiment, several researchers were inspired to empirically confirm or disprove it by building machines.
Communication
The approach of cybernetics—a name derived from the Greek term ‹kybernetes› (steersman)—consists in transferring the theory of control and message transmission, whether in the machine or in a living being, to the fields of communication and machine control. The objective is to investigate the relationships between animal and machine, and in the case of the machine the specific mode of its behavior, as a characteristic of the performance to be expected. [14]
On the basis of the observation of certain analogies between machines and living organisms, the mathematician asserts that no reason actually exists not to make a machine resemble a human being, since one and the other develop tendencies toward decreasing entropy, meaning that both are examples of local anti-entropic phenomena.
Turing likewise conceded priority to the subject of communication. His famous experiment—the imitation
game, as he called it, also known as the Church-Turing thesis or Turing Theorem—for verifying the intelligence of a computer was concerned less with the actual construction of such a machine than with simulating with machines the human capability of communication. Turing was here acting in line with a tradition of measuring the faculty of thought by the ability to use human language. Descartes had already presented the logically semantic usage of language as a criterion for identifying thinking beings. For a long time, the mastery of semantics would remain a basic problem of Artificial Intelligence.
Information
In contrast to that tradition Wiener’s cybernetics sought operational ways of developing a specific language that would enable communication between dissimilar systems, and aimed to adapt semantics to specific goals in the process. Viewed from this perspective, Wiener’s theory replaced the notion of energy with that of information as the elementary parameter of communication, and thus postulated the definition of this new invariant for cybernetic science as a whole, which is a basic prerequisite for understanding the range of the cybernetic approach.
Unlike Newton’s mechanics, which operates with closed systems, information is applied to open systems. In this way it must be seen as a key enabling linkage and communication between dissimilar systems, and between the latter and the external world. ‹Mass› and ‹energy› are directly related to matter in the natural sciences, whereas ‹information› is not conveyed by any substance, but is based on variable properties: Information can be reproduced (duplicated or copied), destroyed (erased), or reiterated (repeated). «Information is a name for the content of what is exchanged with the outer world as we adjust to it, and make our adjustment felt upon it. The process of receiving and of using information is the process of our adjusting to the contingencies of the outer environment, and of our living effectively within that environment.» [15] To this extent, not the possible quantity of circulating information is crucial to the effectiveness of communication, but the degree to which this information is integrated into communication. Along the lines of cybernetics, then,
significant information is not the entirety of all information transmitted, but that information which passes through the ‹filters.›
Feedback
In the field of information and communication Wiener devoted particular attention to the question of automatons and the development of feedback models. His core interest lay in investigating machines capable of evaluating input and of integrating the stored experience into the further feedback loops. In this respect, feedback is a method of making systems self-regulating, by which the results of preceding activities are re-integrated in the procedural sequence and thus enable runtime corrections to be made permanently. To this end, machines must be capable of learning processes.
Although his approaches and conclusions are very different from those of Wiener, Turing in his essay likewise clearly indicated the necessity of developing systems capable of learning. Devoted to the subject of learning machines, the essay takes as its starting point the principle that «education can take place provided beyond that for the development of interactive, digital systems. This communication channel permits bi-directional information exchange, and therefore also learning processes. On the basis of this method, Turing repudiated the thesis set up by Ada Lovelace in 1842. [22]Using investigations made with the ‹analytic machine› of Charles Babbage, Lady Lovelace had claimed that a machine can do only that which it is instructed to do, and therefore is never capable of producing anything truly new. [23] Turing contradicted this thesis with the question «who can be certain that ‹original work› that he has done was not simply the growth of the seed planted in him by teaching, or the effect of following well-known general principles?» [24] He further pointed out that the machine must be to a certain degree ‹undisciplined› or random-controlled in order that its behavior can be considered intelligent. [25]
Precisely this element of chance was what lent the machine ‹creative› ability, namely the ability to solve problems. Although discrete machines that could pass the Turing Test are feasible, they would succeed not because they were replicas of the human brain but because they would have been programmed accordingly. As Turing himself realized, the basic problem lies in the area of programming. In fact, it was not necessary to wait the fifty years assumed by Turing in order to program «computers, with a storage capacity of about 109, to make them play the imitation game so well that an average interrogator will not have more than a 70 per cent chance of making the right identification after five minutes of questioning.» [26] The programs have been written, and have passed the Turing Test with a high degree of interactivity. One might therefore conclude that the problem is not solely confined to investigating the possibilities of Artificial Intelligence. [27]
Viewed from the contemporary perspective, cybernetics and AI cannot be reduced to solely scientific, economic, or technical interest. Since these theories belong to a socio-technical field in which communication structures, world-views and people-views are formed and transformed, they are concerned with philosophic issues of perception, cognition, language, ethics, and aesthetics. If information technology is basically working towards the automation of mental processes, then it directly or indirectly reaches into disciplines concerned with human cognition or creativity.
Translation by Tom Morrison