During the recent Annual General Membership meeting of the International Committee on Fundraising Organizations (ICFO) held in Madrid, Spain in May 2012, the issue of technology and its implications for the civil society sector came up. But then, in our modern digital, new media era, doesn't it always seem to come up?
One of the projects we are considering at the ICFO Centre for Research and Public Policy addresses the use of new media in the nonprofit sector for branding and raising funds, and whether such use is truly compatible with ideals of transparency and accountability. If it is not, what then?
Here are some of the questions:
1. In what way are the new media technologies good for civil society organizations?
2. Is it possible to distinguish between effective use of these new technological tools and the effect they have on what we think and do?
3. Does new media technology enhance or hinder communications and build relationships between civil society organizations and their donors?
4. Are new media tools for raising money effective, and if so, in what way are they effective?
5. What are some of the cost trade-offs that should be considered in the use of new media technologies?
6. How do these new media technological tools enhance or inhibit transparency and accountability?
7. Are transparency and accountability all that important in the Internet age and the increased use of new technologies, including digital technologies to effectively and efficiently raise funds for the public benefit purposes for which civil society organizations exist?
Maybe the readers can think of other questions as well.
But first this to set the stage. Many years ago during a symposium on technology, French philosopher, sociologist, law professor, and lay theologian, Jacques Ellul, told an audience about one of his friends, a very competent surgeon, who during a discussion on technology, was asked the question as to whether he was aware of the progress in technology and science relative to the medical profession. The surgeon responded with an answer that was both humorous and serious. His answer was this:
What many consider to be his most important work, Jacques Ellul, set forth in The Technological Society (1964), seven characteristics of modern technology that make efficiency a necessity. Several of these are relevant to what this post is attempting to address. There include: rationality, artificiality, and autonomy.
The rationality of technique, for example, enforces logical and mechanical organization through division of labor, the setting of production standards, etc. It creates an artificial system which "eliminates or subordinates the natural world." Instead of technology being subservient to humanity, human beings have to adapt to it, and accept total change. For example, people questioned the value of learning ancient languages, history, religion and philosophy, and anything relating to the humanities, which on the surface, do little to advance their financial and technical state. This emphasis on the world of information, on being able to work with computers, is invading the whole intellectual domain and also that of conscience.
One of the illusions we have is that technology gives us more freedom. But free to do what? Freer to go more places, see more things, experience more things, and as charitable organizations, to do more things. We can acquire more knowledge about more of the world and about more things, and yes, that is wonderful. I wonder how many of the people trying to travel on the Beijing metro system thought of the wonder of technology and the glory of the freedom it gave them when they tried to go some place. Of course, especially when they did not coordinate their plans on where they were going on this particular day.
But, of course with the freedom there is another factor that must be considered. In a technological society such as ours, it is impossible for a person or group of people to be responsible for anything. For example, there is a dam that one day bursts and the valley is flooded. Geologists did the survey of the terrain and decided that the rock could hold the dam. Engineers designed the dam and supervised its construction. Workers built the dam. And, politicians decided that the dam was needed and where it should be constructed. Who is responsible? No one.
In our technological society, work is so fragmented and often bureaucratically determined and controlled. No one is responsible and no one is free either. Everyone has a specific task and does not have the freedom to go beyond that task.
Now the argument here is not about how we use technology and new media, and whether or not that use is good or bad. Rather, it is about how modern technology, and specifically new media, informs our thinking and thought processes, and how it affects the way we act. This has specific application to how we communicate and form or maintain relationships. Indeed, it has application to and implications with respect to the questions posed above.
Since Heidegger and Habermas, technique has become a primary theme of philosophy. Many philosophers and social scientists are trying to understand the phenomenon or see what kind of influence it has on the world. The warning here is that people need to be alert to the future potential of technique and to the risks entailed by its growth so that they might be able to react and master it, or at least remain in mastery and control of their lives and their organizations.
Ellul speaks of a "technological bluff" he refers to the problem of language in which the word, "technology" refers to the actual process, whereas in a strict sense, technology is discourse on "technique." It involves the study of a technique, a philosophy or sociology of technique, instruction of a technique. It is, as Ellul writes, the gigantic bluff in which discourse on technique envelops us, making us believe anything and far worse, changing our whole attitude toward techniques.
What this discussion is about is not what technology is and can be. Rather, it is the discussion of techniques, particularly those used in communication. The risk consists of essentially rearranging everything in terms of technical progress, which, according the Ellul, "with prodigious diversivication offers us in every direction such varied possibilities that we can imagine nothing else. So, any discussion on technique is not justification of techniques, but a demonstration of the prodigious power, diversity, success, universal application, and impeccability of techniques."
One must consider and realize that in the use of digital technology and new media, especially the Internet, there are certain Internet designs that tend to pul us into patterns that gradually degrade the ways in which each of us exists as human individuals. As Jaron Lanier wrote in his You Are Not a Gadget: A Manifesto, these patterns "are more oriented toward treating people as relays in a global brain, deemphasizing personhood, and the intrinsic value of an individual's unique internal experience and creativity. He wrote that:
The issue here is that software is subject to an exceptionally rigid process of "lock-in." Web designs were developed to attract more popular use from more people and were introduced as minimalist and accessible to all. As a program grew in size, the software became a maze. When other programmers got involved, it took great effort to modify software depending on its use. Because of the brittle character of maturing computer programs, digital designs became frozen into place by the process known as "lock-in." This occurred when many software programs were designed to work with an existing program. "Lock-in" removes design options based on what is easiest to program, what is politically feasible, what is fashionable, or what is created by chance. It removes ideas that do not fit into a particular winning digital representational scheme. But, it also narrows the ideas it immortalizes by cutting away the number of meanings that distinguish a word in its natural language from a command in a computer program.
There is another "locked-in" idea which is the concept of the file. UNIX, the MAC, and Windows all had "files" which are part of our lives. As Jaron Lanier put it in his You Are Not a Gadget book:
The problem is that we start to care more about the abstraction of the network more than the real people who are networked, even though the network by itself is meaningless. Only people were ever meaningful.
So, as Jacques Ellul writes, the computer as such implies a network and a businessman [or leader of a charitable organization] has no choice and cannot acquire a computer just because he or she likes progress. The computer brings a whole system with it. The difference is that the technical system has now become strongly integrated. Offices, means of distribution, personnel, and production, including the production of information or funding, have all to be adapted to it. If they are not, they run the risk of not merely losing the advantages brought by this wonderful gadget, but also of causing unimaginable disorder by introducing computers into an organization or society without making possible their proper use.
Everything is challenged. Can we adapt physically, socially, and intellectually to the computer? Can we adapt morally to what the computer allows us to do, and maybe even forces us to do? Are we as humans simply redundant. These are some of the questions posed both by Jacques Ellul and Jaron Lanier. What is at stake is the social link itself. The media have now confused what used to be clearly the domain of social and private life.
There are always those who would argue that technological progress, especially with respect to the techniques employed in science, in medicine, in business, in government, and in the civil society sector as it seeks to raise funds and engage in socially impacting public benefit, is good. Maybe it is just neutral, rather than being good or bad. However, one of the great weaknesses of those who separate the good results of technique from the bad is that they constantly think of people as being wise, reasonable, in control of their desires and instincts, serious and moral. But, as experience has shown, the growth of technical powers has not made us more virtuous.
Moreover, every economic, administrative, and managerial operation becomes more and more complex as a result of the multiplication of techniques. With the expansion and extension of techniques, there is increased specialization. Processes are increasingly refined, complex, and subtle. With that, there is a demand for regulation by which we think we can control the proliferation of possibilities that are available as a result of the varieties of techniques available for almost every facet of life. We draw up rules, make organizational charts, set up groups, and are convinced that we can clearly see what we are doing and control it. The result is a multiplicity of regulations which are both finicky in how we handle this proliferation, and often contradictory for there is no longer any possibility of synthesis. The regulations become totally detached from reality. And by their density, scope, and complexity, they become real hindrances to any meaningful action and sources to other hindrances. Machines cannot make our decisions. We, as people, must make them. But the decisions become increasingly inadequate and confused, and administrators and managers are increasingly crushed under their weight.
So, to engage our discussion of the nature and use of the techniques for production of information and communications, as well as the work of simply doing what we are called to do, we must analyze the impact and import of these technologies to the way we think and act. In addition to the questions posed at the beginning of this post, there are four propositions that might help us in our thinking.
1. First, all technical progress has its price. What is that price?
2. Second, at each stage it raises more and greater problems that it solves. What are some of those problems?
3. Third, its harmful effects are inseparable from it beneficial effects. What are the harmful and what are the beneficial effects?
4. Fourth, it has a great number of unforeseen effects. What kind of effects might be unforeseen?
One of the projects we are considering at the ICFO Centre for Research and Public Policy addresses the use of new media in the nonprofit sector for branding and raising funds, and whether such use is truly compatible with ideals of transparency and accountability. If it is not, what then?
Although I have addressed some of this theme before in various talks on media ecology, I would like to address it here again with a specific invitation to readers to submit comments that can be considered as part of our research project.
Here are some of the questions:
1. In what way are the new media technologies good for civil society organizations?
2. Is it possible to distinguish between effective use of these new technological tools and the effect they have on what we think and do?
3. Does new media technology enhance or hinder communications and build relationships between civil society organizations and their donors?
4. Are new media tools for raising money effective, and if so, in what way are they effective?
5. What are some of the cost trade-offs that should be considered in the use of new media technologies?
6. How do these new media technological tools enhance or inhibit transparency and accountability?
7. Are transparency and accountability all that important in the Internet age and the increased use of new technologies, including digital technologies to effectively and efficiently raise funds for the public benefit purposes for which civil society organizations exist?
Maybe the readers can think of other questions as well.
But first this to set the stage. Many years ago during a symposium on technology, French philosopher, sociologist, law professor, and lay theologian, Jacques Ellul, told an audience about one of his friends, a very competent surgeon, who during a discussion on technology, was asked the question as to whether he was aware of the progress in technology and science relative to the medical profession. The surgeon responded with an answer that was both humorous and serious. His answer was this:
I am of course, very familiar with progress in medical technology and science. But, ask yourself this question: Currently we carry out heart transplants, liver transplants, kidney transplants. But, where do those hearts, those livers, those kidneys come from? The must be healthy organs. Not affected by illness or damage. They must be fresh. They can only come from one place. From people who died in traffic accidents. So to carry out these operations, these transpants, we need more traffic accidents. If we make driving and traffic safer, we will have fewer organs with which to carry out these wonderful operations.Of course, everyone was surprised by his answer, and also quite shocked.
What many consider to be his most important work, Jacques Ellul, set forth in The Technological Society (1964), seven characteristics of modern technology that make efficiency a necessity. Several of these are relevant to what this post is attempting to address. There include: rationality, artificiality, and autonomy.
The rationality of technique, for example, enforces logical and mechanical organization through division of labor, the setting of production standards, etc. It creates an artificial system which "eliminates or subordinates the natural world." Instead of technology being subservient to humanity, human beings have to adapt to it, and accept total change. For example, people questioned the value of learning ancient languages, history, religion and philosophy, and anything relating to the humanities, which on the surface, do little to advance their financial and technical state. This emphasis on the world of information, on being able to work with computers, is invading the whole intellectual domain and also that of conscience.
One of the illusions we have is that technology gives us more freedom. But free to do what? Freer to go more places, see more things, experience more things, and as charitable organizations, to do more things. We can acquire more knowledge about more of the world and about more things, and yes, that is wonderful. I wonder how many of the people trying to travel on the Beijing metro system thought of the wonder of technology and the glory of the freedom it gave them when they tried to go some place. Of course, especially when they did not coordinate their plans on where they were going on this particular day.
But, of course with the freedom there is another factor that must be considered. In a technological society such as ours, it is impossible for a person or group of people to be responsible for anything. For example, there is a dam that one day bursts and the valley is flooded. Geologists did the survey of the terrain and decided that the rock could hold the dam. Engineers designed the dam and supervised its construction. Workers built the dam. And, politicians decided that the dam was needed and where it should be constructed. Who is responsible? No one.
In our technological society, work is so fragmented and often bureaucratically determined and controlled. No one is responsible and no one is free either. Everyone has a specific task and does not have the freedom to go beyond that task.
Now the argument here is not about how we use technology and new media, and whether or not that use is good or bad. Rather, it is about how modern technology, and specifically new media, informs our thinking and thought processes, and how it affects the way we act. This has specific application to how we communicate and form or maintain relationships. Indeed, it has application to and implications with respect to the questions posed above.
Since Heidegger and Habermas, technique has become a primary theme of philosophy. Many philosophers and social scientists are trying to understand the phenomenon or see what kind of influence it has on the world. The warning here is that people need to be alert to the future potential of technique and to the risks entailed by its growth so that they might be able to react and master it, or at least remain in mastery and control of their lives and their organizations.
Ellul speaks of a "technological bluff" he refers to the problem of language in which the word, "technology" refers to the actual process, whereas in a strict sense, technology is discourse on "technique." It involves the study of a technique, a philosophy or sociology of technique, instruction of a technique. It is, as Ellul writes, the gigantic bluff in which discourse on technique envelops us, making us believe anything and far worse, changing our whole attitude toward techniques.
What this discussion is about is not what technology is and can be. Rather, it is the discussion of techniques, particularly those used in communication. The risk consists of essentially rearranging everything in terms of technical progress, which, according the Ellul, "with prodigious diversivication offers us in every direction such varied possibilities that we can imagine nothing else. So, any discussion on technique is not justification of techniques, but a demonstration of the prodigious power, diversity, success, universal application, and impeccability of techniques."
And when I say bluff, it is because so many successes and exploits are ascribed to the techniques (without regard for the cost or utility or risk), because technique is regarded in advance as the only solution to collective problems (unemployment, Third World misery, pollution, war) or individual problems (health, family life, even the meaning of life), and because at the same time it is seen as the only chance for progress and development in every society. There is a bluff here because the effective possibilities are multiplied a hundredfold in such discussions and the negative aspects are radically concealed. . . . Thus it transforms a technique of implicit and unavowed last resort into a technique of explicit and avowed last resort. It also causes us to live in a world of diversion and illusion which goes far beyond that of ten years ago. It finally sucks us into this world by banishing all of our ancient reservations and fears.Now, isn't this much of what the task of government, and really of civil society, is all about? And, isn't this really what we see happening in the civil society sector where the Internet, social media, and text messaging are where all the effective action in branding and fundraising find their central roles?
One must consider and realize that in the use of digital technology and new media, especially the Internet, there are certain Internet designs that tend to pul us into patterns that gradually degrade the ways in which each of us exists as human individuals. As Jaron Lanier wrote in his You Are Not a Gadget: A Manifesto
For instance, the idea that information should be "free" sounds good at first. But the unintended result is that all the clout and money generated online has begun to accumulate around the people close to only certain highly secretive computers, many of which are essentially spying operations designed to gain information to seel advertising and access or to pull money out of a marketplace as if by black magic. The motives of people who comprise the online elites aren't necessarily bad . . . but nevertheless the structures of the online economy as it has developed is hurting the middle class, and the viability of capitalism for everyone in the long term.
The implications of the rise of "digital serfdom" couldn't be more profound. As technology gets better and better, and civilization becomes more and more digital, one of the major questions we will have to address is: Will a sufficiently large middle class of people be able to make a living from what they do with their hearts and heads? Or will they be left behind, distracted by empty gusts of ego-boosting puffery?
The issue here is that software is subject to an exceptionally rigid process of "lock-in." Web designs were developed to attract more popular use from more people and were introduced as minimalist and accessible to all. As a program grew in size, the software became a maze. When other programmers got involved, it took great effort to modify software depending on its use. Because of the brittle character of maturing computer programs, digital designs became frozen into place by the process known as "lock-in." This occurred when many software programs were designed to work with an existing program. "Lock-in" removes design options based on what is easiest to program, what is politically feasible, what is fashionable, or what is created by chance. It removes ideas that do not fit into a particular winning digital representational scheme. But, it also narrows the ideas it immortalizes by cutting away the number of meanings that distinguish a word in its natural language from a command in a computer program.
There is another "locked-in" idea which is the concept of the file. UNIX, the MAC, and Windows all had "files" which are part of our lives. As Jaron Lanier put it in his You Are Not a Gadget book:
The file is a set of philosophical ideas made into eternal flesh. The ideas expressed by the file include the notion that human expression comes in severable chunks that can be organized as leaves on an abstract tree -- and that the chunks have versions and need to be matched to compatible applications.
* * *
It's worth trying to notice when philosophies are congealing into locked-in software. For instance, is pervasive anonymity or pseudonymity a good thing? It's an important question, because the corresponding philosophies of how humans can express meaning have been so ingrained into the interlocked software designs of the internet that we might never be able to fully get rid of them, or even remember that things could have been different.
* * *
The rise of the web was a rare instance when we learned new, positive information about human potential. Who would have guessed (at least at first) that millions of people would put so much effort into a project without the presence of advertising, commercial motive, threat of punishment charismatic figures, identity, politics, exploitation of fear of death, or any of the other classic motivators of mankind. In vast numbers, people did something cooperatively, solely because it was a good idea, and it was beautiful.
But not all surprises have been happy. This digital revolutionary still believes in most of the lovely deep ideals that energized our work so many years ago. At the core was a sweet faith in human nature, if we empowered individuals, we believed more good than harm would result.
The way the internet has gone sour since then is truly perverse. The central faith of the web's early design has been superseded by a different faith in the centrality of imaginary entities epitomized by the idea that the internet as a whole is coming alive and turning int a superhuman creature.
The designs guided by this new, perverse kind of faith put people back in the shadows. The fad for anonymity has undone the great opening-of-everyone's windows of the 1980s. While the reversal has empowered sadists to a degree, the worst effect is a degradation of ordinary people.Every element in the system, whether every computer, every person, every bit, comes to depend on detailed adherence to a common standard, a common point of exchange. Instead of people being treated as the sources of their own creativity, commercial aggregation and abstraction sites prevented anonymized fragments of creativity as products that might have fallen from the sky, or been dug up from the ground obscuring the true source. "The groupthink problem I'm worried about isn't so much in the minds of technologists themeselves, but in the minds of the users of the tools the cybernetic totalists are producing."
The problem is that we start to care more about the abstraction of the network more than the real people who are networked, even though the network by itself is meaningless. Only people were ever meaningful.
So, as Jacques Ellul writes, the computer as such implies a network and a businessman [or leader of a charitable organization] has no choice and cannot acquire a computer just because he or she likes progress. The computer brings a whole system with it. The difference is that the technical system has now become strongly integrated. Offices, means of distribution, personnel, and production, including the production of information or funding, have all to be adapted to it. If they are not, they run the risk of not merely losing the advantages brought by this wonderful gadget, but also of causing unimaginable disorder by introducing computers into an organization or society without making possible their proper use.
Everything is challenged. Can we adapt physically, socially, and intellectually to the computer? Can we adapt morally to what the computer allows us to do, and maybe even forces us to do? Are we as humans simply redundant. These are some of the questions posed both by Jacques Ellul and Jaron Lanier. What is at stake is the social link itself. The media have now confused what used to be clearly the domain of social and private life.
There are always those who would argue that technological progress, especially with respect to the techniques employed in science, in medicine, in business, in government, and in the civil society sector as it seeks to raise funds and engage in socially impacting public benefit, is good. Maybe it is just neutral, rather than being good or bad. However, one of the great weaknesses of those who separate the good results of technique from the bad is that they constantly think of people as being wise, reasonable, in control of their desires and instincts, serious and moral. But, as experience has shown, the growth of technical powers has not made us more virtuous.
Moreover, every economic, administrative, and managerial operation becomes more and more complex as a result of the multiplication of techniques. With the expansion and extension of techniques, there is increased specialization. Processes are increasingly refined, complex, and subtle. With that, there is a demand for regulation by which we think we can control the proliferation of possibilities that are available as a result of the varieties of techniques available for almost every facet of life. We draw up rules, make organizational charts, set up groups, and are convinced that we can clearly see what we are doing and control it. The result is a multiplicity of regulations which are both finicky in how we handle this proliferation, and often contradictory for there is no longer any possibility of synthesis. The regulations become totally detached from reality. And by their density, scope, and complexity, they become real hindrances to any meaningful action and sources to other hindrances. Machines cannot make our decisions. We, as people, must make them. But the decisions become increasingly inadequate and confused, and administrators and managers are increasingly crushed under their weight.
So, to engage our discussion of the nature and use of the techniques for production of information and communications, as well as the work of simply doing what we are called to do, we must analyze the impact and import of these technologies to the way we think and act. In addition to the questions posed at the beginning of this post, there are four propositions that might help us in our thinking.
1. First, all technical progress has its price. What is that price?
2. Second, at each stage it raises more and greater problems that it solves. What are some of those problems?
3. Third, its harmful effects are inseparable from it beneficial effects. What are the harmful and what are the beneficial effects?
4. Fourth, it has a great number of unforeseen effects. What kind of effects might be unforeseen?