Skip to content
Who's in the Video
David Gelernter is professor of computer science at Yale, chief scientist at Mirror Worlds Technologies, contributing editor at the Weekly Standard, and member of the National Council of the Arts.[…]
Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

A conversation with the writer, artist, and Yale computer scientist.

rnrn

Question: What balance dornyou strike between your teaching, writing, science, and art?

rnrn

David Gelernter:rnNowadays, I spend my time mainly painting.  I have an exhibition coming up.  Generally speaking, I spend more time painting than doingrnanything else, except for writing. rnI’ve been writing pieces for—some pieces connected with DLD where I gotrnto meet Frank Schumacher, who's been associated with the meeting for some time, andrnis an editor and publisher at the Frankfurter Allgemeine Zeitung, and a remarkable guy.  So, I agreed to write a series ofrnpieces for them and he’s a wonderful guy and I think Europe is more interestedrnin the implications as opposed to the immediate market meaning ofrntechnology.   I mean, I don’trnthink people are better educated or more thoughtful or any different, they’vernjust got a somewhat different focus. rnI think growing out of the nature of the European market and the originrnof so much of the technology in the United States gives them one degree of remove,rnwhich I think is useful.  There’s arnlot of thoughtful people over there.

rnrn

Question: What is thernfocus of your new art exhibition?

rnrn

David Gelernter:rnWell, let’s see, this is the latest of a series of exhibits at Yale, which is arngood place for me to exhibit.  Irnlike to sell paintings, not from galleries, but from a more informal,rnone-on-one way, and so a non-commercial gallery space in which to exhibit, is, forrnme, very useful.  I mainly—I’vernbeen trying for many years, I should say for many decades at this point, tornfigure out what Jewish art is, if there is such a thing.  It’s come to seem to me that Jewish artrnis paintings of words.  Not justrnpaintings in which words appear, or words on a wall, but paintings in which thernwords themselves have meaning and decorative significance and conceptual weight.  It’s hard to describe an image,rnespecially one that is somewhat idiosyncratic, but anyway.  General idea.

rnrn

Question: How doesrnJudaism shape your work?

rnrn

David Gelernter:rnGenetically, to begin with.  When Irndo think up pictures, my own job description is an image thinker, as manyrnpeople have been, and what I do is a matter of the images that float through myrnhead.  Many people think in images,rnit’s hard to say how many. rnCertainly many people think in images some of the time.  Many people think in images virtuallyrnall the time.  When I’m working inrnsoftware, I’m thinking of the picture that needs to appear on the screen orrnthat needs to appear in the user’s head in order to make sense of the software.  In the studio, more directly, I try andrntake as any painter does, as any artist does, tries to take what is in his headrnand make it concrete which is a constant—which is a struggle, which isn’t easy,rnbut is what art has always been about. 

rnrn

When I write, I tend to write vividly or try to writernvividly, and it’s also a matter of the images that drift through one’srnhead.  So, this is the way I dealrnwith the world, picture-wise. 

rnrn

Question: Why do yournbelieve Judaism is the central intellectual development in Western history?

rnrn

David Gelernter:rnIt seems to be, beyond doubt, that Judaism is the most important intellectualrndevelopment in western history for two reasons: one having to do with thernaesthetic and spiritual, and the other having to do with the ethical.  If I begin with ethical and moralrnissues, Judaism invented the idea of mankind as an entity.  So we see striking differences betweenrnancient Israeli literature and Greek literature, let’s say in the first 1,000rnyears, the first millennium B.C. rnThere is a word in Greek that has no equivalent in Hebrew, namely “barbarian.”  Barbarian meaning, somebody thatrnbabbles—a Greek word meaning someone who babbles, who doesn’t speak Greek, whornis foreign, who is culturally inferior by definition and of very littlerninterest.  Not only different, butrnboring.  Judaism, meanwhile insofarrnas to develop the idea of a single god, which was a revolutionary and bazaarrnidea at that time, first emerges 3,000 some odd years ago.  I figured that if there really only onerngod in the world, he had to be everybody’s god.  Everybody should have the right to say, this is my god.  Must have that right.  And then if you look who thatrncommunity, who the faithful are in principle, it’s everybody.  So, Judaism has the idea that ethicalrnlaws, moral rules and strictures apply to everybody. Not that everybody has arnsort of liability to carry them out. rnThere were stricter requirements of Jews, or Israelis, than there arernof people in general.  Judaism hasrnnever been a proselytizing religion. rnIt doesn’t really care—as a matter of fact is indifferent—whether peoplernbecome Jews or join the Jewish community, but is very clear on what the basicrnmoral obligations of mankind are with respect for life, respect for justice,rnkindness to animals, a familial, what should I say, sexual fidelity andrnrefraining from sexual crimes. rnThese are the so-called “Seven laws of the sons of Noah,” meaning thatrnthey apply to everybody. 

rnrn

So, without going into a lengthy disquisition, Judaism hasrnthe idea that there is a simple moral code which goes not only for the Israelirnpeople, or the Israeli nation, but is applicable to everybody and has thernrevolutionary idea that not only is there one god, but there is essentially onernman; one mankind, the whole world. rnSo on festival occasions at the Temple of Jerusalem, 70 sacrifices wouldrnbe brought at certain points.  Itrnwas thought that there were 70 nations in the world; one for each nation. 

rnrn

Judaism has an aesthetic and spiritual side also, ofrncourse.  Judaism is obsessed withrnimagery.  One often finds that itsrnstereotypes are either basically right or exactly wrong.  They are rarely sort of inrnbetween.  Judaism is oftenrndescribed as being hostile to imagery. rnBut we know that can’t be right because of the Hebrew Bible underliesrnwestern literature.  Hebrew poetry,rnthe poetry of the psalms, the prophets, the Book of Job, is the basis ofrnWestern literature.  Hebrew prosernnarrative is the basis of Western narrative.  There is no such thing as great poetry without imagery, thernidea is absurd.  There is no suchrnthing as great writing that isn’t vivid and vibrant and that means based onrnimages. And we find, in fact, the imagery of the Bible is the imagery thatrnrecurs throughout Western literature and Western art, from ... the split-open Red Sea, to the handwriting on the wall,rnto chariot of fire.  These arernimages that are not only painted in the developing tradition of medieval artrnand western realist painting, but they recur in Western literature of allrnlanguages down to this afternoon.  

rnrn

So for both of these reasons, Judaism has a commanding rolernin the creation of the culture and civilization that we’ve occupied for severalrnthousand years, and especially so with the emergence of the idea of the liberalrnnation.  The liberal modern nationrnwhich is a sort of joint invention of the United States and of Great Britain inrnthe 17th century and the 18th century. rnThese were Christian nations, but the Christianity of early America andrnof Britain in the Elizabethan, and especially the age of the civil wars andrnCromwell, is what is often called “Hebraic Christianity,” or “Old TestamentrnChristianity.”  It was a profoundlyrnHebrew-inspired sort of Christianity. rnNot that people thought of themselves as Jews because they did not, butrnboth the early United States and the early Britain repeatedly referred tornthemselves as “The New Israel” and the idea of freedom and liberty emerges inrnthe United States on the basis of the story of the Exodus, the biblical verse,rn“Let my people go,” which is repeated many times by Moses to Pharaoh becomesrnfundamental in American history not only when religious zealots, who werernpersecuted in England immigrate in the 17th Century to the United States, butrnwhen the United States declares it’s own independence and freedom as a nationrnduring the Civil War when the North becomes gradually resolved under PresidentrnLincoln to free the slaves, and then the Civil Rights Movement of the ‘60s,rnlate ’50s and ‘60s again.

rnrn

So, the notion of freedom, the notionrnof equality, which is derived by the founders of English and Americanrnliberalism from the opening of the Bible, which says, “All men are created inrnGod’s image, therefore you’re not allowed to make distinctions on the basis ofrnrace, color, and creed.  All menrnbeing in God’s image are to be treated justly and fairly.”  Abraham Lincoln put that mostrnconcisely.  And interestingly, thernidea of democracy too, if you read the early literature in the United States,rndeveloping the idea of modern democracy in the 1600’s, especially New England andrnin Virginia, to some extent,  biblical verses are quoted constantly.  Not only the ones in which Moses sets up what is describedrnas a Jewish commonwealth, he’s told to essentially let each tribe furnish itsrnown leaders.  Tell Moses who hisrnleaders will be.  But it’s also therncase of the Hebrew Bible is an aggressively anti-monarchy book.  There are vivid denunciations of thernidea of a king, the rights of kings, an absolute king.  Prophets in the Bible confront kingsrnfor them in the name of God to be fair and to be just and to be honorable, andrnin fact, Israel was told that if it had any sense, they wouldn’t have a kingrnto begin with. 

rnrn

So in lots of ways—and this is something that used to bernwell known—the last couple of generations in western culture, I would sayrnsince the Second World War, have been secularizing generations in which we werernmore apt to look at ancient Greece than ancient Israel.  But as a matter of historical record,rnit’s easy to trace these ideas, also in the philosophy of the EnglishrnEnlightenment.  It’s easy to open arnbook of Locke and notice that he keeps quoting the Bible, or Hobbes, or Seldon,rnor others of the English philosophers who provided the intellectualrncounter-weight to the active and pragmatic liberalism of the founding fathers.

rnrn

Question: What is cloudrncomputing, and what do you believe its future will be?

rnrn

David Gelernter:rnThe idea of the cloud is that I compute on many platforms in many places.  I use many different machines, eitherrnbecause I have a machine at home, a machine at work, because I have a couple ofrnlaptops, maybe I have a cell phone which itself is a computing platform, a pod,rna pad, a Blackberry or whatever it is, there are a lot of differentrnplatforms.  I travel; I need torncompute in a lot of different places. rnSo, for practical reasons, rather than taking my information and puttingrnit in the file system on my new laptop, or on my machine at home, or on my cellrnphone or something like that, it’s much easier for me just to let therninformation float off somewhere so it’s always sort of overhead, or somernintangible place around me and I can tune it in, in the sense that I can tunernin C-Span from any TV, cable connected TV, I want to be able to tune in myrninformation and be able to see it from any internet-connected computer.  It’s importantrnin terms of portability; it has other major pragmatic advantages, some of whichrnhave not yet been fully realized. rnIt still is an enormous nuisance to buy a new computer, which isrnabsurd.  Why, when I get a newrncomputer it sits in my front hall for three weeks while I work up courage torninstall it.  I usually wait untilrnone of my sons is home so he can do the work for me because, although it shouldrnbe trivial, what I want is to get a new computer, take it out of the box, plugrnit in, take a sledge hammer and smash the old one to bits, and I’m online.  But because the cloud doesn’t reallyrnfunction the way I want it to yet, one has to copy, painfully, the entire filernsystem from one computer to the other computer, even if one rigs up a specialrnconnection that’s a nuisance.  Onernalways winds up missing things. 

rnrn

So, anyway, you need a cloud because you have a lot ofrncomputers.  You need a cloudrnbecause you often get new computers that are born empty.  Maybe most important, you need a cloudrnfor security.  More and morernof people’s lives is going online. rnFor security and privacy, I need the same sort of serious protection myrninformation gets that my money gets in a bank.  If I have money, I’m not going to shove it in a drawer underrnmy bed and protect it with a shotgun or something like that.  I’m just going to assume that there arerninstitutions that I can trust, reasonably trustworthy to take care of the moneyrnfor me.  By the same token, I don’trnwant to worry about the issues particularly with machines that are always on, that are always connected to the network, easy to break into.  I don’t want to manage the security onrnmy machine.  I don’t want to worryrnabout encryption; I don’t want to worry about other techniques to frustraternthieves and spies.  If myrninformation is out on the cloud, not only can somebody else worry about encryptionrnand coding it, not only can somebody else worry about barriers and logonrnprotections, but going back to Linda and the idea of parallelism and a networkrnserver existing not on one machine, but being spread out on many, I’d like eachrnline of text that I have to be spread out over a thousand computers, let’s say,rnor over a million. 

rnrn

So, if I’m a hacker and I break into one computer, I may bernable to read a vertical strip of a document or a photograph, which isrnmeaningless in itself, and I have to break into another 999,999 computers tornget the other strips.  You know, orrnit may be more computers than that. rnThe cost of computers is going asymptotically to zero, of course it will alwaysrncost money to connect them and keep them running and stuff like that, but notrnonly for matters of convenience, which are very important, I need to be able tornget my data anywhere on any platform, but even more for privacy and securityrnwhen people talk about a cloud, they mean information that’s available on anyrnplatform managed, not by me, but by responsible—by an organization in whom Irncan place as much trust as the institution as my community or my city thatrnpatrol the streets, that bank my money, that generally keep civilizationrnrunning.  They need to do the samernthing with respect to the information landscape and privacy and security and sornforth.

rnrn

Question: How can we fixrningrained software flaws rather than accepting them?

rnrn

David Gelernter:rnWell, there are a bunch of big juicy flaws one could talk about.  I think sort of the most important flawrnthat haunts us today is the way that software is stuck in 1984.  The hardware world has changedrndramatically.  The world ofrncomputer users, everything has changed, but when I turn on a computer I see arndesktop with windows and icons and a menu and I have a mouse.  This so-called desktop UI or GUI (graphical user interface), a brilliant invention at Xerox Palo Alto Research Center in thern1970’s. A group led by Allen Kay and others, developed this idea of the GUI forrnthe first personal computer called the Altos.  So, in 1976, ‘77, ‘78, this was an extraordinaryrninvention.  When Apple saw Xeroxrndemo in, I think it was 1979, and first implemented this thing on the Lisa inrn1983 and in a more commercially successfully form on the Mac in 1984, the brilliantrnmove, the brilliant idea was a GUI. rnHowever, 1984 was a long time ago. rnIt’s a full generation ago. rnAnd while a freshman in one of my classes today, if I sat him down withrna 1984 Mac, he wouldn’t even know it was a computer.  I mean, it’s got this tiny little screen it was sort of likernan upright shoebox and it’s obviously wildly—it doesn’t have a hard drive inrnit.  It’s obviously wildlyrndifferent in terms of its capabilities. rnIt has no internet connection ordinarily.  However, if I turn it on for him and he looks at therndisplay, he’s going to be right at home because it’s exactly what he sees onrnhis Mac or his PC, or whatever platform he’s got today.  He’s got Windows, and they overlap andrnthere are menus and icons and you drag the mouse around.  

rnrn

And so this interface, the Windows interface, the desktoprninterface, that was a brilliant innovation, the most important in computing,rnwhen Xerox came up with it the 1970’s, was a tremendously important steprnforward when Apple popularized it and commercialized it with a Mac in 1984, wasrnstill reasonably fresh and new when Microsoft took over the world with Windowsrn3.0 in 1990, which was—I mean, Microsoft was already a company.  But that’s the product that madernMicrosoft the dominant power in computing for the next period of years.  That was still pretty good inrn1990.  By 1995, the Web was up andrnrunning, uses in computers were becoming different, email was changing the wayrnin which people communicated.  Byrnthe late ‘90’s, the Web was emerging and all the Web commerce—the Web becamernessential to the economy in a lot of ways.  Cell phones were becoming ubiquitous as computing platformsrnof other sorts, there were all sort of new ways to communicate; in the last 10rnyears the emergence of social networking sites and stuff like that have oncernagain changed the picture. 

rnrn

So, users today are a radically different group ofrnpeople.  They use computers forrndifferent purposes than they did in 1984. rnI mean, today I use a computer as an information... first of all as arncommunication device, second of all to manage the information that I have.  It’s essential for me in dealing withrnthe world in a lot of different ways and finding out what’s going on.  Keeping my own informationrnup-to-date.  I certainly don’t userna computer for computing.  Back inrnthe 1980’s people still did compute with computers.  The major users were people who had reasonablyrncomputationally serious applications at the time.  Either they were spreadsheets, or they were runningrnsimulations, video games have always been major users.  Cycles of course going back then too,rnbut the emergence of the network, the radical increase in power of thernhardware, the radical increase in the size of the user base, the radical changernin the purpose to which computers are put, the radical difference in graphicalrncapabilities—modern high definition screen as opposed to a small lowrndefinition, now only grey scale, but black and white screen that the first Macrnhad and that Xerox was working with. rnAll this stuff suggests that we 1984 software is probably not going tornbe the right solution for 2010 computing. 

rnrn

However, the industry, what can you say about an industry which makes so much money so fast, has so many stockholders, responsible to sornmany owners and employees that it’s naturally reactionary.  You know, all the companies that dependrnon Microsoft, all the companies that depend on Apple and there are a few otherrnproducers at the edges dealing with Linux machines and stuff like that—theyrnhave heavy, what do you want to call it, fiduciary responsibilities that makernthem reactionary.  I mean, if yournare a very successful company, you’re slow to change except for the edges.  I mean, you want everybody to thinkrnyou’re a leading edge, you want to do your best to look as if you’re changing,rnbut nonetheless, you’ve got your Windows, and you’ve got your menus, and you’verngot your mouse, you’ve got a display which treats the screen as an opaquernsurface, as a flat surface, as a desktop and there are Windows sort of tackedrnto it as if it were a bulletin board. 

rnrn

What I want is not an opaque surface; I want to think of thernscreen as a view port that I can look through.  I want to be able to look through it to a multi-dimensionalrninformation landscape on the other side. rnI don’t want to have a little bulletin board or a little desktop to putrnstuff on.  That was a brilliantrnidea in 1984, today it’s a constraining view and it’s obsolete.

rnrn

Question: How can a grasprnof emotional subtext be built into artificial intelligence?

rnrn

David Gelernter: AIrnis really—it used to be a completely separate field, in effect.  If you were working in ArtificialrnIntelligence, which had an obvious interface to the neurosciences, to cognitivernscience, to cognitive psychology, experimental psychology, and philosophy ofrnmind—that was a very different field from software, software engineering,rnsoftware design, software architecture. rnToday, the fields increasingly are in contact as techniques arerndeveloped and AI are ubiquitous in modern software.  Generally not very fancy techniques, but systems of rules;rnwhat are called experts, at what used to be called expert systems.  They’re fundamental in a lot ofrnways.  If you look at the heart ofrnthe problems of AI, Artificial Intelligence, going back to the 1950’s, thernquestion—the deeper question AI was interested in, and certainly did want tornbuild useful software, and it still does and has an important goal with AI.  But it was also the philosophical orrnscientific question of, "How does the mind work?  What does it mean to think?And the idea was—finding a definition for thinking,rnunderstanding thinking, had proven enormously elusive over the centuries, sincernDescartes with his opening bid essentially created the field of philosophy ofrnmind in the 17th century, it’s been very difficult to understand what the mindrnis.  What consciousness is, whatrnunderstanding is, and what it means to understand and communicate as opposed tornmerely use language in a robotic or mechanical way. 

rnrn

Now, AI people, in trying to understand thought and mind,rngenerally approach it by attempting to build a working model and software.  The computer has seemed mind-like tornpeople in 1950.  I mean, during thern1950’s it was widely called an electronic brain, and people don’t use thatrnphrase anymore, but they think of the computer in the same terms.  It was an interesting idea, the idea ofrntrying to build a working model of human thought out of software.  Because if I could get a computer tornact as if it were thinking, then I could look at the software and maybe learnrnsomething.  Not necessarily—becausernsoftware is very different from brain-ware—but maybe get some idea of whatrnthought really consists of and maybe what consciousness consists of. 

rnrn

Now, my argument with the direction that AI has taken is that borrowing from the standard approaches in philosophy and mind, in most cases, not in every case, borrowing from approaches in cognitive psychology and experimental psychology, AI had tended to say, “We want to know about thinking, then we’ll move on to emotion, or maybe consciousness.”  Thinking is what we want to do.  The really important thing the mind does, the important activity the mind does is think; solve problems.  A lot of people in AI used problem-solving as an equivalent of thinking.  They said, “Here’s what we’re working on.  We’re working on artificial thought, artificial intelligence, problem-solving software.”  But it’s obvious to anybody that the mind does much more than solve problems.  It’s very rare for anybody to go about solving a problem formally.  Certainly we don’t do a lot of mathematical problem-solving, or problem sets in physics most of the time, and to an extent that we are confronted with problems to solve, we almost always first have recourse to experience and we think, “Well, what did I do the last time?” But in a more fundamental way, it is obvious to anybody—maybe obvious to anybody who is not in AI—that if I am working at my computer and I get tired and I lean back and look out the window and just watch the passing scene, I’m still thinking, my mind hasn’t shut down.  I watch what’srnhappening; I react in more subtle, cognitive ways to what I see.  It’s obvious that when I get tired,rnwhen my mind starts to drift, when I move into the free-associative state thatrnwas studied by Freud that we know precedes falling asleep; free association isrna kind of thinking also.  My mindrndoesn’t shut off, but I’m certainly not solving problems, I’m wanderingrnaround.  And we also know that whenrnwe sleep, we think also.  Sleeprnthought is different than waking thought. rnSleep thought is not solving problems in mathematics, or solving anyrnkind of problems in a methodical way. rnSleep thought is image thought, for the most part, and sleep thought isrnhallucinatory.  I see things thatrnaren’t there. 

rnrn

So, we need to understand the connection, the spectrum thatrnconnects wide-awake, focused, alert problem-solving type of thought with whatrnhappens to my mind as I get tired, as my focus decreases, as I approach sleep.  Actually, the brain goes through several oscillations likernthis during the day, but there’s a continuous spectrum connecting my mostrnfocused, my sharpest kind of analytical thought, the sharpest of which I amrncapable of on the one hand, and the lowest focused kind of thought in which myrnmind drifts and ultimately I find myself asleep and dreaming. 

rnrn

The field of Artificial Intelligence had studied only thernvery top end of the spectrum and still tends to study only the very toprnend.  Tends to say, what isrnthinking, it’s this highly focused, wide awake, alert, problem-solving state ofrnmind.  But not only is that not thernwhole story, but the problem, the sort of biggest unsolved problem has tendedrnto haunt philosophy of mind, cognitive psychology AI is creativity.  People have always been fascinated,rnwhat makes for a creative person? rnWhat explains a creative leap, which is a well defined psychologicalrnevent?  People know when it happensrnto them.  There is generalrnagreement that to be creative is to have the ability to invent new analogies.  To connect two things that are notrnobviously related, but once you have made the connection, you can see, yeah,rnthere is a relationship and other people can see the relationship too andrncreativity flows from that. 

rnrn

Now, we know that the invention of analogy has to do withrnnot highly focused analytic problem-solving thought.  Creating analogy means connecting thoughts, letting yourrnmind drift essentially from one thought to another.  So, not only do we need to study the entire cognitivernspectrum, or cognitive continuum because that’s what human beings are.  They are not problem-solving machines,rnthey do a lot of kinds of thought that is not problem-solving, but if we everrnwant to know what creativity is, if we ever want to know what goes into therninvention of the new analogy, we’re going to have to study the free-associativernstates in which the mind drifts from state to state to state. 

rnrn

And finally, to add in a question which had to do withrnemotion, how do we connect thoughts? rnI’ve argued—and I won’t go through this now—but, as you’re focusrndeclines, as you become less alert, less focused.  As you look out the window, as your mind starts to wander,rnas you start to get drowsy, emotion is playing a more and more important rolernin your thought.  Emotion is whatrnallows us to take two thoughts or ideas that seem very different and connectrnthem together because emotion is a tremendously subtle thing; a subtle kind ofrncode, or tag that can be attached to a very complicated scene.  If I say "What is your emotion on thernfirst really warm day in April or March when you go out and you don’t need arncoat and you can smell the flowers blooming and there may be remnants of snow,rnbut you know it’s not going to snow anymore and there’s a certain springinessrnin the air.  What do you feel?"  It’s not that you feel happyrnexactly.  There are a million kindsrnof happiness.  It’s a particularrnshade of emotion.  Or what you feelrnon a million other occasions.  Yourngo to the mailbox and you see a letter from a girlfriend from 15 years ago andrnyou haven’t heard from her since. rnOr you read in the newspaper something about blah, blah, blah, or yourrncomputer breaks just when you need to do something.  There are many, many circumstances in which I can say I’mrnhappy, I’m sad, but I need to go much further, beyond language.  I can’t use ordinary language to describernthe nuanced emotions that I feel on these occasions.  They’re much more subtle than elation or depression.  But the mind knows what they are evenrnif they can’t be reduced to words and I can connect two scenes that seem veryrndifferent, that are virtually are very different.  Where the actors are different, where the scenes arerndifferent, the colors are different, the sounds are different, but nonetheless, ifrnthey made me feel the same way, I can connect them and create analogies.  So, how it ultimately fits together.

rnrn

Question: How hasrnsurviving the Unabomber attack changed your life?

rnrn

David Gelernter:rnZero.  It was my responsibility—Irnthink it would be anybody’s who had been attacked in a particularly cowardlyrnand despicable fashion—to go on. rnIf I had said "This attack had changed me in the following 10 ways"... I’mrnnot interested in being changed by criminals, murderers, and terrorists.  I’m interested in being whoever I wasrndestined to be as a member of my family and my community and that’s what I’vernbeen doing.  It slowed me down,rnpresented physical challenges, but it didn’t change my worldview, or the sortrnof broader sense...  Worldview in thernsense—there’s a tremendously useful German word used in philosophy:rnweltanschauung.  Worldview meaningrnnot just looking around, but how to make sense of things, how I put it togetherrnin a coherent way.  So, myrnworldview is the same.

rnrn

Question: Has being thernvictim of an attack changed your feelings about terrorism?

rnrn

David Gelernter:rnI’m not a victim.  I never was,rnnever will be.  Victimhood isrnsomething you choose, or something you reject.  I and so many others have done before me and are doingrntoday, they rejected, hate the tendency of society to glorify victimhood and tornspeak of oppression and victimhood and persecution as some soft of badge ofrnhonor, or something of that sort. rnI’m not a victim. 

rnrn

On terrorism, on the other hand, I guess it’s fair to sayrnthat I had a close-up personal look at terrorism.  I don’t think my views have changed any.  The fact is, any member of the AmericanrnJewish community has relatives who lived through the Holocaust, and who has morernimportant, has relatives or close friends in Israel, who were either attackedrnthemselves or whose family has experienced terrorist attack because terrorismrngoes back many centuries, but has always been a weapon of choice of Jew-hatersrnand Israel-haters...  So, the tragicrnfact is that the reality of terrorism is fundamental cowardliness, isrnfundamental anti-human character. rnI think it's familiar to everybody... I’d should say not just in thernAmerican Jewish community, the fact is that America is unique in its sympathyrnfor Israel.  Europe certainlyrndoesn’t feel this way, Asia doesn’t feel this way.  This is not a feeling only of American Jews.  In fact, in many cases, the Christianrncommunity has been—has shown itself as much more interested in Israel’s faternand well-being than the Jewish community, which has its own political axes torngrind.  I think America in generalrnhas felt close to, in some ways, because the states are so similar—there is nornnation in the world set up by people with bibles in their back pockets as a NewrnIsrael, there’s no nation that has been set up on that basis aside from thernUnited States and Israel.  So,rnthere’s always been the sympathy, and growing up one has the feeling, one had arnfeeling in this country, I mean back in the 1960s and ‘70s, that terroristrnattacks on Israel were hitting close to home.  It was impossible not to be aware of the nature ofrnterrorism, the threat of terrorism. rnIt’s something that I’ve always lived with, tragically, as has everybodyrnwho has felt close to Israel.

rnrn

Question: What makes yournoptimistic about the century ahead?

rnrn

David Gelernter:rnWe’re looking at—in one word maybe I should say, graphics.  Not only computer graphics orrnanimation, but the enormously increased scope for pictures; for showingrnpictures, for seeing pictures, for seeing things.  Seeing is a source of wisdom and pleasure in a lot ofrnways.  Mankind really has nornvocabulary to discuss color because if you look at art history, until tworngenerations ago, nobody knew what paintings looked like, they could bernreproduced in black and white going back to the 19th century, before then theyrncouldn’t be reproduced in any way at all, but until, say the 1930’s, ‘40’s,rn‘50’s in the 20th century, there was no way to, you could say Titian is a greatrncolorist, or Velazquez has extraordinary subtle browns, or the reason the 13thrncentury glass at Chartres is unique is because of the blue.  The special blue.  But you could see it.  You had to travel to France or to yournknow, Venice to see—wherever.  Andrnnot only that, once you were there, unless you stayed, you’re not going to stayrnplanted in front of a picture in a museum and nor are you going to camp out inrna cathedral.  But computers havernnot only made printing—has not only made displaying on their screens, but theyrnmade printing on paper—color printing—vastly better and inexpensive.

rnrn

The possibility that we have now of seeing what mankind hasrndone, the art that has been done, the cities that have been built, thernlandscapes that have drawn on people is a tremendously exciting—and to see eachrnother, because ultimately that is what people want to see most of all is otherrnpeople.  That’s exciting.  It opens up a new world that mankindrnhas longed for ever since he’s seen... "Colors are good, and I want to make my worldrncolorful, and I want to see my fellow human beings and I want to build thingsrnand I want the horizons to be further than what I can see from my front door."

Recorded on April 1, 2010.

rnrnrnrnrn

Related