<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Zócalo Public Squaregenius &#8211; Zócalo Public Square</title>
	<atom:link href="https://legacy.zocalopublicsquare.org/tag/genius/feed/" rel="self" type="application/rss+xml" />
	<link>https://legacy.zocalopublicsquare.org</link>
	<description>Ideas Journalism With a Head and a Heart</description>
	<lastBuildDate>Mon, 21 Oct 2024 07:01:54 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
		<item>
		<title>Einstein&#8217;s Genius Wasn&#8217;t in His Brain; It Was in His Friends</title>
		<link>https://legacy.zocalopublicsquare.org/2020/02/20/albert-einsteins-brain/ideas/essay/</link>
		<comments>https://legacy.zocalopublicsquare.org/2020/02/20/albert-einsteins-brain/ideas/essay/#respond</comments>
		<pubDate>Thu, 20 Feb 2020 08:01:52 +0000</pubDate>
		<dc:creator>by Sal Restivo</dc:creator>
				<category><![CDATA[Essay]]></category>
		<category><![CDATA[connectome]]></category>
		<category><![CDATA[creativity]]></category>
		<category><![CDATA[Einstein's brain]]></category>
		<category><![CDATA[genius]]></category>
		<category><![CDATA[myth of lone genius]]></category>
		<category><![CDATA[social brain theory]]></category>
		<category><![CDATA[social networks]]></category>

		<guid isPermaLink="false">https://legacy.zocalopublicsquare.org/?p=109636</guid>
		<description><![CDATA[<p>In 2017, the “Genius” issue of <i>National Geographic</i> credited Albert Einstein’s ability to harness the power of his “own thoughts” to predict gravity waves, a century before gravity waves were detected using highly sophisticated technologies. Does this prove that Einstein really was, as many have claimed, the “genius of all geniuses?” </p>
<p>Einstein and his brain are iconic objects—a sacred scientific hero and a sacred relic––but thinking differently about him now can help us revise outdated ideas about genius and about ourselves. There are several reasons to question Einstein’s genius: First, the very idea of “genius” has come under critical scrutiny in contemporary research on creativity. Second, a new view of the social basis of creativity has emerged in the last quarter century; new ideas are created in social networks, not in individuals or individual brains. Third, the idea of a biological brain is being superseded by a new paradigm that </p>
<p>The post <a rel="nofollow" href="https://legacy.zocalopublicsquare.org/2020/02/20/albert-einsteins-brain/ideas/essay/">Einstein&#8217;s Genius Wasn&#8217;t in His Brain; It Was in His Friends</a> appeared first on <a rel="nofollow" href="https://legacy.zocalopublicsquare.org">Zócalo Public Square</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>In 2017, the “Genius” issue of <i>National Geographic</i> credited Albert Einstein’s ability to harness the power of his “own thoughts” to predict gravity waves, a century before gravity waves were detected using highly sophisticated technologies. Does this prove that Einstein really was, as many have claimed, the “genius of all geniuses?” </p>
<p>Einstein and his brain are iconic objects—a sacred scientific hero and a sacred relic––but thinking differently about him now can help us revise outdated ideas about genius and about ourselves. There are several reasons to question Einstein’s genius: First, the very idea of “genius” has come under critical scrutiny in contemporary research on creativity. Second, a new view of the social basis of creativity has emerged in the last quarter century; new ideas are created in social networks, not in individuals or individual brains. Third, the idea of a biological brain is being superseded by a new paradigm that sees the brain in a social context. It has become increasingly clear in the life and social sciences that humans are the most social of the social species. We can now say with some confidence that the “I” is a grammatical illusion. We all, as Walt Whitman claimed in <i>Song for Myself</i>, contain multiples; the self is a mosaic, not a unitary ego, in a scientific sense as well as a poetic one. </p>
<p>This doesn’t challenge the uniqueness of Einstein and his achievements but it does change our understanding of that uniqueness.  </p>
<p>When we identify Einstein as a genius, we learn more about ourselves and our culture than we do about Einstein. The term “genius” rests on the concept of the individual as an entity that stands apart from society, history, and culture—even outside of time and space. Culturally, genius is also gendered and divinely inspired—so to meet a genius is to meet a male god. The element of the male divine spins the genius right out of the world into a sacred space. It sets Einstein and his brain apart from the rest of us.                                           </p>
<p>In the real world, there is no such thing as the lone wolf genius. Every genius, like every person, is a social network. And every genius stands on the shoulders of a social network, not the shoulders of giants. For the commonly accepted concept of “genius” to be meaningful it would have to be rooted in genes, neurons, or both. In that case, geniuses would appear at random and scattered across intellectual and cultural landscapes. On the contrary, the most comprehensive studies of genius by social scientists have demonstrated that geniuses do not appear at random. Instead, genius <a href="https://legacy.zocalopublicsquare.org/2017/02/21/never-get-one-isolated-great-thinker-time/ideas/nexus/" target="_blank" rel="noopener noreferrer">clusters</a>.</p>
<p>The fact that creative acts and actors cluster was recognized in the ancient world. Modern research shows that creative clusters appear predictably during times of rapid decline or rapid growth within civilizations. We also know that new ideas, theories, and technologies emerge simultaneously in different places in the same cultural neighborhoods and share a family resemblance. The <a href="https://legacy.zocalopublicsquare.org/2017/02/22/genius-alone-doesnt-advance-big-ideas/ideas/nexus/" target="_blank" rel="noopener noreferrer">particular version that prevails</a> and the person or persons who get credit for the innovation hinges on negotiation, politics, public relations, personalities, connections, and in some cases (take, for example, the electrical engineer Nikola Tesla) the outcomes of patent disputes.   </p>
<p>The notion that Einstein’s “own thoughts,” were responsible for his insights into gravity waves ignores his collaborations with Michele Besso and Michael Grossman during the construction of the general theory. It was Grossman, for example, who helped Einstein with the geometry and the concept of tensors he needed to formalize the theory. In the same way, the portrait of Einstein as a lone wolf patent clerk who published the revolutionary 1905 papers leaves out a network of his influences—from Newton to Lorentz, and Poincaré to Minkowski. It also obscures the roles of his friends, teachers, and colleagues in physics, of his first wife Mileva Marić, and his math assistant Walther Mayer. </p>
<p>The important point is not that Einstein worked with and depended on others. It is that Einstein <i>is</i> those others—they are embodied in his self as a social network. When you understand all the people who went into Einstein being Einstein, does the label “genius” really help us understand him or is it merely a representation of untutored awe and worship?   </p>
<div class="pullquote">Einstein and his brain are iconic objects—a sacred scientific hero and a sacred relic—but thinking differently about him now can help us revise outdated ideas about genius, and about ourselves.</div>
<p>What did Einstein’s genius cluster look like? Einstein’s 1905 papers came in the midst of a cultural flowering of ideas, inventions, and discoveries across the full spectrum of the arts, humanities, and sciences between 1840 and 1930. Einstein’s genius cluster in physics included such luminaries as Planck, Tesla, Marconi, Westinghouse, Madame Curie, the Wright Brothers, Emmy Noether, and Edison. The two great innovations in physics that would remain at the core of physics throughout the twentieth and into the twenty-first century—relativity theory and quantum mechanics—were born in the early 1900s.  </p>
<p>Expanding that genius cluster to encompass music brings in such names as Sibelius, Puccini, DeBussey, Schoenberg, Stravinsky, and Charles Ives. Innovations in literature include the rise of the novel, American Transcendentalism, Realism, Stream of Consciousness, various forms of Modernism, Naturalism, the growth of children’s literature, and the Harlem Renaissance of the 1920s. There was a sympathetic mutuality that linked Cubism (represented by Picasso’s “Les Demoiselles d’Avignon,” 1907) and Relativity Theory. Both involved challenges to conventions regarding absolute time and space. </p>
<p>The period 1840-1930 also witnessed a veritable Copernican revolution, the emergence and crystallization of the social sciences. This period can be considered the classic Age of the Social.  It ushered in the idea that we are through and through social beings. </p>
<p>Ultimately, by looking at the myth of Einstein’s brain, we can understand how the myth of individualism is at odds with the evolutionary reality that humans are always, already, and everywhere social. Einstein’s singular status is not a matter of genes, neurons, quantum phenomena, or the biological brain; the architecture of his brain reflected his experiences in the world, all of the social networks he encountered in his life. Since the1990s, developments in social neuroscience, studies of brain plasticity, epigenetics, and network theory have fueled the development of an explanation for Einstein’s genius—a social brain paradigm. </p>
<p>The idea that we have social brains arose from hypotheses about the connection between brain size and social complexity. Beginning in the 1920s and then more systematically in the 1950s, these hypotheses were explored in studies of non-human primates. Two conflicting hypotheses fueled this research: larger brains led to larger and more dense social networks; or larger and more dense social networks led to larger brains. Over time, it seemed more reasonable to hypothesize that brain size, and the size and density of social networks, were coupled in co-evolution.  </p>
<p>All of this led to the crystallization of the social brain hypothesis, which entered the neuroscience literature in 1990. This hypothesis initially identified specific regions of the brain (including, for example, the amygdala and the insula) as “the social brain.” More recent studies suggest that the whole brain must be considered a social and cultural entity. In other words, the brain is a complex organ that originates and functions at the nexus of biological, environmental, and social forces. By the 2000s, the social brain hypothesis was finding its way into studies of autism, schizophrenia, and other classic topics in psychiatry.  </p>
<p>The story of pathologist Thomas Harvey removing Einstein’s brain during the autopsy in 1955 is well known. However, there were no studies of Harvey’s brain slides between 1955 and 1985, and those done between 1985 and the early 2000s proved, in the end, to be sterile. The noteworthy features of Einstein’s brain some researchers identified were controversial, and many experts who studied Einstein’s brain found nothing unusual. One brain scientist said it was just an old, diseased brain. These studies were guided by the false assumption that the mind is the brain, and by an inability to “see” social life as the locus of causal forces that shape our behaviors, emotions, and thoughts. </p>
<p>And yet, the myth that we are our brains lives on in science, politics, and the culture. It is the basis for Bush’s proclamation of the 1990s as the Decade of the Brain, Obama’s 2013 BRAIN initiative, and comparable policy pronouncements in Europe, the Middle East, and China. Brain research remains haunted by the myth of individualism, which is at its root the myth of the brain in a vat. (<i>The Matrix</i> is an artistic gloss on this metaphor.) The social brain, though, proposes a far more powerful concept: Network thinking, which is capable of connecting the smallest parts, such as neurons, across multiple scales to the global network of information and communication. Don’t think of a brain in a vat, but of a connectome—in which everything from cells and neurons to neural nets, to the body, its microbiome and its organs, and to social relations and the environment are linked by a circulation of information. </p>
<div class="signup_embed"><div class="ctct-inline-form" data-form-id="3e5fdcce-d39a-4033-8e5f-6d2afdbbd6d2"></div><p class="optout">You may opt out or <a href="https://www.zocalopublicsquare.org/contact-us/">contact us</a> anytime.</p></div>
<p>It’s been 65 years since Einstein’s brain was removed during the autopsy and still the most insightful discussion of it was found not in the halls of science and philosophy, but in TV land.  On July 21, 1999, David Letterman audience members were allowed to ask questions of “Einstein’s brain,” a model brain in a beaker of green gelatin. After they presented their questions, they were told that due to Einstein’s death in 1955, they were addressing dead tissue, which could not answer. This comedic vignette did more for neuroscience than all of the papers and lectures on Einstein’s brain.</p>
<p>The post <a rel="nofollow" href="https://legacy.zocalopublicsquare.org/2020/02/20/albert-einsteins-brain/ideas/essay/">Einstein&#8217;s Genius Wasn&#8217;t in His Brain; It Was in His Friends</a> appeared first on <a rel="nofollow" href="https://legacy.zocalopublicsquare.org">Zócalo Public Square</a>.</p>
]]></content:encoded>
			<wfw:commentRss>https://legacy.zocalopublicsquare.org/2020/02/20/albert-einsteins-brain/ideas/essay/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Genius Alone Doesn’t Advance Big Ideas</title>
		<link>https://legacy.zocalopublicsquare.org/2017/02/22/genius-alone-doesnt-advance-big-ideas/ideas/nexus/</link>
		<comments>https://legacy.zocalopublicsquare.org/2017/02/22/genius-alone-doesnt-advance-big-ideas/ideas/nexus/#respond</comments>
		<pubDate>Wed, 22 Feb 2017 08:01:51 +0000</pubDate>
		<dc:creator>By Neil Gross</dc:creator>
				<category><![CDATA[Essay]]></category>
		<category><![CDATA[Nexus]]></category>
		<category><![CDATA[Berggruen Institute]]></category>
		<category><![CDATA[genius]]></category>
		<category><![CDATA[ideas]]></category>
		<category><![CDATA[philosophy]]></category>
		<category><![CDATA[social climate]]></category>
		<category><![CDATA[thinkers]]></category>

		<guid isPermaLink="false">https://legacy.zocalopublicsquare.org/?p=83712</guid>
		<description><![CDATA[<p>Where do big new ideas come from—the kind that break the mold and change how we see the world? As a sociologist, this has long been an interest of mine. So I was excited to read Michael Lewis’ new book <i>The Undoing Project</i>, which tells the story of Daniel Kahneman and Amos Tversky, the Israeli psychologists whose work on decision-making helped convince economists—and everyone else—that people aren’t nearly as rational as economic theory would predict. Kahneman and Tversky’s research on the cognitive errors to which we’re all prone was so transformative for economics that it scored Kahneman a Nobel Prize in 2002. (Tversky died in 1996.) How’d they come up with it? And what can we learn from their experience? </p>
<p>Lewis gives us a fast-moving tale with all the verve you’d expect from the author of <i>Flash Boys</i> and <i>Moneyball</i>. It focuses on how—for a time—two geniuses became </p>
<p>The post <a rel="nofollow" href="https://legacy.zocalopublicsquare.org/2017/02/22/genius-alone-doesnt-advance-big-ideas/ideas/nexus/">Genius Alone Doesn’t Advance Big Ideas</a> appeared first on <a rel="nofollow" href="https://legacy.zocalopublicsquare.org">Zócalo Public Square</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>Where do big new ideas come from—the kind that break the mold and change how we see the world? As a sociologist, this has long been an interest of mine. So I was excited to read Michael Lewis’ new book <a href=http://books.wwnorton.com/books/The-Undoing-Project/><i>The Undoing Project</i></a>, which tells the story of Daniel Kahneman and Amos Tversky, the Israeli psychologists whose work on decision-making helped convince economists—and everyone else—that people aren’t nearly as rational as economic theory would predict. Kahneman and Tversky’s research on the cognitive errors to which we’re all prone was so transformative for economics that it scored Kahneman a Nobel Prize in 2002. (Tversky died in 1996.) How’d they come up with it? And what can we learn from their experience? </p>
<p>Lewis gives us a fast-moving tale with all the verve you’d expect from the author of <i>Flash Boys</i> and <i>Moneyball</i>. It focuses on how—for a time—two geniuses became joined at the hip. </p>
<p>Kahneman, nervous and self-critical, was a teenager when he fled occupied France with his family during World War II. Settling in Jerusalem before the 1948 war, he was part of the first wave of students to study psychology at the Hebrew University. He conducted psychological research for the Israeli army before heading off to graduate school at UC Berkeley.</p>
<p>Tversky, the child of Russian émigrés, had the self-confidence Kahneman lacked. A dazzling scholar with a mathematical mind, Tversky was also a paratrooper who’d been awarded a medal of bravery by Moshe Dayan, the Israeli general. After completing his military service, Tversky went to the University of Michigan for his Ph.D.</p>
<p>In 1969 Kahneman asked Tversky to speak in a graduate seminar he was leading in Israel. Tversky discussed a line of experimental research he’d learned about in Michigan. Subjects were presented with 20 poker chips, one at a time, said to have been drawn at random from bags containing different mixes of red and blue chips. (Lewis offers a slightly different description of the experimental setup.)</p>
<p>Researchers found that as people were presented with more reds or blues, they gave different estimates of the mix of each bag. If they were presented with a lot of reds, for example, they guessed the bag contained mostly red chips. Their guesses didn’t correspond perfectly with statistical theory—they were more conservative than the chip data warranted—but to the researchers who carried out the experiment, and to a young Amos Tversky, the lesson was that people had an instinctive understanding of probability. </p>
<p>Kahneman wasn’t impressed—and he told Tversky so. He thought it was obvious that if someone sees a lot of red chips, the guess is going to be “mostly red bag.” This isn’t necessarily because the human brain is hard-wired to understand odds. It could just be generalizing from something you’ve observed, with no clue at all as to the underlying mathematics.</p>
<p>Tversky was dumbstruck. Of course Kahneman was right. How had he not seen it before? But if humans don’t grasp odds effortlessly, how do they go about making decisions in situations where it’s important to understand probability, from investors gauging portfolio growth to doctors relying on imaging to make cancer diagnoses?</p>
<p>The answer, the two men came to conclude, is by relying heavily on intuition, perception, and cognitive shortcuts. This often leads to choices that are suboptimal from the perspective of strict rationality. Kahneman and Tversky began an intensely productive collaboration. In paper after paper they explored the errors and biases to which the human mind is prone. Together they laid the groundwork for an intellectual revolution not just in psychology, but also in economics, eventually providing the foundation for the new field of behavioral economics, which uses psychology to help understand economic behavior. </p>
<p>Kahneman and Tversky spent hours huddling together, devising new experiments, writing, being funny. As Lewis describes it, their innovative scholarship was one part the magic of collaborative synergy, one part sheer brilliance, and one part Israeli gumption: Getting human behavior right matters when your country faces existential threat every day. </p>
<p>This makes for a great story. But it left me wondering about the broader social roots of Kahneman and Tversky’s ideas.</p>
<p>You might not think where ideas come from is the purview of sociology. But early thinkers in the social sciences like Karl Marx or Max Weber—the German sociologist best-known for his writings on the Protestant work ethic—recognized that what intellectuals do and say can affect the course of society. Marx argued that intellectuals often develop systems of thought that advance the interests of the economic class they belong to. Weber held that the cultural values of an intellectual’s social group guide the direction of his or her thinking.</p>
<div class="pullquote"> You might not think where ideas come from is the purview of sociology. But early thinkers in the social sciences like Karl Marx or Max Weber … recognized that what intellectuals do and say can affect the course of society.  </div>
<p>Since these early theories, other ways of understanding the social origins of thought have arisen. For example, in a 1998 book the sociologist <a href=https://legacy.zocalopublicsquare.org/2017/02/21/never-get-one-isolated-great-thinker-time/ideas/nexus/>Randall Collins</a> considered how the social networks of philosophers—who they know—influence their ideas. Collaborators are part of one’s network, but so are teachers, students, colleagues, and confidants.</p>
<p>One of the many philosophers Collins wrote about is Jean-Paul Sartre, who gave us existentialism, the mid-20th-century French philosophy extolling the virtues of authenticity. Sartre belonged to an elite intellectual network in Paris. Collins showed that existentialism took shape as Sartre brought together concepts and ideas that were already floating around in the café-society milieu. </p>
<p>Other sociologists study how ideas spread and make an impact. Marion Fourcade-Gourinchas and Sarah Babb, for instance, published <a href=http://www.journals.uchicago.edu/doi/abs/10.1086/367922>a study in 2002</a> on why many countries abandoned Keynesian economic policies in the 1970s and 1980s, embracing free market theories and policies in their place. </p>
<p>Looking at developments in Britain, France, Mexico, and Chile, Fourcade-Gourinchas and Babb showed that policy makers glommed onto free market ideas partly because they thought deregulation could help their countries weather the economic turbulence of that period. But political reasons also mattered. In Chile, for instance, laissez-faire caught on after the 1973 military coup. General Augusto Pinochet, who wanted to get rid of the last vestiges of socialism, put the nation’s economic policies in the hands of Chilean economists who’d been trained at the University of Chicago, a center of free market thought. Free market thinking also had the backing of Chilean business groups concerned about their bottom line. If it hadn’t been for a complicated mix of economic, political, and cultural factors, these ideas wouldn’t have been able to make headway. </p>
<p>It almost goes without saying that if Lewis had written about Kahneman and Tversky from a sociological point of view, his book wouldn’t have been quite as exciting. Escaping from Nazis, jumping out of airplanes, forming platonic intellectual love affairs, showing up snide, complacent psychologists with pithy remarks—that’s tough stuff for sociological analysis to compete with. But something’s lost when you don’t pay attention to intellectuals’ social environments.</p>
<p>When I was reading <i>The Undoing Project</i>, I kept thinking about how economics became the queen of the social sciences in the post-war period. Scholars in other fields were jealous. Fame awaited anyone who could credibly show that the rationality assumptions at the heart of modern economics were, well, a stretch. (And that you could explain stuff better if you relaxed those assumptions.)</p>
<p>If you look at the <a href=http://blogs.lse.ac.uk/impactofsocialsciences/2016/05/12/what-are-the-most-cited-publications-in-the-social-sciences-according-to-google-scholar/>all-time most cited social science articles</a>, you’ll see two pieces by Kahneman and Tversky, but also a 1973 article called <a href=https://sociology.stanford.edu/sites/default/files/publications/the_strength_of_weak_ties_and_exch_w-gans.pdf>“The Strength of Weak Ties”</a> by sociologist Mark Granovetter and an article published the same year by economist Oliver Williamson (another Nobel winner) titled <a href=https://books.google.com/books?id=pg-wGL12BjUC&#038;lpg=PA106&#038;ots=vUVwhyFa9l&#038;dq=markets%20and%20hierarchies&#038;lr&#038;pg=PA106#v=onepage&#038;q=markets%20and%20hierarchies&#038;f=false>“Markets and Hierarchies.”</a> Both articles argued—against economic orthodoxy—that markets just aren’t places where efficiency and unbounded rationality reign supreme. </p>
<p>Around the same time, scholars of law and society were busy showing that law on the books is one thing; law as it’s actually practiced by judges, lawyers, and cops is quite another. And historian Thomas Kuhn had already come out with his <a href=https://www.amazon.com/Structure-Scientific-Revolutions-Thomas-Kuhn/dp/0226458083>famous account of paradigm change</a>—claiming that even in science rationality doesn’t always prevail.</p>
<p>Kahneman and Tversky must have sensed that they’d discovered a potent line of attack against standard economic thinking. Their ideas thus had a lot to do with the state of the intellectual field at the time, the social and academic world they navigated: It provided powerful incentives for their collaborative work, and gave them an opening they seized.</p>
<p>As to why their ideas have gained so much traction recently with policymakers and the public (Kahneman’s 2011 book, <a href=http://us.macmillan.com/thinkingfastandslow/danielkahneman/9780374533557/><i>Thinking, Fast and Slow</i></a>, was a runaway best-seller), one hypothesis is that knowing something about behavioral economics has become a mark of status in business and political circles. On the heels of wild stock market valuations in the 1990s, the crazy real estate bubble of the early 2000s, and the subsequent crash, conventional economic models haven’t been looking so good.<br />
Behavioral economics is a plausible alternative, and if you devise a new marketing campaign or a set of public policies based on behavioral premises, you’ll look pretty smart.</p>
<p>And there’s a side benefit. If you take the position that markets don’t work perfectly because people are irrational, it gets you out of having to face the possibility that markets don’t work perfectly because markets are inherently exploitative and prone to crisis.</p>
<p>Stories about geniuses with big ideas are fun to read, but they’re not necessarily the whole story. Unless you have a sense for the social context in which ideas develop and spread, it’s impossible to get a proper handle on how much intrinsic value an idea may have versus how much it may <i>seem</i> to have in light of social factors surrounding it. </p>
<p>The post <a rel="nofollow" href="https://legacy.zocalopublicsquare.org/2017/02/22/genius-alone-doesnt-advance-big-ideas/ideas/nexus/">Genius Alone Doesn’t Advance Big Ideas</a> appeared first on <a rel="nofollow" href="https://legacy.zocalopublicsquare.org">Zócalo Public Square</a>.</p>
]]></content:encoded>
			<wfw:commentRss>https://legacy.zocalopublicsquare.org/2017/02/22/genius-alone-doesnt-advance-big-ideas/ideas/nexus/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Name That Tune: Da-Da-Da-DUM</title>
		<link>https://legacy.zocalopublicsquare.org/2012/12/13/name-that-tune-da-da-da-dum/events/the-takeaway/</link>
		<comments>https://legacy.zocalopublicsquare.org/2012/12/13/name-that-tune-da-da-da-dum/events/the-takeaway/#respond</comments>
		<pubDate>Thu, 13 Dec 2012 13:00:49 +0000</pubDate>
		<dc:creator>by Sarah Rothbard</dc:creator>
				<category><![CDATA[The Takeaway]]></category>
		<category><![CDATA[creativity]]></category>
		<category><![CDATA[genius]]></category>
		<category><![CDATA[music]]></category>

		<guid isPermaLink="false">https://legacy.zocalopublicsquare.org/?p=43449</guid>
		<description><![CDATA[<p>They may be the most easily recognizable four notes of music ever composed, but the opening of Beethoven’s Fifth Symphony confounds orchestras and conductors with its very notation: a rest that <em>Boston Globe </em>music critic Matthew Guerrieri, author of <em>The First Four Notes: Beethoven’s Fifth and the Human Imagination, </em>called “a little slice of nothing.”</p>
<p>Zócalo editor T.A. Frank opened his conversation with Guerrieri by asking the crowd at the Heard Museum in Phoenix, at an event co-presented by the Arizona State University Center for Science and the Imagination, to guess the correct time signature of the piece. About half of the audience correctly identified it as 2/4. When Frank prodded Guerrieri to conduct the audience in singing those infamous first four notes in unison, again just about half of the group was successful.</p>
<p>Guerrieri explained that the easiest thing for a conductor to do in order to ensure a </p>
<p>The post <a rel="nofollow" href="https://legacy.zocalopublicsquare.org/2012/12/13/name-that-tune-da-da-da-dum/events/the-takeaway/">Name That Tune&lt;span class=&quot;colon&quot;&gt;:&lt;/span&gt; Da-Da-Da-DUM</a> appeared first on <a rel="nofollow" href="https://legacy.zocalopublicsquare.org">Zócalo Public Square</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>They may be the most easily recognizable four notes of music ever composed, but the opening of Beethoven’s Fifth Symphony confounds orchestras and conductors with its very notation: a rest that <em>Boston Globe </em>music critic Matthew Guerrieri, author of <em>The First Four Notes: Beethoven’s Fifth and the Human Imagination, </em>called “a little slice of nothing.”</p>
<p>Zócalo editor T.A. Frank opened his conversation with Guerrieri by asking the crowd at the Heard Museum in Phoenix, at an event co-presented by the <a href="http://csi.asu.edu/">Arizona State University Center for Science and the Imagination</a>, to guess the correct time signature of the piece. About half of the audience correctly identified it as 2/4. When Frank prodded Guerrieri to conduct the audience in singing those infamous first four notes in unison, again just about half of the group was successful.</p>
<p>Guerrieri explained that the easiest thing for a conductor to do in order to ensure a harmonious opening is to give the orchestra a beat before the rest. But that has other hazards: “It’s the sort of thing that other conductors will regard you as less of a conductor for doing,” he said “It’s become this odd sort of contest to see who can do the most grand gesture.”</p>
<p>So what, besides their familiarity and difficulty, makes these notes so special—what makes them genius?</p>
<p>Guerrieri explained that when Beethoven wrote the piece, there was a tradition of symphonies beginning with “slow, grand statements.” Beethoven, instead, starts the Fifth with a fast tempo, only to pull the rug out from underneath listeners’ feet by stopping immediately—then stopping and starting and stopping again before the piece starts up in earnest. It was shocking in Beethoven’s day, said Guerrieri, for its abruptness, for being “nakedly aggressive,” and for being “such a direct piece of music.”</p>
<p>But beyond shock value, asked Frank, what is it that makes Beethoven a genius?</p>
<p>Guerrieri said that it’s a combination of traits, the first being the almost athletic genius of playing the piano: “It’s muscles, it’s using your body, and it’s making your body do something on command,” he said. But it’s also about a “daisy chain of connections,” from physical talent with the instrument to translating that talent onto a page of written instructions to mastering that talent and taking it somewhere new. On top of all this, Beethoven had these skills at a young age, which set him apart from his contemporaries.</p>
<p>But he also had another form of brilliance: “the genius of making a career out of it.” Beethoven, said Guerrieri, “very consciously manipulated and leveraged his fame, which at the time was an unusual thing.” We take it for granted because we live in a celebrity-driven culture, but in Beethoven’s day, lasting celebrity—particularly for a composer—was something new.</p>
<p>Beethoven, like a lot of geniuses, was something of an outsider, said Frank—it’s a thread that seems to run through the stories of brilliant people.</p>
<p>Guerrieri said that this idea—of an artist or genius being apart from everyone else—is a Romantic one that in fact started with Beethoven. The German Romantics adopted him as their example of a genius with more privileged access to the sublime and to ideas beyond conventional human understanding. The Romantics were also the first to consider madness to be a symptom of genius—and you can see this start with Beethoven, who was so legendarily irascible and antisocial.</p>
<p>Beethoven was also celebrated because he was deaf—and thus perceived as being shut off from the world so he could receive divine inspiration without interference. This idea was reinforced when, in the 1860s, his body was transferred to a new tomb, and his skull was discovered to be unusually thick—which, wrote Richard Wagner, also helped him get inspiration from above without the rest of the world interrupting.</p>
<p>Today, said Frank, the study of creativity and genius is an interdisciplinary field, with neuroscientists and social scientists and music teachers offering studies and instructions on unleashing the genius within all of us. Yet we haven’t produced a Beethoven recently. What, he asked, is wrong with us?</p>
<p>Guerrieri said he didn’t know if the cultural world will ever produce a figure like Beethoven again. His incredible fame came in part because of the small circle he inhabited—he could “encompass the entire scope of the area in which he could become famous.” Today, there are so many more niches and styles and genres that the same level of cultural saturation is impossible. “There probably are geniuses of Beethoven’s level out there,” he said. “But they will not become the sort of overpowering, universal figure that Beethoven did.” It’s not losing something, he added—it’s our cultural life getting richer.</p>
<p>Frank quoted a 1970s Paul Masson advertisement proclaiming that the amount of time it took Beethoven to compose the Fifth Symphony—four years—is proof that “some things can’t be rushed: good music and good wine.”  Is being slow or labored part of genius?</p>
<p>It can be, said Guerrieri, and it certainly was for Beethoven. But that plodding pace is about the homework and discipline that genius requires. If you look at geniuses, they put in the work, he said—they’re honing their craft all the time, whether they’re James Joyce or Beethoven or George Gershwin. The one great example of Beethoven’s genius, said Guerrieri, is that there’s no shortcuts to it.</p>
<p>In the question-and-answer session, an audience member asked how Beethoven would feel about tonight’s discussion: What would he think about being labeled a genius—and did he consider himself to be one?</p>
<p>“My guess is he would’ve scolded you for calling him a genius, but he would have agreed with you,” said Guerrieri. In a conversation late in life, Beethoven’s nephew told him he was famous for being deaf—and Beethoven “smacked him down a little.” Beethoven felt that he was a genius, but he also felt he worked hard for it. And he didn’t want to become a novelty act—the deaf composer. “I think,” Guerrieri concluded, “he knew how good he was.”</p>
<p>The post <a rel="nofollow" href="https://legacy.zocalopublicsquare.org/2012/12/13/name-that-tune-da-da-da-dum/events/the-takeaway/">Name That Tune&lt;span class=&quot;colon&quot;&gt;:&lt;/span&gt; Da-Da-Da-DUM</a> appeared first on <a rel="nofollow" href="https://legacy.zocalopublicsquare.org">Zócalo Public Square</a>.</p>
]]></content:encoded>
			<wfw:commentRss>https://legacy.zocalopublicsquare.org/2012/12/13/name-that-tune-da-da-da-dum/events/the-takeaway/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Misbehave, Kids, So You Can Become a Genius</title>
		<link>https://legacy.zocalopublicsquare.org/2012/12/12/misbehave-kids-so-you-can-become-a-genius/ideas/up-for-discussion/</link>
		<comments>https://legacy.zocalopublicsquare.org/2012/12/12/misbehave-kids-so-you-can-become-a-genius/ideas/up-for-discussion/#respond</comments>
		<pubDate>Wed, 12 Dec 2012 08:01:43 +0000</pubDate>
		<dc:creator>Zocalo</dc:creator>
				<category><![CDATA[Up For Discussion]]></category>
		<category><![CDATA[creativity]]></category>
		<category><![CDATA[genius]]></category>
		<category><![CDATA[imagination]]></category>

		<guid isPermaLink="false">https://legacy.zocalopublicsquare.org/?p=43360</guid>
		<description><![CDATA[<p>Few subjects have received more attention and study than creativity. But it also takes, well, creativity to put research into practice, and there’s no consensus that 21st century society has become creative. In advance of a &#8220;How Do We Make Sense of Genius?”, a Zócalo event on the nature of genius, we asked experts: Has any of the vast literature on creativity actually made us more creative?</p>
<p>The post <a rel="nofollow" href="https://legacy.zocalopublicsquare.org/2012/12/12/misbehave-kids-so-you-can-become-a-genius/ideas/up-for-discussion/">Misbehave, Kids, So You Can Become a Genius</a> appeared first on <a rel="nofollow" href="https://legacy.zocalopublicsquare.org">Zócalo Public Square</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p>Few subjects have received more attention and study than creativity. But it also takes, well, creativity to put research into practice, and there’s no consensus that 21<sup>st</sup> century society has become creative. In advance of a &#8220;<a href="https://legacy.zocalopublicsquare.org/event/">How Do We Make Sense of Genius?</a>”, a Zócalo event on the nature of genius, we asked experts: Has any of the vast literature on creativity actually made us more creative?</p>
<p>The post <a rel="nofollow" href="https://legacy.zocalopublicsquare.org/2012/12/12/misbehave-kids-so-you-can-become-a-genius/ideas/up-for-discussion/">Misbehave, Kids, So You Can Become a Genius</a> appeared first on <a rel="nofollow" href="https://legacy.zocalopublicsquare.org">Zócalo Public Square</a>.</p>
]]></content:encoded>
			<wfw:commentRss>https://legacy.zocalopublicsquare.org/2012/12/12/misbehave-kids-so-you-can-become-a-genius/ideas/up-for-discussion/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>The Silence Before the Symphony</title>
		<link>https://legacy.zocalopublicsquare.org/2012/12/12/the-silence-before-the-symphony/books/readings/</link>
		<comments>https://legacy.zocalopublicsquare.org/2012/12/12/the-silence-before-the-symphony/books/readings/#respond</comments>
		<pubDate>Wed, 12 Dec 2012 08:01:16 +0000</pubDate>
		<dc:creator>Zocalo</dc:creator>
				<category><![CDATA[Readings]]></category>
		<category><![CDATA[Beethoven]]></category>
		<category><![CDATA[genius]]></category>
		<category><![CDATA[music]]></category>

		<guid isPermaLink="false">https://legacy.zocalopublicsquare.org/?p=43412</guid>
		<description><![CDATA[<p><em>The opening notes to Beethoven’s Fifth might be one of the most easily recognizable musical passages ever written. But to unaccustomed ears, the symphony’s opening can be confounding, explains </em>Boston Globe <em>critic Matthew Guerrieri. Guerrieri visits Zócalo to discuss what makes this symphony and its composer special. Below is an excerpt from his book, </em>The First Four Notes: Beethoven’s Fifth and the Human Imagination.</p>
<p>The first thing to do on arriving at a symphony concert is to express the wish that the orchestra will play Beethoven’s Fifth. If your companion then says “Fifth what?” you are safe with him for the rest of the evening; no metal can touch you. If, however, he says “So do ­I”—­this is a danger signal and he may require careful handling.</p>
<p>—­Donald Ogden Stewart, <em>Perfect Behavior</em> (1922)</p>
<p>Jean-­François Le Sueur was not quite sure what to make of Beethoven’s Fifth. Le Sueur was a </p>
<p>The post <a rel="nofollow" href="https://legacy.zocalopublicsquare.org/2012/12/12/the-silence-before-the-symphony/books/readings/">The Silence Before the Symphony</a> appeared first on <a rel="nofollow" href="https://legacy.zocalopublicsquare.org">Zócalo Public Square</a>.</p>
]]></description>
				<content:encoded><![CDATA[<p><em>The opening notes to Beethoven’s Fifth might be one of the most easily recognizable musical passages ever written. But to unaccustomed ears, the symphony’s opening can be confounding, explains </em>Boston Globe <em>critic Matthew Guerrieri. Guerrieri <a href="https://legacy.zocalopublicsquare.org/event/how-do-we-make-sense-of-genius/">visits Zócalo</a> to discuss what makes this symphony and its composer special. Below is an excerpt from his book, </em>The First Four Notes: Beethoven’s Fifth and the Human Imagination.</p>
<p>The first thing to do on arriving at a symphony concert is to express the wish that the orchestra will play Beethoven’s Fifth. If your companion then says “Fifth what?” you are safe with him for the rest of the evening; no metal can touch you. If, however, he says “So do ­I”—­this is a danger signal and he may require careful handling.</p>
<p>—­Donald Ogden Stewart, <em>Perfect Behavior</em> (1922)</p>
<p><a href="https://legacy.zocalopublicsquare.org/2012/12/12/the-silence-before-the-symphony/books/readings/attachment/the-first-four-notes/" rel="attachment wp-att-43417"><img decoding="async" class="alignright size-full wp-image-43417" style="margin: 10px;" title="The First Four Notes" src="https://legacy.zocalopublicsquare.org/wp-content/uploads/2012/12/The-First-Four-Notes.jpg" alt="" width="125" height="187" /></a>Jean-­François Le Sueur was not quite sure what to make of Beethoven’s Fifth. Le Sueur was a dramatic composer, a specialist in oratorios and operas, and the Parisian taste for such fare (along with Le Sueur’s career) had persisted from the reign of Louis XVI through the Revolution, through Napoléon, through the Restoration. For audiences suddenly to be whipped into a frenzy by <em>instrumental</em> ­music—­as they were in 1828, when a new series of orchestral concerts brought Paris its first sustained dose of Beethoven’s symphonies—­was something curious. Le Sueur, nearing 70, was too refined to fulminate, but he kept a respectful distance from the ­novelties—­that is, until one of his students, an up-­and-­coming enfant terrible named Hector Berlioz, dragged his teacher to a performance of the Fifth. Berlioz later recalled Le Sueur’s postconcert reaction: “Ouf! I’m going outside, I need some air. It’s unbelievable, wonderful! It so moved and disturbed me and turned me upside down that when I came out of my box and went to put on my hat, for a moment I ­didn’t know where my head was.”</p>
<p>Alas, in retrospect, it was too much of a shock: at his lesson the next day, Le Sueur cautioned Berlioz that “All the same, that sort of music should not be written.”</p>
<p>In 1920, Stefan Wolpe, then an 18-­year-­old student at the Berlin Hochschule für Musik, organized a Dadaist provocation. He put eight phonographs on a stage, each bearing a recording of Beethoven’s Fifth Symphony. He then played all eight, simultaneously, with each record turning at a different speed.</p>
<p>A socialist and a Jew, Wolpe would flee Nazi Germany; he eventually ended up in America, cobbling together a career as an ­avant-­garde composer and as a teacher whose importance and influence belied his lack of fame. (The jazz saxophonist Charlie Parker, shortly before he died, approached Wolpe about lessons and a possible commissioned piece.) In a 1962 lecture, Wolpe recalled his Dada years, revisiting his Beethoven collage; ­in a bow to technological change, this performance used only two phonographs, set at the ­once-­familiar 33 and 78 r.p.m. Wolpe then spoke of “one of the early Dada obsessions, or interests, namely, the concept of unforeseeability”:</p>
<p>That means that every moment events are so freshly invented,<br />
so newly born,<br />
that it has almost no history in the piece itself<br />
but its own actual presence.</p>
<p>*</p>
<p>If today we regard Le Sueur’s frazzled confusion as quaint, it is at least in part because of the subsequent ubiquity of the Fifth Symphony. The music’s immediacy has been forever dented by its celebrity. Wolpe’s eightfold distortion can be heard as a particularly outrageous attempt to re-create Le Sueur’s experience of the Fifth, to conjure up a time when the ­work’s course was still unforeseeable. It is an uphill ­battle—­in the two centuries since its 1808 premiere, Beethoven’s Fifth has become so familiar that it is next to impossible to ­re-­create the disorientation that it could cause when it was newly born.</p>
<p>The disorientation is built right into the symphony’s opening. Or even, maybe, <em>before</em> the opening: the symphony begins, literally, with silence, an eighth rest slipped in before the first note. A rest on the downbeat, a bit of quiet, seems an inauspicious start. Of course, every symphony is surrounded by at least theoretical silence. Though, in reality, preconcert ambient noise, or at least its ­echoes—­overlapping conversations, shifting bodies, rustling programs, ­air-­conditioning, and so ­on—­may in fact bleed into the music being performed, we nonetheless create a perceptive line between nonmusic and music, enter into a conspiracy between performers and listeners that the composer’s statement is ­self-­contained, that there is a sonic buffer zone between everyday life and music. (Like most conspiracies, it thrives on partial truths.) The obvious interpretation is that silence functions as a frame for the musical object. The less obvious (and groovier) interpretation is that the music we hear is but one facet of the silence it comes out of.</p>
<p>This is almost certainly not what Beethoven was thinking about when he put a rest in the first measure of the Fifth Symphony. But, were Beethoven ­really trying to mess around with the boundary between his symphony and everything outside of it, he would have been anticipating the French philosopher Jacques Derrida, the guru of deconstruction, by nearly 200 years. Derrida talks about frames in his book <em>The Truth in Painting</em>, noting that when we look at a painting, the frame seems part of the wall, but when we look at the wall, the frame seems part of the painting. Derrida terms this slipstream between the work and outside the work a parergon: “a form which has as its traditional determination not that it stands out, but that it disappears, buries itself, effaces itself, melts away at the moment it deploys its greatest energy.”</p>
<p>Our minds dissolve the frame as we cross the Rubicon into Art. But Beethoven drags the edge of the frame into the painting itself, stylizing it to the point that, for anyone reading the score, at least, this parergon refuses to go quietly, as it were. Beethoven waits until we’re ready, then gruffly asks if we’re ready yet.</p>
<p>We can <em>see</em> the silence on the page, in the form of the rest. But do we hear it in performance? The rest completes the meter of ­2/4—­two beats per measure, with the quarter note getting the ­beat—­which, normally, would mean that the second of the three following eighth notes would get a little extra emphasis. But most readings give heavy emphasis to all three eighth notes, steamrolling the meter (which is ­really only one beat to a bar ­anyway—­more on that in a minute). Paleobotanist, artist, and sometime composer Wesley Wehr recalled one consequence of such steamrolling:</p>
<p>Student composer Hubbard Miller, as the story goes, had once been beachcombing at Agate Beach. He paused on the beach to trace some musical staves in the sand, and then added the opening notes of Beethoven’s Fifth Symphony. Hub had, however, made a slight mistake. Instead of using eighth notes for the famous “da, da, da, <em>dum</em>!,” Hub had written a triplet. He had the right notes, but the wrong ­rhythm—­an easy enough mistake for a young lad to make. Hub looked up to find an elderly man standing beside him, studying the musical misnotation. The mysterious man erased the mistake with one foot, bent down, and wrote the correct rhythmic notation in the sand. With that, he smiled at Hub and continued walking down the beach. Only later did Hub learn that he had just had a “music lesson” from Ernest Bloch.</p>
<p>Knowledge of the rest is like a secret handshake, admission into the guild. (Bloch, best known for his 1916 ­cello-­and- orchestra “Rhapsodie hébraïque” <em>Schelomo</em>, was also a dedicated photographer who liked to name his images of trees after composers: “Bloch sees ‘Beethoven’ invariably as a single massive tree appearing to twist and struggle out of the soil.”)</p>
<p>Indeed, one practical reason for the rest is to reassure the performers of the composer’s professionalism. Beethoven knew that any conductor would signal the downbeat anyway, so he put in the rest as a placeholder for the conductor’s gesture. And it’s liable to be a fairly dramatic gesture at that. The meter indicates two beats to the bar, but no conductor actually indicates both beats, as it would tend to bog down music that needs speed and forward momentum. Instead, the movement is conducted “in one,” indicating only the downbeat of every bar.</p>
<p>So the conductor has one snap of the baton to get the orchestra up to full speed. And the longer the Fifth Symphony has retained its canonical status, the more that task has come to be seen as perilous. For the two leading pre-World War I pundits of conducting, Richard Wagner and Felix Weingartner, starting the Fifth was no big deal. Wagner takes ignition for granted, being far more concerned with the lengths of the subsequent holds, while Weingartner scoffs at his colleague Hans von Bülow’s caution: “Bülow’s practice of giving one or several bars beforehand is quite unnecessary.” But jump ahead to the modern era, and one finds the British conductor Norman Del Mar warning of “would-­be adopters of the baton” suffering “the humiliation of being unable to start the first movement at all.” Gunther Schuller, American composer and conductor, is equally dire, calling the opening “one of the most feared conducting challenges in the entire classical literature.” Del Mar reaches this conclusion: “It is useless to try and formulate the way this is done in terms of conventional stick technique. It is direction by pure force of gesture and depends entirely on the will-­power and total conviction of the conductor.”</p>
<p>The post <a rel="nofollow" href="https://legacy.zocalopublicsquare.org/2012/12/12/the-silence-before-the-symphony/books/readings/">The Silence Before the Symphony</a> appeared first on <a rel="nofollow" href="https://legacy.zocalopublicsquare.org">Zócalo Public Square</a>.</p>
]]></content:encoded>
			<wfw:commentRss>https://legacy.zocalopublicsquare.org/2012/12/12/the-silence-before-the-symphony/books/readings/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
