Skip to content

Ubiquitous Helvetica

If you are like me and enjoy writing for as long as I have, you are no doubt sensitive to the appearance of particular fonts to reflect certain moods and expressive forms. Words provide the contour (ideas) and fonts the in-coloring (tone); jointly bringing another dimension to the creative process of writing. When used wisely, the combination can inordinately enhance the consumption of thoughts as well as wonderfully enrich the interaction between author and reader. An amusing 2007 documentary simply named “Helvetica” explores the humble 1950s roots of this minimalist Swiss font to its gradual appearance and now ubiquitousness in graphic design. The Helvetica font emerged in the European postwar era of the 1950s that abounded in idealism, reflecting a sense of enlightened optimism for a peaceful, rational and aesthetically-pure future purged of the clutter and turbulence of the past. The overarching principle of graphic designers in these quintessentially Modern times was to instill a sense of neutrality and cleanliness in visual presentation. It was in this context that two Swiss designers, Eduard Hoffman and Max Miedinger, then working for a German graphics company based in Zurich invented Neue Haas Grotesk, or what would simply come to be known as Helvetica (another name for Switzerland as a not-so-subtle tribute to their nationalist identities, ironic indeed given the platform). Helvetica’s sans-serif purity reflected a novel aspect of producing fonts that not only had well-balanced, precise lines & curves but also emphasized the space between letters to hold their shape. This combination brought structure and order to what had previously been the cursive-happy, formless, visually-saturated advertising of the 1940s and early 1950s. The documentary, overlaid with the funky music of Mogwai, consists of interviews with a range of prominent graphic & typeface designers on their attitudes towards this controversial (for some revered & while others much-maligned) font with a good balance maintained debating the opposing viewpoints. First among these is the towering master of Modernism, Massimo Vignelli of Itay, designer of the 1972 New York subway map and iconic logos for America Airlines and the United Colors of Benetton, who extols the virtues of simplicity, clarity, and minimalism in visual presentation while denouncing the vulgarity of postmodernism. Postmodernism arose in the 1970s as a reaction against the prevailing idealism via a counterculture obsessed with disorder, subjectivity and flouting of conventions (this period was also marred by violent anti-war protests that also emblematically challenged authority). [ My book review of Steven Pinker’s Blank Slate explored these themes from a cognitive science perspective ]. Vignelli, dressed in a hip, tight-fitting black shirt and sleek designer glasses in line with his dematerialist aesthetic, points to the return of Modernist design principles in the current era as vindication of the timelessness of its principles (the previous post on Fumihiko Maki’s new work on the MIT campus is an obvious example). Two other Modernist vanguards interviewed are Wim Crouwel of the Netherlands and Matthew Carter of the UK, both men now of advanced age, who nostalgically reminisce of an era where typefaces were crafted on paper by hand and manufactured using embossed metallic blocks although both have wholeheartedly embraced the digital revolution and sing its praises for elevating their art form as well as democratizing the profession altogether. On the opposing side, are somewhat paradoxically, two German graphic designers – Michael Spiekerman and Stefan Sagmeister – who both denounce Helvetica as being uniform, bland, and devoid of personality. Spiekerman shows little restraint in his scathing remarks when he quips “Helvetica has no contrast, no rhythm, every letter tries to look the same” while Sagmeister simply declares that “Modernism got boring” (Sagmeister explains his design philosophy in this amusing TED talk). Herein lies the classic tension between the individual artist seeking to impart as much of his own creativity into his work while also needing to be mindful of the instinctual, human tendency for clarity in form. The transforming attitudes towards Helvetica are an embodiment of the larger debate surrounding human aesthetics and nature. My only criticism of this documentary was that it did not focus enough on typeface design and what made a font visually appealing, it abstained from briefly exploring what the universal measures of beauty are and how they are reflected in a particular typeface. Instead, what was presented was interview after interview between the principally older typeface pioneers disparaging the work of the naive and irreverent younger generation, and vice versa. Nevertheless, “Helvetica” is entertaining, educational and worth checking out.


Fumihiko Maki’s Media Lab Extension opens

Any visitor to MIT’s campus will be readily struck by the unusual architecture of many of its buildings. There is Steven Holl’s Simmons Hall, Frank Gehry’s Stata Center, Kevin Roche’s Zesiger Sports and Fitness Center, Eero Saarinen’s Kresge Auditorium & nearby chapel, Alvar Aalto’s Baker House, I.M. Pei’s Media Lab, Charles Correa’s Brain & Cognitive Science Complex, the Broad Institute and the soon to be completed Koch Center for Cancer Research (I happen to work in one marvelous such completion). All of the recent buildings were initiated as part of the multi-billion dollar “Evolving Campus” project spearheaded by former President Charles Vest during his tenure spanning the heydays of 1990-2004 that has nearly added a staggering million square feet of new lab & office space. The emphasis was to continue the tradition of placing the Institute at the forefront of not only technology but also architecture. The latest and final project of this very successful construction frenzy was unveiled yesterday with the opening of the extension to the Media Lab designed by Pritzker-prize winning Japanese architect Fumihiko Maki. I attended the open house and took my time exploring this magnificent six floor, 163 000 square foot, new work symbolized by its abundant airy public spaces basking in wonderful natural lighting enhanced on an unusually sunny clear day. The building is an extension of I.M. Pei’s original 1984 Media Lab and the juxtaposition vividly captures the transformation that computing technology has undergone in the 25 years since. The original Media Lab, as explained in today’s symposium by former Dean of the School of Architecture William J. Mitchell and author of “Imagining MIT”, was an opaque, austere, monastic building characterized by the absence of windows and dark interior lighting. This was designed in an era when computers were solitary, disconnected devices that required the use of dark surroundings to be seen better and when by-and-large the scientific culture was centered around the individual working in isolation and thus requiring solemn, private office space. Times have sure changed. The last 25 years has seen a remarkable worldwide trend towards globalization, interconnectedness, collaboration mediated by disruptive digital technologies and a corresponding movement towards open source, decentralized, interdisciplinary, team-based approaches to problem solving. This is precisely what is embodied in Mr. Maki’s design for the Media Lab’s extension where glass is ubiquitous. What I love most about this new work is the prevalent sense of structure evidenced by its elegant rectilinear form capped by the wondrous curves of its “crown” that forms the roof of the atrium on the sixth floor with sublime views of the Charles river and the Boston Back Bay skyline. Mr. Maki in a presentation this morning on the design principles behind his work emphasized many things: the use of horizontal aluminum girders on the front surface for “light composition,” the intentional selection of red, blue and yellow for the staircases as a tribute to modernism (think Mondrian), the notion that the building would function as a house making it incumbent that one is always aware of one’s own location wherever one happens to be situated, and the particular emphasis on providing as much functionality and efficient design into a small a space as possible (where he referenced his Japanese roots as vindication of his direct commission for the project). At 81 years old, Mr. Maki’s eloquence and charisma were on full display as he provided anecdotes of his days in the graduate school of design at Harvard in the early 1950s when he first became aware of the site, in particular a hideous brutalistic, imposing housing project that was constructed across the street from the current Media Lab unceremoniously designed by MIT architects. Mr. Maki even made mention of a conversation with a Japanese graduate student at MIT during yesterday’s reception who had as a child attended an elementary school designed by Mr. Maki in 1970 where he claimed that his spatial experience in both buildings was the same (a fact that Mr. Maki took special pride in). However there was one point in particular that I had been thinking about during my investigation of the building’s interior and which I had the chance to bring up during the Q&A session afterwards. Given the recent trends of designing buildings to promote collaboration & team work as a sign of the broader trend towards interconnectedness, how is the individual able to find solitary, private spaces to escape to during personal times? While I very much enjoy the wide, airy and light features of such buildings, I find it disturbing that anyone anywhere in the building is easily observed. In an densely connected, always on, wired world we are increasingly losing the means to find our own independent voice, to think our own thoughts, and to disconnect and unwind with nothing but silence. William Mitchell answered my question by mentioning that he too had thought along these lines and made reference to the purposeful design of offsetting adjacent floor levels that serves as a separator dividing research divisions. Nonetheless, Mr. Maki’s new building is a gem, a marvelous work of art, and the finest building on the MIT campus.

Mamoru Hosoda’s “Summer War’s”

Rising anime star Mamoru Hosoda was on the MIT campus Monday night this week to debut his new film “Summer Wars” to North American audiences. While I would not consider myself a mega-fan of the genre I decided to attend mainly to see this critically acclaimed film on 35mm print and to participate in the Q&A session afterwards. Needless to say, the auditorium in 26-100 was filled to capacity as a huge throng of colorful nerds from all over the Boston area showed up in full force (more on these creatures in a minute), as well as several pockets of smartly-dressed Japanese fans, and a number of older Cambridge intelligentsias who were curious to see for themselves what all the fuss was about in the posters plastered around campus. Summer Wars was a captivating film that masterfully fused a rich storyline with jaw-dropping special effects, elements that Japanese anime houses have perfected into an art form. I had never really seen anime on the big screen and my first experience three nights ago left me mesmerized as I was vividly engrossed in the make-believe world of sumptuously realistic cartoon characters, anthropomorphic avatars, and the near complete abandonment of the laws of physics. Briefly, the story revolves around a teenage girl, Natsuki, who makes a trip to the Japanese countryside to attend the 90th birthday of her grandmother as part of a large family gathering. Natsuki decides to bring along a male friend & classmate, Kenji, who is a math prodigy (pretending later that the two are dating to appease her family). The serene, idyllic landscape surrounding this intimate family reunion is violently disrupted when it is discovered that a malicious virus has taken over OZ, a massive virtual reality world very much akin to the internet, that threatens to bring about utter societal chao. Kenji is implicated in the breach of the once-impenetrable cyber security and the entire clan is then drawn into the gripping mayhem to try and restore order. There are many important themes explored in this engaging and entertaining flick: the interaction between the older generation and the young (a recurring topic in Japanese literature and film stemming from a culture steeped in tradition), the blurring boundary between direct face-to-face communication and that mediated by social media, the dual role of technology to be both good and evil, and the role of the family in an increasingly classless, ageless and solitary world. The stunning visual effects when coupled with the jolting audio track made for a delightfully visceral experience. After the film, the nerd armada and hardcore anime fiends in attendance were quite eager to impress upon one another their encyclopedic knowledge of anime during the Q&A with the director. Although Hosoda spoke through an interpreter, many of the nerds laughed at his jokes and smirked at his references before the translation and as I later found out most anime fans have used the genre as a means to learn Japanese. One such person posed his question in Japanese citing his need to directly convey his thoughts and appreciation to Hosoda that everyone in the audience found amusing. The choice of MIT to preview this film was very appropriate and the crowd no doubt deeply appreciated both the screening and the rare opportunity to interact with Hosoda.

The Singapore Model

Singaporeans are vastly over-represented at MIT, I’m not aware of the exact statistics but the number of them that I have met during my years here has far exceeded its proportional representation on the global stage. Singapore, a tiny city-state on the southern tip of the Malay peninsula just 85 miles north of the Equator, continues to solidify its standing as a regional epicenter of research, finance, tourism and travel (just recently MIT announced an agreement to establish Singapore’s fourth university with former dean of Engineering & Institute Professor Thomas Magnanti serving as the founding President of the Singapore University of Technology & Design). To appreciate how remarkable this is one need only consider that a little more than 50 years ago this island speck was nothing but a destitute swamp; a former British colonial backwater deprived of human & material resources, only a gateway for moving goods into the region. I started thinking more about Singapore after I recently met the grandson (currently an MIT undergrad) of Singapore’s founder Lee Kuan Yew, an uncompromising, hard-nose leader who is widely revered in East Asia for the miraculous transformation of Singapore he made possible (albeit with the help of a brilliant Dutch economist Albert Winsemius) starting as a large manufacturing base for low-cost commodities and eventually rising rapidly up the value chain to research & development. Singapore’s meteoric rise to the pantheon of developed countries is also mirrored in its increasing globally aware & integrated citizens, many of whom are sponsored by the government to pursue academic degrees at the world’s leading institutions and promised stable employment upon their return (a prominent example is ASTAR familiar to most Singaporeans). Yet the defining characteristic that for me explains Singapore’s success is unquestionably the determined stewardship of its government. The government has acted with a heavy hand to dramatically remake society and fashioned social engineering for a more utopian vision of statehood by confronting social taboos that had severely held back progress: for example, ethnic quotas are meticulously maintained on living quarters all over the miniscule island, dictating who can live where, so as to discourage racial enclaves & and promote integration. Consider even the daunting challenges of becoming a politician: members of Singapore’s parliament are thoroughly vetted for their educational background and technical acumen and are given a “probationary” period where they are constantly evaluated before being allowed to participate in meaningful decision making. This has meant that some of the best and brightest minds are serving in the highest levels of government, making far-reaching decisions on public & economic policy; yes Singapore is run by some of the most highly educated, supremely accomplished and utterly loyal technocrats around. This model could hardly be more at odds with members of the United States Congress many of whom are lawyers, lack academic pedigree and seem to pledge allegiance more to their parties than their country. For this reason Western democracies have a diametrically-opposed modus operandi as the unmistakable signature of democratic governments to evince maddening gridlock and stifling partisanship is in some respects meant to empower individual citizens to take responsibility for their own lives that have made America so successful. The Eastern model works from the top down as the citizenry grounded in Confucian culture respect authority and revere order, this culture is based on vastly different cultural underpinnings emphasizing team work, working for the greater good, and a collective spirit to improve the lot of the ordinary citizen through direct state intervention. China no doubt is headed directly into this technocratic path and Singapore may very well present a glimpse into its future.

The Enchanting Piano of Dustin O'Halloran

Sometimes the simplicity of modern, slow-moving, pianissimo tonality is enough to convey deep, solitary, sublime emotion. Isolated yet rounded, somber yet pensive ivory notes devoid of the flowering fullness of symphonic or vocal accompaniment are all that is necessary to evince the sobering grandness of being alive. This is particularly the case of the endearing piano music of little-known American composer Dustin O’Halloran (I’ve posted some of his work here). His touching Opus 23 overlaid with Marco Morandi’s whimsical animation is wondrous strange. Absorbing Mr. O’Halloran’s delightfully simple compositions also reminded me how effective music can be to enhance the visceral qualities of film-making in astute ways, some prominent examples that readily come to mind are Julian Schnabel’s masterpiece “The Diving Bell and the Butterfly,” Florian von Donnersmarck’s gem “The Lives of Others,” and Stephen Daldry’s faulty enigma “The Reader”, three films that explore tragic human suffering on different levels connected in key scenes with the undertones of hauntingly beautiful piano music. O’Halloran’s music also draws slight comparisons to the venerable composer Ennio Morricone, renowned for his scores in such legendary films as Fistful of Dollars & other Sergio Leone spaghetti westerns, Cinema Paradiso and The Untouchables among many others. Cudos to Mr. O’Halloran and other such talented contemporary composers for their remarkable ability to distill the garrulousness of (much of) everyday life to its minimalist wholesome core.

Operation Ketsugo & the Japanese Warrior Code

The greatest tragedy of the 20th century is universally acknowledged to be the calamity of World War II which wrought indescribable destruction on the continent of Europe and was primarily the responsibility of arch-villain Adolf Hitler and his fanatical henchmen in the National Socialist party. However most people fail to remember that what really ended the unspeakable devastations of WWII was the unconditional surrender of the Japanese Imperial Army under the terms of the Potsdam Declaration on August 15th, 1945 more than three months after the downfall of the Axis powers. Indeed, the Pacific theater of WWII was arguably witness to the most gruesome fighting in the history of modern warfare that somewhat curiously is scantly regarded as such by the public at large. An engrossing documentary produced by the History channel entitled “Victory in the Pacific” dramatically narrates using incisive commentary, actual war footage, and probing interviews with historians the epic clash between the United States and the Empire of Japan. The fateful showdown commenced with the infamous Japanese surprise attack on Pearl Harbor during the dawn of December 7, 1941 when kamikaze pilots sought to destroy a significant portion of the American naval fleet so as to reign supreme over vast expanses of Pacific ocean territory. This attack lead to FDR’s famous declaration of war the next day in front of Congress and according to many future generations “awoke the giant from its slumber.” The first major encounter between the two military titans occurred on June 4, 1942 during the Battle of Midway when the Americans, boosted with the immeasurable advantage of cracked Japanese communication codes forewarning of the impending assault, ambushed the Imperial Navy off the tiny islands 1200 miles from the coast of Hawaii and destroyed four aircraft carriers, thus permanently weakening the Japanese military and spelling its eventual demise. This crucial turning point of the war also culminated in the gradual emergence of a radically different war strategy on the part of the Japanese termed “ketsugo:” acknowledging the reality that the war could not be won, the Japanese instead decided to wage a war of attrition through a fight to the bloody death in order to eventually secure favorable peace terms with the Allied powers (one that most importantly would preserve the Imperial institutions and national sovereignty). Ketsugo stemmed from the Samurai code of honor that had given rise to the Imperial government which revered an honorable death over a disgraceful surrender. It was this fateful decision to pursue such an audacious plan that lay the groundwork for some of the most fierce fighting the world had ever witnessed: pivotal battles for miniscule island terrain saw waves and waves of ferocious Japanese suicide attacks meant to demoralize the Americans and blunt their enthusiasm for prolonging the war. The legendary Battles of Iwo Jima and Okinawa produced massive casualties on both sides; the Americans were aghast at the savagery of their adversaries (one marine interviewed spoke of witnessing mutilated corpses of American soldiers on the gored battlefield who had been beheaded, had missing fingers and some with their own genitals grimly stuffed into their mouthes). Curtis LeMay, the heralded American air force commander who conceived and led the fire bombing of Tokyo in March 1945 that claimed the lives of 100 000 Japanese civilians once astutely remarked that “all war is immoral, and if you let it get to you then you are not a good soldier.” (Another brilliant documentary produced by Erroll Morris entitled “The Fog of War” offers telling insight into the stalwart LeMay from Robert McNamara, who would later become JFK’s Defense Secretary.) The rampant racial propaganda on both sides sought to exacerbate the demagoguery by demonizing and dehumanizing the other side so as to heighten the zeal for bloodletting. The Japanese Imperial Army had with the cunning manipulation of Shinto religious symbols reinforced the divine link of the Emperor to Japan’s founding deities to further brainwash its civilians into enduring the punishing shock of war and making whatever sacrifices necessary in the name of defending the homeland from evil foreign aggression. The civilians were lead to believe that capture by the Americans would result in savage rape, cruel torture and vile murder. This mindset was sadistically manifest during the last days of the Battle of Okinawa when with the military defense crumbling, many civilians (with their children in tow) leapt from steep rocky cliffs to their deplorable deaths in order to avoid the indignities thought to be awaiting them from the Americans. Such was the ferocity of the fighting that lead American war planners to conclude that a full scale invasion of the Japanese home islands (codenamed Operation Downfall and planned for November 1945) would certainly cause inexorable American casualties that could very well shake the public’s conviction to continue the battle until unconditional surrender had been won. It was precisely in this gloomy context that the decision to use the atomic bombs on Hiroshima and Nagasaki was made with little deliberation and objection by President Truman and the Joint Chiefs for fear of the unspeakable price the Japanese were going to make the Americans pay when they came ashore Kyushu for the planned invasion (the final showdown that the Emperor had been pressing his generals for in the midst of collapsing morale). It must have come as a total surprise then when Emperor Hirohito finally broke the deadlock in his divided cabinet and accepted the Potsdam Declaration (even withstanding a coup d’etat that night by fanatical nationalists who refused his surrender). In a highly unusual gesture unbecoming of his ceremonious post, he ambiguously announced Japan’s surrender without accepting responsibility for the nation’s turmoil by radio broadcast to his exasperated citizens who were stunned to hear his voice for the first time, thus ending his tragic quest to preserve his throne on the back of what was surely to be the utter annihilation of his country. (An intriguing follow up BBC documentary investigates the motives for why Emperor Hirohito never faced a military tribunal for war crimes in the aftermath of Japan’s surrender when many of the other members of his government were tried and executed. It argues that the Americans, in particular the decorated general Douglas MacArthur who oversaw the post-war occupation, deemed that removal of the Emperor would spell chaos in the shattered Japanese civil society and would likely give rise to a dreaded Communist insurgency. History has proved that preserving the Emperor, albeit symbolically, to have been an extraordinarily wise policy for stabilizing Japan as it recovered from the rubble and contributed to its rebuilding as part of the league of democratic nations. Yet the decision to preserve and even exonerate the Emperor of responsibility for the war also engendered widespread anger that the Emperor had evaded punishment.) Both documentaries are very engaging and worth checking out.

Canada’s unnatural obsession with hockey

With the Winter Olympics in Vancouver fast approaching, I was reminded recently of all the hysteria that must surely be building up north of the border. No doubt the preparations are humming along smoothly, the city looking immaculate after major cosmetic enhancements, the final touches being placed on the massive construction projects all finished well ahead of schedule, and Canada looking to roll out the red carpet for the world but I bet I know where the entire focus of the nation has been concentrated: on the men’s national hockey team. You see, Canada has a quixotic obsession with hockey that singularly trumps everything else in the public consciousness (comparable, perhaps, to the Brazilian lust for football). It is hard for me to convey the extent of the Canadian infatuation, nay compulsion, for this peculiar winter sport played on ice with sticks and skates. Hockey news however trite is front page news. The fate of the men’s team at the Olympics will weigh crucially on the Canadian psyche and its self-perception among the elite nations of the world: a win would reinforce Canadian supremacy over the sport and breathe confidence into the national mood in these particularly troubling times (in addition to the Great Recession, Canadian forces have suffered many casualties in Afghanistan), a loss would be utterly devastating and will no doubt usher in a prolonged bout of soul searching pocked with lingering feelings of inferiority. This is precisely what happened in the 1998 Winter Olympics in Nagano, Japan when for the first time superstars from the NHL (the professional North American hockey league) were allowed to represent their respective countries at this supreme sporting event. And thus when the much vaunted Canadian men’s hockey was shockingly eliminated in the quarter-finals (the quarter-finals!) by a lightly regarded team from the Czech Republic (on the back of some incredible goalkeeping by the legendary Dominik Hasek and superb skills of wunderkind Jaromir Jagr), not only were the entire hopes of a nation dashed but it triggered mass introspection into the country’s increasingly diminished standing on the world stage. The Canadian press lamented the long lost glory days (the pinnacle when Phil Henderson scored “the goal” (endlessly replayed on television) to beat the Soviet Union in the final game of the 1972 hockey summit) and some politicians bore the brunt of the public anger by being voted out of office; the whole country was feeling low. In response, the immortal hockey god Wayne Gretzky was summoned to oversee the rebuilding of the men’s program in preparation for the 2002 games in Salt Lake City and with the unbearable weight of the country pressing down upon him amidst unrelenting second guessing, he became the savior the country had been lusting for: Canada crushed the United States on its home soil in the gold medal game 5-2. Global order had been restored. Mass euphoria erupted on city streets as hordes of red clad, maple–leaf-waving citizens poured out in droves to bask in the renewed glory of their country (I witnessed the raucous first hand in downtown Toronto during my undergrad days). Representing the men’s hockey team is the dream of many youngsters even as they take their first steps on the ice as toddlers draped in oversized team jerseys. In Canada, hockey players are idols to be worshiped who with missing teeth, scar-laced faces, and sweaty brows incoherently utter banalities peppered with “eh”s to awestruck TV crews during post-game interviews in the locker room. Perhaps Leo Strauss would approve of this potent unifying element in society, citing its profound ability to bring such a vast country together. But I must say, it is a little childish: a nation so fixated with the ability of its citizens to place a rubber puck into a net as a barometer of its international stature is no great society at all. (update 3/5: thankfully Canada won gold; all is well for at least another 4 years)

Robert Sapolsky on the Uniqueness of Human Beings

Many individuals who deny the truth of evolution and instead vouchsafe in the divine origins of our species all universally seem to point to our putative unique position in the animal kingdom: our complex emotions, idiosyncratic tendencies, and enlightened virtuosity so the argument goes distinguishes humanity from all other life forms on our planet. In a very amusing and entertaining response (delivered as part of the Class Day lectures to graduating seniors from Stanford University), neuroscientist and primatologist Robert Sapolsky masterfully dispels these myths by presenting a compelling list of distinctly human behavioral traits that have been observed throughout his lifelong research into other primate species, most notably baboons and chimpanzees. Specifically, he discusses aggression, theory of mind, empathy, the golden rule and culture — all once thought to be particular to our species — also observed in primates citing: chimpanzee’s violent tendencies to kill one another for status, revenge or sport; a community of baboon’s showing remorse to a fellow member who has been innocently physically harassed by a superior; and even the unusual emergence of grooming among male baboons in an environment artificially selected to remove the most aggressive individuals (thus demonstrating the advent of a non-genetic behavior trait that is learned and passed onto future generations, one that had hitherto never before been observed in the wild). Sapolsky distills his argument into the simple phrase “similar design, novel uses” to compare humans and primates. In other words, the main distinguishing characteristic of human beings, Sapolsky argues, is that while we share most of our fundamental genes and behavior with other species, we evince ourselves with a higher degree of complexity. Sapolsky concludes his talk with an unusual insight into the perplexing primate trait of holding out hope for something that is utterly implausible. Here he presents evidence of chimpanzee studies where individuals were given a signal to perform some kind of work, and once properly completed were then rewarded in kind while measurements of dopamine levels in the brain were recorded. The findings contradicted intuitive suggestions that the dopamine levels would spike once the reward was received, instead the dopamine levels were seen to markedly increase and sustained at fixed levels while the work was being completed in anticipation of the reward. Such anticipation is a powerful stimulant driving the chimpanzees to perform their work regardless of whether a reward was ultimately given and Sapolsky cleverly draws an analogy with casino-goers who cognizant of the infinitesimal odds of winning nevertheless devote themselves to spending money with oftentimes reckless abandon. Professor Sapolsky’s wit and clear passion for his work are evident, I look forward to enjoying more of his findings in his popular books.

Ten days in Chicago

I returned last night from a delightful trip to Chicago where I spent my winter break this year. Chicago is a teeming, thriving city of nearly nine million souls (including suburbs) situated on the southwestern shore of Lake Michigan; it is by far the largest city in the midwestern United States and one that I had never before explored. The trip was immeasurably enhanced by the generosity of my hosts who with the luxury of an automobile and enthusiasm to sightsee made exploring this vast, sprawling city a painless affair in the biting cold. Among the significant points of interest I visited were: the Green Mill jazz club (famous for being the place where Al Capone and his cronies hung out during the Prohibition era of the 1930s), a comedy show at the famous Second City theater, the Museum of Science & Industry (housed in the glorious former palace of the 1893 World Expo), the Art Institute of Chicago (a magnificent building housing a wonderful collection of Impressionist art; the Caravaggio exhibit was exquisite and the new wing designed by Renzo Piano which opened in May 2009 is a stunning gem), the Chicago Cultural Center and its brilliant glass dome, the Shedd Aquarium with its impressive array of marine life, the Fields Museum and its splendiferous dinosaur collection, the Adler Planetarium to behold the grandness of the cosmos in 3d, the Willis Tower (formerly the Sears) observatory deck on the 106th floor with its breathtaking glass abutments into thin air, the skydeck cafe of the John Hancock Center with gorgeous views of the Chicago skyline and waterfront, the Museum of Contemporary Art (one of the largest in the United States), the bopping neighborhoods of Lincoln and Wicker Park, the University of Chicago and surrounding Hyde Park area of the South Side, as well as the bustling downtown core of the ‘Loop’ and the Magnificent Mile on Michigan Ave. Underappreciated by me prior to my arrival in the Windy City was Chicago’s pre-eminence in contemporary architecture: this was the city where 20th century giants Frank Lloyd Wright and Ludwig Mies van der Rohe once resided and where some of their most prominent works still reside. My friends and I spent an afternoon exploring the suburban Oak Park neighborhood where Wright was once based early in his career and had a chance to explore from the outside his home/studio and the very first example of what later came to be known as the ‘Prairie School’ at his Winslow House. I even took a tour of Wright’s masterpiece, the Robie House, in Hyde Park and was impressed by the prescient modernity of its design (considering that it was built in 1910!). The Bauhaus master van der Rohe was the former head of architecture at the Illinois Institute of Technology (where he took up an appointment in the 1930s after fleeing the Nazis in his native Germany) and established one of the most prestigious design schools in the United States; in fact the towering achievement of the Bauhaus movement is embodied in van der Rohe’s S.R. Crown Hall built in the 1950s that currently houses the architecture school. Not even the bone-chilling winds could dampen the charm of this majestic city for me though I would recommend visiting in the slightly warmer seasons if you can. (see photos here)

Steve James' "Hoop Dreams"

If you want to gain a first-hand look at several key issues at the core of contemporary American society, seek it in the brilliant 1994 documentary Hoop Dreams directed by Steve James. Hoop Dreams follows the lives of two aspiring young black boys in the Chicago inner city – William Gates & Arthur Agee – as they pursue their dream of making it to the NBA (the professional basketball league where marquee players earn astronomical salaries). It tells the tale of the quintessential youthful yearning for the American dream of rags to riches, of the intense and ruthless competition for success at the upper echelons, of the grinding and debilitating poverty affecting blacks, and a seemingly callous white America that manipulates those lowest on the socio-economic chain for profit and later eschews them when convenient. The documentary project had humble beginnings (and ambitions) as a half-hour PBS segment in the early 90s but after the director had amassed over 250 hours of film shot over five years of following his two main characters he realized that there was much deeper story to be told, and indeed there was. What is most grand about the storytelling and that really defines its success in bringing to the fore the many complex and underlying societal issues is the sheer access that Mr. James has to each of the boys’ families. There are memorable scenes shot around the routine family dinner table, candid private conversations with each boys’ parents and basketball coaches, and a light narrative to keep the story moving. A recurring theme is the broken homes of the boys as major impediment to their success; in both families the fathers are either missing entirely (as in William’s whose father has fled the home) or selfishly preoccupied with sinister endeavors (as in Arthur’s father who becomes a drug addict). The role of the family anchor thus falls to the mothers who, despite their tender young age, nevertheless assume their heavy responsibilities with full intent. Hoop Dreams is one of those films that resonates with me every time I watch it (going back to when I first saw it in the mid 90s as a basketball-crazed early teen) and its depiction of the individual zeal for triumph in modern America despite innumerable setbacks is uplifting: neither of the two boys achieve their ultimate dream of making it to the pros but both use their newfound fame from the documentary to better their lives and find their own success. A must-see!