Friday, February 28, 2014

The New Tour Bus

Lucie Lozinski



My brother Ian, who studies computer science at Rutgers, constantly tries to share his life as a programmer with me. He wants me to learn to code, start building apps, and get a job at a start-up. I tell him I don’t want a job in technology, but he won’t hear it. “It’s not just a tech thing,” he says. “Coding is a life skill that everyone should have.”

When I saw the coding community firsthand, I started to see his point. With other students from his university’s computer science department, Ian attends, competes, and often wins prizes, in the form of money or new gadgets, at hackathons around the country. My mom called to tell me he had won something at HackNY, a 24-hour hackathon hosted by NYU and Columbia. I envisioned him and a roomful of skinny, pale nerds typing on their laptops in a dark room overnight, tirelessly competing to see who could hack into the most secure networks: someone gets into the White House’s online databases, for example, and can see the President’s top secret files; another kid hacks into Google and exposes famous people’s search histories. It’s probably obvious, but this is not how hackathons work.

In October, Ian came with a busload of computer science students from Rutgers to compete in HackMIT in Cambridge. Excited to see him while he was in town, I agreed to meet him at the hackathon, thinking we’d get lunch together at Anna’s Taqueria after I’d watched him receive a medal. It’s never so simple. My brother, who can write javascript much clearer than he can make plans, went ahead and registered me as part of his team in the hackathon.

Ian, his friend Eddie, and Eddie’s silent girlfriend arrived in Boston late on Friday night. They declined the HackMIT staff’s offer to sleep on air mattresses in frat houses and instead took a cab, paid for by HackMIT travel reimbursements, to stay with me at Wellesley. Despite getting to sleep long after midnight, the four of us would be a well-rested team compared to the rest. On the bus ride back to Cambridge in the morning, I asked how Eddie had met Ian. He told me stories about getting through challenging classes together, including one time when they and some others had shotgunned beers before a difficult final exam. “We’re programmers,” Eddie said and giggled. After thirty seconds of silence, he added, “I don’t think I said that right. We’re bro-grammers, I meant. Like bros who program together.” I laughed, mostly at the clarification. Their camaraderie surprised and inspired me. College didn’t seem to stress them out. Studying computer science was like joining a club.

He and Ian vaguely answered some of my questions about what we’d be doing throughout the day. The first step, they explained, was brainstorming. We had to think of a project, something to make, like an app. “It doesn’t have to be an app,” Ian said. “Just something useful. You remember those girls from that art school who made grandma robot?” Ian asked Eddie. They both laughed nostalgically. “It was this robot thing that played a song and knitted, and this smokey incense came out of it. It was pretty cool, actually. The point is to create something cool.” 

Cut to go-time: in a hockey rink at MIT, rows of foldable tables fill the space. Over a thousand students flood the room to claim tables for their teams, dumping their duffle bags and backpacks, and plugging their laptops and other gear into outlets. Stiff extension cords spread across the floor, refusing to stay duct taped down. They run rampant from the walls, floors, and a few centrally located boxes that overflow with cables and wires. So that we don’t overcrowd the wifi network, the staff encourage some teams to use ethernet. Each table already looks alive, or at least on life support, strung to different sources of power. People trip on cords often.

In the stands around the rink, sponsors set up camp with boxes full of “swag,” or free stuff. From Google to Etsy to General Electric to Uber, representatives hand out branded apparel, headphones, snacks, water bottles, pillows, and reduced-price gadgets. Some sponsors also have equipment available for hackers’ free use in the competition. I’m sent immediately to get an AirView pack (3D-touch hover technology) from Synaptics. I memorize the phrase, run to spit it at one of the men at the Synaptics table, and exchange my driver’s license for a package.

Back in the rink, team members take turns holding down the table-fort while the others stock up on swag. One senior from Rutgers, a metal-band looking boy in baggy jeans with hair past his shoulders, returns to the table next to ours, his arms full, and says, “Well, looks like I just got a new wardrobe.” I point out that it’s comprised of branded t-shirts, a look that probably won’t work for him after college. He retorts, “I go to every interview dressed exactly like I am now. If they don’t like it, then I don’t want to work for them.” I wonder if he’ll run out of options with that mentality, but Ian tells me that he’s a talented programmer and has already turned down a handful of job offers.

We have thirty hours. We think about what to make, try to create it, encounter an obstacle too big, and start over. We go through several ideas: a lie detector test app that measures pulse, body heat, and eye movements; a Mad Libs style karaoke game for computers and mobile devices; an app that generates and plays raps based on a specific topic. My teammates don’t impress me with their work ethic—Eddie’s studying for a test and Ian is joking with his friends on other teams. Not having the skills to contribute to the project past the idea stage, I get frustrated with our lack of progress as the sun sets. Sponsor reps, fellow hackers, and curious MIT students float through the tables, asking to see what various teams are working on. I purposely give a vague answer so that no one steals our project, but Ian and Eddie clarify with details, always adding, “We’re just waiting for that pivot.” Everyone throws around the word “pivot” as the inevitable breakthrough each team will have at some point during the night.

At one point, we ask another Rutgers student to watch our table so that we can take a break outside as a team. We toss a frisbie with some company’s name on it and try to loosen up our bodies and brains. I’m not used to this type of collaborative, patient, wait-for-inspiration atmosphere, especially in a competitive setting. Except for the running to claim a table, the hackathon never feels like a competition. It’s more about making something to share with other people and hopefully having good ideas acknowledged.

Back inside, hours pass with no progress, and no one seems worried. Eddie’s girlfriend takes a nap on the air mattress under the table while he and Ian play games on the Internet, drink Red Bull from the heaping stash of snacks in the bleachers, and work on homework occasionally with Rutgers classmates on other teams. Some students aren’t on a team at all but have come to support. I try to catch up in The House of Mirth, but someone asks me about what I’m reading every time I get into it. Exhausted, bored, and unproductive, I don’t see any reason for me to be here. I fight the protests from Ian and his friends and take the bus back to my bed at Wellesley.

I return in the morning to the rink, which reminds me of a gymnasium after a natural disaster when it serves as a shelter and clinic, with tables and beds set up in neat lines, and half-dead looking people stumbling around sleepless. I find my team, just in time to see them present the final product to the judges making their rounds. It’s called “PinDB: your girlfriend’s database,” a nerdy, clever hack that misuses Pinterest’s image upload feature to store any file—photo, music, whatever—injected into a JPG with steganography. Basically, they’ve exploited Pinterest by using it as a file-sharing service. I barely understand and wonder how our lie detector app has gotten so lost.

“We worked on that assignment until we finished, and then finally pivoted at around three or four,” Eddie explains, remarkably cheerful for not having slept. “We thought no one would use Pinterest’s API, which increases our chances of winning their prize. When we realized Pinterest has a loophole where they don't sanitize the data their users upload, we came up with PinDB.” The judges smile and nod and finally ask why it’s “your girlfriend’s” database. Eddie says, “The only people we know using Pinterest are girls, specifically our girlfriends—Pinterest is really only female users.” The judges ask me if I use Pinterest. I don’t, and I tell them so, which feels like an awkward rejection. Two free agent hackers joined the team in the middle of the night. They all revel in their success with high fives and unbrushed smiles. I feel somewhat disappointed about being left out, but I can’t help my excitement when the PinDB team wins Pinterest’s prize: iPad minis.

Among some of the other winners are a better way to control PowerPoint presentations from your smartphone or pebble wristwatch, a Grand Theft Auto computer game built on Google Earth so that you can play in any geographical location with real-time traffic and weather, and 3-D drawing app.

A closing ceremony encourages students to ditch graduate school, avoid becoming a pawn in a big corporation, and, instead, join the start-up nation. The founder of Rap Genius calls us to become a part of the new frontier, the harnessing of a new resource we can’t possibly use up: cyberspace. The coding, hacking community is by no means underground, but it’s a still somewhat obscure club. It sees itself, paradoxically, as our generation’s version of a subculture sticking it to the man. Instead of going on tour with a rock band in hopes of getting rich, young twenty-something-year-olds take buses with their computer science departments. In both cases, developing the necessary skills might demand hard work, yet the climb to wealth and fame is as much about having fun as it is about meeting the goal.

Thursday, February 27, 2014

Wednesday, February 26, 2014

The Future Of Meat

The Future Of Meat
By Claudina Yang

Devour is what I would like to do to steak. Succulent, juicy, creamy, and fatty, the taste and aroma of red meat makes my senses scream, “I must have it now.” Similar to the reaction one might have towards freshly baked cookies or percolated, ground Robusta, there is a certain potency of smell to cooked meat that makes me go weak at the knees. However, I was forced to reconsider my all-consuming love for juicy steak when I took an Environment Ethics course in the fall. Upon declaring that I was the sole and proud meat eater in the room, I was only mildly surprised to be the main target of all the dirty looks and verbal assault for the rest of the semester. Perhaps I shouldn’t have made the quip that some species of animals were virtually useless, but you’d think I drank the tears of pandas by all the appalling looks I received when I said nobody could pay me any amount of money to give up eating meat.  Nonetheless, it later became acutely apparent to me that my meaty dreams may soon have to be crushed, or at least suppressed, if I want to be a conscientious citizen of the changing world.

Unfortunately for those of us meat lovers, all the signs and research warn us that drastic changes to food production and diet are needed by 2050 to combat further global climate change. Some estimate that meat eaters in developing countries will have to cut back consumption by almost 50% to avoid the worst consequences of future environmental change. Reducing food emissions while still producing enough food for a growing population that is estimated to reach 9 billion by 2050, is arguably the most difficult challenge of combating climate change. Presently, around 10 billion land animals in the US are raised for dairy, meat, and eggs each year. Factory farming accounts for 37% of methane emissions every year and contributes to air pollution by also releasing other compounds like hydrogen sulfide and ammonia. Animal waste from the farms cause dangerous levels of phosphorous and nitrogen in the water supply, and the use of fossil fuels to raise and feed the animals emits around 90 million tons of CO2 emissions annually. Furthermore, global deforestation for grazing and feeding animals emits another 2.4 billions tons of CO2 every year.

Looking at the current research, it’s hard to deny the inevitable consequences of our meat eating habits if we continue to consume as much as we are now. Global warming might be the controversy of yesterday, but climate change is upon us, and it will only get worse if we don’t take on individual responsibility for our changing environment. Nowadays, with people growing more aware of the imminent problems we are facing, vegetarianism and veganism seem to be more common alternatives. However, in many developing countries, particularly in South America, ranching provides a large source of income for many of its farmers. A vegetarian diet is also often considered a privileged lifestyle that is too expensive to maintain, and buying processed meat provides a cheaper way for people to get the protein they need. Consequently, despite the infallible claims made by this current research, I still had a hard time coming to terms with what that would mean for my lifestyle. Why would I implement changes that I consider to be entirely unnatural to my way of life? So while proudly defending my fellow meat-eating brethren of the world in class, I also brought to the table topics on natural predation and biological evolution to back up my end of the argument. I’m a proponent of regulated hunting, and so I believe that natural predation is necessary if we want to maintain healthy animal populations. The food chain is a natural, biological structure. Entire ecosystems can’t be maintained if we give up meat consumption all together. Although this is true, it isn’t a strong enough argument to justify the enormous magnitude and scale of the meat consumption rates we are going at now. But some might ask, “Isn’t it human nature and natural for us to eat meat?” Meat-eating proponents argue that there is not much scientific proof that really shows us that our bodies can have completely healthy diets without eating meat in the long term. And look at our teeth. One could make the argument that evolutionary biology dictates that the shape and structure of our incisors were born to chomp on meat. However, if we take this line of argument, then we have to ask ourselves what it means to be natural. Is it natural for us to be species-est and deny equal consideration to other animals? Is it natural for us to destroy the environment we live in just to supplement our “natural” diet? This isn’t to suggest that we give up eating meat entirely, or that we must now adjust our thinking and view amoebas as precious lives that must be preserved, but as a long time defender of meat eating myself, I can no longer deny the dire consequences that my lifestyle can have on my environment. So what alternatives are out there?

In August, when London tasters bit into the first lab-grown burger, one taster declared, “It’s close to meat.” Another said “the bite feels like a conventional hamburger,” but that it tasted like “an animal protein cake.” However, the taste isn't necessarily relevant here. A hamburger grown from stem cells obtained from cow muscle, this lab-grown meat provides one view of the future of meat. Dutch researcher and proponent of the idea, Dr. Mark Post, presented the idea saying that this lab-made meat could provide high-quality protein for the world’s growing population while also combating many of the environmental and animal-rights issues surrounding conventional food production. Although it would take about 10 years for cultured meat to become commercially viable, this feat could lead to less usage of land, water, and energy resources, while also reducing methane emission and other greenhouse gases. It would also placate those who are concerned about animal welfare, since animals don’t have to necessarily be killed to make the meat. Other alternatives to combatting climate change also include renewable energy technology and breakthrough research in climate engineering and human engineering etc. So maybe there’s still hope yet for the future of meat.


With researchers making headway on new ways to supplement protein in our diets, perhaps one day we can provide a way for beefeaters to eat beef that is both environmentally friendly and morally ethical. Although the thought of giving up meat thoroughly pains me, I now realize how glib my comments may have been at the beginning of class when I proudly declared myself an inconsequential meat-eater. So nowadays, whenever I think about what to eat, I am at least now more aware and conscious of what it means for the environment when I consume that food. Maybe someday I can reconcile my love for eating red meat and the implications of doing so, and jump on the train with the other vegetarians. But for now, these days I choose to curb the desire to eat meat more often that not. Asking meat lovers to eliminate meat from their diets might not be a reasonable thing to ask, but at the minimum, we should all consider eating less of it, or at least supplement factory farmed meat with other alternatives like buying grass fed, organic, or free range meat once in awhile. Nothing can beat genuine, prime USDA cow in my book, but we can’t deny that if we want to combat global climate-change and sustain our growing populations, conscientious meat eating such as reducing portion size and frequency of consumption could go a long way for the future.

Through the Looking Glass and Into the Creationism Museum

by Chelsea Ennen, who regrets that "Ham on Nye" was taken before she could use it.


Like a fair percentage of people my age, I grew up watching Bill Nye in school.  The lanky, bow tie-clad former aeronautics consultant specialized in teaching the sciences to kids on his clever, engaging show Bill Nye the Science Guy.  I could probably call any one of my childhood friends, say “Richie, eat your crust!” and they’d immediately remember his plate tectonics episode.  Nye has a gift for teaching, so I was thrilled to see he would be participating in a major scientific debate.  Each presenter would be given half an hour for opening arguments followed by a question and answer session.  While the official topic was “Is creation a viable model of origins in today’s scientific era?” the big question hanging over both presenter’s heads was “Should creation be taught in public schools instead of evolution?” To question whether or not creationism belongs in schools is to question whether or not religion belongs in schools, and both Nye and Ham often mentioned the importance of what we teach the next generation.  While I’m sure he was pleased with himself, Ham certainly didn’t make much of a case for creationism to be taken seriously.
President of both the creationism museum that hosted the debate and the organization “Answers in Genesis,” Ken Ham used his half hour to try and convince the audience of rather labored and disjointed theories.  He quoted several Bible verses as responses to scientific findings and went on and on about the difference between “historical” (or mainstream science) and “observational” science, saying that something like the big bang theory shouldn’t be taught in schools because no one was there to observe it happen and prove it to be true.  He based many arguments on fundamental misunderstandings of scientific facts, including the second law of thermodynamics: a concept often cited by creationists as proof against evolution.  Basically this law refers to the idea that in isolated systems disorder and entropy will always increase.  Ken Ham used it to refute evolution, where organisms become more and more advanced and complex.  However, as Bill Nye explained in his rebuttal, the earth is not an isolated system- we receive energy from the sun.  I find much of upper-level science difficult to follow, but even I could see that Ham blatantly bent the facts to serve his argument.  If you don’t have three spare hours for the full debate, you need only visit www.answersingenesis.org for a full record of Ham's beliefs, beliefs which it is important to remember are not held by the majority of Christians.  
 If anything got me through watching this debate, it was Bill Nye.  His hair was graying, but he looked mostly the same as I remembered him - complete with bow tie, effervescent charm, and a knack for explaining scientific concepts to humanities brains like mine.  When I found out the debate was actually about evolution versus creationism I wondered why Nye was speaking instead of someone like evolutionary biologist and outspoken atheist Richard Dawkins, but since the debate was aimed toward the general public, Nye’s clear and accessible teaching style was very appropriate.  He discussed carbon dating, fossils, astronomy, and, of course, evolution, with impressive articulation and poise.  I highly recommend his portions of the debate as an engaging lecture in their own right.  
However I couldn't help but wonder, as the hours ticked by, how this event even came to be.  I was watching it for the spectacle, but why was a renowned scientist dignifying Ham's arguments with a response?  It occurred to me that if Richard Dawkins had been there instead of Bill Nye, he probably would have tackled Ham to the ground within minutes of his opening remarks out of sheer frustration and rage, as would a good deal of other big names in the scientific community.  In a piece written for CNN's religion blog, Nye addressed those of us who questioned his decision to accept Ham's invitation. 
"In short, I decided to participate in the debate because I felt it would draw attention to the importance of science education here in the United States...The facts and process of science have enabled the United States to lead the world in technology and provide good health for an unprecedented number of our citizens. Science fuels our economy. Without it, our economic engine will slow and eventually stop." -Bill Nye
Science has given us vaccines, MRI machines, pacemakers, warm houses in winter, the internet, smart phones, planes, trains, automobiles, everything we touch in our day to day lives.  Not too long ago it was assumed that influenza was an unavoidable evil but today a new vaccine comes out every year.  A bit further back in time but still in relatively recent history, pneumonia was a death sentence, but now antibiotics have reduced it to a couple weeks in bed with chicken soup.  Now that science has taught us the consequences of suntanning, people are putting on sunscreen (also a result of scientific research) and preventing deadly melanomas.  There are thousands of examples that prove how science has served humanity, served humanity in ways Ken Ham has benefitted from whether he agrees with it or not, but I doubt he stopped to think about how science makes it possible for him to broadcast his debate to the world and to create his website.     
A passionate advocate for education, Nye asked the audience to consider these advancements in health and predict what would happen if a whole generation of medical researchers were not taught science correctly.  I was immediately disturbed by the question’s implications: what if the scientist destined to cure cancer or Alzheimer’s or AIDS didn’t get the education they needed because of someone like Ken Ham?  So many people wouldn't be alive now without the scientific advancements we've already made, but what about the ones we have yet to make?  On the other hand, don’t the people who agree with Ham just as passionately as I disagree with him have a right to be heard?  Where do we draw the line between what parents teach their children outside of school and what is acceptable for curricula nationwide?
The problem Ham refused to recognize, the real reason that creationism shouldn’t be in textbooks even though it is anyone's right to believe in it, beyond questions of adverse effects in future research and development, is that creationism is based on the Bible and therefore relies on a belief in God.  There is no such thing as an atheistic creationist: to believe in creationism you must believe in a creator, but a belief in evolution does not necessarily eliminate faith in God.  Bill Nye even discussed this fact and pointed out what was really the crux of the debate: 
“there are billions of people around the world who are religious and who accept science…the exception is you Mr. Ham…you want us to take your word for what’s written in this ancient text to be more compelling than what we see around us, the evidence for a higher power, for spirituality, is, for me, separate.” –Bill Nye  
 While it would have been a shame to miss out on Bill Nye’s lecture, he really could have started and ended the debate with this single statement.  Ken Ham will only respond to criticism by quoting a religious text, so the scientific community can only address him by saying that proving God’s existence is not science’s responsibility; it is a separate issue.
That word “separate” is problematic for people who share Ham’s beliefs when it’s included in the phrase “Separation of Church and State,” which refers to the idea that religion must stay out of the government and, by extension, public schools.  Whenever this law is brought up in creationism/evolution discussions, someone inevitably cries out that their freedom of religion is being violated by the exclusion of creationism from schools.  On the contrary, “Separation of Church and State” is meant to preserve freedom of religion - and freedom of religion also means freedom from religion.  It means if you believe in Judaism, Buddhism, Hinduism, Islam, Christianity, Wicca, or the Flying Spaghetti Monster of Pastafarianism, you have the right to practice your beliefs without being coerced into anyone else’s.  It also means that atheists shouldn't have to put up with religious doctrines in a public school setting.  Teaching science shouldn’t be about teaching religion; it should be about teaching the mainstream science that does not implicitly rely on faith in God.
Teachers aren’t telling kids that evolution disproves the existence of God, and no one is stopping parents from teaching the creation model outside of school.  Ken Ham’s arguments stretched past the limits of sense so far I almost expected him to jump off the stage and follow a white rabbit back to Wonderland, where his logic would be at home.  But, really, it doesn’t matter what I think about Ken Ham, because my opinions on him hinge on our differing opinions of God’s existence, and no one has the right to force their religious beliefs - or lack thereof - on anyone, especially not children.  Bill Nye the Science Guy has always been a valuable classroom tool, and always should be, because what Nye teaches- fact based evolution- has everything to do with science and nothing to do with religion, unlike Ham’s creation model.  So to all the Christian parents out there, feel free to keep this article with all my heretical mean spiritedness out of schools- just don’t use Ham’s lecture either. 

Tuesday, February 25, 2014

Great Album, Stratospheric Review

Beck's new record:

http://www.beck.com/index.php/homepage

and one of the giddiest raves I've ever read:

http://www.newyorker.com/arts/critics/musical/2014/02/17/140217crmu_music_frerejones


(It is great.)


Monday, February 24, 2014

The Adaptation of Art (and Vice Versa)

The Adaptation of Art (and Vice Versa)
by Kayleigh Butler

The societal compulsion to translate pre-existing art to the silver screen today seems almost innate. Once something, whether a book, theater production, or even an individual experience has gained a certain amount of public attention, the response of the film industry is almost immediately to scoop up copyrights and begin production. And while true art devotees may resist the conversion of whatever stands to be transformed, arguing the perfection of an original, the process undeniably feeds the public’s insatiable desire for more.

Whether on stage or in film, the presence of musical theater, in particular, predates the procedure of film adaptation: original theater and film productions alike almost instinctually used catchy songs as methods for telling narratives. Mainstream film eventually diverged from this technique, but has over time repaired its relationship with the genre by popularizing existing musicals through film adaptations. Notably, films like Chicago (2002), Rent (2005), Mamma Mia! (2008), and most recently, Les Misérables (2012) have brought Broadway to the big screen, inciting widespread public adoration for a subculture otherwise exclusive to those who have been pre-exposed.

Determining the success of such film adaptations, however, presents another challenge, as filmgoers unfamiliar with the originals, filmgoers familiar with the originals, and professional critics all stand to have varying opinions of the end products. Unfamiliar filmgoers may go either way, depending on their depth of knowledge and affection for musicals in general, but none are as volatile as viewers predisposed to the original—whether critics and familiar filmgoers—and understandably so. When someone loves a piece of art, or on the contrary hates it, that individual certainly approaches an adaptation with certain expectations, entering the cinema with either overexcitement or a premature, begrudging distaste at the thought of a Hollywood translation.

Volatility is an understatement when it comes to responses to the resurrection of the Kandor & Ebb musical Chicago, an adaptation that exemplifies the fickleness of filmgoers (specifically those familiar with the revival production of 1996, which won the Tony Award for Best Revival of a Musical). Despite having been nominated for thirteen Academy awards and taking home six, including Best Picture, the movie provoked displeasure from experienced theater patrons.

Film critic Stanley Krauffmann of The New Republic laments, “..the net effect of the incessant dazzle is depressing” and that “most of the lead performances are weak.” Note: Among the 13 Academy Awards nominations were Rene Zellweger for Best Actress, John C. Reilly for Best Supporting Actor, Queen Latifah for Best Supporting Actress; and among the winners was Catherine Zeta-Jones for Best Supporting Actress.

Anthony Lane writes for The New Yorker, “The setting is so stylized, so shamelessly grounded in a hundred other shows and films, that ‘Chicago’ barely qualifies as a period piece”. He continues to critique the director’s editing decisions, noting, “Rob Marshall cuts away furiously during every song and this chronic wish to glance aside makes us wonder: could the performers not weather the camera’s unstinting gaze?” Note: Rob Marshall was also among the Academy Award nominees, for Best Director, and Martin Walsh won the Oscar for Best Film Editing.

And so on. Certainly, there exists tremendous value in criticizing stylistic choices in films, especially those in adaptations; and, certainly, this particular adaptation, what with its many accolades and nominations, represents an outlier in its strength to oppose criticisms like those above. But that these critics, holding minority yet powerful opinions, specifically condemn aspects of the adaptation that were otherwise applauded seems to point toward a pre-existing prejudice toward the film.

Even those who admittedly prefer the theater production and criticize Marshall’s direction acknowledge its successes: Joe Morgenstern of the Wall Street Journal writes, “Rob Marshall's screen version of the near-venerable show looks great, in its razzly-dazzly neo-Fosse way, and sounds good, especially when Renee Zellweger's gorgeous Roxie Hart is singing her heart out.” But the best reviews are those that acknowledge the film for what it is, an adaptation. Associated Press writer Ben Nuckols commends the adaptative process, “The path from stage to screen was arduous and the director could hardly be greener, but the Broad-way-spawned "Chicago" is every inch a movie. It’s kinetic, dynamic and always entertaining.”

That the film industry not only embraces musical adaptations but competes for the rights to do so has a net positive effect, for audience members as well as the original musicals themselves. It is a saddening truth that for the general population Bob Fosse and his cohort are strangers, foreign entities typically associated with high culture enthusiasts. Even worse, the Great Recession has facilitated an ongoing depletion of art programs across primary and secondary education institutions nationwide, leaving the responsibility of invoking appreciation for art on the shoulders of parents too often preoccupied with maintaining or gaining stable work. For their children, film adaptations of musicals, or plays for that matter, provide access to the art of Broadway, which at its least powerful is educational, and at its most, inspirational.

Surely, familiar filmgoers, both hopeful and cynical, have a sincere hope that the industry will do justice to their favorite productions. And in this hope they are indeed entitled. But to directly compare any adaptation to its original undermines the art itself. Film adaptations are mere extensions of originals, meant to capture some essence or to portray some understanding of the material rather than to meticulously replicate it at the same level. In this way, film adaptations are themselves works of art that should be regarded as separate entities from their predecessors. Take Shakespeare’s Hamlet, for example. A theatergoer attending a production of this work expects something unique, an insightful, yet unexpected take on the classic text. And on Broadway, shows that run for years straight must continually reinvent themselves to stay relevant and exciting.

Why are films treated differently? Perhaps the prospect of unlimited funds raises expectations, and the thought of A-list actors playing renowned Broadway characters raises doubts. Rightfully so, since an adaptive failure could potentially turn the general public against the original, the opposite effect than that desired. But no cigar, as it would seem that directors of recent musical adaptations are genuinely dedicated to finding the right balance between paying homage and utilizing artistic license. By doing so, they bring Broadway home for viewers of all backgrounds, educating minds young and old about culturally significant productions and inspiring some to look further into the dazzling realm of theater. And in doing so lies the true determining factor of success in an adaptation.

The Trouble with Toast: In defense of West coast coffee-shop culture



I want to talk to you about toast. Toast, and why paying $4 for a slice might not be so unreasonable after all.


Let’s back up a bit. Last week, I ran across an article on my twitter feed about San Francisco’s “artisanal toast” trend. These days, “artisanal” is often just a buzzword for food that’s trendy and expensive. According to a food blogger from Chow.com, in the case of San Francisco’s toast this means thick slices of homemade bread, spread with small-batch, locally produced condiments like almond butter or cinnamon-sugar and going for upwards of $3 a slice. A little bit of digging revealed that this food fad has recently become a rallying point for those who claim that the overpaid tech-industry yuppies are sucking the soul out of San Francisco. The article I read came on the tail end of this internet frenzy, and took a distinctly different approach. Rather than bemoaning the coming hipster apocalypse, the author attempted to hunt down the origins of the West-coast toast phenomenon. Although most other articles had focused on The Mill bakery and cafe, author John Gravois claims that the real toast pioneers were at “a little spot called Trouble.”


“Trouble Coffee & Coconut Club” was founded in 2007 by a woman named Giulietta Carrelli, and it’s her story which makes the article so striking. Carrelli describes a long struggle with undiagnosed mental illness. As a result of what she now knows to be schizoaffective disorder (a combination of schizophrenia and bipolarity) she struggled to stay afloat beginning in her late teens. Episodes which could “shut her down with little warning for hours, days, or, in the worst instances, months” made it nearly impossible for her to maintain relationships, hold down a job, and find reliable housing. After finding stability and purpose in a job as a barista and a new network of friends in San Francisco, Carrelli explained that she scraped together the money to open Trouble because it was time to “build her own damn house.” The shop’s unusual four-item menu (just coffee, whole coconuts, grapefruit juice, and cinnamon-sugar toast) reflects the way certain foods have shaped Carrelli’s life, and the business is both a community fixture and a personal safety net. Having a large group of people who recognize her and rely on the space she has created helps Carrelli hang onto her sense of self. As the article describes it, Trouble is a place that fosters face-to-face connections. As the author notes, it’s hard not to strike up a conversation with someone when you’ve each just ordered toast and a coconut, and it’s impossible not to smile at the gregarious, eccentric barista who hands them to you.


Trouble makes for a great story, but it that doesn’t change the fact that a lot of people still scoff at the idea of the toast fad and what it represents. One article, entitled “$4 Toast: Why the tech industry is ruining San Francisco” neatly sums up most of the reactions I ran across. That author explains the issue this way: We don’t go to the opera: we overspend on the simplest facets of life...Bake your own bread. Buy regular coffee. Save your money. Aspire to be wise rather than just knowledgeable.”


I get it. I agree that we should aspire to wisdom over knowledge, and that’s why I think all this righteous indignation at $4 toast, cupcakes, lattes, etc. belies a deep misunderstanding of the culture that created them. Knowledge is knowing that you’re paying several times the value of the ingredients for whatever artisanal snack you’re ingesting. Wisdom is understanding that that isn’t the point.

**********************

                                                



I don’t deny that food fads tend to quickly spiral out of control. However much I enjoy them, I’m not here to defend places like Georgetown cupcakes or the price of a Starbucks pumpkin spice latte. I haven’t personally witnessed the toast trend, so I can’t tell you whether or not I think it’s become a trendy scam that’s pricing people out of their neighborhoods. The tech industry may well be ruining San Francisco, but the tone of this argument reminds me of another conversation I often have about my hometown of Seattle. The widespread irritation at an “overpriced” menu item echoes the derisive comments I’ve heard directed at Seattle’s latte-obsessed coffee culture. The problem, as I see it, is a misunderstanding of what $4 toast really stands for.


I don’t blink at spending $4 on a good cup of coffee in the right coffee shop. I’ve happily paid for the “artisanal” cupcakes and, yes, even toast that has so many people rolling their eyes. Although I love trading stories about obscure new places or strange new menu items, it’s not about the bragging rights, and definitely not about flaunting disposable income. If I just wanted a slice of toast, I would make my own damn toast at home. I don’t go to coffee shops because I don’t know how to make coffee (I do), and I don’t order lattes because they’re a superior caffeine delivery system to cheap drip (they’re not). When choosing between mega-corporations to hand my money to, I don’t choose Starbucks over McDonalds because the coffee is better or cheaper (hint: it’s neither.) When I spend between $4 and $9 dollars on coffee and a pastry-- or a slice of toast-- I’m paying for lot more than caffeine and calories.


I am really, really passionate about coffee shops. I was born in Seattle just around the time that Starbucks was becoming a national phenomenon. I don’t hate Starbucks, but to me it’s just a watered-down, corporate version of one of the most important parts of the lifestyle I grew up with. Starbucks has made billions of dollars off of the “third place” concept. Briefly, that’s the idea that most people spend most of their time at home or at work. That company wanted to create a “third place” where people could come and just be. You can sit and read, do work, have a conversation, in a space that combines features of public and private.


I grew up surrounded by an infinite variety of “third places.” In Seattle, coffee shop culture is about a lot more than finding a place to grab a drink or hide out from the rain. Coffee shops are communities, places that combine familiarity and anonymity. I have my favorites in each neighborhood I frequent, places where I’ll go if I have a few minutes or a few hours to kill. Then there are the new faces, shops opening up every few months which I’ll trek across town to visit. Picolino’s is my home base, an Italian-run cafe around the corner from my house with excellent homemade pastries and espresso that varies in quality from barista to barista. The folks behind the counter are a friendly, odd group, and I’m there enough that I can order “the usual.” It’s a long, bright room with plenty of natural light, closely-set tables that seat one or two, and walls that are decorated with old maps. Anchored Ship Coffee, another favorite close to home, is wedged into an oddly-shaped space on a hip block. They serve stellar pour-over coffee ($5), there are plenty of seats at the window counters, and the decor is all repurposed building materials and vintage kitchen cabinets. The Muddy Cup, near a bus stop on my way home from school, is in the living room of a tiny repurposed house, with a full menu of exotic latte flavors (i.e. “fruit loop,”) giant comfy chairs, and battered board games. Herkimer has perfect espresso and wide, laptop-friendly tables. Cherry Street coffee is downtown, with cozy window seats full of vintage rugs and an impressive tea menu. Slate only serves a single, rotating menu item and doesn’t stock sugar or to-go cups, but rumor has it that they have some of the friendliest baristas in the city. My favorites are the places that have perfected some aspect of the experience to the point where I want to tell everyone I know to give it a try. Even more important than finding a single perfect coffee shop, though, is knowing that I can always find one when I need it.


To me, coffee shops mean safety. I can walk into one, buy a drink, and know that no one is going to bother me or expect anything of me as long as I’m there. One of my favorite shops growing up (now, sadly, closed) had a large, unpretentious space and a handful of tables outside on the sidewalk. A handful of local homeless people, some with dogs, would regularly spend days there. The staff left them alone as long as they bought something when they walked in, and in turn they had a warm place to read the paper and sit unmolested. When I was in my early teens and riding the bus alone through downtown Seattle, I always kept a few dollars in my bag so if I ever had time to kill or needed to sit somewhere safe for a while, I could buy a cup of coffee and sit at Cherry St Coffee or Seattle Coffee Works. Later, when a close friend of mine was dealing with a bad home situation, she would often slip out to sit at Richmond Beach Coffee Co. until they closed, a two minutes walk that meant hours of peace.


I’m sure that there are other places which fulfill this same function in cities with a less-established coffee culture than Seattle or San Francisco. But honestly, I’ve never found a substitute. When I first came to Wellesley, I was shocked to find that Boston doesn’t have the kind of coffee shops I grew up with. I’ve found a handful of places in the Boston area which achieve the basics of what I look for in a coffee shop, but the culture isn’t there. I don’t feel that sense of safety and welcome as I walk down the streets that coffee culture brings. Trouble and Giullietta Carrelli’s story are perfect examples of what coffee shop culture can be at its best. Baristas and the people who own coffee shops are some of the most interesting people I’ve ever met. A city with a lot of unusual, independent coffee shops is also going to be full of people with non-traditional visions of success. Sure, some of them will be Portlandia-style hipster cliches, young tattooed white people who are “in a band” or “working on a screenplay” and seem to have a suspicious amount of their parents money at their disposal. But a lot of them will be people with “useful skills in tangible situations” (Carrelli’s phrase describing her and her employees) who are invested in community spaces.


Of course, sometimes the coffee is overpriced, and there is a limit to how much I’ll pay for the perfect slice of toast, no matter how beautiful the coffee shop or how friendly the barista. But I’m tired of hearing people scoff at the culture that created things like hipster toast and artisanal cupcakes. Coffee shop culture is about community, dialogue, and finding a way for anyone to feel safe and at home, no matter where they are or what their actual home looks like. I’ll pay $4 for that.