NEWS & VIEWS

LJK v2

It seems hard to believe that this week the Oval Office –and the world- says farewell to Barack Obama, and ‘hello’ to who knows what. Eight years ago I wrote a piece for the United Church Observer reflecting on this new articulate president –all the more eloquent since he was replacing GWB who, if you recall, suffered from syntax problems. The essay has been posted on the magazine’s website under the Retrospect category www.ucobserver.org/retrospect/2017/01/preacher_president


Preacher president: Those who study oratory hear the power of the pulpit in Barack Obama's cadence


By Larry Krotz

Editor's note: Eight years ago this month, Illinois Senator Barack Obama made history when he became the first African-American to take the office of the president of the United States. The Observer commemorates the anniversary of Obama's first inauguration by republishing this story from January 2009.
When Americans elected Barack Obama their 44th president, they were hoping he would bring to the White House something amorphously called “change.” While the jury on this might need to stay out a bit longer, what America got for sure was oratory. From the very beginning, Obama’s candidacy was built on his extraordinary ability to stir crowds with the spoken word. We can safely assume that oratory will be a defining feature of his presidency, starting with his inaugural address this month.

Obama delivers oratory, or rhetoric, as gets heard only rarely in our culture. Lincoln, Churchill, Kennedy and King hold iconic positions, but a great deal of time has lapsed since the last of them held sway. So it was a delight to have this skinny fellow with roots in Kansas and Kenya emerge unexpectedly in 2008. The rhythm, the soar of the cadence, the timing — listening to Obama as he campaigned, even on television, was like being at a Pavarotti concert. Even if you couldn’t understand English, it would still sound good.

Obama’s abilities were matched only by the public’s hunger to hear him. Through him, we realized how starved many of us were for the spoken word delivered in front of great audiences. By the time he got to his big speeches — the one delivered in Philadelphia in response to the troublesome questions surrounding his former pastor Rev. Jeremiah Wright; the one in the Denver football stadium when he accepted the Democratic nomination; the one in Chicago’s Grant Park on election night — we were no longer surprised that he was hitting home runs. The crowds for each event grew exponentially, with 80,000 in Denver and 150,000 in Chicago. The pundits who examined these performances set their expectations high and dissected each speech to determine whether it had been the barnburner they’d hoped for.  

Obama’s strength, believes Rev. Bruce McLeod, former United Church moderator and no slouch himself as a speaker, is that his words come from the heart; his sincerity is never in doubt. “The power of spoken words go more from heart to heart than from head to head. His passion for hope was always infectious. Even when talking about mundane things like taxes, he kept everything in the context of heading for the Promised Land, a biblical connection, and he hooked us.” McLeod, who has taught preaching in seminaries, points out how the rhythm of a great speech or sermon is directed by the pulsing of the speaker’s blood. “So it truly is from the heart.”

Rev. Anthony Bailey, team minister at Ottawa’s Parkdale United, agrees with McLeod. “The tremendously positive impact of his oratorical prowess stems from the fact that it is woven inextricably with his character, his integrity,” he says. Bailey was born in Barbados and raised in Montreal at Union United, a predominantly black congregation. He has lived and ministered in Kenya and Jamaica and has studied the African-American preaching style under Rev. Dr. James Forbes of Riverside Church in New York City. One might ask him whether there is a parallel to preaching in Obama’s oratory. According to Bailey, “In terms of his particular oratorical style, it must be acknowledged that he has benefited from the cadence-specific genre of the African-American preaching tradition. In the formative period, when Barack was exploring his Christian faith and his identity as a biracial person, his chosen congregation at Trinity [an African-American congregation of the United Church of Christ] in Chicago were mentors in that undertaking. Barack has tailored this oratorical genre for broader appeal.”

Robert Reid of Iowa’s University of Dubuque studies public speaking for a living. As chair of the university’s communications department and a former Baptist minister, Reid is awed by the agility with which Obama functions, his “masterful ability to adapt his communicative style to the medium and context.” But with regards to the public soapbox, Reid pronounces Obama “a charismatic speaker who draws on elements of the black preaching tradition balanced with a plain-speech style.” If Obama reminds some of a great preacher, they are also reminded that he has come at a time when traditional preaching has reached a nadir. With the aging and retirement of Rev. Billy Graham, even the big-time evangelists seem in short supply, whereas mainline churches, many argue, gave up on preaching decades ago. Maurice Boyd, who held United Church congregations rapt  in Sarnia and London, Ont., as well as congregations in New York City, once lamented the demise of preaching: “In the 1970s, word came down that preaching was passé. They had decided it did positive harm, it was something uncongenial to our present culture.”

The larger context is that the spoken word does not have an easy go of it in our culture. Part of us, perhaps after being cautioned about Hitler and other demagogues, doesn’t trust it. Some of us are scornful. “Oh, he is eloquent,” an exasperated John McCain complained while aimlessly roaming the stage during the presidential debates. “There he goes again.”

Of course, there are lots of words spoken in the classrooms and lecture halls of schools and universities, from the pulpits of churches. But it is rarely oratory, the call that soars and infects, inspires and moves. When I was a child, Canadian politics had John Diefenbaker, whose gift for the platform often descended into a harangue. So much of our political oratory falls short: the mangled syntax of George W. Bush or Jean Chrétien; the lectures of Stephen Harper; the blatant falsehoods that poor Colin Powell had to deliver to the United Nations in his argument to invade Iraq.

As for Obama, people are quick to point out that his words and everything else possibly raised expectations too high. Obama (maybe like Jesus) allowed the public to invest him with whatever they most wanted and needed, and they’d inevitably be let down. Reid finds Obama “most alive, most vibrant when given an opportunity to inspire others to the possibilities of what government can accomplish for people. He has found a way to embody these performative leadership skills when he speaks, which permits listeners to see him, not as a mere manager, but as a leader.”

But as with great orators before him, the issue is perhaps not what he can do but how he makes his listeners believe in what they can do. Bailey points out that “People, deep down inside, want the most honourable and noble aspirations of their souls to be animated. That doesn’t necessarily mean that their behaviour, attitudes and lifestyle will change to follow suit. But it does mean that many will host that possibility because of someone whose integrity, character, vision, faith, intellect and commitment has been conveyed in a substantive and inspiring way.”

Barack Obama’s real gift is to make even the most jaundiced and cynical of us want to shout, “Yes We Can!”



Forever

Originally published: United Church Observer, May 2015

How close are scientists to helping us live a great deal longer? Researchers are using cutting-edge science in a quest to defy mortality. How far should they go?


In 1513, when the Spanish explorer Ponce de León was searching for the fabled fountain of youth, he went to Florida. After wading through malarial swamps, the legend goes, he came upon some disease-free spring water around what would become St. Augustine. He declared it the fountain of youth, and it became his fame. Five centuries later, when I am looking for something similar — an explanation for aging and an answer to the question of whether we might do something about it — I get on a plane for California. My destination is the Buck Institute, a stunning building of glass and white travertine stone designed by I.M. Pei and set like a jewel on 200 hectares of wooded hillside in Marin County, north of San Francisco.

In 1900, the average life expectancy for a Canadian was 50. Some, of course, lived much longer, but those who died as infants or in childbirth or of various infections kept down the averages. By 1990, with improved sanitation, antibiotics, vaccines and better practices during childbirth, average life expectancy had risen to 74 for males and 81 for females. By 2012, it was 77 and 82. Could it improve any further? The low-hanging fruit has already been plucked. Further gains would require crossing a new frontier: making old people live even longer — perhaps dramatically longer.

The Buck Institute bills itself as the world’s leading private institution dedicated to the study of aging, and its scientists even coined a term for their investigation — “geroscience.” They dare to ask: Why do we age? And is it possible to slow, stop or even reverse that process? The questions challenge assumptions that hold sacred places in our culture: that aging is a natural and inevitable process, and death is as certain as taxes. How much should we tinker with what most of us believe is a set limit to the human lifespan?

As I started my research, I felt unsettled. Apparently I’m not totally enthralled about the possibilities of science transcending nature. I couldn’t escape a feeling of disloyalty to the proper rhythms of the universe. The Psalmist pegged the ideal human lifespan at threescore years and 10, or even fourscore for the strong (that’s 70 or 80). We are told that Moses lived to 120, Noah to 950 and Methuselah to 969. But I always assumed those numbers were exaggerated — or somehow from a different calendar. The longevity record in modern times is 122, held by a French woman named Jeanne Calment, who died in 1997.

It’s hard to argue that longer life is a negative thing. Yet there have always been opposing camps. Aristotle described the elderly as those whose “passions have slackened, and they are slaves to the love of gain.” The centuries-long quest to turn nature on its head has included some wacky practices, such as implants of goat testicles and ingestion of radium-laced elixirs. In 1934, poet W.B. Yeats emerged from what was in effect a vasectomy at age 68 claiming to be frisky as a colt. Scientific research into aging has roots in the 1960s, but it got going in earnest in the early 1990s with a generation of scientists who remain active today — several of them at the Buck.

The Buck Institute was established by the estate of Marin County philanthropists Beryl and Leonard Buck. Leonard, a pathologist whose family owned an oil company, died in the 1950s. His widow, Beryl, died in 1975 expressing the desire “to extend help towards the problems of the aged.” Twenty-four years and several lawsuits later, the Buck Institute opened its doors, funded by income from the estate. Kris Rebillot, director of communications, explains that 17.5 percent of the institute’s present annual budget of US$32.5 million comes from the Buck Trust. To support the work of the 229 employees, each of the 23 faculty augment this funding through grants.

Rebillot takes me below the gleaming upper floors into an underworld of laboratories chock full of expensive equipment. Microscopes, freezers and cell-sequencing apparatus share space with thousands of research subjects: mice and fruit flies, yeasts and microscopic worms. These tiny specimens are where the strategies that will someday be applied to humans are tested. Institute CEO Brian Kennedy has spent the past 25 years looking through his microscope at yeast. His colleague, Gordon Lithgow, has spent equivalent time with tiny worms called C. elegans. Judith Campisi and her post-doctoral students work with mice, while Pejmun Haghighi spends his days with fruit flies. It’s not that we are reduced to these tiny beings, but the other way round: to understand how the cells in more complex beings function, you need to start with the single-cell organisms and work up.

Geroscience is different from research into the various diseases that hit us when we get old, though Haghighi (who is also on the faculty of McGill University in Montreal) looks for clues to Alzheimer’s in the proteins of his fruit flies, and Campisi (who has a posting at the University of Alberta by virtue of supervising one of its PhD students) hopes to postpone age-related cancer through what she is learning from the cells of her mice. Geroscience also differs from gerontology, the examination of everything to do with aging, including the socio-economics.

Lithgow, a Scot who maintains his soft brogue, explains that the beauty of his worms is their 20-day life cycle. “Things go on in the worm during those 20 days that take 80 years in us.” The highlight of his career came in 2000 when he and a colleague changed a single gene and managed to make one of their worms live not for 20 days, but 40.

Campisi cranks up a PowerPoint presentation to explain what she has learned about how our cells age and the mischief that leads to heart and vascular conditions, neuro-degeneration, diabetes, organ failures, osteoporosis and cancer. “The occurrence of these diseases rises exponentially after age 50 or 60, and the evidence is mounting that there is something about the process of aging itself that is driving them all,” Campisi says. “That is our hypothesis, and if we’re right, we will absolutely revolutionize modern medicine.” What she predicts is that the silo model, where specialists look at one disease at a time, will collapse and geriatricians will instead treat aging itself. The focus will be on avoiding or postponing disease by creating a generally more healthy body that deteriorates more slowly. “It’s still a dream that we will be able to tackle diseases in this way,” Campisi adds, “but it’s no longer science fiction.”

What Campisi, who hails from New York, has figured out over 35 years is that as we get older, multitudes of useless “senescent” cells impede our bodies from attacking all the above-mentioned diseases. In collaboration with colleagues at the Mayo Clinic, she has been able to inject powerful drugs that help mice get rid of senescent cells and, as a result, clear their bodies of cancer tumours. It’s only a matter of time, she believes, until drugs achieve the same for humans.

Integral to many theories of longevity is something called caloric restriction (CR). In the 1930s, scientists at Cornell University observed that severely restricting dietary calories in lab rats — while maintaining micronutrient levels — resulted in life extension. In subsequent decades, experiments have been carried out with dogs and primates, in all cases achieving extended lifespans.

Severe calorie restriction, however, has side effects. No one wants to feel cold and experience slower healing of wounds or reduced sex drive, acknowledges Brian Kennedy. “People don’t want to be malnourished, and few want to put in the effort to gather the knowledge to do it [CR] properly. You can’t eat two Snickers bars and call it a day.”

The trick, then, is finding a drug that will mimic what was achieved in the rat studies while letting humans continue to eat normally — what’s known as a “mimetic.” Writer David Stipp, whose book The Youth Pill details research initiatives into human aging, states that CR mimetics from the early 2000s began “to transform the anti-aging quest from an endless guessing game into a fairly routine exercise in drug development.”

Some initiatives have stumbled after promising debuts. An exciting moment in the early 1990s was identifying the so-called French paradox: despite rich diets loaded with sauces and cheese, the French were discovered to have healthier hearts than Americans. Could it be because of the red wine they consumed?

Researchers soon isolated an ingredient called resveratrol. When tested on yeasts and finally on mice, it extended both lifespan and energy. Then it was acknowledged that the average person would have to drink 300 glasses of wine per day to replicate the dosage and effect.

Something currently considered promising is rapamycin, a member of a class of drugs called mTOR inhibitors, and already shown to be capable of counteracting aging and delaying age-related diseases in mice and other animals. This is where Kennedy is directing much of his research energy. In one trial, mouse lifespan increased by 38 percent. In 2014, the pharmaceutical company Novartis proved rapamycin safe for humans through a six-week trial with older people who were also receiving the influenza vaccine. The drug zeroed in on a genetic signalling pathway associated with immune function and aging, and improved the participants’ immune reaction to a flu vaccine by as much as 20 percent.

The 25 years of Kennedy’s scientific career match the trajectory of modern aging research. Now 51, the Kentucky-born scientist has been, since 2010, CEO of the Buck Institute after stints at MIT in Boston and the University of Washington in Seattle. “Skeptics have always argued that aging is a natural process, and you could never test drugs for aging in humans because you shouldn’t give drugs to healthy humans,” he says. He sees the work of his colleagues, both at the institute and in the increasing number of research programs in universities around the world, as preventative medicine. “We don’t want people living longer if it’s just prolonging their time suffering the diseases of aging. We want to see if we can delay or postpone those. People imagine we’re trying to make rich people live longer. We’re not doing anything different than people targeting one disease at a time. We’re just trying to come up with an intervention that will prevent multiple diseases at one time, something that will prevent cancer that, by the way, might also prevent Alzheimer’s.”

While the Buck Institute scientists seek pharmaceutical keys to extending longevity, that goal has already been achieved elsewhere through more natural means. A decade ago, National Geographic writer Dan Buettner set out to find “blue zones,” places around the globe where people seemed to live extraordinarily long lives — naturally. Buettner identified five: a corner of the Mediterranean island of Sardinia, Okinawa in Japan, Ikaria island in Greece, the Nicoya Peninsula of Costa Rica and — the next stop on my journey — Loma Linda, Calif.

Each blue zone community had its defining characteristic. What distinguishes Loma Linda is the fact that most of the 23,000 people who live there are Seventh-day Adventists — the church founded in the 1860s that, among other things, follows Old Testament rules (no tobacco, alcohol or pork) and observes a strict Saturday Sabbath. Adventists are also deeply committed to health, having established hospitals all over the world. One of their largest health-care facilities is in Loma Linda where, in 1905, Ellen G. White founded a nursing school that became Loma Linda University; a community grew up around it. White’s 1905 book, The Ministry of Healing, became a bible of proper nutrition.

Well before the arrival of Buettner, Adventists realized they had something special going on. In 1960, 23,000 Californian members of the church joined the Adventist Mortality Study. Through it, they discovered that if the death rate from coronary heart disease in non-Adventists is taken as 100 percent, then deaths among Adventist men were only 66 percent of what was expected (and 98 percent for women). Overall cancer deaths, again with the non-Adventist population as the standard, were 60 percent for Adventist men and 76 percent for Adventist women. Mortality was also lower with specific types of cancer: the lung cancer death rate among Adventists was 21 percent; the colo-rectal cancer death rate was 62 percent; the breast cancer death rate for Adventist women was 85 percent; and the prostate cancer death rate for Adventist men was 92 percent.

In 1974, another study of 34,000 California Adventists began, this time to determine what components of their lifestyle contributed to the lower rates of diseases. Yet another study was launched in 2002, with 96,000 Adventists from across the United States and Canada. This one, which is still ongoing, is exploring the links between lifestyle, diet and disease.

Larry Beeson is a cheerfully relaxed professor at Loma Linda University’s School of Public Health. The 68-year-old, who sports steel-framed glasses and mutton-chop whiskers, classifies himself as a “lifer.” Beeson came to the university as a student and was welcomed onto the faculty in 1973. He’s been there ever since, teaching generations of students and joining the research into their one big question: Why do Adventists live to such a ripe old age? Male Adventists outlive other men in California by 7.3 years, and female Adventists outlive other women by 4.4 years. And not, emphasizes Beeson, seven and four dragged-out years but, by and large, vigorous, healthful, productive years. “We die in the end of the same diseases — heart and stroke and cancer — but we do it later and live well up until that end.”

Adventist studies compare the health and longevity effects of various diets. Beeson (who consumes eggs and milk but not meat) notes that vegans have the lowest rates of Alzheimer’s disease and dementia, a number that climbs as more animal products are introduced. A vegetarian diet is recommended for church members but not enforced. The culture of the town supports such a lifestyle with lots of vegetarian options and soy milk on restaurant menus and in grocery aisles. School cafeterias are vegetarian.

The conclusions of the studies reveal no mysteries to someone like Beeson who rhymes off the essentials: “Eat a plant-based diet; maintain normal body weight — avoid being obese; don’t smoke or drink, except lots of water; eat lots of nuts; maintain a faith base — bring God into the picture.” This last point particularly intrigues him. “When Dan Buettner came here,” Beeson observes, “the one thing that separated Loma Linda from the other blue zones was the 24-hour Sabbath. That is, the time to de-stress, put my work aside and replace it with concentrating on my relationship to God, to my family, to my friends.” In the current ongoing health study, a sub-study, designed by psychologists and members of the faculty of religion, will try to quantify the effect of being a religious practitioner on health and longevity.

What is it like to live in a town with a substantial population of seniors? As I drive around the spanking-clean streets, I note that the civic centre is promoting a scrapbooking workshop for seniors. The organic produce supermarket has basket-mounted scooters for mobility-challenged customers, as does the campus food market. But this is not a seniors’ community like you might find in Florida; Loma Linda is definitely mixed. When I peek in the door of the university gym (which is open to the public), I note as many grey-haired persons as students sweating on the machines. While seniors make up 14 percent of Loma Linda’s population (compared to the California average of 11 percent), there are also the 4,500 students at its medical schools (many of whom are Asian), as well as the university’s faculty and staff members and their families. It is not a homogenous place.

Except in the religious sense. Rev. Randy Speyer, one of 17 pastors at the 6,500-member Loma Linda University Church, tells me that there are almost no churches other than Adventist inside the town limits, and 30 more Adventist churches in a 10-mile radius beyond the town. With those kind of numbers, it’s not hard to keep the Sabbath customs. Speyer spent 20 years as a counsellor in northern California before coming to Loma Linda’s church last year. This is the first time he’s been in such a monolithic community, as well as one with such a preponderance of older people. Five of LLUC’s 17 pastors are dedicated to ministries with seniors. But some noteworthy things are observable: the generations mix, and seniors teach children in Sabbath school. Seniors also take leadership roles on committees and are committed to continued education. When people are spry and vigorous in their 80s and 90s, it redefines the term “seniors.” Lots of 65-year-olds, Speyer observes, remain working as teachers and doctors, enjoying their professions and the contributions they make.

As a man of a certain age, I’ve decided to embrace John Burroughs’s dictum that “old age is always 10 years older than I am.” Yet, although I’d be happy to see aging postponed, I also feel I must be prepared to gracefully leave the stage — not hog it at the expense of those coming after me. Nothing in the healthy mix of generations in Loma Linda, nor in the modest and reasonable goals of the Buck Institute scientists, offended that instinct. As Kennedy assured me, they’re not trying to make “rich people live longer.” In fact, they know they have big challenges still ahead.

Lithgow acknowledged that what has most surprised him in the 15 years since the eureka moment when his worm doubled its life is how slowly things have moved. “I guess we thought that would be a watershed moment and the walls would come tumbling down,” he said. “Money would come into the field; pharmaceutical companies would pop up recognizing that aging and disease were going hand in hand. It didn’t happen.” Instead, geroscience remains at the edge of respectability, with the pharmaceutical industry slow to invest because government regulators aren’t yet on board with anti-aging drugs.

“The complexities of humans still encompass huge unknowns,” Lithgow admitted. “There are holes in our understanding. The big challenge remains translating what we know from studying worms into something of import for humans. That is painstaking, time-consuming and expensive.”

Coincidentally, while I was travelling through California last February, Time magazine put human aging in the public eye, featuring a baby on its cover with the provocative headline, “This baby could live to be 142 years old.” The sensational pronouncement played into the hands of enthusiasts, who love to trumpet the possibilities of 90-year-olds who look and feel 45.

We already do take some things for granted. Today’s medicine prolongs life each time it treats or cures an individual’s disease. But will a combination of drugs and geroscience prolong life for the population as a whole? What about the more extreme possibilities of turning back the clock — actually making people younger? This has already been achieved in the labs: blood from young mice has been transfused into old mice, heralding a futuristic horror show where the old, like vampires, might feed on the young. And despite Lithgow’s gloominess about financing, some very big players are showing up. In 2013, the Internet giant Google set up Calico (California Life Company) with an investment reputedly in the range of $1 billion. The specifics of Calico remain secret, but its project is generally to “harness advanced technologies to increase our understanding of the biology that controls lifespan.”

This brought me rudely back to reality. Science will plow forward, but so will the accompanying implications. In her lead essay in Time, Laura Carstensen, founding director of Stanford University’s Center on Longevity, postulates that “our aging society presents challenges every bit as fundamental and pervasive as climate change and globalization.” Carstensen’s centre, established in 2006, has charted how quickly the demographic structure has been reshaped. Once a pyramid, with a small quotient of elderly at the top, it now looks more like a rectangle, with the aged population equal in number to the young. The implications of this shifting demographic are both financial and structural (things like architecture with no stairs or automobiles that drive themselves). But they’re also ethical. How will the changing demographics affect things like global population pressures, demand for resources and political tensions between the young and the old?

The idea of a fountain of youth retains its imaginative power, and the possibilities are more enticing than ever. But the issues at play are not simply scientific. And we’re going to have to start talking about them.



WHO CURATES EVIL?



Originally published: United Church Observer, March 2014

There is something inherently perverse about the as-yet-unfinished and not open until later in 2014 Canadian Museum of Human Rights rising from the flat plain between a parking lot and a baseball stadium at Winnipeg’s Forks. When you get right down to it, the $351 million dream of the late media mogul Izzy Asper, the only national museum outside Ottawa, is being built to celebrate evil. Celebrate is not the right word, of course: itemize, or document, or memorialize, perhaps. There will be celebration, but it will be of survival against horrible odds, endurance against the atrocities that human beings inflict on one another. If you and your people are able to come through the worst of horrors, that in itself is cause for celebration. But (another perversity) is that getting to be part of the museum’s litany of narratives has become, almost naturally, a kind of race to the bottom. Equally bad things happened to me as to you. Or –as the museum’s developers have been dealing with since almost the day they started work- ‘the story you are proposing to tell about me is not nearly so bad as it should be’. Disputes, during the half decade the museum has been taking shape have almost constantly overshadowed what its promoters most wanted to highlight: the symbolism of the building’s stark rugged Tyndall stone, its glass atrium suggestive of the enfolding dove wings of care, the spiral staircases leading up to a light-filled 100 meter-tall Tower of Hope. Possibly, a museum of something as touchy as human rights should expect this. It is a museum of grievances, and it is very hard to make the aggrieved happy.

The contentiousness has all come from groups feeling their particular stories weren’t going to be given sufficient scope and play. The Ukrainian community lamented that Holodomor (the starvation imposed by Stalin) exhibits were going to be too close to the washrooms; Palestinians objected to being left out entirely; even Jews -who Asper envisioned as central to the museum- were reportedly upset that the founding of the state of Israel was not going to be commemorated. But what has turned into the nascent museum’s most heated controversy is the growing insistence that exhibits depicting the story of First Nations peoples need to carry the word ‘genocide’ in their titles. And the museum’s resistance, so far, to doing that.

The government of Canada currently recognizes five genocides: the Holocaust, the Holodomor, the Armenian genocide of 1915, the Rwandan atrocities in 1994, and the Bosnian ethnic cleansing. The aim of activists is to add one more to the list. For them, the museum is a testing ground and when their overtures are resisted, they are angry. The Manitoba Assembly of Chiefs claim that when the Southeast Tribal Council in 2009 made a donation of $1 million, profits from its casino, they did so, “with the understanding that the true history of the treatment of First Nations people would be on exhibit.” When that didn’t happen, Chief Murray Clearsky wrote scathingly to museum CEO Stuart Murray this past summer, “It is now abundantly clear that Canada is choosing to sanitize the true truth and continue with their agenda of minimizing the many attempts of genocide perpetrated against the many peoples of this land.”

The project to delineate much of what happened to First Nations peoples after European contact a ‘genocide’ is, of course, much bigger than the museum. Last July, a potent shot was fired through an op-ed page column in Canada’s largest newspaper. “Canadians need to face the sad truth that the country engaged in a deliberate policy of attempted genocide against First Nations people,” former Assembly of First Nations National Chief Phil Fontaine along with two active members of the Jewish community, Michael Dan and Bernie Farber, wrote in the Toronto Star. Their arguments were built on now-standard history -residential schools and the transporting of prairie peoples onto reserves- along with some recently come-to-light documentations about things like experiments conducted as late as the 1950s where residential school children were veritably starved in the interests of nutritional research. These, said Michael Dan, reminded him chillingly of “Nazi medicine.” The authors consider it self-evident that the many nasty actions perpetrated against native peoples over three centuries since European contact fit inside the definition of genocide as the United Nations constituted it in 1947. Dan, a physician, says that because of being a doctor he is clinical about it. “The UN definition is there, so you look, something either fits the criteria or doesn’t. Many things that happened to native people fit the criteria.”

And while this position is shared by growing groups in academia, the media and elsewhere, it is problematic and carries profound implications. Such a re-assessment of Canada’s history is unsettling not least because if it gets commonly accepted we’ll be acknowledging not far distant Nazis or Stalin or Ottoman Turks as perpetrators, but the Canadian government –and its colonial predecessors. Ourselves and our ancestors. Even our churches that –as in the residential schools- committed evil while believing they were doing good. The most troubling of constructs.

In our world, ‘genocide’ is absolutely the worst thing you can say about an action undertaken by individuals or groups. To have to confront it as something your own government might have participated in or engineered is truly horrible. So horrible, in fact, that many events that carry the characteristics of genocides fail to –or struggle to- get named as such. Behind all this, is a substantial problem with the word itself. The horrific things that happened not to people but to peoples all down through history essentially went without a name until Raphael Lemkin, a Polish-born jurist who lost the whole of his family to the holocaust, grasped onto the Greek geno meaning race or tribe as the root for a new term and declared that ‘Genocide’ occurred when you and your group are targeted not because of what you had done, but because of who you were. Ironically, formalizing genocide as a crime in some ways seemed to augment rather than solve problems. After much lobbying and debate, the newly formed United Nations in 1948 passed resolution 260 that defined genocide as something both to be prevented and punished. But many resisted, including the US whose Senate took nearly four decades to finally ratify it. Possibly this was because also at risk of punishment were those who failed to prevent.

Another problem: which horrific events would be allowed to claim the name? What happened to the Armenians at the hands of the Turks in 1915 was retroactively termed a genocide –though still much protested by Turkey. Meanwhile, whether the violent slaughter that rent Rwanda in 1994 was actually genocidal in nature continues to be disputed in some circles. One issue seems to be that the term itself has been elevated onto such a special tier of evil that its use is both jealously guarded and jealously coveted –franchised out, if you will, to specific victims. Harvard scholar Samantha Power (now US ambassador to the UN) in her 2002 Pulitzer Prize winning book A Problem From Hell, America and the Age of Genocide describes how for almost the entire hundred days it took the Rwandan catastrophe to play itself out, the U.N. Security Council and the various arms of the U.S. government were locked in a semantic debate about whether to use “the G word”.

Genocide is a legal as well as a descriptive term. It is also dueled over between politically motivated activists on one side and, on the other, scholars who need to be rigorous about history. To them, levels of intent are important as is the notion that one size doesn’t necessarily fit all. William Schabas, a Canadian-born international law scholar at Middlesex University in the U.K., told a CBC radio interviewer that the term carries ”a special stigma that distorts a debate.” This is a view shared by the first academic I approached for an interview, an historian who hastened to tell me the topic has become so politicized he didn’t wish to go on record. Like Schabas, he does not deny that awful things happened to native peoples in Canada, but argues that the use of the genocide term makes it difficult to look with any degree of precision at even the awful things. Conventional wisdom and political correctness take over with the resulting chill preventing historians from discussing what they should discuss –and what is useful to the rest of us- that is, the implications of choices that are made in our public discourse.

The application of the term ‘genocide’ to what happened in North America goes back to books in the 1970s (The Genocide Machine in Canada, the pacification of the north, by Robert Davis and Mark Zinnis); the 1990s (The American Holocaust, Columbus and the Conquest of the New World, by David Stannard); and 2004 when Dean Neu and Richard Therrien published Accounting for Genocide, Canada’s bureaucratic assault on Aboriginal People. Andrew Woolford, a professor of criminology at the University of Manitoba specializing in genocide studies, predicts that the term will become ever more central to discussions about Canada. In 2004, he was the only Canadian scholar presenting at a genocide conference; nine years later there were seven papers on Canada at an international conference. “There is a generational shift,” he says, “where younger academics want to look at Canada through the critical genocide studies lens.” Still, he laments that people would fear being labeled ‘denialists’ should they disagree with even a part of the thesis presented. “The role of scholarship should be to complicate rather than simplify things.”

On the positive side, Woolford argues that should it (that acts of genocide were perpetrated again Native People) become a general consensus, the results would in actuality be beneficial. “For the survivors, recognition is important. From a more general perspective, my angle is that thinking about ourselves as a nation born out of genocide gives us a point to reinvent ourselves, to think about how we can de-colonize Canada and be different as a nation.”

For Michael Dan, one the Star editorial’s authors, it should be about healing. “As a Jew,” he says, he has spent a lot of time trying to process ideas of genocide. “In Canada we have trouble processing the idea we are capable of it. It doesn’t go with our being peacekeepers, a nice country that is apologizing all the time. But in order to heal we have to acknowledge that we did this.” Phil Fontaine sees acceptance as closing an as yet still-gaping circle. “Some people say it’s going to be just another money-grab,” he allows. “Not so. It was never intended as something that would extract more money from the government. But there has to be a series of conversations with Canadians so together we can write the missing chapter in Canadian history, one that would have to include this notion of genocide.”

Something else about genocide, is the thorny question of responsibility and guilt. As Hannah Arendt famously wrote of the Second World War Holocaust -quite possibly appropriate for the aboriginal situation in Canada- yes, racist policy, though sometimes couched in the language of good intentions, bears a basic responsibility. But then, along with some unarguably violent assaults, the majority of destructive actions are carried out by people simply doing their jobs (in Canada’s case possibly a lot of church people) blind (sometimes willfully) to the implications of a bigger picture. This, as Arendt termed it, is the banality of evil.

Rt Rev. Stanley McKay says that the idea of there having been genocide is difficult for our society to wrestle with. “We are completely caught up in the Canadian concept that somehow we were doing good; the church in particular, had the interests of the First Nations in our minds and hearts when we did these things.” The first Aboriginal moderator of the United Church, now retired north of Winnipeg, says he feels that though it will encounter strong resistance, the project to identify some actions as genocide is important. “Years ago those of us who lived on reserves and went to residential schools experienced racism without having any idea what to do,” he says. “We had no idea we had rights to have things different.” When asked if the Church has a role in the emerging discussion, he answers quickly: “Yes, a fundamental role. The credibility of the Christian community is on the line as this information becomes more widely available and people can no longer claim ignorance. The future of the church rests on its capacity to engage and develop right relations.”

Rev. James Scott, General Council’s Officer for Residential Schools for the past eleven years, has observed the term ‘genocide’ gain traction over time with “more Aboriginal people now using the word.” In response, he says that in the fall of 2013 his staff flagged the importance of having a conversation about use of the term within the Church. “We need to move as a settler society to grapple with the breadth and depth of what harm we did,” he told me. Still, he pleads for caution. “Genocide is a very incendiary word that sometimes might be a barrier in the way of having people talk about important things that really happened. If you scare people away, they won’t want to hear the truth.”

Says Scott, “There may be graduations of how blunt we can be, but those graduations need to move forward. We need to learn and help others understand the profound brokenness we created.”

Everybody struggles in this manner. In Winnipeg, museum staff wrestle with what they see as their proper responsibility. “If the museum were to use the word genocide, it would make a declaration it has no right to make,” museum spokesperson Maureen Fitzhenry told the Winnipeg Free Press. “We are not a court that adjudicates,” Clint Curle, head of stakeholder relations, told me, “but a place to hold the conversation. We believe this is our proper and also welcome role. Education may be more effective than adjudication in helping Canadians grapple with the human rights issues in our past. That is also more in keeping with the museum’s capacity.”

What should Canadians feel? Should we be appalled by efforts to lump us in with history’s more vile regimes, or should we welcome a more blunt interpretation of our national story? Is it important and necessary that we experience and feel a greater shame than we already carry with regards to the history of relations with native peoples?
Andrew Woolford quotes German philosopher Jurgen Habermas, who acknowledged about his country, “we are participants in a form of life that made genocide possible and this is why we need to critically interrogate the past and our present.” Applying this to our own time and place he adds, “I think we need to interrogate the Canadian past and the Canadian present and work towards social change.”