By Dr. Sander A. Diamond, professor of history
At exactly 11 o’clock on the morning of Nov. 11, 1918, the Great War of 1914 came to an end. By the time the Armistice was signed, more than 10 million had been killed. Most military scholars agree that 6.8 million men died in combat and another 3 million-plus from accidents, disease, or in POW camps.
During the war, there were two major fronts: the East, where the Russians and Germans fought, and the West, in Flanders and Northern France. Before the war destroyed the landscape, the fields looked like a van Gogh painting, filled with flowers, cows, and shafts of wheat mixed in with red poppies. In 1915, John McCrae published the most famous poem of the war: In Flanders Fields. In time, every school child memorized the poem that began:
In Flanders Fields the poppies grow
Between the crosses, row on row…
The British Legion chose the red poppies as the symbol of remembrance, fabricated out of wire and red paper. They were first worn at the first Armistice Day and ever since.
The first Armistice Day was commemorated Nov. 11, 1919, at 11 a.m. in all Allied capitals amid hushed crowds, bells ringing out the hour. In towns and cities, large and small groups of veterans and their fellow citizens gathered around make-shift memorials to mark the hour. Nearly all of the veterans wore their uniforms, their chests decorated with medals. In the United Kingdom and its possessions, they called it Remembrance Day. It was called Armistice Day in the United States until changed to Veterans Day in the 1950s to honor all who served in later conflicts.
The reality of the loss was everywhere to be seen: women in mourning dresses; fathers with black armbands mixing with veterans horribly deformed, many in wooden wheelchairs or on crutches with missing limbs some with faces so disfigured that they wore masks; those whose lungs had been poisoned by gas, still coughing.
The bodies of those who fell in battle were collected and many put in temporary grave sites near where the battles were fought. Families were given the option of having the remains sent home or buried in Europe. It was a painstaking task; often, the remains had to be exhumed. Of the 116,708 soldiers who died as part of the American Expeditionary Force in World War I, 30,921 are buried in Europe.
The bodies that were returned home were buried in local or military cemeteries. In the inter-war years, memorials were built and erected in many towns and cities. In 1921, Congress created a single memorial in Arlington National Cemetery on the property once owned by General Robert E. Lee. In France, the bodies of four unknown soldiers were exhumed and brought to a small chapel. A U.S. soldier was handed a bouquet of white roses and asked to place it on one of the four coffins. The one he selected was sent to Arlington and placed in an above-ground sarcophagus made of white marble: the Tomb of the Unknown Soldier. In 1931, a more elaborate tomb replaced the original and after World War II, several additional tombs were added, and today the memorial is known as the Tomb of the Unknowns. Each Nov. 11, a solemn ceremony is held, with the president placing a wreath on the tomb. It is our most sacred ground, guarded day and night by special service personnel.
Today, as we approach the 100th anniversary of the outbreak of World War I not one service person remains. The last American, Frank Buckles, died in February 2011 and the last Brit, Harry Patch, just three years ago, victims of what Lincoln called “the silent artillery of time.”
On Nov. 11, the Keuka College community will gather at the World War II memorial that stands near Hegeman Hall. It was dedicated May 9, 2005, the 60th anniversary of the end of World War II. It was a gift to the College and community from the students in the History/Political Science Club. On one face of the monument, all of the theaters of war are listed; on the other, a testament to the Keuka College nurses and the program created in the darkest days of World War II. Nursing graduates have served in all wars since the founding of the program. A few yards away is another small monument remembering the 50th anniversary of the end of World War II. Rising above it is an oak tree, the symbol of the College.
On Nov. 11, the College will remember all who have served our nation and continue to serve. We will also remember the 124,913 Americans who are buried overseas. A formation of veterans from the local VFW will fire their rifles at the end of the commemoration just as veterans have since the first Armistice Day in 1919.
We must take time to remember and honor all of those who served and continue to serve on land, on the sea, and in the air. We also must remember those who are still in the line of fire— in Afghanistan and other regional conflicts. When our service personnel return home, many will return to civilian life as they knew it. Others will enter college and will be welcomed as students on our campus, the two World War II monuments reminding them that Keuka College has always been a welcoming community.
There is also one final nugget from our past, namely, the Field Period program, which was, in part, created in 1942 so some of our students could help bring in the harvest since “the boys” were overseas.
On the surface, hip hop music isn’t something that would warrant serious scholarly investigation.
But when you dig deeper, as Athena Elafros did, it most assuredly does.
“The sociological study of hip hop culture teaches a great deal about culture and society in an increasingly globalized and interconnected world,” said Elafros, assistant professor of sociology at Keuka College
Her doctoral dissertation, Global Music, Local Culture: Popular Music Making in Canada and Greece, was completed at McMaster University in Hamilton, Ontario, Canada. It featured 62 interviews, as well as song lyrics, in order to analyze how global cultural forms, such as rap music, are rearticulated within local contexts in Toronto and Vancouver, Canada, and Athens, Greece.
“Hip hop music began as a predominantly African-American, Puerto Rican and Latino youth culture in the South Bronx in the mid 1970s,” said Elafros, who earned her undergraduate degree at the University of Toronto. “The loss of good-paying factory jobs within the South Bronx contributed to the poor social and economic conditions within which hip hop culture developed.”
By Dr. Sander A. Diamond, professor of history
Conventional weapons—most in accord with the rules of war and Geneva Conventions—have killed millions on an unimaginable scale. In the Syrian civil war, an estimated 125,000 people have been killed by conventional weapons; 1,500 died from gas poisoning. Yet, gas and other weapons of mass destruction (WMDs) are prohibited by international law for use on the battlefield or against one’s people.
At the root of the prohibition against unconventional weapons is the long-held belief that WMDs are a departure from the norms of war, a cruel and inhuman way to kill either a handful of people or millions. Soldiers and civilians have conditional protection against conventional weapons. This is not the case with WMDs. When sarin-loaded missiles struck a suburb of Damascus, the civilians had no protection. Death by gas is a horror in its own constellation.
In late April 1915, French and British troops were defending the medieval Flemish city of Ypres. For the second time, the Germans had mounted a massive assault; the first attack, in October-November 1914, failed. Amid the German shelling, which filled the air with grayish-black smoke and the smell of spent gunpowder, yellow clouds appeared. The French troops were the first to inhale what was soon known as chlorine gas. They started to cough their lungs out, which were filled with blood. Disoriented and blinded from the chemical mix, many of the defenders were dead in minutes, others suffered agonizing death. Clouds of German-made gas followed the air currents. The British and French soldiers dropped like flies. An estimated 6,000 were killed; those who initially survived died a horrible death in field hospitals. Survivors had to live with scarred lungs and many were blinded. Others went insane or endured a lifetime of deep depression, in those years dubbed the “shell shock” syndrome.
World War I claimed the lives of nearly 15 million soldiers on all fronts and left more than 20 million wounded and maimed. Of those killed, roughly 90,000 died from gas poisoning and 10 times that were disabled. As we remember the Great War, the slaughter that took place at the battles of the Marne, Verdun, and the Somme are recalled with horror. A generation perished. But we will recall the Second Battle of Ypres with a special horror since a chemical weapon was introduced to the battlefield for the first time.
When the war ended in 1918, the Allies dismantled the German military machine and the gas- filled artillery shells were dumped into the Baltic Sea. In the inter-war period, military planners went to work developing technologies and new inventions that would permit mobile warfare. But gas was deemed taboo and in 1925, the Geneva Convention banned its use on the battlefield and the ban was later extended to its use against civilians. Later, biological and bacteriological weapons were also banned.
The prohibition against the use of gas has been violated, but fewer times than most people believe. While gas may have been used in the fog of war on many occasions, the evidence reveals that the Italians used it in the African Campaign in the 1930s, the Japanese in China prior to and during World War II, the Soviets in Afghanistan in the 1970s, Saddam against the Kurds, and most recently, Bashar al-Assad in the brutal Syrian civil war.
Hitler did not open his huge stockpile of chemical weapons on the battlefield during World War II. But this was not the case when it came to the Jews and others the Nazis deemed “life unworthy of life.”
At the time the U.S. Constitution was drafted, little was said about the idea of an administrative state due to the Framers’ distrust in centralized power and the type of bureaucracy existent in Britain at the time.
“But it has evolved in the U.S. as a uniquely American enterprise; it is still quite small, in relation to the bureaucracies of other industrialized democracies, and arguably more responsive than many of them,” said Dr. Angela Narasimhan, assistant professor of political science.
Narasimhan marked Constitution Day (Sept. 17) in her Public Policy (POL 331) course with a discussion of “Public Administrative Theory and the Separation of Powers,” published in 1983 by David H. Rosenbloom.
“Rosenbloom’s work explores the interaction between the three federal branches of government in the implementation and evaluation of public policy and serves as a useful reminder of how public agencies, unlike private corporations, are constrained by their role as democratic institutions,” she said. “This work was important in that it was part of the new public administration movement that challenged the notion that public administration could be neutral and efficiency-driven like organizations in the private sector.”
That view, outlined in the scientific management paradigm, was articulated in the early 1900s by such scholars as Max Weber and Luther Gulick “in the hopes that bureaucracy could be cleansed of corrupting political influences and made more efficient,” she explained.
“As Gulick noted, a hierarchical organization would help achieve these goals, and underlying that hierarchy is a clear chain of command, with bureaucrats only ‘serving one master’ in order to clarify responsibility and promote accountability.”
However, explained Narasimhan, new public administration theorists like Rosenbloom contend that public agencies have more than one master: they answer not only to the top level of management, including the head of the agency and the president, but also have direct relationships with Congress, who makes the policy that they implement; the Judiciary, which reviews their actions and can challenge implementation; as well as the public, as their consumers.
“We reject the scientific management paradigm,” said Narasimhan, “because we now know that public administration is inherently political; it operates under the democratic structure of the Constitution.”
Mike Rogoff’s college career got off to an inauspicious start.
He blew up the chemistry lab at Hofstra University.
“I started my college career as a pre-med major because I planned to become a psychiatrist,” explained Rogoff, Keuka College’s 2012-13 Professor of the Year who delivered the keynote address at academic convocation today (Aug. 27). “Things went pretty well in my biology courses, but chemistry was another story. I was barely making it through chemistry lecture with a D- average but the big problem came when I blew up the chemistry lab.”
No one was hurt, but “‘Big Boom Boy’ got bounced from pre-med,” recalled Rogoff.
But then he “bounced back.” Rogoff changed his major to psychology and the rest is history.
“It’s great if you start with a major that fits you right away and you stay with it throughout college,” said Rogoff, who joined Keuka faculty in 1971, “but don’t feel like a failure if your first major just doesn’t fit your talents and interests. These changes help you build your personal and professional identity. They help you find out who you are, what you’re good at, and what you really want to do.”
Rogoff credited one of his teachers (Dr. Vane) and adviser (Dr. Cohen) for helping him “grow into my new major” and building his “confidence as a learner.
“I needed that support,” he said. “I didn’t feel good about myself when I bombed out of pre-med. As a matter of fact, I felt downright stupid. But my adviser helped me flip things around. He reminded me that I had done pretty well in my biology courses even though I had a hard time in chemistry.”
According to Rogoff, he also got a lot of support from the upperclass psychology majors, and by the time he finished at Hofstra, he was on the Dean’s List and admitted to all seven of the graduate schools to which he applied.
“Not too shabby for the ‘Big Boom Boy,’” quipped Rogoff, who holds master’s and doctoral degrees from Cornell University.
Said Rogoff: “My main message here is that you’ll have the same opportunities as you grow into your career. Here at Keuka, you’ll have access to many circles of support and that will help you continue to develop your competencies and interests. You’ll continue to gain insight into who you are, what you can do, and what you want to do.
“Let us help you accomplish your dream,” added Rogoff. “Let us help you develop your competencies, and let us help you build your support and bounce-back skills. Increasingly, you’ll be able to put into place the circles of support. You’ll be able to help others build resilience. All of this can help make the world a better place.”
Academic convocation marked the official opening of the 2013-14 academic year and College President Dr. Jorge L. Díaz-Herrera and Robert Schick, chair of the Board of Trustees, welcomed new students to campus.
Schick urged the students to get involved.
“Make a new friend every day: another student, faculty member, any of the staff of the College. Immerse yourself into the very fabric of the College by joining clubs and participating in sports as a participant or fan.”
Friendship was also on the president’s mind. He told the students they could “expect to make plenty of friends, many of who will become lifelong friends. You can definitely expect to make memories that will last a lifetime.”
He also said they can expect to make a difference—both on campus and in the larger community.
“Community service at Keuka is important,” he said. “Last year alone, our students devoted more than 60,000 hours of service.”
By Professor of History Dr. Sander Diamond
We learned from Edward Snowden’s illegal and unprecedented breech of national security that in the name of national security in the Age of Terrorism, the National Security Agency’s (NSA) most sophisticated computers have been collecting and storing nearly all of our electronic communications.
The magnitude of this Congress-approved project (named Prism) is so great that it can only be expressed in numbers beyond our comprehension and will soon be stored in a building complex larger than a shopping mall. The NSA hopes to ferret out nuggets of information which, after analysis, will thwart another terrorist attack in the post-9/11 environment. We have been assured that the NSA’s program has already been successful in preventing new attacks. We have also been assured that the NSA and other agencies that share this information are not reading our mail, so-to-speak, unless there are red flags that indicate the planning or coming execution of an assault against the United States or its allies. We have been asked to trust the government that this information will remain secure and sacrosanct.
While millions of Americans find this more than mildly distasteful, they are willing to give the government a pass since the events on that pristine morning in the late summer of 2001. Others believe that what the NSA is doing is in violation of the Fourth Amendment which, in part, reads: “The right of the people to be secure in their persons, houses, papers and effects, against unreasonable searches and seizures, shall not be violated… but upon probable cause…” This is at the heart of the ACLU’s legal challenge to the NSA’s program. What is being done in peacetime in the name of national security may in time be argued before the Supreme Court.
Governments have known a great deal about their people since the advent of civilization. Moses was commanded to tally up the Hebrew’s Twelve Tribes (the Priestly tribe was exempted) as they walked to the Promised Land. What Moses was ordered to do was very precise: “Take a census of all the congregations of the people of Israel, by families, by father’s houses, according to the number of names, every male, head by head…”
In our promised land, local, state and federal agencies have been collecting information for 237 years. In general, the courts have upheld the right of the government to collect data. Throughout our history, many Americans have had a deep suspicion of government and carefully guard their privacy. The Bill of Rights was added to the Constitution to ensure that the new government did not abuse its powers, and that our inherent rights were protected. In wartime, these rights were not suspended but laws were passed to ensure the security of the nation. During the Civil War telegraph lines were tapped
It is unlikely surveillance will end in an age when the United States is a prime target for attacks of all kinds. In a word, this is the future. However, the government has to do far more to allay the suspicions of its people. Merely saying “trust us” is not enough. For starters, multi-layer firewalls have to be put in place to make certain that hackers and foreign governments do not gain access to our private worlds. The government should stop out-sourcing the collection of data and must do a better job of vetting its employees. The Snowdens and Private Mannings of the world are not fighting for our freedoms.
It is also time for Congress to place time limits on how long information can be stored; deleting information after a reasonable amount of time unless a person(s) or group(s) have been red- flagged under the probable cause mandate in the Fourth Amendment. And, Congress has to be pressured to make certain that the litmus test of probable cause is adhered to.
By Mary Leet ’16
The Faculty Development Committee recognizes faculty for excellence in experiential learning, teaching, and academic achievement through an awards program. All three awards include a $500 prize. Here is a capsule look at the 2012-13 recipients:
Excellence in Experiential Learning Award: Dr. Patricia Pulver
The Excellence in Experiential Learning Award goes to a faculty member that has demonstrated an effective practice or activity that allows students to learn through their experiences.
And that is precisely what Dr. Patricia Pulver, professor of education, does through her Master Teacher Insight Project.
Pulver believes that by observing teachers in the classroom, then discussing relevant issues and reflecting on their actions, students gather first-hand knowledge and experience that shapes them into effective teachers “a lot faster than reading textbooks.”
In addition to observing current teachers teaching, students conduct four separate interviews over the course of a semester with a teacher they know and consider a “master teacher.” Students discuss what came out of the interviews with classmates and then compose a reflective paper that summarizes what they learned.
They must identify common themes and provide “specific illustrative examples.”
Through this project, “students are able to articulate what they learned about the process and any ‘take away’ strategies that they might utilize in their future classroom,” said Pulver.
Excellence in Teaching Award: Dr. Christopher Leahy
While the traditional history lecture is still important, “students learn history best— and enjoy it more— when they actually do what historians do,” said Assistant Professor of History Dr. Christopher Leahy.
Leahy employs the historical method to teach all his classes, effectively turning what can otherwise be a dry subject into a discipline that requires critical reading, logical thinking, and persuasive and effective writing.
Students respond enthusiastically to this unique approach, calling Leahy “interesting,” “captivating,” and “the best professor I have ever had.” Shelby Seeley ‘13 noted that “Dr. Leahy is a teacher who can make even the most tedious topics interesting and intriguing.” “His classes are the ones that the students are truly excited to take,” according to Diane DePrez ‘13. “It has often been said… that it is a sad semester when you don’t have a Leahy class.”
By using primary sources and working with students to interpret them, Leahy’s students say that he makes history accessible and understandable on a relevant level.
“[He] always strives to give his students a deeper understanding not just that an event happened, but how it happened, why it had to happen, what brought it about, and what might have happened if it never did occur,” said Josh Beaver ’13..
Excellence in Academic Achievement Award: Dr. William Brown
Assistant Professor of Biology Dr. William Brown isn’t hesitant to involve students in research or have them present at professional conferences.
Recently, Brown presented a poster with collaborators from Kutztown University using data that had almost entirely been generated by students in his biostatistics classes.
Brown attended the annual meeting of the Rochester Academy of Science last fall, accompanied by undergraduates Kelsey Morgan ’15 and Amber De Jong ’16. Morgan presented her research at that meeting, and De Jong recently completed a research project of her own, “Temporal Changes in Red-shouldered Hawk Morphology,” which she will present at the 2013 meeting of the Rochester Academy of Science this fall.
In January 2012, a peer-reviewed paper composed by Janelle Davidson ’12, Brown, and ecologist Marion Zuefle was published in The Journal of Applied Animal Welfare Science, the leading peer-reviewed journal on the science of animal welfare. Titled “Effects of Phenotypic Characteristics on the Length of Stay of Dogs at Two No Kill Animal Shelters,” it has been read more than 800 times, making it the most-read paper published in the journal.
By Dr. Sander A. Diamond, professor of history
The Middle East has long been the epicenter of complex problems that wash like waves into other regions. No different than his predecessors as far back as Truman, President Obama is faced with a complex nexus of interrelated problems and hard choices.
The Arab Spring has brought with it change, much of it unwelcomed in Washington. Under the leadership of the Muslim Brotherhood, the future of Egypt is uncertain. An Iran armed with atomic weapons is a dismal prospect. Yemen is in disarray, al-Qaeda doing its usual destructiveness. The prospect of an independent Palestinian State on the West Bank has moved to the back burner. The Sinai has fallen into total disorder. Hamas and Hezbollah will surely take advantage of the region’s problems, each being fed by Iran.
And then there is Syria, which presents an unimaginable host of problems that can destabilize the region even further. Much of Syria is in ruins, its ancient cities along the coastline demolished by the Bashar al-Assad regime’s air force, tanks, and fighting between paramilitary units and the regular army. Syria is sinking as a nation and the ability of its leader to rule over what was Syria two years ago is limited. But al-Assad still has a monopoly of power. He is being resupplied by the Russians and the Iranians, using Hezbollah fighters to put down the Opposition in the most brutal way.
Barring a bolt out of the blue, Syria is shaping up as the greatest foreign policy challenge of Obama’s presidency. One suspects his humanitarian instincts push him in one direction, his geo-political instincts in another.
After two wars in the region, he is mindful of the perils of getting involved in a civil war. In the words of The New York Times’ Tom Friedman, “If you torch it, you own it.” After a decade in Iraq and Afghanistan, this is the last thing Obama wants. However, if the U.S. continues to stand on the sidelines and wait to see how the crisis unfolds, Obama will be accused of being heartless or too tepid.
Like many, Obama would like to see al-Assad pack and leave, but if he falls, the gates of hell will open in a region not known for moderation. There is the fear that once al-Assad is gone, Syria could be dismembered, further collapsing into chaos with Hezbollah moving into its western and southern sections, which Israel would not tolerate. Given the prospect of Syria falling into total chaos, dismembered, or worse, one has to question if some of Obama’s advisers want al-Assad to leave anytime soon.
Obama may be taking a leaf out of the Israeli handbook on the region: better the devil you know that the one you do not. If al-Assad emerges relatively intact and manages to reassert control, it will take years to rebuild a broken Syria. From a humanitarian point of view, this is hard to swallow. But it just may be a very hard reality, a bitter pill. As one Israeli said, the choice is between plague and cholera, both horrific but at least we know al-Assad, so-to-speak.
Letting the situation play itself out, standing on the sidelines may be the best choice, however distasteful. But al-Assad has to be put on notice: opening your arsenal of weapons of mass destruction and committing genocide on an unimaginable scale would be a game changer.
By Dr. Sander A. Diamond, Professor of History
At approximately 8:50 a.m. on Sept. 11, 2001, the first of two planes slammed into the World Trade Center. The Age of Terrorism had arrived on our shores. On April 15, we were once again reminded that despite our best efforts to insulate ourselves from terrorism, we live in an age where our safety is conditional.
The perpetrators of the Boston attack, brothers Tamerlan and Dzhokhar Tsarnaev, were born in Chechnya, which is located in the North Caucasus region of the Russian Federation. Exactly when and why the older brother, Tamerlan, became a terrorist is a key element in the investigation. What we do know is Chechnya and Dagestan are the epicenters of Islamic Jihadism in the region. Their geography places them closer to Tehran than Moscow to the far north. During the Second World War, when the fate of Russia was conditional and the Battle of Stalingrad was raging in the winter of 1942-43, Moscow alleged that some in Chechnya were on the wrong side. Their punishment was collective. Stalin uprooted them and shipped them to Siberia only to return in the 1950s. Decades later, when the USSR imploded, Chechnya attempted to leave the Russian Federation and an insurrection was put down with the full might of the Russian military. Chechnya terrorists responded by blowing up a subway train in Moscow while the so-called Black Widows seized a theater with 800 people in it and threatened to blow it up.
Meanwhile, in the years that followed, some of the people of Chechnya and Dagestan have turned to Islamic extremism. Perhaps Tamerlan was predisposed to the Jihadist mindset before he arrived on our shores, concluding that however different Russia and the United States are, they share a common hatred of Muslims. Others suspect he was in contact with the Islamic extremists and was, in the words of the FBI, radicalized during a six- or seven-month stay in the region not long ago. We also know that his computer is filled with materials downloaded from radical Islamic sites, so perhaps he and his brother were radicalized on the web. Whatever the case, one does not become a terrorist overnight.
By the same token, terrorist acts are not spur-of-the-moment decisions. In the case of the Boston bombers, they bought fireworks, dismantled them and used the black powder to build bombs. They also planned to use pipe bombs in Times Square. For seasoned terrorist organizations such as al-Qaeda with its numerous so-called franchises, leaders place a high value on bleeding their victims in the financial sense. For bin Laden and his cohorts, Sept. 11 was a nickel-and-dime operation. What we have spent repairing the damage and trying to protect ourselves for future attacks is beyond comprehension. The Tsarnaev brothers spent perhaps $200 building the bombs. When the cost of finding them, the commerce lost in Boston for several days, and the hospital bills are finally added, the total will be staggering. Given the cost of health care, the $20 million fund that has been established to help cover hospital expenses for those injured may be peanuts, barely making a dent in the final tally. For many of the survivors, the costs of restoring their health over a lifetime will also be staggering. As for the coming trial of the younger brother, the cost will be in the millions.
What appears to link all terrorists is a deep hatred of their perceived enemies and what they represent. For them, killing is a perfectly rational act in keeping with their religious or ideological beliefs. Their victims, not them, are the incarnation of evil.
By Sander A. Diamond, Professor of History
In the 1960s, British comic Peter Sellers starred in a farcical film, “The Mouse That Roared,” a comedy about a mini-nation that somehow acquired an atomic bomb.
Fifty years later, we have a case of life imitating art. North Korea is a roaring mouse. Labeled “The Hermit Kingdom,” a description that conforms to its isolation from the main current of world events, it is a totalitarian regime led by Kim Jong-un, the proverbial loose cannon.
The entire North Korean economy supports the military establishment, a serf-like labor force confined to collective farms and factories. Here, weapons are produced and exported overseas. While other communist states such as Vietnam and China have enjoyed prosperity, North Korea remains poor. Just across the 38th parallel is South Korea, where a population of 49 million enjoys a high standard of living, the average per capital income being $28,000.
Though smaller than Mississippi, North Korea is armed to the teeth with an unknown number of atomic bombs and the ability to deliver them. The image it projects in countless propaganda clips seen on TV in recent days is a leaf out of another age. In the old Soviet Union and Mao’s China in the 1950s, we saw generals, chests filled with medals in off-green uniforms, clapping and shouting in unison when their venerated leader appeared. Today, we see Kim Jong-un looking down in a Red Square-type setting on his troops, 1.1 million in all, as they parade past followed by Soviet-style missile carriers and heavy guns, the types the Russians used in the siege of Berlin in April-May 1945. Even the capital of North Korea, Pyongyang, has the stamp of the old USSR: high-rise buildings the Russians used to call Stalinist Modern.
Although North Korea is modeled after the world of Stalin and Mao, it differs from its ideological mentors in one very significant way. In the USSR and China, leaders began their careers in the nascent years of the revolutionary movement and those who followed worked their way through the ranks of the party bureaucracy. North Korea is ruled by a dynasty established by the current leader’s grandfather, who began his career as a revolutionary and came to power in 1945. When he died, his son assumed the leadership of the state and the party and recently, the torch was passed to his son.
The entire world is trying to divine the intentions of the new 28-year-old leader who talks about war as it if was a parlor or video game. Whether all of the blustering and military action is being used to consolidate his grip on the military power or pry economic concessions from the United States, no one can say with certainty since few people outside “The Hermit Kingdom” know exactly what is going on behind the drawn curtain. Here, the ghost of Stalin is alive and well.
Short of a highly unlikely military coup, we have to take Kim Jong-un at his word. And if and when this crisis passes, we can expect Kim Jong-un to repeat his antics again. At 28, he has a lifetime ahead of him to threaten the world.
There aren't any events scheduled for today. Please check back in the near future or view the College calendar to see what's coming soon.