By Ken Zurski
As brothers growing up in Rochester, New York, William and Francis Church were raised in a strict but loving household. Their father, Pharcellus Church, was a newspaper publisher and Baptist minister. He demanded nothing but the best from his boys, who in return, each earned a college degree and joined their father in the newspaper business.
In 1862, however, at the onset of the Civil War, the two brothers followed separate paths. William resigned his post at the New York Times to become a full-time soldier while Francis continued on as a civilian war correspondent.
William earned the rank of Lieutenant Colonel, but left after a year. His superior at the time, General Silas Casey, suggested he start up a newspaper and devote it strictly to the war. William liked the idea so he mustered out and asked his brother to join him. Together they published The Army and Navy Journal and Gazette of the Regular and Volunteer Forces, a weekly filled with articles on everyday applications of the war, soldier’s viewpoints, and criticism too. “There is not a shadow of a doubt that Fort Sumter lies a heap of ruins,” the first sentence of the first volume read on August 29, 1863.
While the two brothers continued to edit the Journal, and eventually collaborated on a monthly literary magazine, The Galaxy, their legacies are vastly different.
William would go on to become the founder and first president of the National Rifle Association (NRA), while Francis became posthumously known for an editorial he wrote in response to a little girl’s inquisitive letter.
The editorial appeared without a byline and was buried deep in New York’s The Sun on September 21, 1897.
Only after Francis’ death in 1906 was it revealed the former war correspondent penned the famous line: “Yes Virginia, there is a Santa Claus.”
By Ken Zurski
In April of 1979, NBC rolled out a new one-hour weekly TV show called “Real People” that was unlike anything anyone had ever seen before. At first no one was quite sure what to make of “Real People.” Was it a news show or an entertainment show? Quickly they discovered while the program loosely incorporated a “60 Minutes” type format, it was done mostly for laughs. The struggling NBC network was desperately in need of a hit and they got one in the unlikely “Real People.” The next day people would be talking about what they had seen on the show.
“Real People’ of course stood for, well, real people. The program aired short videotaped segment highlights of real people doing real, but often oddball and hilarious things. In between the segments, mostly unknown hosts at the time, like local LA talk show host Sarah Purcell and comedian Skip Stephenson, would engage in humorous banter about the segments. (Bill Barbour, Mark Rafferty and Byron Allen rounded out the original cast.)
The show was filled with fast-cut editing, silly sound effects, and just enough cheesy commentary to get in an out of each segment and commercial breaks.
Comedy variety shows were popular on television, like the hillbilly-themed “Hee–Haw” and the trippy “Rowan and Martin’s Laugh-In,” but “Real People” was different. There were no staged skits, not technically at least. The taped segments took up most of the hour, but there were also “in-studio bits” that featured funny newspaper headlines and bizarre photos. Many of these features would later become staples in late-night television programs such as “Jay Leno’s Tonight Show.”
Some of the taped segments were just downright weird. Like the man who could run up a 20-ft brick wall or an elderly woman who could put the tip of her nose in her mouth. Naked skydivers, a beauty pageant for pigs, a married couple who both wanted sex change operations, and a cat that uses the “commode” instead of the liter box were some of the more curious and interesting pieces.
In one episode, two car enthusiasts showed off their “push me-pull me” vehicle that fused together the front of two Packard’s so an engine was on both ends. You get get in the any side and go, they would explain. In a Halloween-themed episode that aired in October of 1980, a medium was featured who tried to reach Elvis Presley
The opening of the show would be a teaser of each segment: “Meet a modern day King Kong,” an excited announcer would describe under jumpy theme music, “who enters the Empire State Building and goes completely ape!” While on the screen a man in a gorilla suit is seen doing a dance in front of actual visitors to the iconic New York skyscraper. Although the show was filmed in front of a live studio audience a laugh track helped accentuate the hilarious response.
While the sillier segments highlighted the show, more serious topics were also explored. They included a Canadian man, an amputee, who ran cross-country and called it the Marathon of Hope; and the reunion of WW II American POW’s who were held captive and bunched up their clothing to make an American flag. Several of the men broke down while recalling the story. So did the audience.
“‘Real People’ focuses on weirdos, eccentrics and unusually admirable Americans,” syndicated TV critic Gary Deeb wrote in November 1979.”Particularity the unsung types who never make the local paper or the 6 o’clock news.”
“Real People” was a really big hit, often topping the Nielsen ratings for the week and leading to several mildly successful imitators like “That’s Incredible” and “Games People Play.” It was a success, many agreed, because it served as an alternative to the other top-rated shows at the time or “cheescake programs” as Deeb called them. They included “Three’s Company,” “Mork and Mindy” and “Charlies Angel’s.”
“It’s a welcome antidote to the glitter and tackiness that characterize so many TV variety shows,” Deeb wrote.
After the success of “Real People,” several spin-offs,”Speak Up America” and “Real Kids,” were launched but failed to ignite like its predecessor. The original “Real People” eventually signed off in July of 1984.
But the concept did not die. Nearly a decade later, when Americans could videotape their own silly segments of pratfalls and embarrassing blunders, “America’s Funniest Home Videos,” debuted on the ABC network in 1989.
Although its inception was attributed to a Japanese television show, the influence of “Real People” is evident. A host or hosts introduce each segment in front of a live audience, a music and laugh track amp up the silliness, and the videos – albeit shorter – are still real people doing really funny things. Only this time, the audience is encouraged to laugh at them, not with them.
Like “Real People,” AFV as it was called, became an unexpected hit.
By Ken Zurski
In November of 1944, Franklin Delano Roosevelt was reelected to a fourth term as president of the United States, an unprecedented, but not unexpected achievement for the New York businessman turned politician who garnered increasing support of the American people during his twelve years in the Oval Office.
Although a handful of past presidents had tried, none had served more than two terms, a limitation the nation’s first president General George Washington had advised others to follow. But at the time, there were no restrictions. FDR, as he was famously called, broke new ground when he won a third term. A fourth term he felt during a time of war was just as important.
The voting public agreed. Roosevelt, a Democrat, beat Republican challenger Thomas Dewey in what can be considered even by today’s standard as an overwhelming victory.
The voters, however, had no idea – at least not officially – that they had elected back into office a man who was living on borrowed time.
In the months, even years, leading up to the 1944 election, the American people heard rumors and speculation about the president’s health. Roosevelt suffered from polio which limited his mobility, but in 1944 his appearance seemed to worsen. He looked feeble and weak; his eyes were often red and swollen; and his movements were slow and calculated.
Behind the scenes, there were concerns, but no immediate panic. Dr. Frank Howard Lahey, a respected surgeon known for opening a multi-specialty group practice in Boston, was brought in for a consultation. Lahey’s connection with the Navy’s consulting board led him to the White House. After a careful examination, Lahey informed Roosevelt that he was in advanced stages of cardiac failure and should not seek a fourth term. Even went so far as to warn Roosevelt that if he did win reelection, he would likely die in office. Roosevelt listened but did not follow Lahey’s advice. He felt it was his duty to continue.
In April of 1945, less than three months after being sworn in for the fourth time, Roosevelt was dead.
The president’s death took most Americans by surprise. That’s because shortly after being reelected, Roosevelt’s personal physician at the time, the surgeon general of the U.S. Navy, Dr. Ross McIntire, helped quell public fears by proclaiming FDR was feeling fine. But others could visibly see the president’s decline.
At the White House, Vice President Harry Truman was sworn in and questions were asked: Why didn’t the voting public know the truth about Roosevelt’s health?
In hindsight, Lahey’s report seemed to be the most truthful and forewarning. But information between a doctor and client is private and confidential. The White House only asked Lehay to consult the president. Whether the details were released was up to Roosevelt and his staff. The report was concealed and only came to light nearly six decades later. Lehay himself could have spoke up, but chose to remain silent and honor the patient-doctor confidentiality agreement.
Instead, what was disclosed to the public was mostly misleading. It included a glaringly deceptive assessment of the president’s condition in the months before the election.
In March 1944, the White House announced a report by Dr. McIntire, which claimed the 62-year-old Roosevelt was looking “tired and haggard” due to the stress and strain of the war years and nothing more.
“In my opinion,” McIntire added, “Roosevelt is in excellent condition for a man of his age.”
He was either astonishingly wrong or lying.
By Ken Zurski
Abraham Lincoln, the 16th President of the United States, was the first commander-and-chief to have facial hair.
Actually by being the first to sport a beard, Lincoln started a trend that lasted nearly 50 years. But even Lincoln’s beard was an afterthought. Lincoln never had facial hair as an adult and only let his whiskers go after a receiving a letter from an 11-year-old girl named Grace Bedell who suggested the president-elect should grow one. “For your face is so thin,” she wrote.
Lincoln reluctantly obliged.
After Lincoln, and in the eleven presidencies that followed, only Andrew Johnson and William McKinley chose to go clean shaven. The rest had either a beard, mustache or both. Chester Arthur was one. The 21st president, had a classic version of sidewhiskers, an extreme variation of the muttonchop, or side hair connected by a mustache.
But it didn’t last.
The last president to have facial hair is William Howard Taft (mustache) in 1909.
Woodrow Wilson, who was always impeccably coiffed and dressed, was next. President Wilson shaved everyday and ended the trend.
Many claim the invention of Gillette’s safety razor in the early 1900’s had something to do with the change. Suddenly shaving was easier and facial hair in general went out of style. Plus, the military banned beards too. This was not the case during the Civil War or the Spanish -American War, led in part by a future president, Teddy Roosevelt, who sported a bushy mustache.
Regardless of why it ended, from Wilson on, rarely a stitch of facial hair has been spotted on a president’s face. (You can even add vice-presidents to that list too.) And despite a surge in popularity for beards today, that likely wont change with the election of the 45th president. Donald Trump has never sported facial hair and well, Hillary Clinton, who could become the first woman president, makes the point moot.
But even something as trivial as a beard has controversy.
Some argue that John Quincy Adams, not Lincoln, should be considered the first president to have facial hair. If so, that would pull the history of presidents and hair growth back nearly four decades.
But not to be.
While Adams certainly had hair on his face, his chops, which extended off his ears and sloped down to his chin, were considered sideburns instead.
By Ken Zurski
In the mid 19th century, coal mining rail cars used to carry large amounts of the valuable black rock between underground multi-leveled work chambers had no braking system. If one went down an incline too quickly, it simply kept going like a roller coaster until it derailed and spilled its contents in the process. Since miners were paid by the weight of cars they unloaded, this would cost them time and money. To keep this from occurring, boys as young as six years of age were hired to help the cars come to a stop.
They were called “spraggers.”
A sprag was a stick of wood, not quite as long as a baseball bat. Each boy carried several sprags and were positioned in areas where the cars, sometimes eight in a row, would roll down the slope. The boys would run alongside and jab the sprags into the spokes of the wheel. The sprags acted as brakes, slowing down the car until it stopped.
If this sounds dangerous, it certainly was. The sprags would get caught in the spoke and take an arm with it. Some boys lost fingers, or even hands. Some of the more adventurous types would jump on the side of the rolling car for fun and hold on while another jabbed the sprag in the wheel. This broke up the monotony of the day, but if the car failed to stop, the unfortunate passenger usually went with it, careening out of control until it broke from the rail and smashed into the wall. Of course, being a “spragger” meant you were one the fastest and most agile of the young crew. Other boys would work in the picking room as “breakers,” sorting refuse from the coal; still others opened and closed heavy doors as the coal cars approached, called “nippers.”
For many boys of this era, working in the coal mine was an honor bestowed upon by their ancestors. After all their father and grandfathers had grown up in the mines and in all likelihood their future as a fellow miner was already set. You can imagine the mothers, even if they protested, had little say in the matter. The coal mining industry was literally a “well-oiled machine” that worked if all parts were in place – even at the expense of using children to keep it moving. “He never got used to the noise, the dust, the threat of danger,” writes author Susan Campbell Bartoletti in her book Growing Up In Coal Country, “He was proud to earn money for his family. That was the life of a miner’s son.”
Although no safety records were kept back then, we can assume there were deaths, perhaps many. Eventually in the late 1800’s state laws were passed that prevented children under twelve to work in a mine. In 1902, that was raised to fourteen. But for many tight-knit mining communities there were no birth certificates, so boys younger than fourteen were passed off as simply “small for their age.”
Although the act of using children in dangerous places was already being condemned by early trade unions and women’s groups, the movement gained more traction in 1912. That’s when The Children’s Bureau was created within the Department of Commerce and Labor and later transferred to the nearly created Department of Labor.
By then reports of children being maimed or worse were surfacing. One boy, Manus McHugh, whose job it was to oil the mining breaker machinery, reportedly wanted to finish the day so badly he attempted to oil the machine while it was still running. His arm got caught first. In the investigation that followed. McHugh’s death was blamed on disobedience. “Boys will be boys and must play,” it stated. “Unless they are held in strict discipline.” No legal action was taken.
The inaugural federal child labor law known as the Keating-Owen Child Labor Act was signed by President Woodrow Wilson on September 1, 1916, a Friday, and ironically the start of the long Labor Day weekend, since Labor Day always fell on the first Monday of September. But the bill only regulated child labor by banning the sale of products used by factories that employed children under fourteen. It was ruled unconstitutional in 1918.
The first minimum age requirement of a minor, part of the Fair Labor Standards Act, was not federally mandated until 1938.
By Ken Zurski
In 1674, while exploring the Illinois River for the first time, French Jesuit missionary Father Jacques Marquette wrote in his journal: “We have seen nothing like this river that we enter, as regards its fertility of soil, its prairies and woods; its cattle, elk, deer, wildcats, bustards, swans, ducks, parroquets, and even beaver. There are many small lakes and rivers. That on which we sailed is wide, deep, and still, for 65 leagues.”
Certainly the reference to parroquets, or perroquets, (French for parrot) raises some eyebrows. But a species called the Carolina Parrot, now extinct, did inhabit portions of North America, as far north as the Great Lakes, as early as the 16th century.
More puzzling, however, is the mention of the bustard.
Even the Illinois State Museum in the state’s capitol of Springfield questions this unusual reference.
“What is a Bustard?” the Museum asks in an exhibit showcasing birds native to Illinois, then answers: “We’re not sure.”
Of course, the bustard is a real bird. In Europe and Central Asia it is more commonly known. In North America? It just doesn’t exist. But did it at one time? According to the Museum’s notes, several French explorers described bustards as being common game birds of Illinois and said they resembled “large ducks.” Large, indeed, since a Great Bustard can stand 2 to 3 feet in height and weigh up to 30 pounds making it one of the heaviest living animals able to fly. Its one distinctive feature, besides its size, is the gray whiskers that sprouts from its beak in the winter.
Marquette was more a man of the cloth than a scientist. His mission was to preach to the Illinois Indians or “savages” as he calls them. Along the way, however, he described the scenery and game in detail. The “bustard” comes up quite often in his journal. He even refers to hunting them, possibly eating them too. “Bustards and ducks pass continually,” he wrote.
The Illinois State Museum speculates that perhaps what Marquette was referring to was not a bustard at all, but the Canada Goose which is similar in size and appearance to the Great Bustard.
But, as the Museum concedes, even that is “open to question.”.
By Ken Zurski
For a man whose mission it was to relinquish his entire fortune before his death, Andrew Carnegie still had plenty of money left when he passed in 1919 at the age of 83. That’s no indictment of a man who built a massively successful business, became the richest man in America, and devoted his later years to giving it all back. It was a noble thing to do. But Carnegie had made so much capital that even he found it difficult to allocate the funds sufficiently.
So he asked for help.
Carnegie grew up poor in Scotland, came to America, and amassed millions in the steel industry. Along the way, he made just as many enemies as dollars. Like many so-called tycoons of his time, Carnegie was accused of cutthroat practices which sacrificed workers’ rights for the bottom line. The Homestead Strike of 1892 was due to a dispute between steel workers at Carnegie’s Homestead, Pennsylvania plant and management which refused to raise workers’ pay despite a windfall in profits. The riot that followed is still one of the bloodiest labor confrontations in history. Ten men were killed in the melee and Carnegie who continued production with nonunion workers, was blamed for the uprising.
Carnegie viewed it differently than the workers. He believed that reducing production costs meant lower prices to consumers. Therefore, he theorized, the community as a whole profited, not the unions. It was a slippery slope. But, many asked, was it worth men dying for?
Carnegie, of course, thought of himself as a benefactor and did not apologize for becoming a wealthy man. When he retired, however, he made it clear that being rich was only relative: “Man must have no idol and the amassing of wealth is one of the worst species of idolatry! No idol is more debasing than the worship of money! Whatever I engage in I must push inordinately; therefore should I be careful to choose that life which will be the most elevating in its character.”
Carnegie didn’t hand out money haphazardly. He spent it on things and places that moved him. Among other worthy causes, the most prominent were funds for more schools – especially in low income communities – and the building or expansion of public libraries. In each case, he released the money only after specific demands were met, each one designed to make sure none of it went to waste. Carnegie had final approval.
In 1908, at the age of 72, with millions more left to give, Carnegie wrote a letter to people he admired. It was in effect an offer disguised as a question: “If you had say five or ten million dollars (close to 5-billion today) to put to the best possible use,” Carnegie asked, “what would you do with it?” Many of the correspondence were business leaders and some were presidents of institutions already bearing the Carnegie name. Most responded in kind that the money should be used to continue fellowships. The letters were an indication that the burden of giving away a fortune was weighing heavy on Carnegie’s mind.
“The fact is that after spending about $50-million on libraries, the great cities are generally supplied and I am groping for the next field to cultivate,” Carnegie wrote President Theodore Roosevelt. “You have a hard task as present but the distribution of money judiciously is not without its difficulties also and involves harder work than ever acquisition of wealth did.”
Carnegie wrapped up the letter by pointing out the absurdity of that last line. “I could play with that and laugh,” he noted.
In the end, of course, Carnegie left enough money behind to take care of his wife and daughter. His loyal servants and caretakers were awarded pensions and his closest friends received substantial annuities.
Carnegie gave away an estimated $350 million dollars, but for the rest, he had one final request. After the will segments were dived up, nearly $20-million remained in stocks and bonds.
He bequeathed that amount to the Carnegie Corporation organization he proudly founded, and which still exists today.
(Sources: Andrew Carnegie by David Nasaw; various internet sites)