By Ken Zurski
Amy Johnson is considered the English counterpart to American Amelia Earhart which is a fair comparison since both pioneer women aviators have similar adventures and equally mysterious fates.
Johnson, however, is not quite as well known.
Born in Yorkshire, Johnson earned an economic degree at the University of Sheffield and received her pilot’s license through the London Aeroplane Club in July of 1929. Flying was just a hobby for the 26-year-old Johnson at first. But that would change. “I have an immense belief in the future of flying”, she wrote. Soon enough she had her sights on breaking solo flying records in Europe, similar to what Charles Lindbergh and Earhart were doing in the States.
Her father and biggest supporter, Lord Charles Wakefield helped finance her flights. The wealthy Wakefield, a well-known figure in London, founded an oil lubricant company that bared his name (later it was changed to Castrol Oil). Wakefield bought Johnson her planes which carried family trademark nicknames, like “Jason.”
In May of 1930, aboard “Jason,” Johnson set off alone from Croydon, England and 19 days later landed in Darwin, Australia a distance of 11,000 miles. She was given a hero’s welcome upon returning home.
Later along with several co-pilot’s, including her husband Scottish aviator Jim Mollison, Johnson completed more globetrotting flights and set distance records from London to Moscow and Japan. Accomplishments that received international recognition. Even Lindbergh, the most famous pilot in the world, sent Johnson a letter of praise.
Then in 1941, while working for the Royal Air Force, Johnson was lost during a flight over the Thames Esturay where the River Thames meets the North Sea. The weather was especially bad that day and she reportedly went off course, ran out of fuel, and ditched the plane.
Johnson was last spotted parachuting into the water.
Even today, her fate is debated and rumors circulate about why the plane went down. Some claim Johnson forgot – or failed – to give the correct security codes needed to identify herself as a British pilot and was shot down by friendly fire. Others report she and her aircraft was mistaken for a German bomber. Still others believe she may have been on a secret government mission. All speculation, of course. Like Earhart, who vanished in 1937 while flying over the Pacific, Johnson, who was assumed drowned, never resurfaced.
A search for her body turned up nothing.
By Ken Zurski
It was late on March 31, 1918, a Saturday night soon to be a Sunday morning and the beginning of a new month, when Daylight Saving Time officially became a household phrase in America.
It started with a crowd of gawkers lining the streets surrounding the Metropolitan Building in Manhattan. At exactly midnight, the crowd strained their necks and looked up at the huge lighted clocks, each 26 feet in diameter, one on each side of the building. A signal was given and the lights shut down—the largest four-dial clock tower in the world went dark.
A hush came over the crowd.
Time was literally at a standstill.
Then there was a cheer!
The crowd was festive despite the late hour and the start of Easter Sunday. A community chorus sang the “Star Spangled Banner,” and the New York Police band and Borough of Manhattan bands took turns playing “Over There.” The excited throng kept staring at the clock dials frozen in time. It just couldn’t be, they thought.
It was deep in the tower’s belly where all the work was taking place. Hired mechanics had made their way up to the tower’s inner workings and begun the arduous task of advancing the 13-foot hour hands manually. They had two hours to get the job done. Then promptly at 2 a.m., the lights flickered on again. Like magic, the clock tower was once again illuminated. Hundreds of late-night souls strained their necks again to see the clock dials’ hour hands in the glowing beams. The hands were pointing to the number … 3.
Three! It was 3 o’clock! For the first time in history, the nation had moved itself ahead one hour. The crowd shouted and cheered. Daylight Saving Time had officially begun.
“Blasé New Yorkers for whom New Year’s Eve celebrations have lost their thrill,” wrote a reporter for the New York Times, “rubbed their eyes and marveled at the novelty of an Easter Sunday of only twenty-three hours.”
The idea for daylight saving is most often attributed to Benjamin Franklin during his years as an American delegate in Paris in the late 1700s. Thanks to the oil lamp, Franklin would stay up late, usually playing chess, and sleep until noon the next day. One morning, quite early, he was awakened by a sudden noise. He threw open the tight window shutters and was even more startled by the amount of daylight coming into his room. “I looked at my watch,” Franklin later wrote in an article that appeared in the Journal De Paris on April 26, 1784, “which goes very well, and found that it was but six o’clock; and still thinking it something extraordinary that the sun should rise so early. I looked at the almanac, where I found it to be the hour given for his rising on the day.” Perhaps with a mix of astonishment and dry humor, Franklin wrote “that having repeated this observation the three following mornings, I found precisely the same result.”
Ever resourceful, Franklin had an intriguing thought. If he had slept six hours until noon through daylight and “lived” six hours the night before in candlelight, then wasn’t that just a waste of precious light and expense? “This event has given rise in my mind to several serious and important reflections,” he wrote. Franklin went to work figuring out the math. Assuming that 100,000 Parisians burned half a pound of candles per hour for an average of seven hours a day, and calculating the average time during summer months between dusk and the time Parisians went to bed, Franklin concluded that the amount saved, as he put it, would be an “immense sum.” Franklin proposed that all Parisians rise with him, when the sun rises, and to compel the naysayers, he proposed “a tax [be laid] per window, on every window that is provided with shutters to keep the light out.”
“Let guards posted after sunset to stop all the coaches that would pass the streets,” he bravely declared. “Let the church bells ring every time the sun rises. Let cannon(s) be fired in every street, to wake the sluggards effectually, and make them open their eyes to see their true interest.”
Franklin’s dry wit and humor notwithstanding, his scheme alarmed Parisians who weren’t ready for change. They thought Franklin’s idea was madness, and a surprising one at that, coming from an American intellectual and a figure that was so well-liked in France. After he left Paris, Franklin mulled over the idea and marveled at “inhabitants,” this time Londoners, who continued “to live much by candlelight and sleep by sunshine.” Franklin used the economy as an example, saying residents had little regard to the costs of candlewax and tallow. “For I love economy exceedingly,” he explained.
Eventually the idea of extending the day during the summer months was proposed. Instead of getting up by daylight, usually too early, then why not just move daylight later in the day and prolong the evening sun?
“Everyone appreciates the long, light evenings,” wrote William Willett, a London Builder who is credited with the idea of extending daylight. “Everyone laments their shortage as Autumn approaches; and everyone has given utterance to regret that the clear, bright light of an early morning during Spring and Summer months is so seldom seen or used.”
Like Franklin, Willett was struck by the amount of people who kept their blinds shut in the morning hours even though the sun was fully out. If getting up too earlier was the crutch, thought Willett, then why not just stretch the light of the evening hours. European countries would adopt the idea first in an effort to conserve fuel supplies during World War I.
On April 1, Easter Sunday of 1918, Americans did as well.
(Excerpt from “The Wreck of the Columbia: A Broken Boat, a Town’s Sorrow & the End of the Steamboat Era” on the Illinois River © 2012 by Ken Zurski. Used with permission from Amika Press.
Byt Ken Zurski
In the 1930’s, radio programs were popular for their slapstick humor and sound effects. The Vic and Sade show was different. It relied on snappy, witty, and intelligent dialogue to carry the stories along.
Eventually, the dry humor caught on, but it took time. And while most programs showcased the silly antics of its actors, the Vic and Sade show was praised for work behind the scenes, specifically the man penning the scripts: Paul Rhymer.
Rhymer was a journalist and writer from Bloomington, Illinois, who created the show based on his own Midwest sensibilities. The protagonists, Vic and Sade Gook, were a married couple living in “a small house halfway up the next block.”
Rhymer gave the couple a folksy slang. He cleverly wrote each episode and carried storylines over like a serial.
Listeners especially enjoyed Rhymer’s knack for clever words and names. Ruthie Stembottom, Mrs. Applerot, Oyster Crecker and Charlie Razorscum were just a few of the colorful characters.
And the cities were mentioned too. Some you wished actually existed. like East Brain, Oregon; Sick River Junction, Missouri; and one strangely dark place only Rhymer could explain , but everyone else could only imagine: Dismal Seepage, Ohio.
Unfortunately, the Vic and Sade show is mostly forgotten today.
Why is difficult to explain. At the height of its popularity, the Vic and Sade show had a reported “devoted ” listening audience of 7-million. It was also briefly adapted to television in the 50’s. But it’s stars were mostly faceless and while most of the popular radio shows at the time ran in the evening, Rhymer’s show never got out of afternoons. It had an audience of mostly women, like television soap operas, but after ending its 14-year run in 1946, failed to capture the cult hero status that the prime time radio shows did.
In fact, it was another Midwestern couple, similar to Vic and Sade, but more physically expressive, who ruled the airwaves.
Airing in the evening, and coming to into homes from a fictional place called Wistful Vista, the stars, Jim and Marion Jordan of Peoria, Illinois, were better known to their large and devoted fans as Fibber McGee and Molly
By Ken Zurski
In the early 1900’s, an enterprising high school drop from Norfolk, Nebraska named Joyce C. Hall began selling perfume door-to–door. Soon he expanded the business to include postcards, specifically the importing, printing and selling of foreign postcards, a popular item at the time.
The possibilities were endless, but not in Norfolk, Nebraska. So Hall boarded a train to Kansas City, Mo. He was armed with boxes full of postcards.
A Hall biographer continues the story this way: “As business picked up, he [Joyce] ventured to the towns served by the railroads running in all directions from the Midwestern rail center. Soon brother Rollie joined him, and they opened a specialty store in downtown Kansas City, dealing in post cards, gifts, books and stationery.”
Then tragedy struck. In 1915, the store was decimated by fire and all was lost, the entire inventory wiped out by the devastating blaze.
The Halls were determined however. With help form a third brother, William, who ran a bookstore back in Norfolk, Nebraska, the three siblings pooled their resources, bought a small engraving firm in Kansas City and began making and marketing their own postcards.
The holiday season was especially busy and the Halls would sell Christmas postcards and tissue paper out of the store. When the tissue paper sold out, they searched the supply room and found a replacement in a stack of “fancy French paper” meant for display only.
They sold it for 10 cents a sheet.
And it too sold out.
So the next year the brothers offered a similar lining paper as a choice. And once again, the more decorative sheets were a big hit. So in 1919, Hall and his brothers began producing and selling their own printed paper for gift wrapping. The paper carried their brand name: Hall Brothers.
That same year they experimenting with cards that had no distinctive purpose other just to say hello or wish someone good luck. They called them “everyday cards.” The cards sold well, but were especially popular for special celebrations like birthdays, anniversaries, and Valentine’s Day.
The company took off and Joyce Hall made most of the business decisions. He offered a change that others, including his brothers, thought was ill-advised. He wanted to change the name. “Hall often went against conventional wisdom. In the 1920’s, he wanted to replace ‘Hall Brothers Company’ on the back of greeting cards with the phrase, ‘A Hallmark Card.’ Everybody in the place was against it, he said, but he made the change.”
Also while others said he was wasting money, Hall began to create and run ads. and soon, Hallmark, the brand, became “the most recognizable in the industry.”
Joyce Hall ran Hallmark for 56 years, eventually giving the president and COO title to his son in 1966. He continued as chairman of the board until his death in 1982.
Today, the date of February 14 or Valentine’s Day is considered to be a Hallmark holiday. That’s because in 2010 the U.S. Greeting Card Association estimated that approximately 190 million Valentine’s Day cards were sent each year in the United States alone.
Even more impressive…the total number of cards produced every year likely tops a billion or more if you count the number of valentines exchanged by schoolchildren.
Happy Hallmark Day!
By Ken Zurski
Avonia Stanhope Jones was born in 1839 and in her teens and early 20’s was considered an accomplished actress. Thank her parents for that. Both were theater-types, and Avonia often played roles opposite her mother. In one instance, mother and daughter toured together in “Romeo and Juliet” where Avonia played “Juliet” and her mother played “Romeo.” More details of that “strange” production is not known. And for the most part neither is Jones – known, that is.
According to several internet sources, Jones married young, had no children, and died at the age of 28. Her name is not well remembered, but as an actress, she was important enough to warrant a sitting with the leading photographer at the time, Matthew Brady.
So there’s that.
But as history goes, Jones seemed to do nothing extraordinary or devious, which would have elevated her name or status. As for acting, the New York Times wrote this in her obituary: “Her understanding of mimic character was quick and thorough, and her intellectual attainments of a high order. Few actresses at the present day have had so much experience and received so much praise at so early an age.”
One biographical source claims the most discriminating words against her was a “declamation of the war,” meaning the Civil War, which one can assume she talked about a lot, one-sided or not. More on that in a moment.
In November of 2012, the movie “Lincoln” opened in theaters. The highly anticipated film, directed by Steven Spielberg, was a commercial and critical smash. It was in essence history come to life, thanks in part to Daniel Day Lewis who channeled his vision of the title role into an Oscar win for Best Actor.
The movie itself, was based on historian Doris Kearns Goodwin’s book “Team of Rivals” and the screenplay was written by Tony Kushner, the Tony Award winning writer of “Angels in America.” Kushner was nominated for an Oscar for his work on “Lincoln.”
Kushner also brought back the name Avoina Jones.
In an interview with Smithsonian Magazine shortly before the movie was released, Kushner said this about Jones: “I thought, I’ve discovered another member of the conspiracy!” Kushner explains that he was looking for a play Lincoln might have seen in early March of 1865. “I found a ‘Romeo and Juliet’ starring Avonia Jones.”
Kushner says Jones, from Richmond, was rumored to be a Confederate sympathizer. “She left the country immediately after the war, went to England and became an acting teacher.” (Note: Jones returned to America in 1867 and died that same year of consumption).
According to Kushner, the backstory is this: “One of her [Jones] pupils was Belle Boyd, a famous Confederate spy. And the guy who was supposed to be in ‘Romeo and Juliet’ with her was replaced at the last moment by John Wilkes Booth—who was plotting then to kidnap Lincoln.” So according to Kushner, Jones could have been a member of Booth’s team of conspirators.
It’s all speculation, of course, and Kushner doesn’t elaborate. Still for a writer a “new” discovery is always worth exploration and in a Hollywood production where the facts can be loosely defined by the words “based on a true story,” Kushner had hoped to introduce movie audiences to Jones.
But it was not to be.
Jones and her story never made it into the final script.
However, Kushner had another historical figure he claims was found through good research: William N. Bilbo, a crafty Nashville lawyer and lobbyist for Lincoln. Bilbo tried to bribe “swayable” Democrats to vote with Republicans on the thirteenth amendment, the overall plot point of the movie.
Bilbo was another forgotten soul. Even Goodwin’s book ignores him. Yet, much to Kushner’s liking, Bilbo was left in the script and actor James Spader brought the real life character back to life.
Avonia Jones will have to wait.
By Ken Zurski
Long before Jim Henson became famously known as the man behind the legendary Muppets, his early puppet creations were popular thanks to stints on television commercials, the Tonight Show, and The Jimmy Dean Show where a furry dog named Rowlf, pronounced Ralph, became nearly as popular as the folksy TV host himself.
Jimmy Dean didn’t seem to mind and neither did Jim. It was after all the characters who were in the spotlight, not the performers. So Henson and his team, including fellow puppeteer Frank Oz, were virtually unseen and unknown at the time.
This was between 1962 and 1969, the same time an English rock sensation known as the Beatles took over America. The Muppets played a completely different role than the lads from Liverpool, but in one respect they shared a rather innocuous connection with the Fab Four.
Author Brain Jay Jones points this out in his book Jim Henson: The Biography.
In a chapter titled “A Crazy Little Band,” Jones writes that “it wasn’t Jim’s name on the door or company letterhead, but rather THE MUPPETS.” Even Jim’s son Brian Henson would later admit, “The Muppets were known,” but as for his father: “He wasn’t.”
Apparently, without a face, there was uncertainty as to who or what the Muppets actually were. Plus, if you didn’t know what the name stood for (a Henson invented combination of Marionette and Puppet), the confusion was two-fold.
So the name baffled some. Many thought Jim and his crew listed on guest lists as simply “The Muppets” were a rock band similar to other one-name bands like the Troggs, the Animals, the Hollies or the Beatles.
In addition, Jones writes, Jim had somewhat long shaggy hair “like a businessman beatnik” and a beard. He was also tall and lanky and walked with long strides, similar to the look and style that the Beatles would make famous on the cover of “Abbey Road.”
Add to that the Muppet characters who were transported in black boxes which resembled instrument cases. If you didn’t know who the Muppets were, Jones explains, you might have mistaken them for a rock group.
Even Frank Oz conceded to the confusion. “We were just kind of this crazy little band at the time,” he wrote. “We were the Muppets, but like an act.”
This confusion led to an embarrassing incident after a performance in Los Angeles when a stubborn hotel manager refused to give Henson and his crew a room for the night fearing a rock band would trash the place.
Henson, of course, would get the last laugh. He attempted to correct the problem by having a “serious conversation” with the manger. Jim’s real voice resembled Kermit the Frogs’s in tone and was quiet, calm and reassuring. He rarely swore.
The manger was likely convinced without Henson having to take out one of his “instruments” as proof.
By Ken Zurski
In the late 18th century, George, Prince of Wales, soon to serve as Prince Regent due to his father’s illness, was told by the royal physicians to take better care of his own health, specifically more baths, or “dips,” in the salt water properties of the sea.
The Prince choose the coastal village of Brighton, England, a once rundown fishing town that was turning into a seaside retreat for the wealthy. Not only was Brighton close to London, but it’s warmer climate and proximity to the English Channel was perfect for those, like the Prince, who were ordered to take these so-called restorative “dips.”
Whether or not the Prince stuck to his doctors orders or not is not known. Regardless of his health, and befitting his reputation as a royal glutton, Brighton suited him just fine. Soon he sought to build a grand palace in Brighton, under his orders and complete with a “glass domed roof, hand-painted Chinese wallpaper, a 62-horse stable, and a Great Kitchen.” Work began in earnest in 1787.
Although the Marine Palace soon to be transformed into the Royal Pavilion, was the brainchild of the Prince Regent (the future King George IV), the finished product, a mixture of many styles and influences, was the work of architect John Nash.
Nash’s design suited the future King, but hardly anyone else. “A masterpiece of bad taste” was one icy reception, while another described it as a “mad house.” Even Queen Victoria, wife of King William IV, King George’s successor, was unmoved calling it “odd” then demeaning its purpose. “Most of the rooms are low and I can see a morsel of the sea from one of my sitting windows,” she bemoaned, refusing to spend much time there.
Born in London in 1752, Nash earned a reputation for designing houses, castles really, for the rich. Eventually, his work caught the eye of the prince. In 1806, Nash became his personal architect. The re-imagining of the Royal Pavilion, also named the Carlton House, was their partnership.
To make it even more unique, between 1815 and 1822, Nash added flourishing touches to the Pavilion that included an elaborate second wing and a special dome feature that made it’s outer appearance similar to India’s most famous architectural statement, The Taj Mahal.
The English shoreline never shone so brightly, but it was different, and not very British-like, especially for royalty, and the biting condemnations quickly followed.
But attitudes toward the Royal Pavilion would change.
In 1841, a rail line made it more accessible. Now more people could come and roam the grounds and enjoy the scenic location for themselves. To the British commoner, the Royal Pavilion was a work of art.
Unfortunately the man who endured the constant jabs about his work from his peers, never lived long enough to see it appreciated. In 1835, shortly before the Pavilion became a popular tourist attraction, Nash died at the age of 83.
Nearly a decade after his death, Nash would be vindicated again when the Royal Pavilion was paid the ultimate compliment by an American entrepreneur and visionary who not only admired the uniqueness of the building, he sought to copy it too.
In 1848 a mansion went up in the scenic countryside of Connecticut that looked oddly out of place for its location. Not only was it very large, occupying 17 acres of land, but the building itself with its exotic Indian influenced architecture looked like something you might spot in far off Mumbai or New Dehli, not Fairfield, near Bridgport, the state’s largest city.
All this was the creation of one man who commissioned the building as a “permanent residence” for his family.
His name was Phineas Taylor Barnum, better known as P.T. Barnum.
In admiration, Barnum patterned the design of his new home in the style of the Royal Pavilion in Brighton.
He named it the Iranistan.
For more on the history of P.T. Barnum’s Iranistan click here: https://unrememberedhistory.com/2018/01/02/the-greatest-showmans-home-was-everything-you-might-imagine-it-to-be-and-more/