October 04, 2022

My Sunday newspaper carried a story of a ship located last week that had sent a wireless radio message warning the Titanic about the presence of icebergs on that fateful day in 1912. The Titanic received the Mesaba’s message, the warning never reached the captain on the bridge and on April 15, the “unsinkable” liner hit an iceberg and sank with a loss of over 1,500 lives. The SS Mesaba sank six years later in the Irish Sea as part of a convoy from Liverpool to Philadelphia on September 1, 1918. The ship was torpedoed by a German U-boat and twenty people were lost. The wreck of the merchant vessel was identified by researchers at Bangor University in Wales using multibeam sonar. Sonar and echolocation are essentially the same process, but echolocation is the use of echoes to detect objects observed in natural creatures (biosonar) while sonar is (nautical) echolocation.
When I looked online, I found echolocation is used by bats and other animals to determine the location of objects using reflected sound. This allows bats to navigate in pitch darkness to hunt, identify friends and enemies, and avoid obstacles. Echolocation allows bats to fly at night and in dark caves. Bats seem to have developed the skill to locate night-flying insects. Bats make the sounds in their larynxes and emit them through their mouths. Fortunately, most are too high-pitched for humans to hear as some bats can scream at up to 140 decibels, or as loud as a jet engine 30 miles (45 km) away. Bats can use echolocation to detect an insect up to 5 miles (7.5 km) away, work out the insect’s size and hardness, and then to avoid limbs and wires as fine as a human hair. The bat cranks up the calls to pinpoint the prey as it closes in for the kill. To avoid being deafened by its own calls, a bat turns off its middle ear just before calling, then restores its hearing to listen for the echoes.
While echolocation is produced naturally by animals, human produced sonar uses machines to produce the sound waves that measure the distance between a sound source and the objects in its surroundings. Humans also use sonar for navigation, communication, mapping, and frequently in underwater vessels. The active sonar used by the Bangor researchers to map the seabed and identify the Mesaba wreckage, involves emitting pulses of sounds and listening for echoes. The speed of sound is constant, so by measuring the amount of time between a chirp emitted and hearing its echo, a vessel can calculate the distance to the reflecting object. The multibeam sonar used by the Bangor team enables seabed mapping at an increased level of detail which allows the details of structures such as shipwrecks to be observed. One of the Bangor researchers described multibeam sonar as “a game-changer for marine archaeology.”
THOUGHTS: “Echolocation” was coined by zoologist Donald Griffin in 1944 but reports of blind people being able to locate silent objects date back to 1749. During the 1940’s experiments staged at Cornell Laboratory showed sound and hearing, not pressure changes on the skin, were what drove human echolocation. Some passively use natural environmental echoes to sense details about their environment, but others actively produce mouth clicks to gauge information about their environment. Both passive and active echolocation help blind people sense their environment. Sighted people tend not to perceive the echoes due to echo suppression, but with training sighted individuals with normal hearing can learn to avoid obstacles using only sound. Echolocation is a general human ability. The human brain receives millions of stimuli every second from our surroundings. The question is what to ignore and when to pay attention. When it comes to creating unity with others, we need to actively pay attention. Act for all. Change is coming and it starts with you.