If you dangle a string just out of reach of a cat’s paw, the cat will do whatever it can to get it. If it’s taken away from him and dangled again, he will continue to pounce on it, jump for it and even try to gnaw at it. But if the cat maintains hold of it, after a few minutes, interest is lost, and the cat goes about looking for something else to do, or to lie down, until something else piques its interest.
Cat string theory is rooted in a marketing principle known as the law of scarcity. If you tune in to a commercial and hear that “supplies are limited, order today!” to get a new facial cream or vacuum cleaner, you may (or may not) be surprised to discover than consumers actually believe what the announcer is saying is true. The law of scarcity increases the perceived value of that item. Then, when the consumer attains the item, the newness wears off, and buyers remorse sometimes sets in.
And isn’t that the way it is with many “trends” in education. Something “new” is proffered as a solution to a persistent issue, and it’s tossed about, debated, voted on, and, when it’s finally attained, then interest and excitement for it is lost for several reasons:
- The realization of how much work is necessary to fully implement the solution (as if the solution was tip of the iceberg, and the implementation process encompassed its other 8/9ths);
- So much time was used “playing with the string” that the solution has evolved into a different “string,” one with many of the same characteristics, but with a different element that necessitates additional discussion and research before further progress can be made; or
- By the time the solution is implemented, the next generation of the solution has become available, or a more effective solution has presented itself in the marketplace.
It’s usually when one of these three things happen that the “cat” moves on to the new “bright and shiny” string.
The problem: we over-analyze. Why do we do it? Because it’s required by funding resources or other certification organizations. Then, by the time we have all the “i”s dotted and “t”s crossed, the new solution has made its way to the forefront, and the funding organizations that were all on board suddenly have second thoughts, and what was once seen as progress is viewed as regression in the blink of an eye.
We’ve seen this phenomenon happen at the consumer level, fueled by the rapid acceleration of the change enabled by technology. You may have purchased an iPad 2 in 2011. Three years later, by the time you’ve become used to it, you’re surprised to discover that it no longer works with the new software upgrade. There is no “fix” available for it, and the only solution is to purchase the new model – which has different peripheral devices and protective cases, while your current technology is practically worthless. I’m on my 4th generation iPad – and the iPad I purchased in 2014 is now an effective paperweight. Computers are now moving toward the same planned obsolescence. A computer I thought was 4 years old is actually only 2.5 years old…but it only has 1.5% of it’s storage capacity available. Time for a new one!
Technology manufacturers know this. “Planned obsolescence” is not a devious attempt to frustrate the consumer, but is an inherent characteristic within technology development. It’s why cell phone providers recommend a two-year contract, with the opportunity to upgrade 18 months into the contract. The problem? Such a mindset sets up a head-on collision with a company’s technology policy that is changing from a three-year equipment refresh cycle to a four-year cycle in the interest of saving money.
One might think that the message for education today is to not play with the string for too long. The system either must embrace it and implement it, or wait for the next string to embrace and implement. Unfortunately, neither approach is practical today because of the unintended consequences which may result, or resistance from those who are not the “early adopters” when it comes to new initiatives.
The problem is not in the “string,” however. The problem is seeking out “Best Practices” in a time of “Innovation.” Because of technology, “Best Practices” have become subjective rather than a benchmark. What is best thinking one day (such as “The earth is the center of the universe”) becomes erroneous when a different theory is proven to be true (“The sun is the center of the solar system, and the earth revolves around it”). Centuries of thinking were disproved in a fraction of that amount of time based on new technologies of the day, such as the telescope.
Today, we are keenly aware that what we purchase today may not necessarily last for decades, like the heirlooms passed down from generations within our families. Unfortunately, we secretly hope they will. Even if they do, and it is considered to be “best” at that time, it will not be “best” 5 to 10 years down the road.
Also today, “Best Practices” in technology are being tested continuously and more and more insidiously. One only needs to read the headlines relating to data breaches in companies that thought the processes they had in place were secure. Unfortunately, it seems that when a company claims to have the highest standards of cybersecurity in place, it’s viewed as a challenging invitation for the next hacker who wants to prove his abilities to the world.
What’s the solution? It’s not an easy one. It requires bi-location, so to speak. It’s the recognition of the principle of duality. Within the last 100 years, the theory of being in two places at the same time has been considered by quantum physicists with the realization that things can be two “things” at the same time. Consider the Complementarity Principle:
- In its totality, therefore, nature is dual. None of its components can only be considered as a particle or as a wave. To understand this fact, Niels Bohr introduced in 1923 the Complementarity Principle: simply put, every component in nature has a particle, as well as a wavelike character, and it depends only on the observer which character he sees at any given time. (Read more at: http://phys.org/news7144.html#jCp)
Our task, though is much simpler. “We” don’t have to be in two places at the same time…but our mind does. It has to be focused on the present to know what’s “Best” now, but we also have to be keenly aware of what’s “Next” when it comes to planning and preparing solutions for problems that haven’t yet surfaced. What makes the “next” component difficult, however, is the current practice of limited terms regarding contracts, elected officials, and appointed positions. Two-year terms do not offer sufficient time to effect long-term solutions that usually take three to five years to come to fruition. Individuals that have been in their position for quite some time and may be looking to retire in the near future aren’t necessarily interested in initiating new processes that will require significant amounts of energy, time, talent and consensus-building.
Perhaps if we let the left side of our brain focus on what’s “Best,” and the right on side of the brain on what’s “Next,” then by working together, it can devise the proper solution for the question of timing. Then again, that’s looking a time as a function of “Chronos;” that is, “Our time,” and our perceived time constraints. There are those that see time as a function of “Kairos,” or “God’s time,” which is vastly different from any timeline any human can construct.