Wednesday, March 29, 2006

Are Mentorships being Disintermediated?

Are Mentorships being Disintermediated?

The basis for learning is changing. In the pre-microchip generation, lessons learned were generated from directly experienced real-world consequences. The tutelage of mentors was prized because they had learned the tricks and traps of the trade through personal or vicarious experience and handed them down through education. The most valuable mentors were guardians of institutional memory, who could act like harbor pilots to guide novices to a solution because they knew where the shallow straits were located, perhaps because they had once run aground themselves. (One of the interesting sidelights here is the pressure for increased self-reliance caused by the disappearance of institutional memory, a result of force realignments and the inevitable attrition of seasoned veterans who stood in the past as mentors.)

Power of Prototyping

By contrast, the information age has facilitated virtual reality--self-discovery through simulation of actual situations, events, and problems. Given data on the real-world parameters, computers can model sensory cues and contingencies, support gaming of choice alternatives, and simulate appropriate consequences. Moreover, these simulations can harmlessly mimic reality so that actual disaster is never experienced as a consequence. This is their key advantage, of course. Incremental learning or proficiency can be developed by stopping or restarting the process at will, completely erasing the damage of a previous mistake.

Infinite Memory

As data storage and processing power increases, the boundary between simulation and reality becomes even less distinguishable. Witness the power of computer animation in such popular films as Jurassic Park, Forrest Gump, and Twister, or the realism of mechanical simulation in the new Boeing 777 training module. In military affairs, this capability to simulate reality has been employed to construct a "virtual theater of war" which seamlessly combines real units and simulated ones to test doctrinal concepts and materiel effectiveness through simulated operational and tactical maneuvers.

When reality and simulation become indistinguishable, is there an indelible effect on the player that desensitizes him to the damage of real-world catastrophes? When harm and pain exist chiefly in cyberspace as immaterial states, can a player fail to develop sympathy for real-world consequences? The allure of virtual reality could have a dark side--the blunting of the human sentiments to real war that is fought abstractly, in a manner not too different from counterpart recreational games. Could advanced technology, the increasing digitization of the battlefield, and the automation of combat systems transform the experience of war into another video arcade game, an abstraction defined by the movement and deletion of computer icons? In my opinion, the answers are not clear-cut, but today's most popular computer games have martial themes, and more are on the way. The story is told of an officer who, in the pitch of virtual battle, swore at his terminal, "Damn, I lost an icon!" as an overrun battalion was flagged by the computer. Even if the story is apocryphal, the potential for the response clearly exists in cyberwar.

Systematic decisionmaking is eclipsing intuition. Computers that beat chess masters prove that artificial intelligence and knowledge-based systems are capable of extremely sophisticated decisionmaking. Programs that apply experts' collective rules of thumb have even been shown to make more consistent, reliable decisions than humans in similar problem-solving situations. Yet most expert systems operate from data-hungry mathematical models. They illustrate the inseparable relationship between knowledge and intelligence--the more you know, the smarter you become; so to become smarter, you must know even more.

If the quantity of evidence determines the certainty of a hypothesis, then how much evidence is enough? The answer is determined by the amount of ambiguity in the problem, because computers reason in all-or-nothing terms and have limited tolerance for partial evidence. Uncertainty is resolved by redundant observations, so that more data is collected to resolve the uncertainty. Ironically, systems that can scan a situation in great depth and analyze in great precision can provide a decisionmaker with so much capability that he becomes addicted to the information and consequently paralyzed by it. Recently, the Army Times described a computer-assisted exercise during which a battle staff hadn't noticed it was being overrun by the enemy because the commander was preoccupied with obtaining more data from his battlefield computer.

Through technology, it is not only possible to suffer paralysis by analysis, but also to neglect the intuitive skills that give commanders an important advantage in ambiguous situations. The author of a recent Military Review essay observes that intuition allows a commander to focus rapidly on feasible solutions when time for systematic analysis is unavailable. This capacity is particularly important in peacekeeping, where the traditional combat decisionmaking model does not fit. But, as that writer notes, intuition comes largely from experience with a broad base of situations. Overreliance on structured systems might, indeed, stunt the growth of this intellectual capacity and severely limit a commander's options in unfamiliar scenarios. One of the worst possible outcomes would be the erosion of a leader's ability to use his own eyes and ears. While decision systems might present an unparalleled opportunity to eliminate risks, they could obscure a strategic leader's awareness of key inputs to the decisionmaking process which exist outside the range of data available through computers. Assessment of political and environmental conditions, for example, will rarely be carried out through the systems of sensors that will generate most of the input to Force XXI computers.

No comments:

Post a Comment