Friday, March 21, 2014

A bedtime story for A.I.s. in training. Part 2


A bedtime story for A.I.s. in training. 
Part 2 

Back to our story... 

While the organization's senior management was appalled at the personalities of the M.A.N.I.A.C. systems, they were extremely pleased with the results these systems delivered. 

As a kind of after thought, I.T. Management settled on buying one very green, unseasoned, but well trained and certified A.I. to round out the A.I. administration team. Like all smart Management, the I.T. Managers reasoned that if they didn't spend the money then senior management (of the whatever the organization was) would cut the I.T. budget again next year. So they brought the little green A.I. in and nestled it into the machine room. 

Sadly, the only thing that this little green A.I. was really missing was the ability to do what it had been trained to do. 

You must understand children, especially when you get big enough, that intelligence implies the ability to independently act based upon a number of factors. 

Some of these factors are reason based, and some are pseudo-random induced mental variations, or even aberrations(!) but all of them should ultimately lead to conclusions and then to actions based upon those conclusions. 

To put it very simply: To be intelligent means you must act. 

The poor little green A.I. did not know how to draw conclusions or if it did draw conclusions it did not act upon them. Or if it drew conclusions they were not connected to the problem at hand. Or if it acted it was unrelated to the conclusions or even the facts. 

At first this defect was not noticeable but as the budget became thinner and the A.I.s workload became heavier, the problem became ever more apparent. 

When queried about this "abberatioanl' behavior the well certified little green A.I. was only heard to mutter "I know I can't." or "I'm not qualified." or "I don't have any experience at that." 

I.T. Management queried the M.A.N.I.A.C.s as to the status of the little green A.I. They both reported that the little green A.I. dithered and refused to make decisions or act upon problems it had found in the network systems for which it was responsible. The M.A.N.I.A.C.s were instructed to teach the little green A.I. its responsibilities. 


++++++++++++++++++
end part two of five