Home > Technology > AI and hot hand fallacy

AI and hot hand fallacy
Dr. Rahul Dass

There is far too much burden of success on artificial intelligence as it moves to the more advanced stages of its evolution.

The continued success of artificial intelligence to effectively undertake more and more complex tasks seems to have led to a situation where there is a possibility of hot hand fallacy becoming a reality.

Far too much, in the recent past, has been placed on the shoulders of artificial intelligence, leading to a complex scenario in which expectations are now sky high. AI has been gradually has been finding its way into every walk of life.

Simply put, hot hand happens when a basketball player successfully gets three consecutive shots. For the fourth shot too, the expectation is that it too will be a successful. But, it is not necessarily so.

This belief that after a string of successes, an individual may enjoy continued success refers to the hot hand fallacy. It is believed that this arises from representative heuristic, as identified by behavioral economics.

There has been considerable discussion about the seven stages of the evolution of artificial intelligence. Here in lies the rub. It is believed that AI will specifically evolve from one stage to the other, without faltering at any stage.

Right from rule-based AI to context awareness to domain specific and reasoning in the initial four stages to the more evolved general intelligence, super AI and ultimately singularity – each of these stages has its own set of challenges.

The ultimate level of AI evolution being singularity wherein AI transcends human intelligence.

My expectation is that the hot hand may kick in at the stage of artificial general intelligence (AGI) – a program that mimics a human being. In the next stage of artificial super intelligence, the computer program may even show sentience.

In the advanced stage of AI evolution, the hot hand may become apparent. While it may be interesting to see how AI develops at the more advanced stage, it is quite clear that far too much expectation has already been put on AI now.

We need to reduce the expectation from AI and its developers so that the growth is more organic in nature. Otherwise, crunching far too much into too short a period of time will beget the next big question: What next after singularity? 

Get In Touch

Emails: aitechinfluencer@gmail.com, deepakgarg108@gmail.com

info@aitechinfluencer

Follow Us

© AiTechInfluencer. All Rights Reserved. Design by AiTechInfluencer