Brands
Discover
Events
Newsletter
More

Follow Us

twitterfacebookinstagramyoutube
Youtstory

Brands

Resources

Stories

General

In-Depth

Announcement

Reports

News

Funding

Startup Sectors

Women in tech

Sportstech

Agritech

E-Commerce

Education

Lifestyle

Entertainment

Art & Culture

Travel & Leisure

Curtain Raiser

Wine and Food

YSTV

ADVERTISEMENT
Advertise with us

ai caramba: What exactly is AI? | Part III: All AI know is that AI know nothing

ai caramba: What exactly is AI? | Part III: All AI know is that AI know nothing

Monday June 18, 2018 , 4 min Read

Just how relevant – or important – are philosophical questions around the field of AI? Part 3 of a three-part series that explores the questions around the often fantastical world of Artificial Intelligence to answer the most basic question of them all – what exactly is AI? (Read Part 1 here and Part 2 here.)

Tautology is the last refuge of the lexicographer; it offers to define by action what is defined by articulation. Examples of this include Herbert Morrison’s maxim, “Socialism is what the Labour Government does”, or the most famous one by Jacob Viner, “Economics is what economists do”.

Thus, for the definition of AI, we may need to jump from out of the teleological fire into the tautological pan.

Image: Shutterstock

Instead of a cop out, this actually mirrors a prominent machine learning technique for image recognition – Neural Networks. A Neural Network is a combination of neurons (mathematical functions along the lines of an actual neuron) arranged in layers based on particulars weights. When an input passes, the Neural Network performs various computations on the input after slicing and dicing it within each layer and propagates them through the other layers until it generates an output. The computations it performs and the individual weights assigned differ violently from the initial datasets with which it was populated.

Though this glosses over the finer technicalities, it illustrates an important point – nobody knows how and what the @*&# goes on within those layers.

This is similar to the conundrum of what happens in our mind when we store and retrieve memories. US psychologist Karl Lashley posited the engram – a theoretical memory storage unit linked to a single memory – in 1916. He experimented by training rats on a maze, destroying a piece of the cortex, and running them through the maze again to see if they remembered the path. They did. Even till today, basic processes such as memory are being hotly debated; no wonder more complex, abstract concepts like intelligence are still undefined.

There exists a school of thought about the paradox of the brain, about how the brain seeks to understand itself. For this, it would need to be simple enough for us to understand it and yet complex enough to be able to understand it. This causes several autodidacts to proclaim this problem belongs to Gödel's Incompleteness Theorem – of how any sufficiently complex system contains certain premises that cannot be formally demonstrated.

Others believe that neural networks and artificial intelligence may theoretically help us better understand our own brains and intelligence by providing an analogue that’s sufficiently similar but not the same. But in the words of Yogi Berra, “In theory, there is no difference between theory and practice. In practice there is.”

Thus, it’s no wonder that the core concept of intelligence is still undefined and that this body of work lacks a formal definition. As illustrated above, large chunks of human endeavour lack a formal and satisfying definition yet maintain a steady and fervent practice. John McCarthy, one of the founders of the field of Artificial Intelligence, stated that the philosophy around Artificial Intelligence is “unlikely to have any more effect on the practice of AI research than the philosophy of science generally has on the practice of science.”

Regardless of the definition, greater care and emphasis must be placed upon the training and use of these systems. The lack of a definition of Artificial Intelligence or even intelligence hasn’t stymied progress in these fields, though it has stifled understanding and acceptance of the same.

After all, far older questions are yet to be answered. In the words of the immortal philosopher Homer (of Simpson, not Iliad fame),

“What is mind?
No matter.
What is matter?
Never mind.”

Siddarth Pai is the Founding Partner and Chief Financial Officer at 3one4 Capital.

(Disclaimer: The views and opinions expressed in this article are those of the author and do not necessarily reflect the views of YourStory.)