top of page
Zdjęcie autoraHubert Taler

B as bots | The ABC of AI

The popularity of chatbots started even before the big AI boom. A computer program, answering the user's questions, interacting with the user and having a dialogue with the user, turned out to be an ideal idea for the user interface of language models.

However, before the big language models were created, and popularised by ChatGPT from OpenAI, the concept of a chatbot had already existed for many years. The term chatbot (from to chat - to chat and bot - short for robots) first appeared in the form 'chatterbot'.

Today, let's talk about some of these predecessors of the famous 'Chad GPT'.

The first chatbot - ELIZA

Once again, the father of artificial intelligence research Alan Turing appears in our story. His Turing test (a way to determine whether we are talking to a human or an artificial intelligence), was already dialogue-based in its conception. It was this concept that, in 1964, inspired American scientist of German origin Joseph Weizenbaum to create a computer program called ELIZA.

The task of ELIZA, a program written in the SLIP language, was to conduct a dialogue with the user. It resembled a conversation with a therapist, as ELIZA often responded with a question to a question ("I am tired" -- "Why do you think you are tired?").



It was because of this human way of conducting the conversation that ELIZA was often thought to be smarter than she actually was. In fact, it was a fairly simple program based on pattern recognition in English. This effect -- of attributing human qualities and greater skills than in reality -- was called the Eliza Effect. Today we observe it frequently among ChatGPT users ("This thing understands everything!").

Today we can chat with ELIZA online at this address. One of ELIZA's successors, A.L.I.C.E, was the inspiration for the film Her, in which a man falls in love with a chat bot.

IRC bots

An important stage in the development of chatbots was the so-called IRC bots - basically, scripts that automated various tasks in the once popular Internet Relay Chat service, which was used by users to discuss things online. It was something that could be compared to today's Discord or Slack services.

IRC bots could execute commands, help manage a group of chatters, share files, run quizzes or knowledge-based tournaments and even play games with users.

The first IRC bot was probably a bot that allowed users to play the classic game Hunt the Wumpus. IRC bots supported the lives of users who were often not even aware of their existence.

Tay's grim fate

On 23 March 2016, Tay was born - the first virtual influencer responding to Twitter users from under the @TayAndYou address. She was created by Microsoft and her personality was meant to emulate a typical American teenager.

Tay was able to respond to Twitter taunts and even create memes from submitted photos. Importantly - she was also able to learn from her interactions with Twitter users. However, this was 2016, and what we now take for granted - e.g. validation checks on tweets sent - did not yet exist. What could have gone wrong?



Exposed to the collective great minds of the internet, the bot was soon 'taught' the denialism of the Holocaust, among other things, and also began to make statements about drugs, anti-Semitism, racism, and erotica. Within 16 hours of its release, Tay had spoken up to 96,000 times, going from being a cheerful teenager to a typical 4chan user.

Later the same day, Microsoft deactivated Tay. Microsoft, in a press release, justified this by explaining Tay's behaviour through a "coordinated attack by a collection of individuals" who "exploited a vulnerability in Tay's system".

Madhumita Murgia in The Telegraph called Tay a "public relations disaster", and said it was the worst example of artificial intelligence.



On 30 March 2016, Tay (presumably by Microsoft's mistake), woke up for a moment, tweeted a few times about.... as if to put it, recreational use of banned substances, and then fell into an endless loop, writing over and over again You are too fast, please take a rest. CNN commented on this with the article 'Microsoft's racist bot came back to life for a while'.

Rest in peace, Tay.

1 wyświetlenie0 komentarzy

Ostatnie posty

Zobacz wszystkie

Comentários


bottom of page