top of page
Zdjęcie autoraHubert Taler

C as Cloud | The ABC of AI

Cloud computing is a concept that may be associated with something modern and developed in the recent history of technology. However, its roots go back to much earlier times. In this article, we trace the evolution of this revolutionary concept from the early days of computing to modern times.

The prehistory of the cloud

The origins of cloud computing can be traced back to 1963, when DARPA (Defense Advanced Research Projects Agency) invested in the MAC (Multiple Access Computer) project, the first system to enable the sharing of CPU time between multiple users. In the 1960s, the concept of time-sharing gained popularity, primarily through Remote Job Entry (RJE) technology, mainly used by companies such as IBM and DEC.

The dominant model at the time was the ‘data centre’ model, where users submitted jobs to be completed by operators on IBM mainframes. In the 1970s, full time-sharing solutions began to appear, such as Multics on GE hardware, Cambridge CTSS or early UNIX ports on DEC hardware. In practice - this meant that instead of relying on operators, researchers needing to use a computer could do it themselves.

Another breakthrough came with the development of the Internet in the late 1980s and early 1990s. In the 1990s, telecoms companies began to offer virtual private network (VPN) services, which provided a quality of service comparable to dedicated connections, but at lower prices. Cloud computing began to symbolise a new era in data management. We now associate VPN with a change of virtual geographical address, but it means accessing resources elsewhere as if they were on our local network.

This has all gone by various names (e.g. remote access) - but without using the metaphor of cloud computing or storage.



The concept of the cloud is emerging

In 1994, General Magic used the cloud metaphor to describe the Telescript environment, in which mobile agents could travel around the network in search of access to various sources of information. It was then that cloud computing began to be seen not just as remote access to services, but as a platform for creating complex virtual services. For example, a place where applications run independently of the user's attention.

In 2002, Amazon established a subsidiary called Amazon Web Services, enabling developers to build applications independently of traditional IT infrastructures. In 2006, the company introduced further services: Simple Storage Service (S3) and Elastic Compute Cloud (EC2), which were among the first services to use server virtualisation on a pay-per-use basis. The term Infrastructure as a Service (IaaS) was born.

2007 brought further breakthroughs: Netflix launched its online movie streaming service - thus creating the first streaming service based on the Software as a Service model, and IBM and Google collaborated with universities to create server farms for research purposes.

The decade of 2010 began with Microsoft introducing the Microsoft Azure platform. Shortly thereafter, Rackspace Hosting and NASA initiated the OpenStack project, aimed at making it easier for organisations to offer cloud computing services on standard hardware, as opposed to network equipment dedicated to server farms.

In the following years, cloud development accelerated. IBM introduced the IBM SmartCloud framework and the US government created the FedRAMP programme, setting security standards for cloud services. In 2011, Apple launched iCloud and in 2012 Oracle announced Oracle Cloud.



Without the cloud, would there be no AI?

Over the past few years, especially after the 2020 pandemic, cloud computing has grown in popularity as a tool to provide remote working flexibility and data security. Currently, global spending on cloud services stands at $706 billion and is expected to reach $1.3 trillion by 2025.

Modern advances in artificial intelligence, such as ChatGPT, would be impossible without the infrastructure provided by cloud computing. The cloud offers the immense computing power and resources necessary to process and analyse the large data sets that are central to machine learning and AI. With cloud computing, AI algorithms can be trained on complex models using massive amounts of data much faster and more efficiently than ever before. What's more, the cloud enables easy scalability and availability of these resources, which is critical for ongoing exploration and innovation in AI. By providing flexible and powerful on-demand computing resources, cloud computing not only supports the development of new capabilities in AI, but also enables faster deployment and integration of smart applications into everyday life and business. Thus, the cloud has become the foundation on which modern AI is built, transforming theoretical concepts into practical applications that change the way we work, learn and communicate.

The story of cloud computing shows how far technology can take us, offering ever new opportunities for businesses and individual users alike. It is a story about the ongoing quest for more efficient and flexible use of the computing resources that define the modern world of technology.

0 wyświetleń0 komentarzy

Ostatnie posty

Zobacz wszystkie

Comments


bottom of page