ICT Dictionary

A set of instructions defined for a sequence of simple or increasingly complex actions to be performed. It can be a calculation, data processing or automation of repetitive tasks.
an adaptive algorithms represent the next step and can adapt to any change in information.

Machines designed to understand, solve problems and complete tasks using human cognitive processes as a model. The AI ​​was developed to allow the automation of increasingly repetitive complex tasks. For such activities it is however necessary to make decisions and understand the situation. Automation leaves people free to focus on more abstract activities, which cannot be performed by a machine.

Research and development of technologies that use the human brain as a model to create more efficient and precise machine learning systems.

Term used to describe the exponential growth of data. The big data requirements are generally higher than the capabilities of commonly used software tools. It is in any case necessary to reconsider the methods of circulation, collection, storage and analysis of the data.

At the basic level the blockchain is a highly specialized type of database, it is part of the broader class of distributed ledgers. The blockchain differs from the previous technologies because it is built as a chain of blocks and because, by allowing untrusted parts to work together, it solves the problem of double spending with the Nakamoto Consensus.

A chatbot, also called Artificial Conversational Entity (ACE), chat robot, talk bot, chatterbot or chatterbox, is an AI programme that mimics interactive human conversation.

By automating the business processes, these solutions have become widespread for basic customer service, instant messaging and as intelligent virtual assistants.

Technology based on research related to human cognitive processes. It studies how to improve AI and computer efficiency.

It is the process of identifying patterns in huge amounts of data, or big data. It is based on a combination of AI, machine learning and database.

The field of study dedicated to the extraction of knowledge and insights from data through processes, algorithms, systems and scientific methods.

A supercomputer made in 1997 by IBM to play chess. It was designed specifically to beat, as then actually did, the world champion and great chess master Garry Kasparov. At the time it was considered a milestone in the development of artificial intelligence.

The deep learning is currently the most recent evolution of the AI. The learning process is based on the example and uses various layers of non-linear processing units to achieve exceptional results.

It is an artificial intelligence system designed to replicate the decision-making abilities of a human expert. It consists of a combination of two elements: a knowledge base containing established rules and facts and an inferential engine that applies the rules to facts known to establish new facts. It is able to solve specific problems by referring to a library of available knowledge.

HCI deals with the interactions between users and information technology. Combining elements of design, psychology and computer science, it analyses the way in which human beings and computers interact.

Image recognition technology is designed to recognize logos, people, animals, landscapes or objects. Like OCR solutions, it converts an unusual data source into concrete results.

An element that perceives the surrounding environment through the sensors and then acts on that environment. These AI systems can be complex, as in the case of the different face recognition functions available on some smartphones, or simple, like a light sensor that detects the ambient light and reacts accordingly.

Technologies that revolve around the connectivity of different devices, using the Internet and networks as communication channel. Examples of these technologies are document sharing between devices via cloud computing and the use of an app to remotely turn on lights at home while you are on holiday.

Abbreviation of “List processing”. It is a programming language developed in the late fifties. It is the basis of the various programming languages ​​(e.g. Common Lisp, Standard Lisp and many others) and is now widely used in the development of artificial intelligence.

The use of machine learning methods allows computers to learn and understand spoken or written natural language.

Machine learning sector that deals with the detection of patterns or regularity in the data to facilitate their classification. It is often used as a synonym for machine learning.

It is also called predictive modelling. It uses the analysis of old data to develop a forecast of future results. It is based on a combination of machine learning, statistics and data mining. It can help brands to foresee the problems related to the brands themselves or to detect significant trends as soon as they occur.

Two neural networks joint and trained together to improve data accuracy through mutual competition. One network creates new examples, while the other tries to challenge the examples created.

Reinforcement learning is a type of data mining that has less specific purposes. The objectives are in fact more abstract, for example “maximizing brand mentions”. During the training, the AI ​​learns by performing actions to reach the goal and evaluating its contribution after each effort.

It is based on a combination of Natural Language Processing (NLP), computational linguistics and textual analysis that allows identifying and extracting subjective information from a content. It allows to understand the attitude of a person.

Algorithm trained to acquire knowledge in relation to an activity. The acquired knowledge is then applied to complete a different but related activity. You could, for example, train an algorithm to make it recognize car images. Later, this knowledge could be transferred in the recognition of similar vehicles (for example, trucks).

Developed by Alan Turing in the 1950s, the test was designed to see if people could distinguish interactions with a machine or a human being. The standard interpretation requires an interviewer to ask a question to a computer and to a human being, without knowing their identity. Afterwards, it is checked whether the interviewer is able to correctly indicate which interlocutor corresponds to the AI ​​based on the results obtained.

The visual processing unit (VPU) is a type of microprocessor that accelerates machine learning and AI technologies. It was created to support specific activities, such as image processing and recognition.

The ability of a machine to interpret natural human speech patterns and translate them into a machine-readable format. It is also known as Automatic Speech Recognition (ASR), computer voice recognition and audio to text conversion (Speech To Test, STT).

This site uses cookies to improve users' browsing experience and to collect information on the use of the site.