Artificial intelligence (AI) as a branch of computer science, the purpose of which is to imitate thought processes, learning abilities and knowledge management, finds more and more applica-tions in experimental and clinical medicine. In recent decades, there has been an expansion of AI applications in biomedical sciences. The possibilities of artificial intelligence in the field of medical diagnostics, risk prediction and support of therapeutic techniques are growing rapidly.
The term “artificial intelligence” was first proposed in 1955 by the American computer scientist John McCarthy (1927–2011) in the proposal of a research project, which was carried out the following year at Dartmouth College in Hanover, New Hampshire [1][2].
Artificial intelligence (AI) as a branch of computer science, the purpose of which is to imitate thought processes, learning abilities and knowledge management, finds more and more applications in experimental and clinical medicine. In recent decades, there has been an expansion of AI applications in medicine and biomedical sciences. The possibilities of artificial intelligence in the field of medical diagnostics, risk prediction and support of therapeutic techniques are growing rapidly. Thanks to the use of AI in ophthalmological [3], radiological [4] and cardiac [5] diagnostics, measurable clinical benefits have been obtained. AI was used in research on new pharmaceuticals [6]. The development of AI also provides new opportunities for research on nutrients and medical sensing technology [7].
ANNs as a currently widely used modeling technique in the field of AI were inspired by the structure of natural neurons of the human brain. ANNs are mathematical models designed to process and calculate input signals through rows of processing elements, called artificial neurons, connected to each other by artificial synapses. There are three types of layers forming ANNs. The input layer captures the raw data and passes them to the hidden layer. In this second layer, the learning process takes place. The results of the analysis are collected in the output layer and the output data are created. A neural network may consist of hundreds of single units. An ANN is a parameterized system that has weights as adjustable parameters. Due to the need for estimation of these parameters, ANNs require large training sets. ANNs acquire knowledge by detecting patterns and relationships between data, i.e., through experience, not as a result of programming.
An ANN reveals its particular usefulness in the case of the need for modeling datasets with non-linear dependencies. In solving biomedical problems, raw data can be both literature and experimental data. In the last two decades, ANNs have been used, among others, to create an experimental decision algorithm model open to improvement, aimed at evaluating the results of biochemical tests confronted with both reference values and clinical data [8]. This technique was also used in evaluation of cell culture cross-contamination levels based on mass spectrometric fingerprints of intact mammalian cells [9]. The particular usefulness of ANNs has been proven in pharmaceutical analyses [10]. An interesting application of ANNs is the prediction of the relationship between the Mediterranean dietary pattern, clinical characteristics and cognitive functions [11]. The usefulness of ANNs has been proven in body composition analyses, which have clearly non-linear characteristics [12]. Using ANN modeling, significant benefits can be obtained in clinical dietetics.
It is worth noting that the fuzzy logic methodology (FLM) can be combined with neural networks. The idea of this area of AI is to strive for greater accuracy, dimensionality and simplification of the structure. There is a possibility to create fuzzy neural networks and convert FLM-based models into neural networks.
ML is an AI area related to algorithms that improve automatically through experience. ML algorithms have the potential to create mathematical models for decision making. The process of creating these models is based on large sets of training data, without programming. The popularization of the use of ML algorithms took place in the last decade of the 20th century in search engine applications. In the following decades, there were high hopes for significant discoveries in the field of organic synthesis with the use of increasingly advanced ML algorithms [13]. Despite the fact that these hopes have not been fully met, this area of AI has important applications both in biomedical sciences and in clinical medicine. Machine learning—both supervised and unsupervised—can be applied to clinical datasets to develop risk models [14]. It can significantly support the analysis of data obtained from the patient [15].
There are suggestions that ML is the future of computer-assisted diagnostics, biomedical research and personalized medicine [16]. Machine learning techniques are becoming more and more popular in diabetes research: in blood glucose prediction and in the development of the so-called artificial pancreas (a closed-loop system) [17]. The use of ML algorithms in research on the gut microbiota is postulated, especially because of the large datasets collected in these studies [18]. In a recent report, Liu et al. proved that an ML algorithm integrating baseline microbial signatures of the intestinal microbiota can accurately predict the patient’s glycemic response to physical effort [19].
Deep learning (DL) is a subtype of ML. It is an AI domain that has found its applications especially in the techniques of image and voice recognition and foreign language translation. DL also has an important use in medical diagnostics. The significant advantage of DL over supervised ML is expressed in the autonomy of the program in the area of building sets of features used in recognition.
The term IoT was first used by British entrepreneur and startup founder Kevin Ashton in 1999, in the sense of a network of connected objects. This is the concept that objects (devices) can directly or indirectly collect, process or exchange data via a computer network or intelligent electrical installation. The term Internet of Everything (IoE) is used to describe a network of people, processes, data and things connected to the Internet.
In clinical medicine, IoT has a significant application in relation to telemedicine procedures [20][21], which are becoming more and more widely used, especially during the COVID-19 pandemic. Important applications of IoT can also be seen in the provision of detailed information on food products available on the market [22].
One of the main problems in analyzing publications on the use of AI in nutrient research is the range of research areas to be considered. This type of research creates a very diverse spectrum of problems. They are not limited to the field of biomedical sciences, but also apply to plant and animal breeding, including the breeding of microorganisms. The limitations which were found in the methodology of the review were dictated by the intention to maintain transparency. Therefore, studies that directly or indirectly relate to human health were included, excluding research on nutrients in agricultural and veterinary sciences. The review of the publications revealed three application areas of AI technology: biomedical nutrients research, clinical nutrients research and nutritional epidemiology.
During the analysis of the reviewed publications presenting the results of research on nutrients with the use of AI technology, it can be noticed a little later that it gained wider application in human health research than analogous applications in experimental research on food. This may have resulted from both some ethical concerns and psychological resistance, as well as from the imperfections of earlier AI algorithms, which seemed not yet ready to solve problems concerning the human body. A significant increase in the number of publications on the use of AI in nutrients research has been recorded in the last decade (2011–2020). Perhaps the title question from the article by Gedrich et al., “How optimal are computer-calculated optimal diets?” [37], asked at the end of the last century was significantly ahead of the medical professions’ mentality.
The use of AI in biomedical nutrients research reflects the need for efficient analysis of large datasets that could not be analyzed using traditional statistical methods. This applies in particular to the study of the relationship between nutrients and the functioning of the human body and in the study of the gut microbiota [23][24][25]. The increasing use of AI algorithms in this area is an expression of scientific progress and is becoming not only a privilege, but even a necessity in the pursuit of obtaining valuable results. The possible decoding of the gut microbiota functioning mechanisms can bring significant benefits in the form of possibilities to develop modern and very effective probiotics.
The application of AI algorithms in clinical nutrients research is expressed both by systems supporting dietary activities, diseases risks in relation to food and nutrients patterns and supplementation research. An important issue in this research area is the assessment of the reliability and credibility of the test results obtained using AI techniques. Another essential issue is the modification of the dietician–patient relationship in the case of replacing, in whole or in part, the work of a medical professional by AI systems [20][21][22][23][24][25][26][27][28][29][30][31][32][33][34][35][36][39][40][43–53]. The problem of trust in AI-based systems, especially in the elderly, remains open. In the social dimension, however, with the implementation of modern technologies in everyday activities, an increase in trust in both robotic systems and AI systems in medicine is observed. Especially on the basis of the articles included in the review, it is possible to state potentially good-quality effects of using dietary AI systems. Comparing them with the assessment of professional nutritionists, it is worth noting that in both cases, there were similar difficulties with regard to estimating the caloric value of some food products (e.g., GoCARB) [28]. The use of AI systems in dietary assessments enables personalized nutrition, which in some diseases is a priority.
The development of AI systems in dietetics may lead, in the near future, to a partial replacement of medical personnel and reducing the need for personal contact with a nutritionist. In the face of contemporary epidemiological threats, this seems to be of significant importance. The further dynamic development of dietary systems using AI technology may lead to the creation of a global network that will be able to both actively support and monitor the personalized supply of nutrients [38]. In this case, consideration should be given to geographical and cultural differences in the management of food and nutrients. Perhaps the development of AI in nutrients research will enable the creation of personalized nutrition databases as a starting point for modulating daily nutrition, as enabled by Nutri-Educ based on fuzzy arithmetic [34].
On the basis of this review, it is worthwhile to consider the possibility of creating AI systems to coordinate both biomedical and clinical nutrients research with nutritional epidemiology. Perhaps the gut microbiota function may be an important mediator of this kind of advanced coordination. Therefore, research on the importance of the intestinal flora is of fundamental importance in the field of nutrients research. A significant challenge for the near future is the use of AI technology in the creation of gut microbiota biobanks for the purpose of scientific research [39].
Despite the fact that AI technologies are dynamically developing, the problem in nutrients research is not currently obtaining more and more advanced algorithms, but the application of those that have already been developed and are standardly used in other fields of knowledge, and even in other areas of biomedicine. An important challenge for nutrients research is also their integration with research on the use of medical robotics. Perhaps the development and application of AI in nutrients research requires modification of both mentality and professional competences, as is already postulated in relation to the food industry [40].
This entry is adapted from the peer-reviewed paper 10.3390/nu13020322