Summary

The topic collection Data Science has helped us to develop interdisciplinary linkages between the computer, statistics, mathematics, information and intelligence sciences, and it has fostered cross-domain interactions between academia and industry for data science and big data analytics. The Encyclopedia of Data Science topic collection welcomes contributions related, but not limited, to the following topics of interest:

Data science and analytical methods;
The machine learning foundations of data science;
Infrastructures, tools and systems focusing on data processing and analytics;
Real-world data science applications and case studies;
Learning from data with domain knowledge;
Emerging data science applications;
Human-centric data science;
Data science for the next digital frontier (telecommunications and 5G, predictive maintenance, sustainability and the environment, etc.);
Systems for practical applications of data science, data analytics and applied machine learning, demonstrating real-world impact;
Solutions or advances towards understanding the issues related to deploying data science technologies/solutions in the real world;
Processes and Methodologies related to Data Science.

Please click here to find Guidelines for Submissions.

Expand All
Editors
Kamran Munir

Institution: Computer Science and Creative Technologies, University of the West of England, Bristol, UK

Interests: data science; Big Data​ and analytics; Artificial Intelligence

José Raúl Romero

Institution: Department of Computer Science and Numerical Analysis, School of Engineering Sciences, University of Cordoba, Cordoba, Spain

Interests: data science; search-based software engineering;intelligent systems;model-driven engineering

Khalid Hafeez

Institution: Leicester Castle Business School, De Montfort University, Leicester LE1 9BH, UK

Interests: digital transformation; knowledge management; strategic management; supply chain management; Industry 4.0

Entries
Topic Review Peer Reviewed
Optimisation of Small-Scale Aquaponics Systems Using Artificial Intelligence and the IoT: Current Status, Challenges, and Opportunities
Environment changes, water scarcity, soil depletion, and urbanisation are making it harder to produce food using traditional methods in various regions and countries. Aquaponics is emerging as a sustainable food production system that produces fish and plants in a closed-loop system. Aquaponics is not dependent on soil or external environmental factors. It uses fish waste to fertilise plants and can save up to 90–95% water. Aquaponics is an innovative system for growing food and is expected to be very promising, but it has its challenges. It is a complex ecosystem that requires multidisciplinary knowledge, proper monitoring of all crucial parameters, and high maintenance and initial investment costs to build the system. Artificial intelligence (AI) and the Internet of Things (IoT) are key technologies that can overcome these challenges. Numerous recent studies focus on the use of AI and the IoT to automate the process, improve efficiency and reliability, provide better management, and reduce operating costs. However, these studies often focus on limited aspects of the system, each considering different domains and parameters of the aquaponics system. This paper aims to consolidate the existing work, identify the state-of-the-art use of the IoT and AI, explore the key parameters affecting growth, analyse the sensing and communication technologies employed, highlight the research gaps in this field, and suggest future research directions. Based on the reviewed research, energy efficiency and economic viability were found to be a major bottleneck of current systems. Moreover, inconsistencies in sensor selection, lack of publicly available data, and the reproducibility of existing work were common issues among the studies.
  • 653
  • 19 Feb 2024
Topic Review Peer Reviewed
Large Language Models and Logical Reasoning
In deep learning, large language models are typically trained on data from a corpus as representative of current knowledge. However, natural language is not an ideal form for the reliable communication of concepts. Instead, formal logical statements are preferable since they are subject to verifiability, reliability, and applicability. Another reason for this preference is that natural language is not designed for an efficient and reliable flow of information and knowledge, but is instead designed as an evolutionary adaptation as formed from a prior set of natural constraints. As a formally structured language, logical statements are also more interpretable. They may be informally constructed in the form of a natural language statement, but a formalized logical statement is expected to follow a stricter set of rules, such as with the use of symbols for representing the logic-based operators that connect multiple simple statements and form verifiable propositions.
  • 1.2K
  • 31 May 2023
Topic Review Peer Reviewed
Tokenization in the Theory of Knowledge
Tokenization is a procedure for recovering the elements of interest in a sequence of data. This term is commonly used to describe an initial step in the processing of programming languages, and also for the preparation of input data in the case of artificial neural networks; however, it is a generalizable concept that applies to reducing a complex form to its basic elements, whether in the context of computer science or in natural processes. In this entry, the general concept of a token and its attributes are defined, along with its role in different contexts, such as deep learning methods. Included here are suggestions for further theoretical and empirical analysis of tokenization, particularly regarding its use in deep learning, as it is a rate-limiting step and a possible bottleneck when the results do not meet expectations.
  • 833
  • 11 Apr 2023
>>