10th International Congress on Information and Communication Technology in concurrent with ICT Excellence Awards (ICICT 2025) will be held at London, United Kingdom | February 18 - 21 2025.
Sign up or log in to bookmark your favorites and sync them to your phone or calendar.
Authors - Reena (Mahapatra) Lenka, Rajiv Divekar, Jaya Chitranshi Abstract - The performance measurement analysis system is suggested to overcome the problems various higher education sectors face in improving their performance. This performance measurement system satisfies all the requirements of scholars related to institute requirements. This system also satisfies the faculty’s needs as well as they can keep track of the student details, attendance, and marks, upload assignments, and have a fair idea regarding the students. Also, they can track how they can improve their performance. This system also satisfies the administration's need to keep students' records per the institutes' requirements. When followed and implemented in the higher education sector, this system would help improve the institute's performance to a greater extent, increasing its brand name.
Authors - Jaya Chitranshi, Rajiv Divekar, Reena (Mahapatra) Lenka Abstract - ‘Performance management’ should be the focus of any institution of ‘higher education’ to achieve its sublime objective of educating, training and steering mature minds. The purpose of transforming students into high-performance individuals can be achieved completely, when there are certain checkpoints and steps to the process of ‘‘performance management’’. The paper attempts to identify existing gaps in literature with respect to ‘performance management’ in ‘higher education’. This research paper proposes a nine-step model of ‘performance management’ for increasing performance of students in ‘higher education’. These steps are (i.) Goal-setting, where the targets should be set for students for one term; (ii.) Coaching and guiding, that should be done to make students achieve the goals set; (iii) Performance measurement, that should be done to assess performance with respect to goals set; (iv) Mentoring, that should be done to help students explore their strengths/potential/chances/ opportunities of improving performance with respect to set goals; (v) Counselling, that should be done to help students identify areas where they still lag or where their potential is still not used; to improve performance with respect to specific goals set; (vi) Performance measurement, that should be done to assess performance again; (vii) Performance Aggregate for the goal(s), that should be measured with respect to a specific goal, and for all goals then combined; (viii) Reward/ Advisory, that should be decided based on the aggregate of performance; (ix) New Goal(s)/ Revising and recalibrating goals-Based on the reward/ advisory and the aggregate of performance, new goals can be set/goals can be revised or recalibrated for the next term. It is extremely essential that ‘continuous feedback’ be provided to students in ‘higher education’ institutions, so that students get a clear direction towards improving their performance.
Authors - Jaya Chitranshi, Rajiv Divekar, Reena (Mahapatra) Lenka Abstract - One of the main objectives of higher education is to assist students improve their academic performance. This objective can only be accomplished when students consistently receive tailored feedback on how to improve their performance levels. This research-paper focusses on the process and model of performance feedback communication in higher education. The process-flow of performance feedback communication illustrates the input received through student, faculty, feedback type, login and model user. On the basis of the inputs, feedback-reports (student report, faculty report, feedback-type report and model user report) can be created and user login details can be checked. The step-wise model of performance feedback communication in higher education provides continuous performance feedback to students in higher education through 4 important steps. Step 1) Monthly Feedback collection-360 Degree, Step 2) Matching with Expectations, Step 3) Continuous Feedback Communication, Step 3. (A) Positive Feedback, Step 3. (B) Constructive Feedback, Step 3. (C) Supplement: Active Listening, Step 4.A.(i) Positive Feedback-script, Step 4.A.(ii) Positive Feedback-Mode of Communication-In public, Step 4.B.(i) Constructive Feedback-script, Step 4.B.(ii) Constructive Feedback-Mode of Communication-In private.
Authors - Eng Bah Tee, Insu Song Abstract - Mobile educational games have arisen as a fascinating tool to teach difficult concepts in an interactive and engaging manner. Mobile educational game uses a game type like puzzle, strategy, role-playing and so forth to drive the education of learning content. Currently game type is something selected by the game designer or programmer. Previous research study has mentioned that game type and lesson content is a critical area that requires more research. At the moment, a teacher or game designer is not too sure what game type would be best to teach a lesson on Geography or Mathematics. In fact, it has been found in the previous study that game type does have a significant impact on learning outcome and experience. To capitalize on the research gap for game type, we have therefore embarked on Stage 2 of our study to use artificial intelligence (AI). A machine learning model is employed to predict the evaluation score of the game type of mobile educational game employed to teach a subject lesson and to recommend the best game type for teaching the lesson. We then proceeded to Stage 3 and evaluated the performance of the AI model by creating a test set of twenty games and twenty undergraduates were recruited at an Indonesian university to evaluate the games. The average score of all mobile games evaluations is above the average of 3.5, thus proving the hypothesis H1 set out for Stage 3.
Authors - Robert Kudelic Abstract - The world of scientific publishing is changing; the days of an old type of subscription-based earnings for publishers seem over, and we are entering a new era. It seems as if an ever-increasing number of journals from disparate publishers are going Gold, Open Access that is, yet have we rigorously ascertained the issue in its entirety, or are we touting the strengths and forgetting about constructive criticism and careful weighing of evidence? We will therefore present the current state of the art, in a compact review/bibliometrics style, of this more relevant than ever hot topic, including challenges and potential solutions that are most likely to be acceptable to all parties. Suggested solutions, as per the performed analysis, at least for the time being, represent an inclusive publishing environment where multiple publishing models are competing for a piece of the pie and thus inhibiting each other’s flaws. The performed analysis also shows that there seems to be a link between trends in scientific publishing and tumultuous world events, which in turn has a special significance for the publishing environment in the current world stage—implying that academy publishing has potentially now found itself at a tipping point of change.
Authors - Jaya Chitranshi, Rajiv Divekar, Reena (Mahapatra) Lenka Abstract - To satisfy its high purpose of shaping, developing, and directing adult minds, an institution of higher learning should have ‘performance management’ as its top priority. When there are clear phases and benchmarks in the ‘performance management’ process, the aim of transforming students into high-achieving individuals can be adequately achieved. The paper's objective is to identify any research gaps in ‘performance management’ in ‘higher education’ and make recommendations for future courses using Bibliometric Analysis.
Authors - Tajim Md. Niamat Ullah Akhund, Kenbu Teramoto Abstract - The demand for efficient human activity recognition systems has surged recently, driven by the need for intelligent monitoring in various environments such as smart homes and workplaces. This paper presents a novel approach to measuring human activeness using a single Passive Infrared (PIR) sensor, highlighting its simplicity, costeffectiveness, and privacy-conscious design. This paper introduces a novel one-dimensional modeling approach for measuring human activeness using a single Passive Infrared (PIR) sensor, incorporating the Laplace distribution to analyze movement patterns. We define an activeness index μ, quantifying average human activity over time, allowing for precise numerical assessment. Our method utilizes the sensor’s capabilities to gather data on human movement and generate numerical metrics of average activeness over time. The results demonstrate that this approach effectively captures human activity levels while minimizing equipment complexity. This work contributes to the growing field of human activity recognition by offering a practical solution that balances performance with user privacy and affordability.
Authors - Hasti Vakani, Mithil Mistry, Hardikkumar Jayswal, Nilesh Dubey, Nitika Sharma,Rohan Patel, Dipika Damodar Abstract - Obesity has become a significant global health concern due to its as-sociation with various non-communicable diseases. Traditional methods for obesity assessment, such as BMI, often fail to capture the complexity of the condition, highlighting the need for more accurate predictive tools. This research utilize the machine learning algorithms, including Random Forest, Gradient Boosting, Support Vector Machines, and Neural Networks, in a stacking ensemble model to predict obesity levels. Utilizing datasets from diverse populations, the model achieved a high accuracy of 96.69%. Key features such as BMI, age, and dietary habits were identified as critical predictors through Recursive Feature Elimination. The research findings demonstrate the potential of advanced data-driven techniques in providing personalized insights into obesity management and underscore the transformative role of machine learning in public health initiatives.
Authors - Kutub Thakur, Md Liakat Ali, Suzanna Schmeelk, Joan Debello, Denise Dragos Abstract - The escalating prevalence of obesity in young adults has become a pressing public health concern, requiring innovative risk prediction and intervention approaches. This paper examines the potential of combining traditional lifestyle factors with social media behavior to predict obesity risk in young adults while addressing ethical considerations related to data privacy and informed consent. By identifying the most predictive social media metrics associated with obesity risk, this research offers novel insights that could inform targeted prevention strategies. Through a mixed-methods approach, the study examines the associations between social media behavior, traditional lifestyle factors, and obesity risk while ensuring adherence to ethical guidelines and protecting individual privacy. The findings highlight the importance of integrating social media metrics into risk prediction models, offering new avenues for intervention and prevention efforts. This research provides a deeper understanding of the complex interplay between social media behavior, lifestyle factors, and obesity risk, emphasizing the need for multidisciplinary approaches to tackle this growing public health challenge.
Authors - Alisher Ikramov, Shakhnoza Mukhtarova, Raisa Trigulova, Dilnoza Alimova, Dilafruz Akhmedova Abstract - Hospital readmissions pose a significant burden on healthcare systems, especially for patients with type 2 diabetes mellitus (T2DM) and cardiovascular diseases. Early readmission risk prediction is crucial for improving patient outcomes and reducing costs. In this study, we develop a predictive model based on accessible clinical features to estimate the risk of future hospitalizations. Using data from 260 patients at the Republican Specialized Scientific and Practical Medical Center for Cardiology in Uzbekistan, we trained a Generalized Linear Model that achieved a ROC AUC of 0.898 on the test set.
Authors - Lucas V. Santos, Vitor B. Souza Abstract - Fog computing emerges as an innovative solution for edge data processing, proving to be particularly important in the context of the Internet of Things (IoT) by delivering low latency and high bandwidth at the cost of requiring a stable connection. One application that has greatly benefited from this concept is the use of Unmanned Aerial Vehicles (UAVs), also known as drones, for various applications requiring real-time communication between these devices and, potentially, a base station. This paper focuses on the use of UAVs, highlighting the connectivity challenges posed by the limitations of wireless communication technologies, such as Wi-Fi. To address these challenges, we propose a model based on deep reinforcement learning (DDQN), which helps drones make decisions on the best route between the origin and destination, balancing the minimization of travel time and the maximization of connectivity throughout the journey. Using a simulated environment where drones are trained to avoid disconnection areas, we found that the proposed model significantly improves connection stability in areas with limited coverage, albeit with an unavoidable increase in route distance. Comparisons with traditional routing methods demonstrate the advantages of our model.
Authors - Sacrificio Sithole Junior, Mohammad Gulam Lorgat Abstract - The increase in the number of university students has resulted in long queues and delays in services, both during orientation events and in resolving general queries. A service chatbot is an artificial intelligence tool designed to interact with users, answering frequently asked questions and assisting in solving problems in an automated and efficient manner. This study presents the development of a chatbot prototype for the Faculty of Engineering administrative office in Chimoio, at the Universidade Católica de Moçambique (UCM), aiming to optimise service delivery, reduce waiting times, and increase efficiency in resolving common issues. Using a mixed-method approach, the study involved direct observation and questionnaires administered to students to identify the main problems with traditional service. The chatbot's development was carried out in two phases: the first involved data collection and the identification of needs, while the second covered the implementation of the prototype. This chatbot can provide a viable and effective solution to the challenges faced, delivering faster and more efficient service, while freeing up human resources for more complex tasks.
Authors - Bella Holub, Viktor Kyrychenko, Dmytro Nikolaienko, Maryna Lendiel, Dmytro Shevchenko, Andrii Khomenko Abstract - The article discusses the informational and algorithmic support for an atmospheric air quality monitoring system. It describes the system's architecture and individual components, along with a logical data model and two approaches to calculating the air quality index. Research on the use of caching methods, pre-aggregation, and sorting is presented to improve the efficiency of processing large volumes of data (Big Data).
Authors - Nguyen Ngoc Tu, Phan Duy Hung, Vu Thu Diep Abstract - In today's Industry 4.0 era, information technology has penetrated every industry, making work easier, faster and helping businesses operate more effectively. The ultimate measure of a business's success is customer satisfaction and loyalty. This work aims to enhance customer care by automating the processing of customer feedback through the development of an automatic classification system using deep learning techniques, specifically the Long Short-Term Memory model. The system will automatically classify customer problems, thereby improving service quality and enhancing the company's image. The study used customer feedback data from our company's customer care system, including 41,886 comments from Vietnamese customers. The study proposes to use the LSTM model to process text data and solve the problem of imbalanced data to improve the accuracy and efficiency of the classification system. Test results of the models show that the highest accuracy is about 80%. The study also recommends improving data labeling and testing more advanced natural language processing techniques to achieve better performance in the future.
Authors - Pham Hong Duong, Phan Duy Hung, Vu Thu Diep Abstract - Text classification, is a very popular problem with various application in natural language processing (NLP). One of the core tasks performed in text classification is assigning labels or tags to units in the text data such as sentences, paragraphs, and documents by exploring the relation between words or even characters. There are many applications derive from text classification, namely Sentiment Analysis, Topic Classification, Spam Detection, Document Classification, and so on. The main object of analyzing is text data. It can come from various sources like a newspaper, a document, some text messages that people use on daily basis. Naturally, as one of the most important form of communication, text is an extremely rich source of data. However, due to its unstructured nature and highly dependence on the context of use, extracting insights from text can be very challenging and time-consuming. This study focuses on exploring the data and forming a classification model on some of the gaming application test sets. We approach the problem using some basic text analysis methods and performing text classification by applying a Deep Learning method – the Convolutional Neural Network model. The dataset is collected from the handwritten test sets for various in-game content by the Quality Assurance Engineers. The main label to be classified is the Priority of the test cases on a whole test set, and eventually, the priority will be used to choose which test case fall into the Regression Test set, specifically 4 types of Priority from highest to lowest label. Finally, we provide an analysis of the performance of deep learning models based on the evaluation metrics as well as comparing it with a self-built traditional Machine Learning model using Logistic Regression and testing against real test case input. From that, we expect to learn to improve the deep learning model and discuss the possible future directions.
Authors - Makhabbat Bakyt, Khuralay Moldamurat, Luigi La Spada, Sabyrzhan Atanov, Zhanserik Kadirbek, Farabi Yermekov Abstract - This paper presents a geographic information system for monitoring and forecasting the spread of forest fires based on intelligent processing of aerospace data from low-orbit vehicles (LOA). The system uses convolutional neural networks (CNN) for fire detection and recurrent neural networks (RNN) for fire spread forecasting. To ensure the security of high-speed data transmission from LOA, a quantum key distribution (QKD) system is implemented, providing virtually unbreakable encryption. Experimental results demonstrate a 30% improvement in fire detection efficiency compared to traditional methods. The paper also discusses the potential costs of implementing QKD and AI, as well as the steps required for practical implementation of QKD on a large scale, taking into account factors such as the influence of the atmosphere on quantum key distribution.
Authors - Hiep. L. Thi Abstract - This paper investigates robust control strategies for managing unmanned aerial vehicles (UAVs) and other systems in emergency situations. We explore the challenges associated with maintaining stability and performance under unforeseen and critical conditions, present current approaches to robust control, and propose new methodologies to enhance system resilience. The paper also discusses practical applications and future research directions in this vital area of control systems engineering.
Authors - Fisiwe Hlophe, Sara Saartjie Grobbelaar Abstract - By adhering to a systematic design approach informed by scientific and engineering principles, Advanced Frugal Innovations yield products that optimize resource utilization, enhancing environmental sustainability and achieving significant cost savings. Following the Joanna Briggs Institute (JBI) framework, this article presents a scoping review that explores the landscape of AFIs in agriculture in developing countries. The Bibliometrix software package was used to facilitate the analysis of the bibliometric data included in this study. This study discovered that AFIs are based on advanced engineering techniques facilitated by research and development and rigorous design. This allows them to be suitable for mass production and have a wide range of novelty. The significant cost savings allow AFIs to be competitive in all markets, not exclusive to lower-income markets. This study discovered that factors such as a suitable innovation ecosystem, user-centered design, availability of highly skilled labor, and technology development enable the generation and development of AFIs. In contrast, skills shortage, lack of cohesion, funding issues, regulatory issues, and market access are some of the hindrances to the development of AFIs. We propose a research agenda for a better understanding of the requirements for setting up innovation ecosystems in the agricultural context that will drive the development and wide adoption of AFIs.
Authors - Karen McCall, Bevi Chagnon Abstract - The advent of the Internet and digital content has underscored the need to ensure equal access to data tables for individuals with disabilities, particularly those who are blind. However, the conventional 'one size fits all’ solutions, akin to Alt Text, have proven inadequate in the face of the growing volume of complex digital data tables. This paper presents research findings that could significantly enhance the accessibility of complex data tables for all users. Past and current research typically focuses on two areas of digital table access: HTML and PDF, and simple rather than complex data tables [1] [2] [3] [4]. For those using screen readers, basic information about a data table is provided with two options. It is either a “uniform” or simple data table or a “non-uniform” complex data table, which can have potential accessibility barriers such as merged or split cells. This paper provides insight and the results of original research in the form of an end-user survey on the multifaceted accessibility barriers of complex data tables. The research highlights that any solution designed for those using screen readers can benefit everyone — regardless of their abilities — in understanding complex data tables. This inclusive approach not only underscores the value of our work to all users, making them feel included and valued, but also holds the promise of a more accessible digital future across all digital technologies and media formats and for all users.
Authors - Luz Norma Caisal, Mocha-Bonilla Julio A. Abstract - The so-called digital era in which we live together with Learning and Knowledge Technologies (LKT) have radically transformed the form and methods of teaching and learning, LKT are tools that have evolved digital teaching towards the creation of learning experiences personalized and meaningful. One of the application contexts focuses on the teaching of Physical Education, an area that presents a wide variety of strategies in the teaching-learning process, therefore, Physical Education is an area where various technological tools can be incorporated for teaching. and practice of Physical Education. We worked with a group of students belonging to the third year of the Unified General Baccalaureate, the sample was made up of 84 students, who are aged ±16 years, as an instrument a structured questionnaire with polytomous questions distributed in three sections was used, the processing and Data analysis was carried out using the IBM SPSS Statistics version 24 package. The results in the first section reflect that students feel satisfied or very satisfied when practicing physical education; In the second section, it could be assumed that 89% of the students, the vast majority, used, applied and improved their learning thanks to learning and knowledge technologies in the physical education teaching process; Finally, in the third section, the use of the most used technological tools such as Genially, Google Meet, Kahoot, Moodle platform, Prezi and Socrative is emphasized. It is concluded that in physical education the application of Kinovea in physical education processes is essential to improve movement human.
Authors - Robert, Tubagus Maulana Kusuma, Hustinawati, Sariffudin Madenda Abstract - The process of forming a good dataset is a very decisive step in the success of a facial expression recognition/classification system. This paper proposes 24 scenarios for the formation of facial expression datasets involving the Viola-Jones face detection algorithm, YCbCr and HSV color space conversion, Local Binary Pattern (LBP), and Local Monotonic pattern (LMP) feature extraction algorithms. The results of the 24 dataset scenarios were then formed into five dataset categories to be used as training datasets and testing of two Machine Learning calcification models, namely Support Vector Machine (SVM) and Convolutional Neural Network (CNN). The SVM classification model is designed using four different kernels: radial, linear, sigmoid, and polynomial basis functions. Meanwhile, the CNN classification model uses the MobileNetV2 architecture. From testing the five categories, the best accuracy result is 83.04% provided by the SVM classifier that uses the sigmoid kernel and a combined dataset of LBP and LMP features extracted to focus only on the facial area from the results of the Viola- Jones face detection algorithm. In addition, for the CNN classifier, the best accuracy was obtained at 82.14% by using the Y-grayscale dataset which also focuses only on the facial area but without the feature extraction process. The results of the best accuracy for the two classifiers show that the face detection stage plays an important role in the facial expression recognition/classification system. The LBP and LMP algorithms are good enough to use for feature extraction in forming datasets in the SVM classification model.
Authors - Toufik Mechouma, Ismail Biskri, Serge Robert Abstract - This paper introduces Syntax-Constraint-Aware BERT, a novel variant of BERT designed to inject syntactic knowledge into the attention mechanism using augmented Lagrange multipliers. The model employs syntactic dependencies as a form of ground truth to supervise the learning process of word representation, thereby ensuring that syntactic structure exerts an influence on the model’s word representations. The application of augmented Lagrangian optimisation enables the imposition of constraints on the attention mechanism, thereby facilitating the learning of syntactic relationships. This approach involves the augmentation of the standard BERT architecture through the modification of the prediction layer. The objective is to predict an adjacency matrix that encodes words’ syntactic relationships in place of the masked tokens. The results of our experiments demonstrate that the injection of syntactic knowledge leads to improved performance in comparison to BERT in terms of training time and also on AG News text classification as a downstream task. By combining the flexibility of deep learning with structured linguistic knowledge, we introduce a merge between bottomup and top-down approaches. Furthermore, Syntax-Constraint-Aware BERT enhances the interpretability and performance of Transformerbased models.
Authors - Anda Batraga, Tatjana Volkova, Jelena Salkovska, Liene Kaibe, Didzis Rutitis, Eduards Aksjonenko, Marta Kontina Abstract - As AI develops, it is becoming increasingly important in digital marketing processes. AI has become an essential part of the digital marketing world, enabling businesses to reach their customers faster and to improve their business operations by automating some of the simplest tasks. Through technology transfer, AI brings significant improvements offering new opportunities and creative approaches to achieving the goals of a digital marketing strategy. The aim of this study is to investigate and analyse the possibilities of using AI in digital marketing strategy in order to draw conclusions and make proposals on the possibilities of improving digital marketing strategy in Latvian companies using AI. The results show that the transfer of AI technology can provide companies with several advantages. The need for a well-thought-out technology transfer is emphasised by the experts in order to make the technology work and help the company achieve its goals.
Authors - Lerato Mashiloane, Khutso Lebea Abstract - Since the beginning of the internet, there has been a continuous effort to secure and encrypt data transmitted through web browsers. The acronym VPN stands for "Virtual Private Network", which refers to the capability of creating a secure network connection while using public networks. Commercial and defence organisations have also adopted virtual private networks because they offer secure connectivity at reduced costs. The research paper will discuss what VPNs are, their importance and the mechanics behind them to give users an understanding of their highest level of security. The paper will look further at factors to consider when choosing a VPN and the balance between security and performance.
Authors - J R Harshavardhan, Anjan Kumar K N, Prasanna Kumar M Abstract - Chronic Kidney Disease (CKD) is a pressing global health concern, where early diagnosis and effective management are vital to prevent progression to end-stage renal failure. This review paper analyzes advancements in the prediction and classification of CKD and related kidney disorders through machine learning (ML) techniques. It explores a spectrum of methodologies, ranging from traditional statistical models to advanced deep learning approaches, assessing their effectiveness in enhancing diagnostic accuracy. A key contribution of this work is the proposal of a novel methodology and block diagram for integrating diverse data sources, including patient demographics, clinical measurements, and medical images, to improve predictive outcomes. The proposed system leverages Convolutional Neural Networks (CNNs) for image analysis and employs ensemble methods for feature integration, aiming to optimize predictive performance. The review also addresses significant limitations, such as data quality and feature selection challenges, while emphasizing the advantages of early detection and personalized treatment through advanced ML models. By identifying research gaps and suggesting future directions, this paper aims to foster the development of more effective algorithms and real-time monitoring systems for CKD and kidney disorder management, ultimately contributing to improved patient outcomes.
Authors - Philane Tshabalala, Rangith B. Kuriakose Abstract - The Fourth Industrial Revolution has had a significant and far-reaching impact on the manufacturing industry. A substantial transformation has taken place within the manufacturing industry, with a notable shift from the conventional approach of mass production to a more bespoke model driven by the global market's demand for enhanced product diversity. This requires the redesign of assembly lines to enable the production of multiple product variants, thereby increasing their complexity. In order to effectively manage the increased complexity and avoid potential bottlenecks caused by longer cycle times, it is essential to implement a virtual system capable of real-time monitoring and fault detection. The current methods for reducing cycle time are deficient in their lack of utilization of real-time data inputs. This article presents a case study of a water bottling plant that employs a mixed-model stochastic assembly line. Two virtual systems, a digital shadow and a digital twin, were developed using MATLAB and SIMULINK as potential solutions. The two systems processed the identical input data in order to calculate cycle times. The results of the study indicate that the application of real-time data and digital twins can lead to a significant reduction in cycle times in a mixed-model assembly line, with an average improvement of 19% in comparison to the digital shadow.
Authors - Hiep. L. Thi Abstract - This paper explores the critical issue of stability in Unmanned Aerial Vehicle (UAV) control systems, particularly under varying environmental conditions and mission requirements. We discuss current challenges, including adaptive control, autonomous missions, urban navigation, and sensor integration. The paper also highlights recent advances in ensuring robust stability and outlines future research directions for improving UAV performance in complex and dynamic environments.
Authors - Abdul Wahab Samad, Noerlina AnggIvanraeni, Khairul Ismed, Ivan Lilin Suryono, Zahera Mega Utama Abstract - Cooperatives are a vital part of Indonesia's economy, and their growth and development have changed several times over the country's history. One of the cornerstones of any functional economy is the cooperative. Much headway was made towards assisting farmers during the New Order through the formation of Village Unit Cooperatives, often called Koperasi Unit Desa (KUD). Conversely, these cooperatives are encountering roadblocks and challenges in their development at the moment. In order to weigh the pros and cons of cooperatives, it is necessary to set up a cooperative framework that considers the cooperative movement and backs regulatory standards. Building this structure is a prerequisite to achieving this goal. That cooperative goods will be available for purchase and supported adequately in the future is ensured by embracing this paradigm. One of the quantitative research methodologies used for this examination was the Smart Partial Least Square (Smart PLS) analysis. An investigation was conducted to assess the scope of the opportunities and constraints that the cooperative market faces in its pursuit of integration into the Indonesian economy. The data found by the academic community at the Institute of Business and Informatics in 1957 shows that the p-value is less than 0.05, which means that there is a substantial association with a number higher than 0.7. This opens up a lot of possibilities for the development and expansion of cooperatives in Indonesia. It is possible that these cooperatives may form the bedrock of the country's future economic success
Authors - Paula Cristina De Almeida Marques, Paulo Alexandre Teixeira Faria de Oliveira Abstract - This further underscore the need to have crisis resilience capabilities in a continuously evolving healthcare environment; especially in the wake of global crises such as COVID-19. This paper explores the impact of intelligent systems in the healthcare system. How to make resilient in crisis, and limits that analysis by comparing with Balanced Scorecard (BSC). When a hospital implements AI and ML technologies, it can dramatically enhance crisis surveillance, reduce the time needed for escalation predictions, and facilitate timely interventions accompanied by quick reactions to unexpected events. Based on multiple case studies, the literature review suggests that intelligent systems can greatly assist in resource optimization, operational efficiency improvement, and crisis decision making. Similarly, to how these perspectives are used to evaluate intelligent systems, BSC analyses them through four financial perspectives: customer; internal processes and growth and learning. We uncovered more than a billion euros respective and on average in value that could be gained fleet of 5G-enabled smart bicycles, specifically contribute to operational efficiency and clinical effectiveness in times of crisis by integrating smart systems. In addition to highlighting the importance of support from upper management an ongoing tailoring of smart systems to assist in the accomplishment of economic alignment to the strategic goals of healthcare organizations The Balanced — So, this study sets out that intelligent systems in health care with the Balanced Scorecard would provide a rapid response health system able to respond appropriately towards forthcoming crises. But there could be for policymakers and health care managers, provide incentives for the strategic integration of these technologies to support crisis management abilities, as well as more general benefits for human health.
Authors - Asmae CHAKIR, Mohamed TABAA Abstract - Throughout the world, the transportation and residential sectors are the most energy intensive. This continues to increase especially with the steady urban development. Consequently, the electricity consumption will increase especially in the above-mentioned sectors. To satisfy this demand, an increase of the electrical production is necessary in an environmentally friendly way. For this purpose, non-conventional or renewable generation is needed. But to overcome the intermittency, the concept of complementary sources hybridization has been launched. In this context, we considered the PV-Wind-Battery hybrid system in small scale that will supply a house already connected to the grid. This will remedy to the issue of increasing consumption in the residential sector. Regarding the transportation sector, a strategy to switch to electric transportation means has been initiated as well. To achieve this, we have hybridized our system with the existence of an electric vehicle used by the building's inhabitants as a means of transportation. On this paper, we proposed to manage the energy of this system according to three management sides, namely: source side, storage side and load management side. This combination allowed an optimization of the energy produced by the renewable system and a management of energy storage preference depending on the home's energetic states. Besides, the management system on the load side which helps in the minimization of the energy consumed trough the electricity utility during the periods of energy deficit to a consumption to satisfy just critical loads, especially with the presence of the mobile battery supplying the electric vehicle via the vehicle to home and home to vehicle strategy.
Authors - Etian Ngobeni, Sara Grobbelaar, Christopher Mejia-Arguata Abstract - Infrastructure maturity models largely guide an organization towards adopting advanced technologies. However, the knowledge on how such models can be developed for wholesale food markets is still lagging. This study fills the gap by using a pragmatic approach and the application of design science research methodology to develop a roadmap to developing a maturity model for infrastructure in wholesale food markets. This paper proposes a three-phase comprehensive framework for developing a maturity model.
Authors - Senthil Kumar Subramani Anandan, Lorenzo Garbagna, Lakshmi Babu Saheer, Mahdi Maktar Dar Oghaz Abstract - Air quality monitoring systems have become an important part of urban areas due to recent attempts to monitor pollution levels to tackle problems such as climate change and population health risks. In recent years, research has been conducted of the utilisation of lowcost pollution concentration sensors to improve and expand on current air monitoring systems, as well as creating mobile systems that could be deployed in different scenarios. Although, the spread of Internet of Things (IoT) devices for monitoring systems brought the need of calibration between multiple different devices that could be found working inside the same network. This project explores the utilisation of Machine Learning and Deep Learning models to calibrate custom and Aeroqual sensors for PM2.5 and PM10 monitoring to an existing network from the city council in Cambridge, UK. For PM2.5, the collection with the custom sensor provided the highest accuracy when calibrated to the council one: Keras Regressor achieved an RMSE of 1.6240 and R2 of 0.8831, while with the data from Aeroqual a GRU Regressor achieved an RMSE of 1.9263 and R2 of 0.4867. On the other hand, collection with Aeroqual on PM10 concentration levels achieved an RMSE of 2.2087 and R2 of 0.6428 utilising RNN Regressor, while an MLP with Attention achieved a lower accuracy, with an RMSE of 4.9582 and R2 of 0.3297.
Authors - Ferlyn P. Calanda Abstract - The main goal of this project was to develop a Poultry Farm Monitoring and Control System using a web application platform. This system was designed to assist farmers by providing real-time data on temperature, humidity, ammonia levels and the overall environmental conditions within the poultry houses. As a result, farmers were able to access this information and make informed decisions to maintain animal welfare and productivity. The study employed a combination of descriptive and developmental research methods. A total of thirty (30) respondents including farmers, agriculturists, veterinarians, and faculty members from the agriculture department, took part in the study. The number of respondents was based on the suggestion of Jakob Nielsen [2012], which states that for quantitative studies, usability tests can be deployed on at least twenty (20) users to get statistically significant numbers. These respondents were able to remotely monitor the data and use it to inform decision-making processes.
Authors - J. Coetzer, R B Kuriakose, H J Vermaak Abstract - As manufacturing and business sectors adopted Industry 4.0, the Fifth Industrial Revolution (Industry 5.0) emerged. Unlike its predecessor, Industry 5.0 extends its focus beyond economic growth and job creation, recognizing the manufacturing sector’s potential to support to broader societal goals. The continuous technological advancements and system improvements of Industry 5.0 have sparked a new area of research: enhancing human-machine interaction in commercial and industrial manufacturing environments by fostering better collaboration between humans and machines. There have been limited studies on how to establish a CDM process that takes into account the worker's recognition and ability to adapt to this development. The aim of the paper is to explore if existing protocols for Human-Machine Collaboration (HMC) are present in the manufacturing sector. If such protocols do not exist, the paper seeks to develop a universal protocol suitable for implementation in an Industry 5.0 context. An entirely mechanized water bottling plant will be serve as a case study to examine the effects of HMC. The study aims create a protocol that supports CDM within an Industry 5.0 environment. To support this goal, a single-case experiment has been conducted to test the theory of HMC that will lead to optimal production time of an automated system in an Industry 5.0 context. The paper details the background that motivated the research, methodology used and showcases steps taken in creating a protocol for CDM before concluding with the investigation of preliminary results, that show an up to an average of 24% reduction in production time.
Authors - Vusumzi Funda, Bingwen Yan Abstract - Knowledge is a strategic asset and a critical source of competitive advantage for organisations. Consequently, organisations employ various knowledge management (KM) enablers to acquire, store, secure, retrieve, share and utilize knowledge, all of which are crucial for enhancing organizational performance. Information and Communication Technologies (ICTs) play a pivotal role in facilitating these processes. This study aimed to evaluate the effectiveness of ICT usage in KM within the context of South Africa, with a specific focus on identifying barriers to ICT utilization. A quantitative method research approach was adopted using surveys. The findings revealed that the selected university lacked a comprehensive guideline on ICT usage, which hindered effective KM. The study concluded that while KM is essential at the University, significant efforts are needed to improve its practices. Additionally, a comparative methodology was proposed to analyse disparities in ICT utilization across different institutions. This study contributes valuable insights into KM and offers practical implications for policy review, potentially influencing management and other stakeholders to initiate necessary reforms.
Authors - Maria Sahakyan, Meri Badalyan, Lusine Karapetya Abstract - The article is devoted to the study of the essence and characteristics of discrimination in the IT workplace. Obviously, the field of information technology is one of the priority areas for the development of the economy of the Republic of Armenia. This area is developing quite rapidly, and the average salary in IT companies is higher than the average salary in other spheres in Armenia. On the one hand, we still face the stereotype that a successful IT professional is a man. On the other hand, women in Armenia are starting to play an increasingly important role in coding, product development, web design, and other IT areas. The average share of women employed in IT in the world doesn't exceed 20% even though the tech world aspires to achieve gender balance and diversity. According to the data of 2022, more than 43% employees of the IT sector in Armenia are women, which is a quite high index at the global level. But still women in the IT sector earn on average about 1.5 times less than men. Despite the efforts of different engaged bodies to diminish the discrimination in the work-place, this is still a serious issue.
Authors - Welekazi Ntloko, Sara S. (Saartjie) Grobbelaar Abstract - Social franchising is a business model in which a successful social enterprise is replicated in multiple areas, often by providing franchisees with training, support, and resources. Social franchising aims to assist social entrepreneurs to impact a larger number of people with their services by scaling their operations while maintaining their standards of excellence and consistency. Social franchising (SF) is used to scale social business models in new locations, allowing them to expand their impact. This article serves to analyse and review the literature surrounding social franchising. Preliminary results reveal a substantial focus on healthcare in social franchising research, with limited multidisciplinary studies. Challenges include the limited legal frameworks in many jurisdictions, impacting stakeholder certainty. The study aims to contribute insights into the evolving landscape of social franchising, emphasizing the intersection with SBMs and HO for sustainable and impactful outcomes, with potential implications for sustainable economic and social development.
Authors - Kuhlula Mathebula, Noluntu Mpekoa, Khutso Lebea Abstract - This research aims to assess the suitability of a multi-factor authentication (MFA) scheme for protecting a university's Wi-Fi network from threat actors. Given the vulnerabilities of current single-factor authentication methods, which often rely on usernames and passwords, implementing MFA is proposed as a more secure alternative. MFA enhances security by requiring users to pass through multiple authentication mechanisms, such as knowledge-based, possession-based, and biometric methods, making unauthorised access significantly more difficult. The research seeks to determine the most effective combination of authentication factors for a university environment. The research findings may have broader implications for securing educational institutions' networks.
Authors - Amr Abu Alhaj, Omar Safwat, Youssef Ghoneim, Imran Zualkernan, Ali Reza Sajun Abstract - This paper examines the use of pre-trained models like Bidirectional Encoder Representations from Transformers (BERT) and A Robustly Optimized BERT Pretraining Approach (RoBERTa) to create reliable models for detecting fake news from media articles. Traditional Machine Learning (ML) methods frequently have difficulties in accurately identifying the nuances of misinformation due to extensive feature engineering dependencies. The latest advancements in Large Language Models (LLMs) such as BERT and RoBERTa have fundamentally transformed misinformation detection by providing deep context. The research utilizes the LIAR dataset, containing 12.8k manually labeled statements from PolitiFact.com, along with associated metadata and speaker credit scores. The approach combines BERT/RoBERTa embeddings with complementary architectures for binary classification, introducing a credit-score calculation reflecting speakers’ historical truthfulness. Notably, BERT-BiLSTM-CNN-FC and RoBERTa-BiLSTM-CNNFC configurations achieved state-of-the-art F1-scores of 0.76 and 0.74, respectively.
Authors - Ana Martinez-Gamez, Heberto Ferreira-Medina, Bernardo Lopez-Sosa, Sayra Orozco, Mario Morales-Maximo, Carlos A. Garcia, Michel Rivero Abstract - This project aims to develop a methodology for predicting solar radiation in San Francisco Pich´ataro, a community in the municipality of Tingambato, Michoac´an, Mexico. This community lies within the Pur´epecha indigenous zone. The project utilized two databases: one from a solarimetric station in the area and the other from the Solcast platform, which provides access to solar irradiance and other pertinent meteorological variables. Rigorous data cleansing and analysis procedures were implemented to ensure data quality and compatibility. Subsequently, both linear and decision tree regression models were applied to the refined and prepared data to forecast solar radiation.
Authors - Haryadi Sarjono, Safina Alya Zahira, Ine Silviya, Boyke Setiawan Soertin Abstract - This study aims to identify the office layout that best suits Gen Z workers' preferences and enhances productivity and work quality. A qualitative method with a descriptive approach was employed, focusing on Gen Z employees in the Information and Technology Division. Among the 38 employees in this division, ten are Gen Z, and eight of them participated in the study through a questionnaire and partial interviews to delve deeper into their responses. The questionnaire covered six different office layout types and assessed their impact on work productivity and efficiency. Gen Z employees in the Information and Technology Division favored new layouts, particularly the Relax Corner, Desk Facing Outside Window, Mini Bar, and WFO Feel Like WFC. They prefer cozy, flexible office spaces with diverse work environments. The findings suggest that these new office layouts can enhance productivity and work efficiency for Gen Z employees. However, some participants noted that their productivity and efficiency were more influenced by factors like their colleagues and teamwork rather than the office layout itself.
Authors - Rolph Abraham YAO, Ferdinand Tonguim GUINKO Abstract - Software-defined networking (SDN) is a growing concept that allows the separation of the control layer from the data layer, making the network programmable, and having a centralized view and management of the network. The control layer is an important component of the network because it is composed of controllers that play a role in supervising and controlling the entire SDN network. For efficient traffic management in SDN, it is essential to have a high-performance controller. In this paper, a performance analysis of Floodlight, ONOS, OpenDaylight (ODL) and Ryu controllers is analyzed. A custom network topology is created with Mininet. The ping and iperf tools are also used to evaluate the four controllers based on bandwidth utilization, jitter, packet transmission rate, round-trip time (rtt), and throughput. Our analysis reveals that in terms of jitter, bandwidth utilization, and throughput, ONOS has the best performance. Floodlight has better performance in terms of round-trip time (rtt) and ODL provides better performance in terms of transmission rate.
Authors - Swayamjit Saha, Garga Chatterjee, Kuntal Ghosh Abstract - Visualizing data plays a pivotal role in portraying important scientific information. Hence, visualization techniques aid in displaying relevant graphical interpretations from the varied structures of data, which is found otherwise. In this paper, we explore the COVID-19 pandemic influence trends in the subcontinent of India in the context of how far the infection rate spiked in the year 2021 and how the public health division of the country India has helped to curb the spread of the novel virus by installing vaccination centers and administering vaccine doses to the population across the diaspora of the country. The paper contributes to the empirical study of understanding the impact caused by the novel virus to the country by doing extensive explanatory data analysis of the data collected from MoHFW, India. Our work contributes to the understanding that data visualization is prime in understanding public health problems and beyond and taking necessary measures to curb the existing pandemic.
Authors - Louay Al Nuaimy, Hazem Migdady, Mahammad Mastan Abstract - Accurate time series forecasting is vital in areas such as finance, weather prediction, and energy management. Traditional forecasting methods often struggle to effectively model the intricate patterns and nonlinearities present in real-world data. This study proposes the feedback-matching neural network (FMNN), a deep learning model that evolves from the feedback-matching algorithm (FMA). By embedding the core concepts of FMA into a neural network structure, the FMNN can recognize and match historical patterns in time series data, leading to more accurate predictions. Extensive experiments reveal that the FMNN outperforms several conventional statistical models and modern neural networks in terms of forecasting accuracy, as evaluated by the weighted absolute percentage error (WAPE). The FMNN enhances prediction accuracy by offering a sophisticated method for identifying and leveraging repeating trends within the data.
Authors - Alexandre Evain, Redouane Khemmar, Mathieu Orzalesi, Sofiane Ahmedali Abstract - This paper presents MYv7 (Mono-YOLOv7), an adaptation of the YOLOv7 architecture tailored specifically for 3D monocular object detection. Rather than competing with specialized 3D methods, we demonstrate the efficacy of enhancing 3D monocular detection using improved 2D object detection algorithms. We showcase how improvements in 2D algorithms can enhance 3D predictions, presenting MYv7’s twofold advantage over a YOLOv5-based method: increased speed and accuracy. These gains are crucial for efficient operation on embedded systems with limited computational resources. Our results highlight the potential of using advancements in 2D detection methods to significantly improve 3D monocular object recognition, opening new avenues for real-world applications.
Authors - Ali Belgacem, Abbas BRADAI Abstract - This summary research paper provides a comprehensive overview of Vehicle-to-Everything (V2X) communications, including various communication types and the roles of base stations. It covers resource allocation techniques and beamforming for high-quality connectivity and addresses energy efficiency optimization metrics. The paper also discusses artificial intelligence methods and their integration to optimize these systems and enhance performance. This research serves as a valuable guide for those aiming to contribute to advancements in 6G technologies for efficient vehicular communications.
Authors - Iaroslav Omelianenko Abstract - Neuroevolution algorithms need to evaluate at the end of each epoch the fitness scores of each organism in a population of solvers within the problem space where a solution is sought. This evaluation often involves running complex environmental simulations, which can significantly slow down the training speed if done sequentially. This work proposes a solution that utilizes the inherent capabilities of the Go programming language to run complex simulations in local parallel processes (routines). The efficiency of this proposed solution is compared to sequential evaluation using two classic reinforcement learning experiments, specifically single and double pole balancing. Direct comparisons indicate that the proposed solution is up to five times faster than the sequential approach when complex environmental simulations are required for objective function evaluation.
Authors - Kayode Oyetade, Anneke Harmse, Tranos Zuva Abstract - The introduction of AI in education has the potential to address educational inequalities and improve outcomes, but it also raises concerns about cultural responsiveness and biases in AI systems. To ensure equitable outcomes, strategies are needed to address these concerns. However, there is a limited understanding of effective approaches for promoting cultural sensitivity and equity in AI-powered educational content, highlighting a significant gap in existing literature. Using literature review methodology, this study aims to explore strategies to enhance cultural sensitivity and mitigate biases in AI-powered educational content, focusing on the intersection of technology and cultural diversity. By addressing concerns related to bias in AI algorithms, our findings highlight the importance of cultural inclusivity in AI-driven educational tools and advocates for proactive measures to embed cultural responsiveness into AI development processes. This review contributes to the discussion on responsibly integrating AI in education, promoting educational environments that value and reflect diverse cultural identities, and promoting a more inclusive educational experience globally.
Authors - Sulafa Badi, Salam Khoury, Kholoud Yasin, Khalid Al Marri Abstract - This study investigates consumer attitudes toward Mobility as a Service (MaaS) in the context of the UAE's diverse population, focusing on the factors influencing adoption intentions. A survey of 744 participants was conducted to assess public perceptions, employing hierarchical and non-hierarchical clustering methods to identify distinct consumer segments. The analysis reveals five clusters characterised by varying demographics, travel lifestyles, and attitudes towards MaaS, highlighting the influence of UTAUT2 variables, including performance expectancy, social influence, hedonic motivation, price value, and perceived risk. Among the clusters, ‘Enthusiastic Adopters’ and ‘Convenience-Driven Adopters’ emerge as key segments with a strong reliance on public transport and a willingness to adopt innovative transportation solutions. The findings indicate a shared recognition of the potential benefits of MaaS despite differing opinions on its implementation. This research contributes to the theoretical understanding of MaaS adoption by offering an analytical typology relevant to a developing economy while also providing practical insights for policymakers and transport providers. By tailoring services to meet the unique needs of various consumer segments, stakeholders can enhance the integration of MaaS technologies into the UAE's transportation system. Future research should explore the dynamic nature of public sentiment regarding MaaS to inform ongoing development and implementation efforts.
Authors - Rakhi Bharadwaj, Priyanshi Patle, Bhagyesh Pawar, Nikita Pawar, Kunal Pehere Abstract - The detection of forged signatures is a critical challenge in various fields, including banking, legal documentation, and identity verification. Traditional methods for signature verification rely on handcrafted features and machine learning models, which often struggle to generalize across varying handwriting styles and sophisticated forgeries. In recent years, deep learning techniques have emerged as powerful tools for tackling this problem, leveraging large datasets and automated feature extraction to enhance accuracy. In this literature survey paper, we have studied and analyzed various research papers on fake signature detection, focusing on the accuracy of different deep learning techniques. The primary models reviewed include Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), and Generative Adversarial Networks (GANs). We evaluated the performance of these methods based on their reported accuracy on benchmark datasets, highlighting the strengths and limitations of each approach. Additionally, we discussed challenges such as dataset scarcity and the difficulty of generalizing models to detect different types of forgeries. Our analysis provides insights into the effectiveness of these methods and suggests potential directions for future research in improving signature verification systems.
Authors - Shilpa Bhairanatti, Rubini P Abstract - While the rollout of 5G cellular networks will extend into the next decade, there is already significant interest in the technologies that will form the foundation of its successor, 6G. Although 5G is expected to revolutionize our lives and communication methods, it falls short of fully supporting the Internet-of-Everything (IoE). The IoE envisions a scenario where over a million devices per cubic kilometer, both on the ground and in the air, demand ubiquitous, reliable, and low-latency connectivity. 6G and future technologies aim to create a ubiquitous wireless connectivity for entire communication system. This development will accommodate the rapidly increasing number of intelligent devices and communication demand. These objectives can be achieved by incorporating THz band communication, wider spectrum resources with minimized communication error. However, this communication technology faces several challenges such as energy efficiency, resource allocation, latency etc., which needs to be addressed to improve the overall communication performance. To overcome these issues, we present a roadmap for Point to Point (P2P) and Point-to-Multipoint (P2MP) communication where channel coding mechanism is introduced by considering Turbo channel coding scheme as base approach. Furthermore, deep learning based training is provided to improve the error correcting performance of the system. The performance of proposed model is measured in terms of BER for varied SNR levels and additive white noise channel distribution scenarios, where experimental analysis shows that the proposed coding approach outperformed existing error correcting schemes.
Authors - Hiep. L. Thi Abstract - A brief summary of the paper, highlighting key points such as the increasing role of UAVs in various sectors, the challenges related to data storage on UAVs, and proposed solutions for improving both the efficiency and security of data management. Include a note on the scope of the study, methodologies, and key findings.
Authors - Gareth Gericke, Rangith B. Kuriakose, Herman J. Vermaak Abstract - Communication architectures are demonstrating their significance in the development landscape of the Fourth industrial revolution. Nonetheless, the progress of architectural development lags behind that of the Fourth industrial revolution itself, resulting in subpar implementations and research gaps. This paper examines the prerequisites of Smart Manufacturing and proposes the utilization of a novel communication architecture to delineate a pivotal element, information appropriateness, showcasing its efficient application in this domain. Information appropriateness, leverages pertinent information within the communication flow at a machine level facilitating real-time monitoring, decision-making, and control over production metrics. The metrics scrutinized herein include production efficiency, bottleneck mitigation, and network intelligence, while accommodating architectural scalability. These metrics are communicated and computed at a machine level to assess the efficacy of a communication architecture at this level, while also investigating its synergistic relationship across other manufacturing tiers. Results of this ongoing study shed insights into data computation and management at the machine level and demonstrate an effective approach for handling pertinent information at critical junctures. Furthermore, the adoption of a communication architecture helps minimize information redundancy and overhead in both transmission and storage for machine level communication.
Authors - Y. Abdelghafur, Y. Kaddoura, S. Shapsough, I. Zualkernan, E. Kochmar Abstract - Early reading comprehension is crucial for academic success, involving skills like making inferences and critical analysis, and the Early Grade Reading Assessment (EGRA) toolkit is a global standard for assessing these abilities. However, creating stories that meet EGRA's standards is time-consuming and labour-intensive and requires expertise to ensure readability, narrative coherence, and educational value. In addition, creating these stories in Arabic is challenging due to the limited availability of high-quality resources and the language's complex morphology, syntax, and diglossia. This research examines the use of large language models (LLMs), such as GPT-4 and Jais, to automate Arabic story generation, ensuring readability, narrative coherence, and cultural relevance. Evaluations using Arabic readability formulas (OSMAN and SAMER) show that LLMs, particularly Jais and GPT, can effectively produce high-quality, age-appropriate stories, offering a scalable solution to support educators and enhance the availability of Arabic reading materials for comprehension assessment.
Authors - Md Fahim Afridi Ani, Abdullah Al Hasib, Munima Haque Abstract - This research explores the possibility of improving insect farming by integrating Artificial Intelligence (AI) unlocking the complicated relationship between butterflies and plants they pollinate to reconsider the way species are classified and helping to redraw farming practices for the butterflies. Traditional methods of butterfly classification are morphologically and behaviorally intensive, thus mostly very time-consuming to conduct considering that most of them have a high level of subjective interpretation. We therefore apply our approach to ecological interactions involving butterfly species and their respective plants for efficient data-driven solutions. This also focuses on the application of AI in making full benefits from butterfly farming, trying to determine where each species will be best located. The system will, therefore, classify and manage butterflies with much more ease, saving time and energy usually used in conventional classification methods hence on to the farmer or industrial client. The research deepens the understanding of insect-plant relationships for better forecasting of butterfly behavior and, therefore, healthier ecosystems through optimized pollination and habitat balance. For that purpose, a dataset of butterfly species and related plants was developed, on which machine learning models were applied, including decision trees, random forests, and neural networks. It tuned out that the neural network outperformed the others with an accuracy of 93%. Apart from classification, it helps in the identification of a habitat to provide the best conditions possible for the rearing of butterflies. Application of AI in this field simplifies the work of butterfly farming hence being an important tool to be used in improving growth and conservation of biodiversity. Integrating machine learning into ecological research and industry provides scalable, time-efficient solutions for the classification of species toward the sustainable farming of butterflies.
Authors - Zachary Matthew Alabastro, Stephen Daeniel Mansueto, Joseph Benjamin Ilagan Abstract - Product innovation is critical in strategizing business decisions in highly-competitive markets. For product enhancements, the entrepreneur must garner data from a target demographic through research. A solution to this involves qualitative customer feedback. The study proposes the viability of artificial intelligence (AI) as a co-pilot model to simulate synthetic customer feedback with agentic systems. Prompting with ChatGPT-4o’s homo silicus attribute can generate feedback on certain business contexts. Results show that large language models (LLM) can generate qualitative insights to utilize in product innovation. Results seem to generate human-like responses through few-shot techniques and Chain-of-Thought (CoT) prompting. Data was validated with a Python script. Cosine similarity tested the similarity of datasets to quantify the juxtaposition of synthetic and actual customer feedback. This model can be essential in reducing the total resources needed for product evaluation through preliminary analysis, which can help in sustainable competitive advantage.
Authors - Jeehaan Algaraady, Mohammad Mahyoob Albuhairy Abstract - Sarcasm, a sentiment often used to express disdain, is the focus of our comprehensive research. We aim to explore the effectiveness of various machine learning and deep learning models, such as Support Vector Machine (SVM), Recurrent Neural Networks (RNN), Bidirectional Long Short-Term Memory (BiLSTM), and fine-tuning pre-trained transformer-based mode (BERT) models, for detecting sarcasm using the News Headlines dataset. Our thorough framework investigates the impact of the DistilBert method for text embeddings on enhancing the accuracy of the DL models (RNN and LSTM) for training and classification. To assess the highest values of the proposed models, the authors utilized the four-performance metrics: F1 score, recall, precision, and accuracy. The outcomes revealed that incorporating the BERT model achieves outstanding performance and outperforms other models for an impressive sarcasm classification with a state-of-the-art F1 score of 98%. The outcomes revealed that the F1 scores for SVM, BiSLTM, and RNN are 93%, 95.05%, and 95.52%, respectively. Our experiment on the News Headlines dataset demonstrates that incorporating Distil-Bert to process the word vector enhances the performance of RNN, and BiLSTM notably improves their accuracy. The accuracy of the BiLSTM and RNN models when incorporating FT-IDT, Word2Vec, and GLoVe embeddings scored 93.9% and 93.8%, respectively. In contrast, these scores increased to 95.05% and 95.52% when these models incorporated Distil-Bert for text embedding. This augmentation can be recognized to the capability of Distil-Bert to acquire contextual information and semantic relationships between words, thereby enriching the word vector representation.
Authors - Lois Abigail To, Zachary Matthew Alabastro, Joseph Benjamin Ilagan Abstract - Customer development (CD) is a Lean Startup (LS) methodology for startups to validate their business hypotheses and refine their business model based on customer feedback. This paper proposes designing a large language model-based multi-agent system (LLM MAS) to enhance the customer development process by simulating customer feedback. Using LLMs’ natural language understanding (NLU) and synthetic multi-agent capabilities, startups can conduct faster validation while obtaining preliminary insights that may help refine their business model before engaging with the real market. The study presents a model in which the LLM MAS simulates customer discovery interactions between a startup founder and potential customers, together with design considerations to ensure real-world accuracy and alignment with CD. If carefully designed and implemented, the model may serve as a useful co-pilot that accelerates the customer development process.
Authors - Prince Kelvin Owusu, George Oppong Ampong, Joseph Akwetey Djossou, Gibson Afriyie Owusu, Thomas Henaku, Bless Ababio, Jean Michel Koffel Abstract - In today's dynamic digital landscape, understanding customer opinions and sentiments has become paramount for businesses striving to maintain competitiveness and foster customer loyalty. However, the banking sector in Ghana faces challenges in effectively harnessing innovative technologies to grasp and respond to customer sentiments. This study aims to address this gap by investigating the application of ChatGPT technology within Ghanaian banks to augment customer service and refine sentiment analysis in real-time. Employing a mixed-method approach, the study engaged (40) representatives including IT specialists, data analysts, and customer service managers from (4) banks in Ghana through interviews. Additionally, (160) customers, (40) from each bank, participated in a survey. The findings revealed a significant misalignment between customer expectations and current service provisions. To bridge this gap, the integration of ChatGPT technology is proposed, offering enhanced sentiment analysis capabilities. This approach holds promise for elevating customer satisfaction and fostering loyalty within Ghana's competitive banking landscape.
Authors - Japheth Otieno Ondiek, Kennedy Ogada, Tobias Mwalili Abstract - This experiment models the implementation of distance metrics and three-way decisions for K-Nearest Neighbor classification (KNN). KNN as a machine learning method has inherent classification deficits due to high computing power, outliers and the curse of dimensionality. Many researchers have experimented and found that a combination of various algorithmic methods can lead to better results in prediction and forecasting fields. In this experimentation, we used the combination and strengths of the Euclidean metric distance to develop and evaluate computing query distance for nearest neighbors using weighted three-way decision to model a highly adaptable and accurate KNN technique for classification. The implementation is based on experimental design method to ascertain the improved computed Euclidean distance and weighted three-way decisions classification to achieve better computing power and predictability through classification in the KNN model. Our experimental results revealed that distance metrics significantly affects the performance of KNN classifier through the choice of K-Values. We found that K-Value on the applied datasets tolerate noise levels to ascertain degree while some distance metrics are less affected by the noise levels. This experiment primarily focused on the findings that best K-value from distance metrics measure guarantees three way KNN classification accuracy and performance. The combination of best distance metrics and three-way decision model for KNN classification algorithm has shown improved performance as compared with other conventional algorithm set-ups making in more ideal for classification in the context of this experiment. It outperforms KNN, ANN, DT, NB and the SVM from the crop yielding datasets applied in the experiment.
Authors - Indika Udagedara, Brian Helenbrook, Aaron Luttman Abstract - This paper presents a reduced order modeling (ROM) approach for radiation source identification and localization using data from a limited number of sensors. The proposed ROM method comprises two primary steps: offline and online. In the offline phase, a spatial-energetic basis representing the radiation field for various source compositions and positions is constructed. This is achieved using a stochastic approach based on principal component analysis and maximum likelihood estimation. The online step then leverages these basis functions for determining the complete radiation field from limited data collected from only a few detectors. The parameters are estimated using Bayes rule with a Gaussian prior. The effectiveness of the ROM approach is demonstrated on a simplified model problem using noisy data from a limited number of sensors. The impact of noise on the model’s performance is analyzed, providing insights into its robustness. Furthermore, the approach was extended to real-world radiation detection scenarios, demonstrating that these techniques can be used to localize and identify the energy spectra of mixed radiation sources, composed of several individual sources, from noisy sensor data collected at limited locations.
Authors - Shima Pilehvari, Wei Peng, Yasser Morgan, Mohammad Ali Sahraian, Sharareh Eskandarieh Abstract - Overfitting is a common problem during model training, particularly for binary medical datasets with class imbalance. This research specifically addresses this issue in predicting Multiple Sclerosis (MS) progression, with the primary goal of improving model accuracy and reliability. By investigating various data resampling techniques, ensemble methods, feature extraction, and model regularization, the study thoroughly evaluates the effectiveness of these strategies in enhancing stability and performance for highly imbalanced datasets. Compared to prior studies, this research advances existing approaches by integrating Kernel Principal Component Analysis (KPCA), moderate under-sampling, Synthetic Minority Oversampling Technique (SMOTE), and post-processing techniques, including Youden’s J Statistic and manual threshold adjustments. This comprehensive strategy significantly reduced overfitting while improving the generalization of models, particularly the Multilayer Perceptron (MLP), which achieved an Area Under the Curve (AUC) of 0.98—outperforming previous models in similar applications. These findings establish important best practices for developing robust prognostic models for MS progression and underscore the importance of tailored solutions in complex medical prediction tasks.
Authors - Ain Nadhira Mohd Taib, Fauziah Zainuddin, M. Rahmah Abstract - This paper presents AdaptiCare4U, an interactive mental health assessment in high school settings. By integrating adaptive technique with an establish mental health assessment instrument in a user-friendly format, Adap-tiCare4U improves the experience in answering mental health assessment. Through expert review validation technique, AdaptiCare4U demonstrates high effectiveness in accessibility, ease of use, and practical value with mean scores of 5, 4.2, and 4.4 respectively. Additionally, students’ perception further supports the tool’s usability, with positive feedback highlighting its engaging interface, use of multimedia elements, and stress-reducing design. A favorable usability rating from both students and experts makes AdaptiCare4U a promising tool for aiding counselors in conducting efficient mental health assessments.
Authors - Aayush Kulkarni, Mangesh bedekar, Shamla Mantri Abstract - This paper proposes a novel serverless computing model that addresses critical challenges in current architectures, namely cold start latency, resource inefficiency, and scalability limitations. The research integrates advanced caching mechanisms, intelligent load balancing, and quantum computing techniques to enhance serverless platform performance. Advanced distributed caching with coherence protocols is implemented to mitigate cold start issues. An AI-driven load balancer dynamically allocates resources based on real-time metrics, optimizing resource utilization. The integration of quantum computing algorithms aims to accelerate specific serverless workloads. Simulations and comparative tests demonstrate significant improvements in latency reduction, cost efficiency, scalability, and throughput compared to traditional serverless models. While quantum integration remains largely theoretical, early results suggest potential for substantial performance gains in tasks like function lookups and complex data processing. This research contributes to the evolving landscape of cloud computing, offering insights into optimizing serverless architectures for future applications in edge computing, AI, and data-intensive fields. The proposed model sets a foundation for more efficient, responsive, and scalable cloud solutions.
Authors - Nouha Arfaoui, Mohmed Boubakir, Jassem Torkani, Joel Indiana Abstract - The increasing reliance on surveillance systems and the vast amounts of video data have created a growing need for automated systems to detect violent and aggressive behaviors in real-time. Manual video analysis is not only labor-intensive but also prone to errors, particularly in large-scale monitoring situations. Machine learning and deep learning have gained significant attention for their ability to enhance the detection accuracy and efficiency of violence in images and videos. Violence is a critical societal issue, occurring in public spaces, workplaces, and social environments, and is a leading cause of injury and death. While video surveillance is a key tool for monitoring such behaviors, manual monitoring remains inefficient and subject to human fatigue. Early ML methods relied on manual feature extraction, which limited their flexibility in dynamic scenarios. Ensemble techniques, including AdaBoost and Gradient Boosting, provided improvements but still required extensive feature selection. The introduction of deep learning, particularly Convolutional Neural Networks (CNNs), has enabled automatic feature learning, making them more effective in violence detection tasks. This study focuses on detecting violence and aggression in workplace settings by addressing key aspects such as violent actions, and aggressive objects, utilizing various deep learning algorithms to identify the most efficient model for each task.
Authors - Kalupahanage A. G. A, Bulathsinhala D.N, Herath H.M.S.D, Herath H.T.M.T, Shashika Lokuliyana, Deemantha Siriwardana Abstract - The explosive growth of the Internet of Things (IoT) has had a substantial impact on daily life and businesses, allowing for real-time monitoring and decision-making. However, increased connectivity also brings higher security risks, such as botnet attacks and the need for stronger user authentication. This research explores how machine learning can enhance Internet of Things security by identifying abnormal activity, utilizing behavioral biometrics to secure cloud-based dashboards, and detecting botnet threats early. Researchers tested numerous machine learning methods, including K-Nearest Neighbors (KNN), Decision Trees, and Logistic Regression, on publicly available datasets. The Decision Tree model earned an impressive accuracy rate of 73% for anomaly identification, proving its supremacy in dealing with complex security risks. Research findings show the effectiveness of these strategies in enhancing the security and reliability of IoT devices. This study provides significant insights into the use of machine learning to protect Internet of Things devices while also addressing crucial concerns such as power consumption and privacy.
Authors - A B Sagar, K Ramesh Babu, Syed Usman, Deepak Chenthati, E Kiran Kumar, Boppana Balaiah, PSD Praveen, G Allen Pramod Abstract - Agricultural disasters, mostly ones caused by biological threats, pose severe threats to global food security and economic stability. Early detection and effective management are essential for mitigating these risks. In this research paper we propose a comprehensive disaster prediction and management framework integrating any of the resources like social networks or Internet of Things (IoT) for data collection. The model combines real-time data collection, risk assessment, and decision-making processes to forecast agricultural disasters and suggest mitigation strategies. The mathematical foundation of this model defines relationship between key variables, such as plant species, infestation agent species, tolerance levels, and infestation rates. The system relies on IoT or mobile-based social network agents for data collection at the ground level, to get precise and consistent information from diverse geographic regions. The model further includes a hierarchical risk assessment process that identifies, evaluates, and assesses risks based on predefined criteria, enabling informed decision-making for disaster mitigation. Multiplant species and multi-infestation agent interactions are also considered to capture the complexities of agricultural systems. The proposed framework provides a scalable approach to predicting and managing agricultural disasters, particularly targeting biological threats. By incorporating real-time data and dynamic decision-making mechanisms, the model considerably improves the resilience of agricultural systems against both localized and large-scale threats.
Authors - Herrera Nelson, Paul Francisco Baldeon Egas, Gomez-Torres Estevan, Sancho Jaime Abstract - Quito, the capital of Ecuador, is the economic core of the country where commercial, administrative, and tourist activities are concentrated. With population growth, the city has undergone major transformations resulting in traffic congestion problems that affect health, cause delays in daily activities, and increase pollution levels among other inconveniences. Over time, important mobility initiatives have been implemented such as traffic control systems, monitoring, construction of peripheral roads, and the "peak and license plate" measure that restricts the use of vehicles during peak hours according to their license plate, a strategy also adopted in several Latin American countries. However, these actions have not been enough, and congestion continues to increase, causing discomfort to citizens. Given this situation, the implementation of a low-cost computer application has been proposed that allows identifying traffic situations in real time and making decisions to improve this problem using processed data from the social network Twitter and traffic records from the city of Quito.
Authors - Elissa Mollakuqe, Hasan Dag, Vesa Mollakuqe, Vesna Dimitrova Abstract - Groupoids are algebraic structures, which generalize groups by allowing partial symmetries, and are useful in various fields, including topology, category theory, and algebraic geometry. Understanding the variance explained by Principal Component Analysis (PCA) components and the correlations among variables within groupoids can provide valuable insights into their structures and relationships. This study aims to explore the use of PCA as a dimensionality reduction technique to understand the variance explained by different components in the context of groupoids. Additionally, we examine the interrelationships among variables through a color-coded correlation matrix, facilitating insights into the structure and dependencies within groupoid datasets. The findings contribute to the broader understanding of data representation and analysis in mathematical and computational frameworks.
Authors - Laurent BARTHELEMY Abstract - In 2024 [7], the author proposed a calculation of weather criteria for vessel boarding against the ladder of an offshore wind turbine, based on a regular wave. However international guidelines [2] prescribe that "95% waves pass with no slip above 300mm (or one ladder rung)". In order to meet such acceptability criteria, it becomes necessary to investigate boarding under a real state, which is an irregular wave. The findings meet the results from other publications [6] [7]. The outcome then is to propose boarding optimisation strategies, compared to present professional practises. The purpose is to achieve less gas emissions, by minimising fuel consumption.
Authors - Amro Saleh, Nailah Al-Madi Abstract - Machine learning (ML) enables valuable insights from data, but traditional ML approaches often require centralizing data, raising privacy and security concerns, especially in sensitive sectors like healthcare. Federated Learning (FL) offers a solution by allowing multiple clients to train models locally without sharing raw data, thus preserving privacy while enabling robust model training. This paper investigates using FL for classifying breast ultrasound images, a crucial task in breast cancer diagnosis. We apply a Convolutional Neural Network (CNN) classifier within an FL framework, evaluated through methods like FedAvg on platforms such as Flower and TensorFlow. The results show that FL achieves competitive accuracy compared to centralized models while ensuring data privacy, making it a promising approach for healthcare applications.
Authors - Ahmed D. Alharthi, Mohammed M. Tounsi Abstract - The Hajj pilgrimage represents one of the largest mass gatherings globally, posing substantial challenges in terms of health and safety management. Millions of pilgrims converge each year in Saudi Arabia to fulfil their religious obligations, underscoring the critical need to address the various health risks that may emerge during such a large-scale event. Health volunteering plays a pivotal role in delivering timely and high-quality medical services to pilgrims. This study introduces the Integrated Health Volunteering (IHV) framework, designed to enhance health and safety outcomes through an optimised, rapid response system. The IHV framework facilitates the coordinated deployment of healthcare professionals—including doctors, anaesthetists, pharmacists, and others—in critical medical emergencies such as cardiac arrest and severe haemorrhage. Central to this framework is the integration of advanced technologies, including Artificial Intelligence algorithms, to support health volunteers’ decision-making. The framework has been validated and subjected to accuracy assessments to ensure its efficacy in real-world situations, particularly in the context of mass gatherings like the Hajj.