Summary of articles related to modern trends in ICT
NAMES OF 10 ARTICLES RELATED TO MODERN TRENDS IN ICT:
> INCLUSION OF ARTIFICIAL INTELLIGENCE IN COMMUNICATION NETWORKS AND SERVICES;
By:
XU Guibao, MU Yubo, LIU Jialiang
[Institute of Technology and Standards Research, China Academy of Information and Communication Technology, Beijing, China PR]
> THE CONVERGENCE OF MACHINE LEARNING AND COMMUNICATIONS;
By:
Wojciech Samek, Slawomir Stanczak, Thomas Wiegand
[Fraunhofer Heinrich Hertz Institute, Berlin, Germany
Dept. of Electrical Engineering & Computer Science, Technische Universität Berlin, Berlin, Germany]
> Block-chain Technology: Benefits, Challenges faced by Researches and their studies, Applications, and Integration of Block-chain Technology with Cloud Computing.
By:
Gousia Habib, Sparsh Sharma, Sara Ibrahim, Imtiaz Ahmed, Shaima Qureshi, Malik Ishfaq.
[Department of Computer Science and Engineering, National Institute of Technology, Srinagar, India
Applied Sciences for Economics and Management, FOM University, 45127 Essen, Germany
Department of Mathematics, University of Kashmir, Main Campus, Hazratbal, Srinagar, India]
> Challenges and Future Directions of Big Data and Artificial Intelligence in Education;
By:
Hui luan, Peter Geczy, Hollis Lai, Janice, Hirokai Ogata, Jacky Balites, Rodigro Guerra, Ping li, Ronnel B.King
[ Institute for Research Excellence in Learning Sciences, National Taiwan Normal University, Taipei, Taiwan
National Institute of Advanced Industrial Science and Technology, Tsukuba, Japan School of Dentistry, Faculty of Medicine & Dentistry, University of Alberta, Edmonton, AB, Canada Graduate School of Education, Rutgers – The State University of New Jersey, New Brunswick, NJ, United States Apprendis, LLC, Berlin, MA, United States Department of Computer Science and Information Engineering, College of Electrical Engineering and Computer Science, National Central University, Taoyuan City, Taiwan Graduate School of Informatics, Kyoto University, Kyoto, Japan Department of Electrical Engineering, College of Technology and Engineering, National Taiwan Normal University, Taipei, Taiwan
The University of Hong Kong, Hong Kong, SAR China
]
> ICT for Cyber Security in Business;
By:
R Hristev and M Veselinova
[ University of Plovdiv Paisii Hilendarski, Plovdiv, Bulgaria blv, Bulgaria ]
>Social Media Activism in the Digital Age;
By:
Myounf-Gi Chon Hyojung Park
[Auburn University, AL, USA]
[Reference Link: https://journals.sagepub.com/doi/full/10.1177/1077699019835896]
>A systemic and cognitive vision for IoT security;
By:
Arbia Riahi Sfar; Zied Chtourou; Yacine Challal
[ STD lab, Military Academy of Tunisia, Science and Technology for Defense Laboratory
Center for Military Research Aouina, Tunisia ]
[Reference Link: https://ieeexplore.ieee.org/document/8071828]
>Study and Investigation on 5G Technology
By:
Ramraj Dangi, Parween lalwani, Ilsun You and Giovanni Pau.
[
School of Computing Science and Engineering, VIT University Bhopal, India
Department of Applied Mathematics and Computer Science, Technical University of Denmark, 2800 Lyngby, Denmark
Department of Information Security Engineering, Soonchunhyang University, Asan-si 31538, Korea
Faculty of Engineering and Architecture, Kore University of Enna, 94100 Enna, Italy ]
[Reference Link: https://www.mdpi.com/1424-8220/22/1/26 ]
>Edge Computing: Vision and Challenges;
By:
Weisong Shi, Jie Cao, Quan Zhang, Youhuizi Li and Lanyu Xu
[Wayne State University]
[Reference link: https://scholar.google.com.pk/scholar_url?url=https://www.researchgate.net/profile/Nirmala-Svsg/post/What_are_the_current_researches_that_are_happening_on_Edge_computing/attachment/59d63ee579197b807799b6f0/AS%253A425783211630593%25401478526036651/download/Edge%2BComputing%2B-%2B%2BVision%2Band%2BChallenges.pdf&hl=en&sa=X&ei=MHKQZPXBO-aay9YPvpKEgAY&scisig=AGlGAw86zK_M3c_PdCqmQ_g-3s-S&oi=scholarr]
>Confidential computing and related technologies: a critical review
By:
Muhammad Usama Sardar and Christof Fetzer
[Reference Link: https://cybersecurity.springeropen.com/articles/10.1186/s42400-023-00144-1]
NOTE:- Reference link for all the articles which I selected are give with every summary.
ARTICLES THAT I CHOSE TO SUMMARIZE AND WHY I CHOSE THEM:
>Block-chain Technology: Benefits, Challenges faced by Researches and their studies, Applications, and Integration of Block-chain Technology with Cloud Computing.
Above mentioned article is very diverse covering many topics which were:
1. The paper reviews blockchain technology in detail, including applications of blockchain technology and evolution.
2. The paper highlights blockchain technology’s critical challenges, and this study suggests a few solutions.
3. The paper describes blockchain technology in the transaction system and its cryptocurrencies.
4. The study integrates blockchain technology with cloud computing, and broadly discusses the application of blockchain in cloud computing.
5. Scalability and hardware issues of blockchain technology are explained in detail, and a few solutions are suggested to combat the major challenges.
6. Finally, the paper criticizes the detailed study of the paper and suggests some future directions for the study.
7. Blockchain and cryptocurrencie related problems.
I included blockchain technology in my study on modern trends in ICT due to its significant impact on various sectors and its potential to revolutionize the way information and transactions are securely managed. Blockchain, as a decentralized and immutable ledger, offers increased transparency, trust, and security in digital interactions. Its applications span beyond cryptocurrencies, with potential uses in supply chain management, healthcare, finance, and more. By understanding blockchain technology and its implications. Thus I I have only included the topics which were relevant to my study which are summarized below(3)
> INCLUSION OF ARTIFICIAL INTELLIGENCE IN COMMUNICATION NETWORKS AND SERVICES;
In my study on recent developments in ICT, I included the topic of "Inclusion of Artificial Intelligence in Communication Networks and Services" because it represents a transformation in area of technology. Integrating artificial intelligence (AI) into communication networks and services has the potential to significantly improve their efficiency, responsiveness, and overall user experience. Communication systems can become smarter, more flexible, and capable by integrating AI technologies such as machine learning and natural language processing.
> THE CONVERGENCE OF MACHINE LEARNING AND COMMUNICATIONS;
In my study on recent developments in ICT, I included the confluence of machine learning and communications since it represents an advancement, Machine learning, an artificial intelligence discipline, has shown extraordinary capabilities in analyzing massive volumes of data and extracting relevant insights. Communication systems can adapt and self-optimize by employing machine learning algorithms, resulting in increased efficiency, greater user experiences, and more intelligent decision-making.
> Challenges and Future Directions of Big Data and Artificial Intelligence in Education;
In my study on present developments in ICT, I covered the issues and future directions of big data and artificial intelligence in education because these themes have substantial consequences for the education sector. The incorporation of big data and artificial intelligence technology in education has the potential to transform teaching and learning processes by providing students with personalized and adaptive learning experiences. However, in addition to the potential, there are problems to be addressed, such as data privacy and security concerns, ethical considerations etc.
> ICT for Cyber Security in Business;
In my study on modern trends in ICT, I included the topic of Cyber Security in Business because of its critical importance in today's digital world. With the rapid advancement. of technology and the increasing reliance on internet and data, businesses are becoming more exposed to cyber threats and attacks. Cyber security has become a top concern for organizations as data breaches, hacking attempts, and other cyber incidents can result in significant financial losses and damage. Integrating ICT for Cyber Security in Business allows for the development and implementation of latest security measures, policies, and practices to protect assets, sensitive information, and customer data.
SUMMARIES:
1-INCLUSION OF ARTIFICIAL INTELLIGENCE IN COMMUNICATION NETWORKS AND SERVICES:
In recent years, the industrialization of artificial intelligence (AI) has been developed due to the development and maturation of technologies such as cloud computing, big data and deep learning. AI has attracted more and more attention. AI technology has been introduced into a number of areas, such as communications, which is a sector with heavy ICT use. With AI's advantages in learning, understanding, reasoning, and cooperating gradually being discovered, software-defined networks (SDN) and network functions virtualization (NFV) have appeared, technologies of deep packet inspection and service aware networks are almost in maturity, and the intellectual building of communication networks and services is becoming possible. Operators have a keen interest in AI, which may decrease capital expenditure (CAPEX) and operating expense (OPEX).
TRENDS IN COMMUNICATION NETWORKS AND SERVICES:
The demand for specialized businesses is increasing, with customized networks and services being provided for enterprise users. In the future, there will be special service packages and networks for each user, making complex requirements impossible without an intellectual tool.
>The Web 2.0 era has enabled Internet users to produce and consume information in multimedia, increasing Internet traffic at an incredible speed. AI can help us handle this challenge.
>Networks must consider the various dimensions and granularity of wireless traffic models when using smartphones.
Network function virtualization and software-defined networks have made network management more precise. AI-based technologies allow operators to set up on-demand networks for special users and achieve energy-saving goals.
Researchers are working on Internationalization, how to improve abilities of learning,More attention to security and safety, Abilities of understanding and reasoning, improving ability of collaborating.
Following are summarized below in points;
>Networks are becoming increasingly heterogeneous, with users using a variety of wireless access technologies. 5G is expected to reshape telecommunication networks in the near future. Network management has become more difficult to maintain with an acceptable QoS. Operators are expected to increase their network performance with smart tools and intelligence technologies to meet customer needs, make more profits, reduce operating costs, and improve network performance.
>AI can be used to improve security protection, and machine learning can be used to detect attacks, analyze data, and identify relationships between isolated behaviors. This can lead to significant commercial consequences.
>Machine learning has the strength to deal with fuzzy logic and uncertainty reasoning. Deep learning constructs a multi-hidden layer model and uses the hierarchical network structure to transform the feature representation of the sample into a new feature space layer by layer. AI does not need to describe the mathematical model accurately and has the ability to deal with uncertainty or even 'unknowability'.
POSSIBILITY TO USE AI IN COMMUNICATIONS ON WHICH RESEARCHERS ARE WORKING:
The communications industry has been seeking to introduce intelligence into network operations, management and maintenance management. N. Kojic, SUI Dan and JIN Xian-hua suggested a neural network algorithm for routing in communication networks, Sandra Sendra et al. introduced AI into a routing protocol using SDN, and Sahebu, K.M. suggested an AI approach to planning and managing communication networks. However, their research focused on theoretical analysis and simulation and could not be used in real communication networks or services. However, SDN, NFV, network slicing and other technologies, coupled with integrated network management systems, have been able to directly issue orders which can be executed by network equipment, and DPI systems can be deployed on network equipment, allowing for real-time monitoring and intelligent management.
Extensive and through studies on AI in SDN,Network functions virtualization (NFV) the establishment of AN AI-BASED NETWORK FRAMEWORK is possible in near future according to the writers.
Conclusion:
In this paper, the writers have highlighted an AI-based network framework, FINE, to give a total solution to introducing AI in communication networks and services. This was then illustrated with an SDN/NFV collaboratively-deployed network.
It has been proven that the FINE framework is feasible to be used in real communication networks and services. Not only that, we can depend on this framework to set up a standard system for AI-based communication networks and services by defining detailed functions of nodes, layers, planes, related interfaces, etc. And finally, questions around privacy and security should be considered.
[Reference link : https://www.itu.int/dms_pub/itu-s/opb/journal/S-JOURNAL-ICTF.VOL1-2018-1-PDF-E.pdf]
2-THE CONVERGENCE OF MACHINE LEARNING AND COMMUNICATIONS:
Modern communication networks, particularly mobile networks, generate a large amount of data at the network infrastructure and user/customer level. The vision of network operators is to either enable new businesses through the provisioning of this data to external service providers and customers or exploit the network data for in-house services. To make this vision reality, there is a strong need for the development and implementation of new machine learning methods for big data analytics in communication networks. These methods aim to extract useful information from the network data while taking into account limited communication resources, and then to leverage this information for external or in-house services. Machine learning methods are also a core part in many emerging applications of communication technology, such as smart cities and the Internet of things.
The use of machine learning methods in various communication applications has been successful, but there are still many challenges and questions that need to be addressed. For example, the large size and high computational demands of modern machine learning algorithms prevent the large-scale use of these models in embedded devices, and 5G networks call for novel machine learning-based approaches to radio resource management and radio streaming. Despite the successful use of machine learning methods in various communication applications, there are still many challenges and questions that need to be addressed. For example, the large size and high computational demands of modern machine learning algorithms prevent the large-scale use of these models in embedded devices, and 5G networks call for novel machine learning-based approaches to radio resource management and radio streaming.
MACHINE LEARNING IN COMMUNICATIONS
According to the writer many researches are in progress to use of machine learning algorithms in different application fields of communications.
>Researchers are trying to improve communication networks,wireless communication,enhancing the security and privacy in communication,introducing smart services and smart infrastructures and image and video communication etc.
>Routing has been used to tackle the multicasting routing problem, which arises when data is sent to multiple receivers through a communication network. Genetic algorithms have also been used to address the construction of multicast trees in mobile ad-hoc networks. Machine learning techniques have also been used for throughput or traffic prediction in communication networks, which is an important topic as it can fulfill the quality of service (QoS) requirements while efficiently utilizing the network resources. Traffic identification is also an important topic for network operators as it helps them to manage their networks, to assure the QoS and to deploy security measures.
>To achieve a high efficiency at the desired QoS, it is essential in wireless systems to continuously adapt different parameters of MIMO-OFDM systems, in particular the link parameters, to the variations in the communication environment. Various work tackle this parameter selection problem using machine learning.
>Machine learning methods play a pivotal role in tackling privacy and security-related problems in communications, such as monitoring network activities and detecting anomalies. Other security applications include automatic spam filtering and phishing attack detection. >Preserving data privacy is an important security aspect in communications, and the design of machine learning algorithms that respect data privacy has recently gained increased attention. Authors of demonstrated that it is possible to build a decision-tree classifier from corrupted data without significant loss in accuracy compared to the classifiers built with the original data, while at the same time it is not possible to accurately estimate the original values in the corrupted data records. This way, one can hide private information from the algorithm, but still obtain accurate classification results.
>The recent advances in communication technology have led to the development of "smart" applications, such as smart homes, smart cities, smart grids, and the Internet of things. Machine learning algorithms are often the core part of these applications, such as forecasting and managing the power production of a photovoltaic plant. Machine learning can also help detect malicious events before they occur, such as in smart-grid networks. Tasks such as resource usage prediction, estimation of task response times, data traffic monitoring, and optimal scheduling have also been tackled with learning algorithms.
>Machine learning methods have been used for various tasks in multimedia communication and processing. Signal compression is one important field of application of these methods as it is part of almost every multimedia communication system.
Writer also describes an exemplary application in wireless networking(summery is stated below);
The design and operation of wireless networks is a highly challenging task, particularly on the road to the fifth generation of mobile networks (5G). The main cause of the problems and limitations in the context of 5G is the radio propagation channel, which can strongly distort transmission signals in a manner that varies with frequency, time, space and other system parameters. This leads to interference between different mobile users, which in turn may have a detrimental impact on network operation. The capacity of wireless links is of an ephemeral and highly dynamic nature, and it depends on global channel parameters such as path loss, path delay and carrier phase shifts, all of which vary with time, frequency and space. Traditional approaches to this problem are usually based on the assumptions that 1) the wireless channel can be modeled with a sufficient accuracy and 2) a sufficient number of pilot-based channel measurements can be performed in real time.
However, the continuously increasing need for pilot-based channel measurements has led to a need for more sophisticated approaches. High-spectral efficiency and the utilization of extremely high frequencies (above 6 GHz) make these assumptions untenable in future networks. A potential solution will not be an adaptation or extension within an existing framework, but rather a paradigm shift to meet the requirements of 5G networks. Modern wireless networks collect and process a large amount of data and this data can be used for tackling the problem of channel reconstruction, tracking and prediction. In this context, special attention has been given to the development of new machine learning algorithms that are able to process spatially distributed data in real time while efficiently using scarce wireless communication resources.
This calls for the development of distributed algorithms that in addition must provide robust results, have good tracking (online) capabilities, and exhibit a relatively low complexity.
Later on the writer has also described the mathematical and physical aspects and derivation(to see them visit above mentioned link).
Future research topics are:
1. Low complexity model.
2. Security and privacy mechanism.
3. Standardized formats for learning machine languages.
4. Radio resources and networking management.
According to the writers research in these fields of ICT will make a huge impact as they will not only simplify thing more but will also make path for further research.
Conclusion:
This paper discussed the increasing mutual influence of machine learning and communication technology. Learning algorithms have been shown to excel in traditional network management tasks such as routing, but also to be a core part of many emerging application fields. The availability of large amounts of data and recent improvements in deep learning methodology will further foster the convergence of these two fields. However, before resource-intensive models such as deep neural networks can be applied on a large scale in communication applications, several practical challenges (e.g. complexity, security, privacy) need to be solved. Additionally, more research is needed on theoretical topics at the intersection of communications and machine learning, such as incremental learning, learning in non-stationary environments or learning with side information.
[Reference link : https://www.itu.int/dms_pub/itu-s/opb/journal/S-JOURNAL-ICTF.VOL1-2018-1-PDF-E.pdf]
3-Block-chain Technology: Benefits, Challenges faced by Researches and their studies and Applications:
INTRODUCTION:
Blockchain technology is a revolutionary technology that helps to reduce security risks, eliminate fraud, and bring transparency to a scale that has never been seen before. It was originally associated with cryptocurrency and non-fungible tokens (NFTs) in the 2010s, but has since evolved into a management solution for all types of global industries. Using blockchain technology, cryptocurrencies (such as Bitcoin) and other digital information can move freely from one person to another without third-party involvement. Multiple nodes on the network verify a transaction simultaneously, and every subsequent transaction is added to a chain to maintain a historical record on the DLT. This secure method of exchanging data without a third party makes block-chain technology so appealing. The paper provides a detailed review of block-chain technology, applications platforms, and critical challenges faced by the technology. It also focuses on integrating blockchain technology with cloud computing. Blockchain dates back to 1991, when Stuart Haber and Wakefield Scott Stornetta introduced the concept of a cryptographically protected chain of records. In 2008, Satoshi Nakamoto gave the technology an established model and applications, and in 2009, blockchain's impact on the tech sector began to unfold. The paper concludes with a summary of the various blockchain platforms in cloud computing and the open challenges in this field that need to be addressed.
Topics:
Demanding Applications of Blockchain Technology and the Motivation behind Them:
Blockchain technology is being used to reduce data breach costs for organizations, increase cross-border remittances, and reduce trans-border transaction costs. Companies such as Ripple are using blockchain technology to overcome the high cost of cross-border transactions, eliminate inefficient supply chain and trade finance transactions, and reduce costs. Companies such as IBM, R3’s Marco Polo, the Digital Trade Chain operated by various banks, and the Hong Kong Trade Finance Platform are using blockchain platforms to solve these problems. Blockchain technology is being used by governments for the management of digital identities, protection of copyright, tracking and tracing prescription drugs, and cybersecurity. It is being used to digitize national identity records, secure citizen data, and reduce the inefficiencies of legacy digital ID management platforms.
Companies such as Blockai and Copyrobo use blockchain technology and artificial intelligence to help artists protect their art online in seconds. In pharmaceutical supply chains, blockchain technology is utilized for tracking and tracing prescription drugs, and cybersecurity is being used to mitigate cyberattacks. The public blockchain is highly secure compared to the consortium and private blockchains due to the nature of the members and the consensus mechanism.
Major Challenges that researchers are facing and working on:
Blockchain is used widely on the Internet for a variety of purposes, but its hardware and energy consumption is a major challenge. A secure public key infrastructure engine is used to offload public key operations such as signature generation and verification. Bitcoin is estimated to consume 127 terawatt-hours (TWh) per year, exceeding. The average energy consumption of Bitcoin per transaction is 707 kilowatt-hours (kWh), 11 times more than Ethereum. Bitcoin requires computers to solve ever more complicated math problems to verify transactions, and the proof of work consensus mechanism uses much more energy than many people realize.
Bitcoin transactions take upwards of 10 minutes to mine a new block, and Visa transactions are faster and require less energy. Bitcoin’s energy consumption problem cannot be solved by returning to centralized networks such as Visa, but Bitcoin’s advocates have several other options. Scalability is essential for scalable blockchain services, and cloud computing can provide on-demand computer resources for blockchain activities. Bitcoin and Ethereum, the major blockchain systems, experienced modest exchange rates and higher exchange fees due to a large increase in clients. BCH was created to solve some of Bitcoin’s existing problems, especially Scalability and transaction fees, and IOTA offers a variety of solutions.
Integration of Blockchain Technology and Cloud Computing:
This paper examines the security patterns in cloud storage and the potential applications for blockchain technology. It does not focus on any specific application of blockchain technology for cloud storage, but does address current trends, classifications, and unresolved difficulties that previous studies have not addressed. Cloud computing is predicated on utilizing centralized servers to store data and then to make that data accessible to consumers through software. However, it is common practice for businesses to have this form of centralized-based organization, which could undermine security, privacy, and authority. Blockchain technology has its unique benefits in data security, and can be used to protect data from hackers and viruses. However, it is important to note that cloud computing is predicated on utilizing centralized servers to store data and then to make that data accessible to consumers through software, which could lead to monetary losses, the loss of customers, harm to the company’s brand, and other impacts.
Benefits of Blockchain in Cloud Computing which researchers have achieved and are still working on:
Decentralization:
The Internet of Things (IoT) and cloud computing technologies rely on a centralized server for data management and decision-making. If the primary server has technical difficulties, the whole system may be rendered inoperable and the potential loss of critical data could have catastrophic consequences. To fix this problem, the decentralized blockchain system can store several copies of the same data on many different computers, reducing the risk of the whole system failing if only one server fails.
Enhanced Data Protection:
Enhanced Data Protection is a major challenge for the Internet of Things field, as it can lead to robbery and illegal selling of personal details. Blockchains in cloud computing are the solution to this problem, as they protect data from robbery and illegal selling.
Improved Goods and Service Ownership Tracking:
A centralized approach in designing software products, such as package tracking systems, might cause problems due to design flaws. Blockchain has a great deal of promise to keep tabs on these products and services
Tolerance for Errors :
Data may be replicated over a network of computer servers linked to each other through collaborative clouds. As a result, the chance of a single cloud node failing will be reduced, allowing for continuous operation
Conclusions, Limitations, and Future Work:
The limitations of blockchain technology include high energy consumption, expensive hardware requirements, and time-consuming transactions. These challenges need to be addressed to enhance the efficiency of coin creation and promote the widespread adoption of blockchain in the mining industry. Integrating blockchain into various economic aspects can also contribute to reducing businesses' carbon footprints. In the realm of cloud computing, there is a growing concern about the associated risks, particularly in terms of security, compliance, and centralization. The COVID-19 pandemic further accelerated the adoption of cloud computing without sufficient attention to legal and regulatory requirements, jeopardizing data security. Blockchain technology offers potential solutions by enhancing the security, speed, and reliability of data storage, transactions, and overall business operations. A combination of blockchain and cloud computing is seen as a promising approach to address security and decentralization concerns while improving authorization and privacy. However, further research is needed to address challenges such as data storage at every node and additional security measures. Exploring the potential of this combination can help businesses combat data threats and make informed decisions in today's competitive landscape.
[Reference link : https://www.mdpi.com/1999-5903/14/11/341]
4-Challenges and Future Directions of Big Data and Artificial Intelligence in Education:
This paper discusses the new challenges,researches and directions facing the use of big data and artificial intelligence (AI) in education research, policy-making, and industry. It highlights a novel trend in leading-edge educational research, with the convenience and embeddedness of data collection within educational technologies, paired with computational techniques, making the analyses of big data a reality. The key research trends in the domains of big data and AI are associated with assessment, individualized learning, and precision education. Model-driven data analytic approaches will grow quickly to guide the development, interpretation, and validation of the algorithms. At the education policy level, the government should be devoted to supporting lifelong learning, offering teacher education programs, and protecting personal data. With regard to the education industry, reciprocal and mutually beneficial relationships should be developed in order to enhance academia-industry collaboration. Finally, it is important to make sure that technologies are guided by relevant theoretical frameworks and are empirically tested.
INTRODUCTION:
This paper discusses the current status, opportunities, and challenges of big data and artificial intelligence (AI) in education. It draws upon opinions and panel discussions from an international conference on big data and AI in education, involving experts from various disciplines. The paper provides an overview of recent progress in utilizing big data and AI in education and highlights the major challenges and emerging trends. The conclusion and future scope of big data and AI in education are also discussed. Advancements in big data and AI technologies have had a significant impact on society, including the education sector. The characteristics of big data, such as volume, variety, velocity, veracity, and value, pose challenges in processing and utilizing large and complex datasets using traditional methods. Novel computational technologies are required for the acquisition, storage, analysis, and management of big data. Big data analytics, employing various techniques like statistical analysis, data mining, and machine learning, extract valuable insights and patterns from the data. Machine learning, as a subset of AI, focuses on building computer systems that learn from data without explicit programming. Machine learning algorithms can provide personalized insights and solutions based on individual needs and circumstances. In the field of education, AI applications have been explored for several decades, including intelligent tutoring systems, robotic systems, and chatbots. Recent breakthroughs in information technologies have enabled educational psychologists to access big data from sources like social media, online learning environments, and learning management systems. Learning analytics, combined with machine learning and AI techniques, have the potential to optimize learning, teaching, and administration in education. The integration of big data and AI in education research is gaining significance, as it provides valuable information for enhancing learning and teaching practices. The adoption of big data and AI in educational psychology is also emerging as a cutting-edge research method. The paper emphasizes the importance of further research and development in utilizing big data and AI to address the challenges and leverage the opportunities in the field of education.
The Position Formulation:
This paper discusses the current progress and potential of applying big data and artificial intelligence (AI) in education. The existing literature in this field includes systematic literature reviews, bibliometric studies, qualitative analyses, and social network analysis. The research on the learner side focuses on identifying learning and affective behavior patterns, improving assessment methods, predicting individual performance, and providing personalized support. On the teacher side, studies aim to enhance course planning, curriculum development, teaching evaluation, and teaching support. Big data technologies, such as learning analytics and machine learning, have shown promise in accurately predicting academic performance. However, there is a gap between technological capabilities and their utilization in education, with slow adoption in the education industry. Bridging this gap requires an interdisciplinary approach involving education, technology, and government. Education needs more knowledge and skills in AI and big data applications, while technology experts should be aware of advancements in educational psychology. Government policies must address regulatory and ethical challenges related to educational reforms and data-oriented technologies. Overall, this paper highlights the need for collaboration and integration between different domains to fully leverage the potential of big data and AI in education.
An Interdisciplinary Approach to Educational Adoption of Big Data and AI:
In light of the opportunities and challenges presented by the big data and AI revolution, collaboration among academics, educators, policy-makers, and professionals is crucial. They must work together to develop the necessary competencies and skills in learners for the knowledge economy of the 21st century. Collaboration across disciplines and sectors can be challenging, especially when there is a lack of shared vision and knowledge. This paper emphasizes the importance of collaboration in research, policy-making, and industry engagements. Researchers and industry can benefit from the development and transfer of educational technology to commercial products. Businesses and governments can benefit from legislation that supports technology markets while safeguarding data and privacy. Academics and policy-makers can benefit from prioritizing educational reforms that facilitate the adoption of technology-enhanced curricula. By working together, these stakeholders can harness the potential of big data and AI in education and drive positive change.
Big Data and AI in Education: Research
The integration of big data and AI in education has the potential to enable personalized and precise learning. However, there are several emerging trends, research gaps, and controversies that need to be addressed for the effective implementation of these technologies. These include: Transition to Precision Education: The shift from a one-size-fits-all approach to personalized learning is necessary to consider individual differences and tailor education to learners' specific needs. Adaptive tools and flexible learning systems are required to accommodate individual learners' interaction and pace. Cognitive Focus in AI: The research focus is moving towards incorporating cognitive aspects, such as perception, emotion, and cognitive thinking, in the design of AI systems for education. Collaboration across disciplines and domain transfers are facilitating this shift. Design and Interpretation of Data Analytics: The design of machine-generated data and the interpretation of algorithmically generated evidence must be carefully guided by theoretical models and evaluated constructs. Cultural differences, contextual factors, and student opinions should be taken into account when collecting affective data. Balancing Human and Machine Learning: Balancing the adoption of technology and human involvement in education is a challenge. Overreliance on technology may hinder certain essential skills and first-hand experiences. The convergence of human and machine learning has the potential for effective teaching and learning. Algorithmic Bias: Algorithms extensively rely on data, and if the data is biased, it can lead to systematic errors disadvantaging certain groups. Addressing algorithmic bias is crucial before wide implementation in education to ensure fairness and equity. Technology Expansion and Inequalities: The rapid expansion of technology raises concerns about inequalities in learning opportunities. Proper learning theories and active participation of learners are necessary to navigate the changing learning landscape. Efforts are needed to reduce the digital divide and provide support to developing countries. An overarching theme is the importance of theories from cognitive and educational psychology to guide the development of personalized learning tools and practices. Technologies like VR and AR should be informed by educational psychology to understand how they interact with learners' abilities and provide effective learning experiences. Empirical data and research on individual differences are essential for optimizing the use of technology in education.
Big Data and AI in Education: Policy-Making
The integration of big data and AI into education requires addressing policy-oriented challenges and gaps. The following points highlight key issues in this regard:
Paradigm Shift in Education:
Traditional formal education systems are undergoing changes to accommodate lifelong learning through online and project-based learning. The concept of continual education necessitates the development of new instructional methods, engagement strategies, and assessment approaches, along with the implementation of micro-credits or micro-degrees to support learners' efforts.
Teacher Adoption of Emerging Technologies:
There is a gap between pre-service and in-service teachers' willingness to adopt emerging technologies. While pre-service teachers are generally more open to adopting modern technologies, in-service teachers may rely more on their practical experience. Effective teacher education programs and continuing education initiatives are needed to bridge this gap and support teachers in adopting and utilizing emerging technologies.
Big Data and AI in Education: Industry
The commercialization of educational tools and systems that include the latest scientific and technological advances has been a major challenge in education. Numerous countries have attempted to stimulate innovation-based growth through enhancing technology transfer and fostering academia-industry collaboration. In the United States, this was initiated by the Bayh-Dole Act (Mowery et al., 2001). It is important that collaboration is mutually beneficial, as it can provide educators with tools for developing more effective curricula, pedagogical frameworks, assessments, and programs. Timely release of educational research advances onto commercial platforms is desirable by vendors from development, marketing, and revenue perspectives. Implementation of the latest research enables progressive development of commercial products and distinctive differentiation for marketing purposes. Novel features may also be suitably monetized, expanding revenue streams. The gaps between availability of the latest research and its practical adoption are slowing progress and negatively impacting commercial vendors. A viable solution is a closer alignment and/or direct collaboration between academia and industry. A greater spectrum of commercially and freely available tools helps maintain healthy market competition and avoid monopolies and oligopolies that stifle innovation. With more tools available on the market, educators and academics may explore novel avenues for improving education and research. Collaborative research projects sponsored by the industry should provide support and opportunities for academics to advance educational research. Vocational and practical education provides numerous opportunities for fruitful academia-industry collaboration. Domain knowledge provided by teachers is beneficially supplemented by AI-assisted learning environments in academia, while practical skills are enhanced in industrial environments with hands-on experience and feedback from both trainers and technology tools. Effective vocational training demands teachers and trainers on the human-learning side, and AI environments and actual technology tools on machine-learning side. Collaboration between academia and industry, as well as government initiatives, should be encouraged.
Discussion and Conclusion:
The paper emphasizes the potential of big data and AI in education, highlighting their ability to drive effective learning and teaching. It acknowledges that while these technologies are still unfamiliar to many researchers and educators, they have become a mainstream research paradigm with significant opportunities and challenges. The major challenges and potential solutions are summarized in table(Please consult the refer link). The future development of big data and AI in education should focus on theory-based precision education, cross-disciplinary application, and appropriate use of educational technologies. The government should support lifelong learning, provide teacher education programs, and ensure the protection of personal data. Collaboration between academia and the education industry is crucial for mutual benefits and enhanced collaboration. The paper emphasizes the need for a balanced approach that combines technological advancement with human-centered education. While automation may impact routine jobs, the teacher's role remains indispensable due to the teacher-student relationship and its impact on students' learning and personal growth. Technology, particularly data analysis, can assist teachers in understanding students' learning patterns and developing effective teaching strategies. It can also free teachers from routine tasks, enabling them to focus on complex issues and foster higher-order thinking skills in students. The adoption of big data and AI in education is still in its early stages, facing technological and mindset challenges. However, the convergence of psychology, data science, and computer science holds promise for transforming educational research, practice, and industry. The authors hope that the presented achievements and future directions contribute to the shared goal of supporting learners and teachers in pursuing sustainable development.
[Reference link:https://www.frontiersin.org/articles/10.3389/fpsyg.2020.580820/full#B103 ]
5- ICT for Cyber Security in Business:
INTRODUCTION:
The article highlights the importance of enhancing data storage security in a company's IT infrastructure. As businesses grow, the implementation of various information technologies becomes necessary to streamline operations. File servers are widely used in both small companies and large corporations to address these needs.
File servers serve two primary purposes: storing digitized company data and facilitating easy information sharing among employees. They act as centralized repositories for secure data storage, enabling authorized personnel to access and manage files. Additionally, file servers promote efficient collaboration by providing a convenient means of sharing and exchanging information within the company. File servers offer scalability options, allowing for expansion as data requirements increase. Additionally, file servers may offer additional services such as file backup, version control, and remote access. Overall, file servers play a crucial role in modern business environments by ensuring secure data storage and facilitating seamless information sharing. Companies should prioritize the scalability and security of their file servers.
Main body :
The article discusses the vulnerability of data stored on file servers and the rise of malware attacks, particularly ransomware. The WannaCry attack in May 2017 serves as an example of a massive malware attack that affected numerous organizations worldwide. Ransomware encrypts data on workstations or file servers, demanding a ransom for its recovery. Payment is typically required in untraceable cryptocurrency like Bitcoin, making it difficult to trace and prosecute the perpetrators. Antivirus software companies invest resources in developing decryption tools, but the process takes time, leaving users unable to access their data during the recovery process. Some ransomware strains remain unrecoverable, adding to the complexity of the situation. Cryptoviruses, a type of encrypting ransomware, pose the most significant threat to data. They target specific file types, encrypting either the entire file or just the metadata. Cryptoviruses can spread through Trojan horses and mass email messages, infecting network drives and shared folders, thereby making file servers vulnerable. Traditional methods of data recovery, such as bit-by-bit hard disk scanning and backups, are ineffective against cryptoviruses. These malware strains delete original files and shadow copies, rendering recovery attempts futile. Backup systems are only reliable if backups are stored in locations disconnected from the local network, but this may limit automatic backups and lead to data loss between the last backup and the infection. The article also introduces different types of clouds—public, private, and hybrid—as alternatives to shared network resources and file servers. Public clouds like Google Drive and Dropbox are suitable for storing and sharing non-sensitive data, but they may lack control over server security. Hybrid clouds, on the other hand, combine multiple cloud infrastructures and allow data and applications to transfer between components. In summary, the article emphasizes the vulnerability of data stored on file servers to malware attacks, particularly ransomware. It suggests that cloud solutions, such as hybrid clouds, may offer better security and control over data storage and sharing compared to traditional file servers and shared network resources.
( Concepts are delivered in the form of figures please consult the article for clarification )
Opportunities to Access Information Stored on Private Clouds:
The article discusses the desktop clients developed by OwnCloud and NextCloud for various operating systems. These clients offer file synchronization, which has its advantages and disadvantages. If the cloud is used by a single user or does not involve sharing files and simultaneous collaboration, the desktop client is a suitable choice. One disadvantage of desktop clients is that they store local copies of files on the machines where they are installed, and these copies are in an unencrypted form. This requirement for local copies also leads to increased disk space usage on the workstation. However, desktop clients are useful in scenarios where large files need to be shared among different users. For example, departments like prepress or X-ray cabinets may benefit from using desktop clients. In these cases, if the department does not work with local copies or copy files to local computers, each open file generates network traffic of the file's size (e.g., 200 MB). This can overload the network and slow down the work of other users, especially in larger departments. Using a desktop client, the user generates network traffic to the server only once when saving a local copy of the file. This minimizes network congestion and improves overall efficiency.
The article explains that using a desktop client can reduce network traffic and improve efficiency. When using a desktop client, the file is initially read from the local file system and saved to the local disk is faster. Synchronization between the local computer and the server becomes transparent to users, allowing them to work without hindering others on the server. Desktop clients are freely available on the official pages of the chosen private cloud and can be easily installed. WebDAV (Web Distributed Authoring and Versioning) is an extension of the HTTP protocol that allows users to manipulate information arrays directly on the server. OwnCloud and NextCloud utilize WebDAV and allow users to mount disks in the operating system, similar to file servers. However, some older software may have compatibility issues with WebDAV when saving files in this manner.
Information recovery methods:
The article discusses the differences between file servers and private clouds regarding data protection and recovery methods. Private clouds offer additional safeguards such as versioning control and double deletion to protect against cryptovirus attacks. Version control can help recover unencrypted files in cases where a cryptovirus encrypts only part of the file's metadata without deleting the original copy. Double deletion is a method employed by private clouds to protect against cryptoviruses that encrypt and delete entire files. To ensure seamless recovery, approximately 2 TB of free cloud space is recommended. To mitigate the need for large storage space, the private clouds offer a "Ransomware protection" add-on. This add-on creates a blacklist of file extensions associated with known cryptoviruses, preventing files with those extensions from being uploaded. However, it does not prevent the attacking virus from deleting original copies. Restoring files from the "Deleted files" section is necessary in such cases.
Integration of Private Clouds in existing IT infrastructure:
When integrating a private cloud into an existing IT infrastructure, there are several considerations to take into account. The choice between NextCloud and ownCloud should be made, and the hardware or virtual machine on which it will be installed should be considered. The number of users who will be using the private cloud, the number and size of files that will be stored on the private cloud, and the anticipated number and size of files that will be added to the private cloud storage within a year should be considered. After setting up the private cloud, existing data needs to be transferred to it. It is recommended to create primary users on the system where the files are initially stored, who will retain ownership of the files and can easily be deleted from the system without transferring file ownership when an employee leaves the company. There are two ways to migrate data from a file server or shared directories to the private cloud: mounting the main user via WebDAV to a server and manually copying the data. The second method for configuring user access to files is to use a synchronization client. By configuring the synchronization client with a new user, a local directory is specified where the files will be stored and the synchronization client will automatically upload the files to the private cloud. It is important to consider the hierarchy of users and determine who should have access to specific data. Private clouds like NextCloud and ownCloud offer various options for accessing data, including through a browser, synchronization client, and WebDAV. Some organizations may require a combination of different access methods based on user needs and preferences. There is no one-size-fits-all rule for configuring user access to files, so it is important to tailor it to the organization's specific requirements.
CONCLUSION:
Ransomware has become one of the most urgent issues in the digital world. It is clearly, that protecting against ransomware is a difficult task. The private clouds considered in this article are open source solutions through which we can better protect our data by storing it according to ISO 27001 -Information Security Determination Systems. The article describes approaches by which user data can be recovered after being encrypted by a ransomware attack. The main advantages of using the considered solutions in comparison with the standard file servers are also described. An approach for integrating a private cloud into an existing IT infrastructure is presented. A comparison of the positive and negative sides of the private clouds and file servers discussed in the article is made.
[Reference Link https://iopscience.iop.org/article/10.1088/1757-899X/1099/1/012035/meta]
Comments
Post a Comment