Advances, Systems and Applications

Journal of Cloud Computing Cover Image

  • Search by keyword
  • Search by citation

Page 1 of 14

Cross-chain asset trading scheme for notaries based on edge cloud storage

Blockchain has penetrated in various fields, such as finance, healthcare, supply chain, and intelligent transportation, but the value exchange between different blockchains limits their expansion. Cross-chain ...

  • View Full Text

An overview of QoS-aware load balancing techniques in SDN-based IoT networks

Increasing and heterogeneous service demands have led to traffic increase, and load imbalance challenges among network entities in the Internet of Things (IoT) environments. It can affect Quality of Service (Q...

MSCO: Mobility-aware Secure Computation Offloading in blockchain-enabled Fog computing environments

Fog computing has evolved as a promising computing paradigm to support the execution of latency-sensitive Internet of Things (IoT) applications. The mobile devices connected to the fog environment are resource...

Correction to: Edge intelligence‑assisted animation design with large models: a survey

The original article was published in Journal of Cloud Computing 2024 13 :48

Provably secure data selective sharing scheme with cloud-based decentralized trust management systems

The smart collection and sharing of data is an important part of cloud-based systems, since huge amounts of data are being created all the time. This feature allows users to distribute data to particular recip...

Stacked-CNN-BiLSTM-COVID: an effective stacked ensemble deep learning framework for sentiment analysis of Arabic COVID-19 tweets

Social networks are popular for advertising, idea sharing, and opinion formation. Due to COVID-19, coronavirus information disseminated on social media affects people’s lives directly. Individuals sometimes ma...

Traffic prediction for diverse edge IoT data using graph network

More researchers are proposing artificial intelligence algorithms for Internet of Things (IoT) devices and applying them to themes such as smart cities and smart transportation. In recent years, relevant resea...

Predicting UPDRS in Parkinson’s disease using ensembles of self-organizing map and neuro-fuzzy

Parkinson's Disease (PD) is a complex, degenerative disease that affects nerve cells that are responsible for body movement. Artificial Intelligence (AI) algorithms are widely used to diagnose and track the pr...

A cloud-edge computing architecture for monitoring protective equipment

The proper use of protective equipment is very important to avoid fatalities. One sector in which this has a great impact is that of construction sites, where a large number of workers die each year. In this s...

A cloud-edge collaborative task scheduling method based on model segmentation

With the continuous development and combined application of cloud computing and artificial intelligence, some new methods have emerged to reduce task execution time for training neural network models in a clou...

Analysis and prediction of virtual machine boot time on virtualized computing environments

Starting a virtual machine (VM) is a common operation in cloud computing platforms. In order to achieve better management of resource provisioning, a cloud platform needs to accurately estimate the VM boot tim...

IoT workload offloading efficient intelligent transport system in federated ACNN integrated cooperated edge-cloud networks

Intelligent transport systems (ITS) provide various cooperative edge cloud services for roadside vehicular applications. These applications offer additional diversity, including ticket validation across transp...

Intelligent code search aids edge software development

The growth of multimedia applications poses new challenges to software facilities in edge computing. Developers must effectively develop edge computing software to accommodate the rapid expansion of multimedia...

Correction to: Advanced series decomposition with a gated recurrent unit and graph convolutional neural network for non‑stationary data patterns

The original article was published in Journal of Cloud Computing 2024 13 :20

PMNet: a multi-branch and multi-scale semantic segmentation approach to water extraction from high-resolution remote sensing images with edge-cloud computing

In the field of remote sensing image interpretation, automatically extracting water body information from high-resolution images is a key task. However, facing the complex multi-scale features in high-resoluti...

Correction: FLM-ICR: a federated learning model for classification of internet of vehicle terminals using connection records

The original article was published in Journal of Cloud Computing 2024 13 :57

CG-PBFT: an efficient PBFT algorithm based on credit grouping

Because of its excellent properties of fault tolerance, efficiency and availability, the practical Byzantine fault tolerance (PBFT) algorithm has become the mainstream consensus algorithm in blockchain. Howeve...

Time-aware outlier detection in health physique monitoring in edge-aided sport education decision-makings

The increasing popularity of various intelligent sensor and mobile communication technologies has enabled quick health physique sensing, monitoring, collection and analyses of students, which significantly pro...

Accurate and fast congestion feedback in MEC-enabled RDMA datacenters

Mobile edge computing (MEC) is a novel computing paradigm that pushes computation and storage resources to the edge of the network. The interconnection of edge servers forms small-scale data centers, enabling ...

Optimus: association-based dynamic system call filtering for container attack surface reduction

While container adoption has witnessed significant growth in facilitating the operation of large-scale applications, this increased attention has also attracted adversaries who exploit numerous vulnerabilities...

A secure cross-domain authentication scheme based on threshold signature for MEC

The widespread adoption of fifth-generation mobile networks has spurred the rapid advancement of mobile edge computing (MEC). By decentralizing computing and storage resources to the network edge, MEC signific...

Edge intelligence empowered delivery route planning for handling changes in uncertain supply chain environment

Traditional delivery route planning faces challenges in reducing logistics costs and improving customer satisfaction with growing customer demand and complex road traffic, especially in uncertain supply chain ...

Security issues of news data dissemination in internet environment

With the rise of artificial intelligence and the development of social media, people's communication is more convenient and convenient. However, in the Internet environment, the untrue dissemination of news da...

Short-term forecasting of surface solar incident radiation on edge intelligence based on AttUNet

Solar energy has emerged as a key industry in the field of renewable energy due to its universality, harmlessness, and sustainability. Accurate prediction of solar radiation is crucial for optimizing the econo...

Edge computing-oriented smart agricultural supply chain mechanism with auction and fuzzy neural networks

Powered by data-driven technologies, precision agriculture offers immense productivity and sustainability benefits. However, fragmentation across farmlands necessitates distributed transparent automation. We d...

AIoT-driven multi-source sensor emission monitoring and forecasting using multi-source sensor integration with reduced noise series decomposition

The integration of multi-source sensors based AIoT (Artificial Intelligence of Things) technologies into air quality measurement and forecasting is becoming increasingly critical in the fields of sustainable a...

An integrated SDN framework for early detection of DDoS attacks in cloud computing

Cloud computing is a rapidly advancing technology with numerous benefits, such as increased availability, scalability, and flexibility. Relocating computing infrastructure to a network simplifies hardware and ...

An optimized neural network with AdaHessian for cryptojacking attack prediction for Securing Crypto Exchange Operations of MEC applications

Bitcoin exchange security is crucial because of MEC's widespread use. Cryptojacking has compromised MEC app security and bitcoin exchange ecosystem functionality. This paper propose a cutting-edge neural netwo...

Privacy-preserving federated learning based on partial low-quality data

Traditional machine learning requires collecting data from participants for training, which may lead to malicious acquisition of privacy in participants’ data. Federated learning provides a method to protect p...

A secure data interaction method based on edge computing

Deep learning achieves an outstanding success in the edge scene due to the appearance of lightweight neural network. However, a number of works show that these networks are vulnerable for adversarial examples,...

TCP Stratos for stratosphere based computing platforms

Stratosphere computing platforms (SCPs) benefit from free cooling but face challenges necessitating transmission control protocol (TCP) re-design. The redesign should be considered due to stratospheric gravity...

Optimizing the resource allocation in cyber physical energy systems based on cloud storage and IoT infrastructure

Given the prohibited operating zones, losses, and valve point effects in power systems, energy optimization analysis in such systems includes numerous non-convex and non-smooth parameters, such as economic dis...

SRA-E-ABCO: terminal task offloading for cloud-edge-end environments

The rapid development of the Internet technology along with the emergence of intelligent applications has put forward higher requirements for task offloading. In Cloud-Edge-End (CEE) environments, offloading c...

FLM-ICR: a federated learning model for classification of internet of vehicle terminals using connection records

With the rapid growth of Internet of Vehicles (IoV) technology, the performance and privacy of IoV terminals (IoVT) have become increasingly important. This paper proposes a federated learning model for IoVT c...

The Correction to this article has been published in Journal of Cloud Computing 2024 13 :75

Multi-dimensional resource allocation strategy for LEO satellite communication uplinks based on deep reinforcement learning

In the LEO satellite communication system, the resource utilization rate is very low due to the constrained resources on satellites and the non-uniform distribution of traffics. In addition, the rapid movement...

Edge-cloud computing oriented large-scale online music education mechanism driven by neural networks

With the advent of the big data era, edge cloud computing has developed rapidly. In this era of popular digital music, various technologies have brought great convenience to online music education. But vast da...

RNA-RBP interactions recognition using multi-label learning and feature attention allocation

In this study, we present a sophisticated multi-label deep learning framework for the prediction of RNA-RBP (RNA-binding protein) interactions, a critical aspect in understanding RNA functionality modulation a...

Low-cost and high-performance abnormal trajectory detection based on the GRU model with deep spatiotemporal sequence analysis in cloud computing

Trajectory anomalies serve as early indicators of potential issues and frequently provide valuable insights into event occurrence. Existing methods for detecting abnormal trajectories primarily focus on compar...

AI-empowered mobile edge computing: inducing balanced federated learning strategy over edge for balanced data and optimized computation cost

In Mobile Edge Computing, the framework of federated learning can enable collaborative learning models across edge nodes, without necessitating the direct exchange of data from edge nodes. It addresses signifi...

Automated visual quality assessment for virtual and augmented reality based digital twins

Virtual and augmented reality digital twins are becoming increasingly prevalent in a number of industries, though the production of digital-twin systems applications is still prohibitively expensive for many s...

Detection of cotton leaf curl disease’s susceptibility scale level based on deep learning

Cotton, a crucial cash crop in Pakistan, faces persistent threats from diseases, notably the Cotton Leaf Curl Virus (CLCuV). Detecting these diseases accurately and early is vital for effective management. Thi...

Unified ensemble federated learning with cloud computing for online anomaly detection in energy-efficient wireless sensor networks

Anomaly detection in Wireless Sensor Networks (WSNs) is critical for their reliable and secure operation. Optimizing resource efficiency is crucial for reducing energy consumption. Two new algorithms developed...

Edge intelligence-assisted animation design with large models: a survey

The integration of edge intelligence (EI) in animation design, particularly when dealing with large models, represents a significant advancement in the field of computer graphics and animation. This survey aim...

The Correction to this article has been published in Journal of Cloud Computing 2024 13 :87

Target tracking using video surveillance for enabling machine vision services at the edge of marine transportation systems based on microwave remote sensing

Automatic target tracking in emerging remote sensing video-generating tools based on microwave imaging technology and radars has been investigated in this paper. A moving target tracking system is proposed to ...

Multiple objectives dynamic VM placement for application service availability in cloud networks

Ensuring application service availability is a critical aspect of delivering quality cloud computing services. However, placing virtual machines (VMs) on computing servers to provision these services can prese...

Investigation on storage level data integrity strategies in cloud computing: classification, security obstructions, challenges and vulnerability

Cloud computing provides outsourcing of computing services at a lower cost, making it a popular choice for many businesses. In recent years, cloud data storage has gained significant success, thanks to its adv...

A secure and efficient electronic medical record data sharing scheme based on blockchain and proxy re-encryption

With the rapid development of the Internet of Medical Things (IoMT) and the increasing concern for personal health, sharing Electronic Medical Record (EMR) data is widely recognized as a crucial method for enh...

A fog-edge-enabled intrusion detection system for smart grids

The Smart Grid (SG) heavily depends on the Advanced Metering Infrastructure (AMI) technology, which has shown its vulnerability to intrusions. To effectively monitor and raise alarms in response to anomalous a...

Enhanced mechanism to prioritize the cloud data privacy factors using AHP and TOPSIS: a hybrid approach

Cloud computing is a new paradigm in this new cyber era. Nowadays, most organizations are showing more reliability in this environment. The increasing reliability of the Cloud also makes it vulnerable. As vuln...

Dynamic routing optimization in software-defined networking based on a metaheuristic algorithm

Optimizing resource allocation and routing to satisfy service needs is paramount in large-scale networks. Software-defined networking (SDN) is a new network paradigm that decouples forwarding and control, enab...

  • Editorial Board
  • Sign up for article alerts and news from this journal

Annual Journal Metrics

2022 Citation Impact 4.0 - 2-year Impact Factor 4.4 - 5-year Impact Factor 1.711 - SNIP (Source Normalized Impact per Paper) 0.976 - SJR (SCImago Journal Rank)

2023 Speed 10 days submission to first editorial decision for all manuscripts (Median) 116 days submission to accept (Median)

2023 Usage  733,672 downloads 49 Altmetric mentions 

  • More about our metrics
  • ISSN: 2192-113X (electronic)

Benefit from our free funding service

New Content Item

We offer a free open access support service to make it easier for you to discover and apply for article-processing charge (APC) funding. 

Learn more here

cloud computing Recently Published Documents

Total documents.

  • Latest Documents
  • Most Cited Documents
  • Contributed Authors
  • Related Sources
  • Related Keywords

Simulation and performance assessment of a modified throttled load balancing algorithm in cloud computing environment

<span lang="EN-US">Load balancing is crucial to ensure scalability, reliability, minimize response time, and processing time and maximize resource utilization in cloud computing. However, the load fluctuation accompanied with the distribution of a huge number of requests among a set of virtual machines (VMs) is challenging and needs effective and practical load balancers. In this work, a two listed throttled load balancer (TLT-LB) algorithm is proposed and further simulated using the CloudAnalyst simulator. The TLT-LB algorithm is based on the modification of the conventional TLB algorithm to improve the distribution of the tasks between different VMs. The performance of the TLT-LB algorithm compared to the TLB, round robin (RR), and active monitoring load balancer (AMLB) algorithms has been evaluated using two different configurations. Interestingly, the TLT-LB significantly balances the load between the VMs by reducing the loading gap between the heaviest loaded and the lightest loaded VMs to be 6.45% compared to 68.55% for the TLB and AMLB algorithms. Furthermore, the TLT-LB algorithm considerably reduces the average response time and processing time compared to the TLB, RR, and AMLB algorithms.</span>

An improved forensic-by-design framework for cloud computing with systems engineering standard compliance

Reliability of trust management systems in cloud computing.

Cloud computing is an innovation that conveys administrations like programming, stage, and framework over the web. This computing structure is wide spread and dynamic, which chips away at the compensation per-utilize model and supports virtualization. Distributed computing is expanding quickly among purchasers and has many organizations that offer types of assistance through the web. It gives an adaptable and on-request administration yet at the same time has different security dangers. Its dynamic nature makes it tweaked according to client and supplier’s necessities, subsequently making it an outstanding benefit of distributed computing. However, then again, this additionally makes trust issues and or issues like security, protection, personality, and legitimacy. In this way, the huge test in the cloud climate is selecting a perfect organization. For this, the trust component assumes a critical part, in view of the assessment of QoS and Feedback rating. Nonetheless, different difficulties are as yet present in the trust the board framework for observing and assessing the QoS. This paper talks about the current obstructions present in the trust framework. The objective of this paper is to audit the available trust models. The issues like insufficient trust between the supplier and client have made issues in information sharing likewise tended to here. Besides, it lays the limits and their enhancements to help specialists who mean to investigate this point.

Cloud Computing Adoption in the Construction Industry of Singapore: Drivers, Challenges, and Strategies

An extensive review of web-based multi granularity service composition.

The paper reviews the efforts to compose SOAP, non-SOAP and non-web services. Traditionally efforts were made for composite SOAP services, however, these efforts did not include the RESTful and non-web services. A SOAP service uses structured exchange methodology for dealing with web services while a non-SOAP follows different approach. The research paper reviews the invoking and composing a combination of SOAP, non-SOAP, and non-web services into a composite process to execute complex tasks on various devices. It also shows the systematic integration of the SOAP, non-SOAP and non-web services describing the composition of heterogeneous services than the ones conventionally used from the perspective of resource consumption. The paper further compares and reviews different layout model for the discovery of services, selection of services and composition of services in Cloud computing. Recent research trends in service composition are identified and then research about microservices are evaluated and shown in the form of table and graphs.

Integrated Blockchain and Cloud Computing Systems: A Systematic Survey, Solutions, and Challenges

Cloud computing is a network model of on-demand access for sharing configurable computing resource pools. Compared with conventional service architectures, cloud computing introduces new security challenges in secure service management and control, privacy protection, data integrity protection in distributed databases, data backup, and synchronization. Blockchain can be leveraged to address these challenges, partly due to the underlying characteristics such as transparency, traceability, decentralization, security, immutability, and automation. We present a comprehensive survey of how blockchain is applied to provide security services in the cloud computing model and we analyze the research trends of blockchain-related techniques in current cloud computing models. During the reviewing, we also briefly investigate how cloud computing can affect blockchain, especially about the performance improvements that cloud computing can provide for the blockchain. Our contributions include the following: (i) summarizing the possible architectures and models of the integration of blockchain and cloud computing and the roles of cloud computing in blockchain; (ii) classifying and discussing recent, relevant works based on different blockchain-based security services in the cloud computing model; (iii) simply investigating what improvements cloud computing can provide for the blockchain; (iv) introducing the current development status of the industry/major cloud providers in the direction of combining cloud and blockchain; (v) analyzing the main barriers and challenges of integrated blockchain and cloud computing systems; and (vi) providing recommendations for future research and improvement on the integration of blockchain and cloud systems.

Cloud Computing and Undergraduate Researches in Universities in Enugu State: Implication for Skills Demand

Cloud building block chip for creating fpga and asic clouds.

Hardware-accelerated cloud computing systems based on FPGA chips (FPGA cloud) or ASIC chips (ASIC cloud) have emerged as a new technology trend for power-efficient acceleration of various software applications. However, the operating systems and hypervisors currently used in cloud computing will lead to power, performance, and scalability problems in an exascale cloud computing environment. Consequently, the present study proposes a parallel hardware hypervisor system that is implemented entirely in special-purpose hardware, and that virtualizes application-specific multi-chip supercomputers, to enable virtual supercomputers to share available FPGA and ASIC resources in a cloud system. In addition to the virtualization of multi-chip supercomputers, the system’s other unique features include simultaneous migration of multiple communicating hardware tasks, and on-demand increase or decrease of hardware resources allocated to a virtual supercomputer. Partitioning the flat hardware design of the proposed hypervisor system into multiple partitions and applying the chip unioning technique to its partitions, the present study introduces a cloud building block chip that can be used to create FPGA or ASIC clouds as well. Single-chip and multi-chip verification studies have been done to verify the functional correctness of the hypervisor system, which consumes only a fraction of (10%) hardware resources.

Study On Social Network Recommendation Service Method Based On Mobile Cloud Computing

Cloud-based network virtualization in iot with openstack.

In Cloud computing deployments, specifically in the Infrastructure-as-a-Service (IaaS) model, networking is one of the core enabling facilities provided for the users. The IaaS approach ensures significant flexibility and manageability, since the networking resources and topologies are entirely under users’ control. In this context, considerable efforts have been devoted to promoting the Cloud paradigm as a suitable solution for managing IoT environments. Deep and genuine integration between the two ecosystems, Cloud and IoT, may only be attainable at the IaaS level. In light of extending the IoT domain capabilities’ with Cloud-based mechanisms akin to the IaaS Cloud model, network virtualization is a fundamental enabler of infrastructure-oriented IoT deployments. Indeed, an IoT deployment without networking resilience and adaptability makes it unsuitable to meet user-level demands and services’ requirements. Such a limitation makes the IoT-based services adopted in very specific and statically defined scenarios, thus leading to limited plurality and diversity of use cases. This article presents a Cloud-based approach for network virtualization in an IoT context using the de-facto standard IaaS middleware, OpenStack, and its networking subsystem, Neutron. OpenStack is being extended to enable the instantiation of virtual/overlay networks between Cloud-based instances (e.g., virtual machines, containers, and bare metal servers) and/or geographically distributed IoT nodes deployed at the network edge.

Export Citation Format

Share document.

Illustration showing how cloud computing enables access to intranet-based infrastructure and applications

Published: 14 February 2024 Contributors: Stephanie Susnjara, Ian Smalley

Cloud computing is the on-demand access of computing resources—physical servers or virtual servers, data storage, networking capabilities, application development tools, software, AI-powered analytic tools and more—over the internet with pay-per-use pricing.

The cloud computing model offers customers greater flexibility and scalability compared to traditional on-premises infrastructure.

Cloud computing plays a pivotal role in our everyday lives, whether accessing a cloud application like Google Gmail, streaming a movie on Netflix or playing a cloud-hosted video game.

Cloud computing has also become indispensable in business settings, from small startups to global enterprises. Its many business applications include enabling remote work by making data and applications accessible from anywhere, creating the framework for seamless omnichannel customer engagement and providing the vast computing power and other resources needed to take advantage of cutting-edge technologies like generative AI and quantum computing . 

A cloud services provider (CSP) manages cloud-based technology services hosted at a remote data center and typically makes these resources available for a pay-as-you-go or monthly subscription fee.

Read how Desktop as a service (DaaS) enables enterprises to achieve the same level of performance and security as deploying the applications on-premises.

Register for the guide on app modernization

Compared to traditional on-premises IT that involves a company owning and maintaining physical data centers and servers to access computing power, data storage and other resources (and depending on the cloud services you select), cloud computing offers many benefits, including the following:

Cloud computing lets you offload some or all of the expense and effort of purchasing, installing, configuring and managing mainframe computers and other on-premises infrastructure. You pay only for cloud-based infrastructure and other computing resources as you use them. 

With cloud computing, your organization can use enterprise applications in minutes instead of waiting weeks or months for IT to respond to a request, purchase and configure supporting hardware and install software. This feature empowers users—specifically DevOps and other development teams—to help leverage cloud-based software and support infrastructure.

Cloud computing provides elasticity and self-service provisioning, so instead of purchasing excess capacity that sits unused during slow periods, you can scale capacity up and down in response to spikes and dips in traffic. You can also use your cloud provider’s global network to spread your applications closer to users worldwide.

Cloud computing enables organizations to use various technologies and the most up-to-date innovations to gain a competitive edge. For instance, in retail, banking and other customer-facing industries, generative AI-powered virtual assistants deployed over the cloud can deliver better customer response time and free up teams to focus on higher-level work. In manufacturing, teams can collaborate and use cloud-based software to monitor real-time data across logistics and supply chain processes.

The origins of cloud computing technology go back to the early 1960s when  Dr. Joseph Carl Robnett Licklider  (link resides outside ibm.com), an American computer scientist and psychologist known as the "father of cloud computing", introduced the earliest ideas of global networking in a series of memos discussing an Intergalactic Computer Network. However, it wasn’t until the early 2000s that modern cloud infrastructure for business emerged.

In 2002, Amazon Web Services started cloud-based storage and computing services. In 2006, it introduced Elastic Compute Cloud (EC2), an offering that allowed users to rent virtual computers to run their applications. That same year, Google introduced the Google Apps suite (now called Google Workspace), a collection of SaaS productivity applications. In 2009, Microsoft started its first SaaS application, Microsoft Office 2011. Today,  Gartner predicts  worldwide end-user spending on the public cloud will total USD 679 billion and is projected to exceed USD 1 trillion in 2027 (link resides outside ibm.com).

The following are a few of the most integral components of today’s modern cloud computing architecture.

CSPs own and operate remote data centers that house physical or bare metal servers , cloud storage systems and other physical hardware that create the underlying infrastructure and provide the physical foundation for cloud computing.

In cloud computing, high-speed networking connections are crucial. Typically, an internet connection known as a wide-area network (WAN) connects front-end users (for example, client-side interface made visible through web-enabled devices) with back-end functions (for example, data centers and cloud-based applications and services). Other advanced cloud computing networking technologies, including load balancers , content delivery networks (CDNs) and software-defined networking (SDN) , are also incorporated to ensure data flows quickly, easily and securely between front-end users and back-end resources. 

Cloud computing relies heavily on the virtualization of IT infrastructure —servers, operating system software, networking and other infrastructure that’s abstracted using special software so that it can be pooled and divided irrespective of physical hardware boundaries. For example, a single hardware server can be divided into multiple virtual servers . Virtualization enables cloud providers to make maximum use of their data center resources. 

IaaS (Infrastructure-as-a-Service), PaaS (Platform-as-a-Service), SaaS (Software-as-a-Service) and serverless computing are the most common models of cloud services, and it’s not uncommon for an organization to use some combination of all four.

IaaS (Infrastructure-as-a-Service) provides on-demand access to fundamental computing resources—physical and virtual servers, networking and storage—over the internet on a pay-as-you-go basis. IaaS enables end users to scale and shrink resources on an as-needed basis, reducing the need for high up-front capital expenditures or unnecessary on-premises or "owned" infrastructure and for overbuying resources to accommodate periodic spikes in usage. 

According to a  Business Research Company report  (link resides outside ibm.com), the IaaS market is predicted to grow rapidly in the next few years, growing to $212.34 billion in 2028 at a compound annual growth rate (CAGR) of 14.2%. 

PaaS (Platform-as-a-Service) provides software developers with an on-demand platform—hardware, complete software stack, infrastructure and development tools—for running, developing and managing applications without the cost, complexity and inflexibility of maintaining that platform on-premises. With PaaS, the cloud provider hosts everything at their data center. These include servers, networks, storage, operating system software, middleware  and databases. Developers simply pick from a menu to spin up servers and environments they need to run, build, test, deploy, maintain, update and scale applications.

Today, PaaS is typically built around  container s , a virtualized compute model one step removed from virtual servers. Containers virtualize the operating system, enabling developers to package the application with only the operating system services it needs to run on any platform without modification and the need for middleware.

Red Hat® OpenShift ® is a popular PaaS built around  Docker  containers and  Kubernetes , an open source container orchestration solution that automates deployment, scaling, load balancing and more for container-based applications.

SaaS (Software-as-a-Service) , also known as cloud-based software or cloud applications, is application software hosted in the cloud. Users access SaaS through a web browser, a dedicated desktop client or an API that integrates with a desktop or mobile operating system. Cloud service providers offer SaaS based on a monthly or annual subscription fee. They may also provide these services through pay-per-usage pricing. 

In addition to the cost savings, time-to-value and scalability benefits of cloud, SaaS offers the following:

  • Automatic upgrades:  With SaaS, users use new features when the cloud service provider adds them without orchestrating an on-premises upgrade.
  • Protection from data loss:  Because SaaS stores application data in the cloud with the application, users don’t lose data if their device crashes or breaks.

SaaS is the primary delivery model for most commercial software today. Hundreds of SaaS solutions exist, from focused industry and broad administrative (for example, Salesforce) to robust enterprise database and artificial intelligence (AI) software. According to an International Data Center (IDC) survey (the link resides outside IBM), SaaS applications represent the largest cloud computing segment, accounting for more than 48% of the $778 billion worldwide cloud software revenue.

Serverless computing , or simply serverless, is a cloud computing model that offloads all the back-end infrastructure management tasks, including provisioning, scaling, scheduling and patching to the cloud provider. This frees developers to focus all their time and effort on the code and business logic specific to their applications.

Moreover, serverless runs application code on a per-request basis only and automatically scales the supporting infrastructure up and down in response to the number of requests. With serverless, customers pay only for the resources used when the application runs; they never pay for idle capacity. 

FaaS, or Function-as-a-Service , is often confused with serverless computing when, in fact, it’s a subset of serverless. FaaS allows developers to run portions of application code (called functions) in response to specific events. Everything besides the code—physical hardware, virtual machine (VM) operating system and web server software management—is provisioned automatically by the cloud service provider in real-time as the code runs and is spun back down once the execution is complete. Billing starts when execution starts and stops when execution stops.

A  public cloud is a type of cloud computing in which a cloud service provider makes computing resources available to users over the public internet. These include SaaS applications, individual  virtual machines (VMs) , bare metal computing hardware, complete enterprise-grade infrastructures and development platforms. These resources might be accessible for free or according to subscription-based or pay-per-usage pricing models.

The public cloud provider owns, manages and assumes all responsibility for the data centers, hardware and infrastructure on which its customers’ workloads run. It typically provides high-bandwidth network connectivity to ensure high performance and rapid access to applications and data.

Public cloud is a  multi-tenant environment  where all customers pool and share the cloud provider’s data center infrastructure and other resources. In the world of the leading public cloud vendors, such as Amazon Web Services (AWS), Google Cloud, IBM Cloud®, Microsoft Azure and Oracle Cloud, these customers can number in the millions.

Most enterprises have moved portions of their computing infrastructure to the public cloud since public cloud services are elastic and readily scalable, flexibly adjusting to meet changing workload demands. The promise of greater efficiency and cost savings through paying only for what they use attracts customers to the public cloud. Still, others seek to reduce spending on hardware and on-premises infrastructure.  Gartner predicts  (link resides outside ibm.com) that by 2026, 75% of organizations will adopt a digital transformation model predicated on cloud as the fundamental underlying platform. 

A  private cloud is a cloud environment where all cloud infrastructure and computing resources are dedicated to one customer only. Private cloud combines many benefits of cloud computing—including elasticity, scalability and ease of service delivery—with the access control, security and resource customization of on-premises infrastructure.

A private cloud is typically hosted on-premises in the customer’s data center. However, it can also be hosted on an independent cloud provider’s infrastructure or built on rented infrastructure housed in an offsite data center.

Many companies choose a private cloud over a public cloud environment to meet their regulatory compliance requirements. Entities like government agencies, healthcare organizations and financial institutions often opt for private cloud settings for workloads that deal with confidential documents, personally identifiable information (PII), intellectual property, medical records, financial data or other sensitive data.

By building private cloud architecture according to  cloud-native  principles, an organization can quickly move workloads to a public cloud or run them within a hybrid cloud (see below) environment whenever ready.

A  hybrid cloud is just what it sounds like: a combination of public cloud, private cloud and on-premises environments. Specifically (and ideally), a hybrid cloud connects a combination of these three environments into a single, flexible infrastructure for running the organization’s applications and workloads. 

At first, organizations turned to hybrid cloud computing models primarily to migrate portions of their on-premises data into private cloud infrastructure and then connect that infrastructure to public cloud infrastructure hosted off-premises by cloud vendors. This process was done through a packaged hybrid cloud solution like Red Hat® OpenShift® or middleware and IT management tools to create a " single pane of glass ." Teams and administrators rely on this unified dashboard to view their applications, networks and systems.

Today, hybrid cloud architecture has expanded beyond physical connectivity and cloud migration to offer a flexible, secure and cost-effective environment that supports the portability and automated deployment of workloads across multiple environments. This feature enables an organization to meet its technical and business objectives more effectively and cost-efficiently than with a public or private cloud alone. For instance, a hybrid cloud environment is ideal for DevOps and other teams to develop and test web applications. This frees organizations from purchasing and expanding the on-premises physical hardware needed to run application testing, offering faster time to market. Once a team has developed an application in the public cloud, they may move it to a private cloud environment based on business needs or security factors.

A public cloud also allows companies to quickly scale resources in response to unplanned spikes in traffic without impacting private cloud workloads, a feature known as cloud bursting. Streaming channels like Amazon use cloud bursting to support the increased viewership traffic when they start new shows.

Most enterprise organizations today rely on a hybrid cloud model because it offers greater flexibility, scalability and cost optimization than traditional on-premises infrastructure setups. According to the  IBM Transformation Index: State of Cloud , more than 77% of businesses and IT professionals have adopted a hybrid cloud approach.

To learn more about the differences between public, private and hybrid cloud, check out “ Public cloud vs. private cloud vs. hybrid cloud: What’s the difference? ”

Watch the IBM hybrid cloud architecture video series.

Multicloud uses two or more clouds from two or more different cloud providers. A multicloud environment can be as simple as email SaaS from one vendor and image editing SaaS from another. But when enterprises talk about multicloud, they typically refer to using multiple cloud services—including SaaS, PaaS and IaaS services—from two or more leading public cloud providers. 

Organizations choose multicloud to avoid vendor lock-in, to have more services to select from and to access more innovation. With multicloud, organizations can choose and customize a unique set of cloud features and services to meet their business needs. This freedom of choice includes selecting “best-of-breed” technologies from any CSP, as needed or as they emerge, rather than being locked into offering from a single vendor. For example, an organization may choose AWS for its global reach with web-hosting, IBM Cloud for data analytics and machine learning platforms and Microsoft Azure for its security features.

A multicloud environment also reduces exposure to licensing, security and compatibility issues that can result from " shadow IT "— any software, hardware or IT resource used on an enterprise network without the IT department’s approval and often without IT’s knowledge or oversight.

Today, most enterprise organizations use a hybrid multicloud model. Apart from the flexibility to choose the most cost-effective cloud service, hybrid multicloud offers the most control over workload deployment, enabling organizations to operate more efficiently, improve performance and optimize costs. According to an  IBM® Institute for Business Value study , the value derived from a full hybrid multicloud platform technology and operating model at scale is two-and-a-half times the value derived from a single-platform, single-cloud vendor approach. 

Yet the modern hybrid multicloud model comes with more complexity. The more clouds you use—each with its own management tools, data transmission rates and security protocols—the more difficult it can be to manage your environment. With  over 97% of enterprises operating on more than one cloud  and most organizations running  10 or more clouds , a hybrid cloud management approach has become crucial. Hybrid multicloud management platforms provide visibility across multiple provider clouds through a central dashboard where development teams can see their projects and deployments, operations teams can monitor clusters and nodes and the cybersecurity staff can monitor for threats.

Learn more about hybrid cloud management.

Traditionally, security concerns have been the primary obstacle for organizations considering cloud services, mainly public cloud services. Maintaining cloud security demands different procedures and employee skillsets than in legacy IT environments. Some cloud security best practices include the following:

  • Shared responsibility for security:  Generally, the cloud service provider is responsible for securing cloud infrastructure, and the customer is responsible for protecting its data within the cloud. However, it’s also essential to clearly define data ownership between private and public third parties.
  • Data encryption:  Data should be encrypted while at rest, in transit and in use. Customers need to maintain complete control over security keys and hardware security modules.
  • Collaborative management:  Proper communication and clear, understandable processes between IT, operations and security teams will ensure seamless cloud integrations that are secure and sustainable.
  • Security and compliance monitoring:  This begins with understanding all regulatory compliance standards applicable to your industry and establishing active monitoring of all connected systems and cloud-based services to maintain visibility of all data exchanges across all environments, on-premises, private cloud, hybrid cloud and edge.

Cloud security is constantly changing to keep pace with new threats. Today’s CSPs offer a wide array of cloud security management tools, including the following:  

  • Identity and access management (IAM):  IAM   tools and services that automate policy-driven enforcement protocols for all users attempting to access both on-premises and cloud-based services. 
  • Data loss prevention (DLP): DLP services that combine remediation alerts data encryption and other preventive measures to protect all stored data, whether at rest or in motion.
  • Security information and event management (SIEM) :   SIEM is a comprehensive security orchestration solution that automates threat monitoring, detection and response in cloud-based environments. SIEM technology uses artificial intelligence (AI)-driven technologies to correlate log data across multiple platforms and digital assets. This allows IT teams to successfully apply their network security protocols, enabling them to react to potential threats quickly.
  • Automated data compliance platforms:   Automated software solutions provide compliance controls and centralized data collection to help organizations adhere to regulations specific to their industry. Regular compliance updates can be baked into these platforms so organizations can adapt to ever-changing regulatory compliance standards.

Learn more about cloud security.

Sustainability in business , a company’s strategy to reduce negative environmental impact from their operations in a particular market, has become an essential corporate governance mandate.  Moreover, Gartner predicts  (link resides outside ibm.com) that by 2025, the carbon emissions of hyperscale cloud services will be a top-three criterion in cloud purchase decisions.

As companies strive to advance their sustainability objectives, cloud computing has evolved to play a significant role in helping them reduce their carbon emissions and manage climate-related risks. For instance, traditional data centers require power supplies and cooling systems, which depend on large amounts of electrical power. By migrating IT resources and applications to the cloud, organizations only enhance operational and cost efficiencies and boost overall energy efficiency through pooled CSP resources.

All major cloud players have made net-zero commitments to reduce their carbon footprints and help clients reduce the energy they typically consume using an on-premises setup. For instance, IBM is driven by  sustainable procurement  initiatives to reach NetZero by 2030. By 2025, IBM Cloud worldwide data centers  will comprise energy procurement drawn from 75% renewable sources .

According to an  International Data Corporation (IDC) forecast  (link resides outside ibm.com), worldwide spending on the whole cloud opportunity (offerings, infrastructure and services) will surpass USD 1 trillion in 2024 while sustaining a double-digit compound annual growth rate (CAGR) of 15.7%. Here are some of the main ways businesses are benefitting from cloud computing: 

  • Scale infrastructure:  Allocate resources up or down quickly and easily in response to changes in business demands.
  • Enable business continuity and disaster recovery:  Cloud computing provides cost-effective redundancy to protect data against system failures and the physical distance required to apply disaster recovery strategies and recover data and applications during a local outage or disaster. All of the major public cloud providers offer Disaster-Recovery-as-a-Service (DRaaS) .
  • Build and test cloud-native applications : For development teams adopting Agile,  DevOps  or  DevSecOps to streamline development, the cloud offers on-demand end-user self-service that prevents operations tasks, such as spinning up development and test servers, from becoming development bottlenecks.
  • Support edge and IoT environments:  Address latency challenges and reduce downtime by bringing data sources closer to the edge . Support Internet of Things (IoT) devices (for example, patient monitoring devices and sensors on a production line) to gather real-time data.
  • Leverage cutting-edge technologies:  Cloud computing supports storing and processing huge volumes of data at high speeds—much more storage and computing capacity than most organizations can or want to purchase and deploy on-premises. These high-performance resources support technologies like  blockchain , quantum computing and  large language models (LLMs ) that power generative AI platforms like customer service automation. 

Create a no-charge IBM Cloud account and access more than 40 always-free products in cloud and AI.

IBM Cloud for VMware Solutions enables you to seamlessly migrate and modernize VMware workloads to the cloud, allowing you to leverage your existing investments for a consistent VMware experience—retaining the same level of access, security and control.

Let IBM Cloud manage your infrastructure while you manage your environment. Pay only for what you use.

Tackle large-scale, compute-intensive challenges and speed time to insight with hybrid cloud HPC solutions.

An industry-specific cloud, built to support your unique modernization and AI transformation needs.

Hybrid cloud integrates public cloud services, private cloud services and on-premises infrastructure into a single distributed computing environment.

DevOps speeds delivery of higher quality software by combining and automating the work of software development and IT operations teams.

Cloud migration is the process of relocating an organization’s data, applications, and workloads to a cloud infrastructure.

Although cloud computing is only a different way to deliver computer resources rather than a new technology, it has sparked a revolution in the way organizations provide information and service.

Determining the best cloud computing architecture for enterprise business is critical for overall success. That’s why it is essential to compare the different functionalities of private cloud versus public cloud versus hybrid cloud.

We're excited to introduce a three-part lightboarding video series that will delve into the world of hybrid cloud architecture. In this intro video, our guide, Sai Vennam, lays out the three major hybrid cloud architecture issues that we're going to cover: Connectivity, Modernization and Security.

Designed for industry, security and the freedom to build and run anywhere, IBM Cloud is a full stack cloud platform with over 170 products and services covering data, containers, AI, IoT and blockchain. Use IBM Cloud to build scalable infrastructure at a lower cost, deploy new applications instantly and scale up workloads based on demand.

Loading metrics

Open Access

Cloud computing applications for biomedical science: A perspective

Affiliation Center for Information Technology, National Institutes of Health, Bethesda, Maryland, United States of America

ORCID logo

* E-mail: [email protected]

Affiliation Department of Biomedical Engineering, University of Virginia, Charlottesville, Virginia, United States of America

  • Vivek Navale, 
  • Philip E. Bourne

PLOS

Published: June 14, 2018

  • https://doi.org/10.1371/journal.pcbi.1006144
  • Reader Comments

Table 1

Biomedical research has become a digital data–intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research.

Citation: Navale V, Bourne PE (2018) Cloud computing applications for biomedical science: A perspective. PLoS Comput Biol 14(6): e1006144. https://doi.org/10.1371/journal.pcbi.1006144

Editor: Francis Ouellette, Genome Quebec, CANADA

This is an open access article, free of all copyright, and may be freely reproduced, distributed, transmitted, modified, built upon, or otherwise used by anyone for any lawful purpose. The work is made available under the Creative Commons CC0 public domain dedication.

Funding: The authors received no specific funding for this article.

Competing interests: The authors have declared that no competing interests exist.

Introduction

Progress in biomedical research is increasingly driven by insight gained through the analysis and interpretation of large and complex data sets. As the ability to generate and test hypotheses using high-throughput technologies has become technically more feasible and even commonplace, the challenge of gaining useful knowledge has shifted from the wet bench to include the computer. Desktop computers, high-performance workstations, and high-performance computing systems (HPC clusters) are currently the workhorses of the biomedical digital data research endeavor. Recently, however, cloud computing, enabled by the broad adoption and increasing capabilities of the internet and driven by market need, has emerged as a powerful, flexible, and scalable approach to disparate computational and data–intensive problems. The National Institute of Standards and Technology (NIST) states the following:

Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage applications and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

NIST categorizes clouds as one of 4 types: public, private, community, and hybrid. In a public cloud, the infrastructure exists on cloud provider premises and is managed by the cloud provider, whereas in a private cloud, the infrastructure can exist on or off the premises of the cloud provider but is managed by the private organization. Examples of public clouds include Amazon Web Services (AWS), Google Cloud Platform (GCP), and Microsoft Azure. A community cloud is a collaborative effort where infrastructure is shared between several organizations—a specific community—that have common requirements for security and compliance. For example, the Federal Risk and Authorization Management Program (FedRAMP) is a United States government-wide program that provides a standardized approach to security assessment, authorization, and continuous monitoring of information technology (IT) infrastructure [ 1 ]. The AWS GovCloud is an example of a FedRAMP-accredited resource that operates as a community cloud that addresses US government community needs. The JetStream Cloud [ 2 ] serves as a community cloud serving the scientific community. A hybrid cloud is a composition of 2 or more distinct cloud infrastructures—private, community, public—that remain unique entities but are bound together in a way that enables portability of data and software applications [ 3 ].

Cloud types discussed above can use one or more cloud services—Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS). SaaS enables the consumer to use the cloud provider’s applications (e.g., Google Docs) that are running on a cloud provider’s infrastructure, whereas PaaS enables consumers to create or acquire applications and tools and to deploy them on the cloud provider’s infrastructure. IaaS enables a consumer to provision processing, storage, networks, and other fundamental computing resources. Most public cloud providers like AWS, GCP, and Microsoft Azure provide IaaS, PaaS, and SaaS, and the customer can select the best applicable solution for their individual needs.

Cloud adoption, regardless of type, has varied in industry because of different levels of security and other features required for operation. Previously, both public and private clouds have been used more in unregulated industries and to a lesser extent in regulated industries, but this is changing [ 4 ]. Federally funded scientific data sets are being made available in public clouds [ 5 ]. For example, Human Microbiome Project (HMP) data, funded by the National Institutes of Health (NIH), is available on AWS simple storage service (S3) [ 6 ], and more biomedical data sets are becoming available in the cloud. Research investigators can now request permission from NIH to transfer controlled-access genomic and associated phenotypic data obtained from NIH-designated data repositories to public or private cloud systems for data storage and analysis [ 7 ]. Subject to appropriate access controls on human subjects’ data, the NIH is committed to making public access to digital data a standard for all NIH-funded research [ 8 ].

Advances across the biological scales, from sequencing instruments to health monitoring devices, image collections to the expansion of electronic health record (EHR) platforms, will see a reduced cost in the acquisition of data. Estimates indicate that in 2016, the NIH alone was supporting 650 petabytes (PB) of data at various institutional repositories. Both volume and complexity of biomedical data will significantly increase in coming years, bringing challenges for storage, management, and preservation, suggesting an increased usage of cloud computing. Biomedical research can benefit from the growing number of cloud-based big data tools and platforms being developed and refined for other industries.

Adopting cloud for biomedical work

Consider examples of how clouds and cloud services have been deployed in biomedical work ( Table 1 ). In genomics alone, usage ranges from single applications to complete virtual machines with multiple applications. Additional information on cloud resources in bioinformatics has been provided previously [ 9 ].

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pcbi.1006144.t001

Individual tools

BLAST [ 10 ] is one of the most frequently used tools in bioinformatics research. A BLAST server image can be hosted on AWS, Azure, and GCP public clouds to allow users to run stand-alone searches with BLAST. Users can also submit searches using BLAST through the National Center for Biotechnology Information (NCBI) application programming interface (API) to run on AWS and Google Compute Engine [ 11 ]. Additionally, the Microsoft Azure platform can be leveraged to execute large BLAST sequence matching tasks within reasonable time limits. Azure enables users to download sequence databases from NCBI, run different BLAST programs on a specified input against the sequence databases, and generate visualizations from the results for easy analysis. Azure also provides a way to create a web-based user interface for scheduling and tracking the BLAST match tasks, visualizing results, managing users, and performing basic tasks [ 12 ].

CloudAligner is a fast and full-featured MapReduce-based tool for sequence mapping, designed to be able to deal with long sequences [ 13 ], whereas CloudBurst [ 14 ] can provide highly sensitive short read mapping with MapReduce. High-throughput sequencing analyses can be carried out by the Eoulsan package integrated in a cloud IaaS environment [ 15 ]. For whole genome resequencing analysis, Crossbow [ 16 ] is a scalable software pipeline. Crossbow combines Bowtie, an ultrafast and memory efficient short read aligner, and SoapSNP, a genotyper, in an automatic parallel pipeline that can run in the cloud.

Workflows and platforms

Integration of genotype, phenotype, and clinical data is important for biomedical research. Biomedical platforms can provide an environment for establishing an end-to-end pipeline for data acquisition, storage, and analysis.

Galaxy, an open source, web-based platform, is used for data–intensive biomedical research [ 17 ]. For large scale data analysis, Galaxy can be hosted in cloud IaaS (see tutorial [ 18 ]). Reliable and highly scalable cloud-based workflow systems for next-generation sequencing analyses has been achieved by integrating the Galaxy workflow system with Globus Provision [ 19 ].

The Bionimbus Protected Data Cloud (BPDC) is a private cloud-based infrastructure for managing, analyzing, and sharing large amounts of genomics and phenotypic data in a secure environment, which was used for gene fusion studies [ 20 ]. BPDC is primarily based on OpenStack, open source software that provides tools to build cloud platforms [ 21 ], with a service portal for a single point of entry and a single sign-on for various available BPDC resources. Using BPDC, data analysis for the acute myeloid leukemia (AML) resequencing project was rapidly performed to identify somatic variants expressed in adverse-risk primary AML samples [ 22 ].

Scalable and robust infrastructure for Next Generation Sequencing (NGS) analysis is needed for diagnostic work in clinical laboratories. CloudMan is available on the AWS cloud infrastructure [ 23 ]. It has been used as a platform for distributing tools, data, and analysis results. Improvements in using CloudMan for genetic variant analysis has been carried out by reducing storage costs for clinical analysis work [ 24 ].

As part of the Pan Cancer Analysis of Whole Genomes (PCAWG), common patterns of mutation in over 2,800 cancer whole genome sequences were studied, which required significant scientific computing resources to investigate the role of the noncoding parts of the cancer genome and for comparing genomes of tumor and normal cells [ 25 ]. The PCAWG data coordinating center currently lists collaborative agreements with cloud provider AWS and the Cancer Collaboratory [ 26 ], an academic compute cloud resource maintained by the Ontario Institute for Cancer Research and hosted at the Compute Canada facility.

Multiple academic resources were used to complete analysis of 1,827 samples taking over 6 months. This was supplemented by the use of cloud resources, where 500 samples were analyzed by AWS in 6 weeks [ 27 ]. This showed that public cloud resources can be rapidly provisioned to quickly scale up a project if increased compute resources are needed. In this instance, AWS S3 data storage was used to scale from 600 terabytes to multiple PBs. Raw reads, genome alignments, metadata, and curated data can also be incrementally uploaded to AWS S3 for rapid access by the cancer research community. Data search and access tools are also available for other researchers to use or reuse. Sequence read-level data and germline data are maintained at the controlled tier of the cloud, and access to read data requires preapproval from the International Cancer Genome Consortium (ICGC) data access compliance office.

The National Cancer Institute (NCI) has funded 3 cloud pilots to provide genomic analysis, computational support, and access capabilities to the Cancer Genome Atlas (TCGA) data [ 28 ]. The objective of the pilots was to develop a scalable platform to facilitate research collaboration and data reuse. All 3 cloud pilots have received authoritative and harmonized reference data sets from the cancer Genomic Data Commons (GDC) [ 29 ] that have been analyzed using a common set of workflows against a reference genome (e.g., GRCh38). The Broad Institute pilot developed FireCloud [ 30 ] using the elastic compute capacity of Google Cloud for large-scale data analysis, curation, storage, and data sharing. Users can also upload their own analysis methods and data to workspaces and/or use Broad Institute’s best practice tools and pipelines on preloaded data. FireCloud uses the Workflow Description Language (WDL) to enable users to run scalable, reproducible workflows [ 31 ].

The Institute for Systems Biology (ISB) pilot leverages several services on the GCP. Researchers can use web-based software applications to interactively define and compare cohorts, examine the underlying molecular data for specific genes or pathways of interest, share insights with collaborators, and apply their individual software scripts and programs to various data sets [ 32 ].

The ISB Cancer Genome Cloud (CGC) has loaded processed data and TCGA project metadata into the BigQuery managed database service, enabling easy data mining and data warehouse approaches to be used on large-scale genomics data. The Seven Bridges Genomics (SBG) CGC offers both genomics SaaS and PaaS and uses AWS [ 33 ]. The platform also enables researchers to collaborate on the analysis of large cancer genomics data sets in a secure, reproducible, and scalable manner. SGB CGC implements Common Work-Flow language [ 34 ] to facilitate developers, analysts, and biologists to deploy, customize, and run reproducible analysis methods. Users may choose from over 200 tools and workflows covering many aspects of genomics data processing to apply to TCGA data or their own data sets.

Efforts are underway by the NIH Center for Excellence in Big Data Computing at the University of Illinois, Urbana-Champaign to construct a Knowledge Engine for Genomics (KnowEnG). The KnowEnG system is deployed on a public cloud infrastructure—currently AWS—to enable biomedical scientists to access data-mining, network-mining, and machine-learning algorithms that can aid in extracting knowledge from genomics data [ 35 ]. A massive knowledge base of community data sets called the Knowledge Network is at the heart of the KnowEnG system, and data sets, even those in spreadsheets, can be brought to KnowEnG for analysis.

Commercial (AWS, Microsoft Azure) cloud-based platforms (e.g., DNAnexus) enables analyses of massive amounts of sequencing data integrated with phenotypic or clinical information [ 36 ]. Also, the application of deep learning-based data analysis tools (e.g., Deep Variant) in conjunction with DNAnexus have been used to call genetic variants from next-generation sequencing data [ 37 ]. Other bioinformatics platforms (e.g., DNAstack) use the GCP for providing processing capability for over a quarter of a million whole human genome sequences per year [ 38 ].

Cloud computing applications in healthcare include telemedicine/teleconsultation, medical imaging, public health, patient self-management, hospital management and information systems, therapy, and secondary use of data.

Real-time health monitoring for patients with chronic conditions who reside at considerable distances from their health service providers have difficulty in having their health conditions monitored. One poignant example are patients who suffer from cardiac arrhythmias requiring continuous episode detection and monitoring. Wearable sensors can be used for real-time electrocardiogram (ECG) monitoring, arrhythmia episode detection, and classification. Using AWS EC2, mobile computing technologies were integrated, and ECG monitoring capabilities were demonstrated for recording, analyzing, and visually displaying data from patients at remote locations. In addition, software tools that monitored and analyzed ECG data were made available via cloud SaaS for public use [ 39 ]. Also, the Microsoft Azure platform has been implemented for a 12-lead ECG telemedicine service [ 40 ]. For storage and retrieval of medical images, deployment of Picture Archive and Communication System modules were deployed in a public cloud [ 41 ]. A review of publications on cloud computing in healthcare has pointed out that many healthcare-related publications have used the term “cloud” synonymously with “virtual machines” or “web-based tools”, not consistent with characteristics that define cloud computing, models, and services [ 42 ]. Several commercial vendors are interacting with hospitals and healthcare providers to establish healthcare services through cloud computing options.

General purpose tools

CloVR is a virtual machine that emulates a computer system, with preinstalled libraries and packages for biological data analysis [ 43 ]. Similarly, Cloud BioLinux is a publicly available resource with virtual machine images and provides over 100 software packages for high-performance bioinformatics computing [ 44 ]. Both (CloVR and BioLinux) virtual machine images are available for use within a cloud IaaS environment.

Cloud adoption can also include managed services that are designed for general Big Data problems. For example, each of the major public cloud providers offer a suite of services for machine learning and artificial intelligence, some of which are pretrained to solve common problems, (e.g., text-to-speech). Database systems such as Google BigQuery [ 45 ] and Amazon Redshift [ 46 ] combine the scalable and elastic nature of the cloud with tuned software and hardware solutions to deliver database capabilities and performance not easily achieved otherwise. For large, complex biomedical data sets, such databases can reduce management costs, ease database adoption, and facilitate analysis. Several big data applications used in biomedical research, such as the Apache Hadoop software library, are cloud based [ 47 ].

Developing a cloud-based digital ecosystem for biomedical research

The examples introduced above, some ongoing for several years, illustrate a departure from the traditional approach to biomedical computing. The traditional approach has been to download data to local computing systems from public sites and then perform data processing, analysis, and visualization locally. The download time, cost, and redundancy involved for enhancing local computing capabilities to meet data intensive biomedical research needs (e.g., in sequencing and imaging) makes this approach worthy of re-evaluation.

Large-scale projects, like PCAWG introduced above, have shown the advantage of using resources, both local and public cloud, from various collaborating institutions. For institutions with established on-premises infrastructure (e.g., high-speed network infrastructure, secure data repositories), developing a cloud-based digital ecosystem with options to leverage any of the cloud types (public, hybrid) can be advantageous. Moreover, developing and utilizing a cloud-based ecosystem increases the likelihood of open science.

To promote knowledge discovery and innovation, open data and analytics should be findable, accessible, interoperable, and reusable (FAIR). The FAIR principles serve as a guide for data producers, stewards, and disseminators for enhancing reusability of data, inclusive of data algorithms, tools, and workflows that are essential for good data lifecycle management [ 48 ]. A biomedical data ecosystem should have capabilities for indexing of data, metadata, software, and other digital objects—a hallmark of the NIH Big Data to Knowledge (BD2K) initiative [ 49 ].

Being FAIR is facilitated by an emerging paradigm for running complex, interrelated sets of software tools, like those used in genomics data processing, and involves packaging software using Linux container technologies, such as Docker, and then orchestrating “pipelines” using domain-specific workflow languages such as WDL and Common Workflow Language [ 34 ]. Cloud providers also provide batch processing (e.g., AWS Batch) capabilities that automatically provision the optimal quantity and type of compute resources based on the volume and specific resource requirements of the batch jobs submitted, thereby significantly facilitating analysis at scale.

In Fig 1 , we illustrate integration of data producers, consumers, and repositories via a cloud-based platform for supporting the FAIR principles.

thumbnail

https://doi.org/10.1371/journal.pcbi.1006144.g001

The core of a cloud-based platform should support the notion of a commons—a shared ecosystem maximizing access and reuse of biomedical research data and methods.

A cloud-based commons ecosystem can collocate computing capacity, storage resources, database, with informatics tools and applications for analyzing and sharing data by the research community. For multiple commons to interoperate with each other there are 6 essential requirements—permanent digital IDs, permanent metadata, APIs, data portability, data peering, and pay for compute [ 50 ].

Other features of the ecosystem include indexing and search capabilities similar to DataMed [ 51 ] and a metalearning framework for ranking and selection of the best predictive algorithms [ 52 ]. Many of the bioinformatics software tools that we have discussed in the previous section have been successfully deployed in cloud environments and can be adapted to the commons ecosystem, including Apache Spark, a successor to Apache Hadoop and MapReduce for data analysis of Next Generation Sequencing Data [ 53 ]. In addition, the data transfer and sharing component of the cloud-based commons ecosystem can include features discussed for the Globus Research Data Management Platform [ 54 ]. We also envision cloud-based commons to be supported by techniques and methods that use a semantic data–driven discovery platform designed to continuously grow knowledge from a multitude of genomic, molecular, and clinical data [ 55 ].

Security is an integral part of a cloud commons architecture along with data policy, governance, and a business case for sustaining a biomedical digital ecosystem. For initial security controls assessment, guidance documents such as Federal Information Security Management Act (FISMA), NIST-800-53, and Federal Information Processing Standards (FIPS) can provide tools for an organizational assessment of risk and for validation purposes [ 56 , 57 , 58 ]. Security in public cloud services is a shared responsibility, with the cloud provider providing security services and the end user maintaining responsibility for data and software that leverage those services. A wide range of issues involving ethical, legal, policy, and technical boundaries influence data privacy, all of which are dependent on the type of data being processed and supported [ 59 ].

A regular training program for data users of the cloud, especially for handling sensitive data (e.g., personally identifiable information) is important. The training should include methods for securing data that is moved to the cloud, and controlling access to the cloud resources, including virtual machines, containers, and cloud services that are involved for data life cycle management. Protecting access keys, using multifactor authentication, creating identity and access management user lists with controlled permissions, following the principle of least privilege—configured to perform actions that are needed for the users—are some of the recommended practices that can minimize security vulnerabilities that could arise from inexperienced cloud users and/or from malicious external entities [ 54 ].

Assessing risk is key to reliably determining the required level of protection needed for data in the cloud. A structured questionnaire approach developed as a Cloud Service Evaluation Model (CSEM) can be used to ascertain risks prior to migration of data to the cloud [ 60 ]. Based on the results of risk assessment, a suitable cloud deployment model can be chosen to ensure compliance with internal policies, legal, and regulatory requirements, which, externally, differ in different parts of the world, potentially impacting the ubiquitous nature of cloud resources.

Striving towards open biomedical data has motivated an interest in improving data access while maintaining security and privacy. For example, a community-wide open competition for developing novel genomic data protection methods has shown the feasibility of secure data outsourcing and collaboration for cloud-based genomic data analysis [ 61 ]. The findings from the work demonstrate that cryptographic techniques can support public cloud-based comparative analysis of human genomes. Recent work has shown that by using a hybrid cloud deployment model, 50%–70% of the read mapping task can be carried out accurately and efficiently in a public cloud [ 62 ].

In summary, a cloud-based ecosystem requires capability for interoperability between clouds, development of tools that can operate in multiple cloud environments and that can address the challenges of data protection, privacy, and legal constraints imposed by different countries (see [ 63 ] for a discussion as it relates to genomic data).

Cloud advantages and disadvantages for biomedical projects

Cloud costs vary among biomedical projects and among vendors, so defining technical requirements for provisioning resources (e.g., amount of memory, disk storage, and CPU use) is an important first step in estimating costs. Remember the intent of commercial public cloud providers is to have you continue to use their cloud environment. For example, data may be free to upload but expensive to download, making adoption of commons approach in the cloud even more important for hosting large-scale biological data sets. This approach can meet community needs of data producers, consumers, and stewards ( Fig 1 ) to improve access and minimize the need for downloading sets to local institutions. To test this approach, NIH has initiated a data commons pilot [ 64 ] by supporting the hosting of 3 important data sets, namely, Trans-Omics Precision Medicine initiative (TOPMed), Genotype Tissue Expression project (GTEx), and Alliance of Genomics Resource link, a consortium for Model Organism Databases (MODS) in the cloud.

Many cloud providers make available calculators for estimating approximate usage costs for their respective cloud services [ 65 ]. Without any point of reference to start with, estimating costs may be challenging. Commercial public cloud providers generally offer free credit with new accounts, which may be sufficient to kickstart the planning and evaluation process. Cloud service charges are based on exact usage in small time increments, whereas on site compute costs are typically amortized over 3–5-year periods for systems that can be used for multiple projects. Though cost comparisons between local infrastructure and cloud approaches are frequently sought, in practice, such comparisons are often difficult to perform effectively due to the lack of good data for actual local costs. Moreover, funding models for cloud computing differ among institutions receiving the funds and the funders themselves. For example, use of cloud resources may be subject to institutional overhead, whereas on-site hardware may not. This is not the best use of taxpayer money, and funding agencies should review their policies with respect to cloud usage by institutions charging overhead. Given the growing competitiveness in the cloud market, cloud resources may be negotiable or available under special agreements for qualifying research and education projects [ 66 – 68 ].

Biomedical researchers in collaboration with IT professionals will need to determine the best way to leverage cloud resources for their individual projects [ 69 ]. Computing costs for using on premise infrastructure requires determining the total cost of ownership (TCO). Both direct and indirect costs contribute towards TCO. Direct costs include hardware purchase costs, network services, data center, electricity, software licenses, and salaries. Indirect costs typically include technical support services, data management, and training. Indirect institutional costs vary significantly depending on the complexity of the project. Productivity is a consideration when assessing costs. For example, a whole genome pipeline in a cloud environment, once prototyped, can be scaled up for processing entire genomes with subsequent minimal human cost [ 70 ].

Using idle computing nodes in the cloud that are preemptible is one of the ways to reduce computing cost, but at the risk of increasing time to compute. For example, a recent report using the NCI cloud pilot ISB-CGC for quantification of transcript-expression levels for over 12,000 RNA-sequencing samples on 2 different cloud-based configurations, cluster-based versus pre-emptible, showed that the per sample cost when using the pre-emptible configuration was less than half the cost compared to the cluster-based method [ 71 ].

Other approaches have used linear programming methods to optimally bid on and deploy a combination of underutilized computing resources in genomics to minimize the cost of data analysis [ 72 ].

Cloud environments are pay-as-you-go, whereas research funding for computation is typically given at the beginning of an award and estimated on an annual basis. This can lead to a mismatch between the need for compute and the resources to meet that need. The NIH undertook a cloud credits pilot to assess an alternative funding model for cloud resources, details will be fully described in [ 73 ]. Credits were awarded when needed as opposed to up front, thereby matching usage patterns. A simple application and review mechanism available to a funded investigator means credits can typically be awarded in weeks or less. The investigator can choose with which cloud provider to spend the credits, thereby driving competition into the marketplace and presumably increasing the amount of compute that can be performed on research monies.

Cloud credits have focused on incentivizing cloud usage; however, a challenge that remains to be addressed is longer term data sustainability in cloud environments. The cost for data management and storage for retaining all the data produced during a research program can be prohibitive as collections become large. One of the ways to proactively tackle this issue is by engaging data producers, consumers, and curators from the beginning of the research data lifecycle process for developing value-based models for data retention, which can be implemented via cloud storage. Based on usage patterns, a policy driven data migration to least expensive cloud storage pools can be adopted. Our perspective is that long-term retention of biomedical data is an excellent venue for public and private institutions to partner together, to explore ways for co-ownership to manage cost and policy that can continue to make research data accessible over time.

Summary and conclusions

Cloud usage, from large-scale genomics analysis to remote monitoring of patients to molecular diagnostics work in clinical laboratories, has advantages but also potential drawbacks. A first step is the determination of what type of cloud environment best fits the application and then whether it represents a cost-effective solution. This introduction attempts to indicate what should be considered, what the options are, and what applications are already in use that may serve as references in making the best determination on how to proceed.

Cloud vendors provide multiple services for compute, storage, deployment of virtual machines, and access to various databases. Cloud vendors and third parties provide additional services to map users ranging from novices to experts. The ubiquitous nature of clouds raises questions regarding security and accessibility, particularly as it relates to geopolitical boundaries. Cost benefits of using clouds over other compute environments need to be carefully assessed as they relate to the size, complexity, and nature of the task. Clouds are termed elastic as they expand to embrace the compute needs of a task. For example, a simple, small prototype can be tested in a cloud environment and immediately scaled up to handle very large data. On the other hand, there is a cost associated with such usage, particularly in extricating the outcomes of the computation. Cloud vendors are seeking an all-in model. Once you commit to using their services, you continue to do so or pay a significant penalty. This, combined with being a pay-as-you-go model, has implications when mapped to the up-front funding models of typical grants. The idea of environments where multiple public cloud providers are used in a collective ecosystem is still mostly on the horizon. What is clear, however, is that clouds are a growing part of the biomedical computational ecosystem and are here to stay.

Acknowledgments

The authors acknowledge the constructive comments provided by Dr. Sean Davis, Center for Cancer Research, National Cancer Institute; Ms. Andrea Norris, Director, Center for Information Technology; and Dr. Vivien Bonazzi, Office of Director, National Institutes of Health.

The opinions expressed in the paper are those of the authors and do not necessarily reflect the opinions of the National Institutes of Health.

  • 1. FedRAMP.gov . In: FedRAMP.gov [Internet]. [cited 18 Sep 2017]. Available from: https://www.fedramp.gov/
  • 2. Indiana University Pervasive Technology Institute. Jetstream: A National Science and Engineering Cloud [Internet]. [cited 19 Sep 2017]. Available from: https://jetstream-cloud.org/
  • 3. Mell P, Grance T, Others. The NIST definition of cloud computing [Internet]. [cited 18 Sep 2017]. National Institute of Standards and Technology; 2011. Report No.: Special Publication 800–145. Available from: http://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-145.pdf
  • 4. Palian J. Cloud Computing Adoption Across Industries. In: Expedient [Internet]. 19 Mar 2013 [cited 18 Sep 2017]. Available from: https://www.expedient.com/blog/how-cloud-computing-adoption-varies-across-industries/
  • 5. Amazon Web Services. AWS Public Datasets. In: AWS Public Datasets [Internet]. [cited 18 Sep 2017]. Available from: https://aws.amazon.com/datasets/
  • 6. Amazon Web Services. Human Microbiome Project on Amazon Web Services. In: Amazon Web Services Public Datasets [Internet]. [cited 18 Sep 2017]. Available from: https://aws.amazon.com/datasets/human-microbiome-project/
  • 7. National Institutes of Health. Use of Cloud Computing Services for Storage and Analysis of Controlled-Access Data Subject to the NIH Genomic Data Sharing Policy [Internet]. [cited 18 Sep 2017]. Available from: https://gds.nih.gov/pdf/NIH_Position_Statement_on_Cloud_Computing.pdf
  • 8. National Institutes of Health. National Institutes of Health Plan for Increasing Access to Scientific Publications and Digital Scientific Data from NIH Funded Scientific Research [Internet]. [cited 18 Sep 2017]. NIH; 2015 Feb. Available from: https://grants.nih.gov/grants/NIH-Public-Access-Plan.pdf
  • View Article
  • Google Scholar
  • PubMed/NCBI
  • 11. NCBI. Cloud BLAST. In: Cloud BLAST Documentation [Internet]. [cited 18 Sep 2017]. Available from: https://blast.ncbi.nlm.nih.gov/Blast.cgi?PAGE_TYPE=BlastDocs&DOC_TYPE=CloudBlast
  • 12. NCBI BLAST on Windows Azure. In: Microsoft Download Center [Internet]. [cited 18 Sep 2017]. Available from: https://www.microsoft.com/en-us/download/details.aspx?id=52513
  • 18. Taylor J. Galaxy on the Cloud. In: Coursera [Internet]. [cited 18 Sep 2017]. Available from: https://www.coursera.org/learn/galaxy-project/lecture/veQKq/galaxy-on-the-cloud
  • 21. Home—OpenStack Open Source Cloud Computing Software. In: OpenStack [Internet]. [cited 16 Oct 2017]. Available from: https://www.openstack.org/
  • 25. PanCancer Analysis Working Group. In: ICGC Data Portal [Internet]. [cited 19 Sep 2017]. Available from: https://dcc.icgc.org/pcawg
  • 26. International Cancer Genome Consortium. Cancer Collaboratory. In: Cancer Collaboratory [Internet]. [cited 19 Sep 2017]. Available from: https://dcc.icgc.org/icgc-in-the-cloud/collaboratory
  • 28. National Cancer Institute. National Cancer Institute Cancer Genomics Cloud Pilots [Internet]. [cited 18 Sep 2017]. Available from: https://cbiit.cancer.gov/sites/nci-cbiit/files/Cloud_Pilot_Handout_508compliant.pdf
  • 30. Broad Institute. FIRECLOUD. In: FireCloud [Internet]. [cited 18 Sep 2017]. Available from: https://software.broadinstitute.org/firecloud/
  • 31. Broad Institute. Workflow Description Language [Internet]. [cited 19 Sep 2017]. Available from: https://software.broadinstitute.org/wdl/
  • 32. Institute for Systems Biology. Institute for Systems Biology: Cancer Genomics Cloud [Internet]. [cited 19 Sep 2017]. Available from: http://cgc.systemsbiology.net/
  • 33. SevenBridges Genomics. Cancer Genomics Cloud. In: Cancer Genomics Cloud [Internet]. [cited 19 Sep 2017]. Available from: http://www.cancergenomicscloud.org/
  • 34. Amstutz P, Crusoe MR, Tijanić N, Chapman B, Chilton J, Heuer M, et al. Common Workflow Language, v1.0. figshare. 2016; https://doi.org/10.6084/m9.figshare.3115156.v2
  • 37. DNAnexus Platform Offers Google-Developed DeepVariant | GEN. In: GEN [Internet]. 13 Dec 2017 [cited 26 Jan 2018]. Available from: https://www.genengnews.com/gen-news-highlights/dnanexus-platform-offers-google-developed-deepvariant/81255267
  • 38. DNAstack—Genomics made simple [Internet]. [cited 28 Jan 2018]. Available from: https://dnastack.com/#/team/mission
  • 45. Google. An Inside Look at Google BigQuery [Internet]. [cited 19 Sep 2017]. Google; 2012. Available from: https://cloud.google.com/files/BigQueryTechnicalWP.pdf
  • 46. AWS. Enterprise Data Warehousing on Amazon Web Services [Internet]. [cited 19 Sep 2017]. 2016. Available from: https://d0.awsstatic.com/whitepapers/enterprise-data-warehousing-on-aws.pdf
  • 53. Szczerba M, Wiewiórka MS, Okoniewski MJ, Rybiński H. Scalable Cloud-Based Data Analysis Software Systems for Big Data from Next Generation Sequencing. Big Data Analysis: New Algorithms for a New Society. Springer, Cham; 2016. pp. 263–283.
  • 54. Foster I, Gannon DB. Cloud Computing for Science and Engineering. MIT Press; 2017.
  • 55. Data4Cure I. Data4Cure :: Biomedical Intelligence [Internet]. [cited 2 Feb 2018]. Available from: https://www.data4cure.com/solutions.html
  • 56. O’Reilly PD. Federal Information Security Management Act (FISMA) Implementation Project. 2009; [internet]. [cited 7 Feb 2018]. Available from: https://www.nist.gov/programs-projects/federal-information-security-management-act-fisma-implementation-project
  • 57. Ross RS. Security and Privacy Controls for Federal Information Systems and Organizations [includes updates as of 5/7/13] [Internet].[cited 7 Feb 2018]. 2013. Available from: https://www.nist.gov/publications/security-and-privacy-controls-federal-information-systems-and-organizations-includes
  • 58. Author: National Institute of Standards, Technology. FIPS 200, Minimum Security Requirements for Federal Info and Info Systems | CSRC [Internet]. [cited 7 Feb 2018]. Available from: http://csrc.nist.gov/publications/fips/fips200/
  • 64. Data Science at NIH [Internet]. [cited 1 Feb 2018]. Available from: https://datascience.nih.gov/DataCommonsPilotPhaseAwards
  • 65. Google Cloud Platform Pricing Calculator -. In: Google Cloud Platform [Internet]. [cited 22 Sep 2017]. Available from: https://cloud.google.com/products/calculator/
  • 66. Amazon Web Services. AWS Programs for Research and Education. In: AWS Programs for Research and Education [Internet]. [cited 22 Sep 2017]. Available from: https://aws.amazon.com/grants/
  • 67. Google. Education Grants—Free Credits for University Computer Science Classes | Google Cloud Platform. In: Google Cloud Platform [Internet]. [cited 22 Sep 2017]. Available from: https://cloud.google.com/edu/
  • 68. Microsoft. Microsoft Azure for Research—Microsoft Research. In: Microsoft Research [Internet]. [cited 22 Sep 2017]. Available from: https://www.microsoft.com/en-us/research/academic-program/microsoft-azure-for-research/
  • Español – América Latina
  • Português – Brasil

What is Cloud Computing?

Understanding the types of cloud computing resources can be time-consuming and costly. Enterprises need to buy physical servers and other infrastructure through procurement processes that can take months, and support the architecture of cloud computing. The acquired systems require a physical space, typically a specialized room with sufficient power and cooling. After configuring and deploying the systems, enterprises need expert personnel to manage them.

This long process is difficult to scale when demand spikes or business expands. Enterprises can acquire more computing resources than needed, ending up with low utilization numbers.

Cloud computing addresses these issues by offering computing resources as scalable, on-demand services. Learn more about Google Cloud , a suite of cloud computing service models offered by Google.

image of person learning

Cloud computing defined

Cloud computing is the on-demand availability of computing resources (such as storage and infrastructure), as services over the internet. It eliminates the need for individuals and businesses to self-manage physical resources themselves, and only pay for what they use.

The main cloud computing service models include infrastructure as a service offers compute and storage services, platform as a service offers a develop-and-deploy environment to build cloud apps, and software as a service delivers apps as services.

Understanding how cloud computing works

Cloud computing service models are based on the concept of sharing on-demand computing resources, software, and information over the internet. Companies or individuals pay to access a virtual pool of shared resources, including compute, storage, and networking services, which are located on remote servers that are owned and managed by service providers. 

One of the many advantages of cloud computing is that you only pay for what you use. This allows organizations to scale faster and more efficiently without the burden of having to buy and maintain their own physical data centers and servers.  

In simpler terms, cloud computing uses a network (most often, the internet) to connect users to a cloud platform where they request and access rented computing services. A central server handles all the communication between client devices and servers to facilitate the exchange of data. Security and privacy features are common components to keep this information secure and safe.  

When adopting cloud computing architecture, there is no one-size-fits-all. What works for another company may not suit you and your business needs. In fact, this flexibility and versatility is one of the hallmarks of cloud, allowing enterprises to quickly adapt to changing markets or metrics.

There are three different cloud computing deployment models: public cloud, private cloud, and hybrid cloud.

Types of cloud computing deployment models

Public cloud.

Public clouds are run by third-party cloud service providers. They offer compute, storage, and network resources over the internet, enabling companies to access shared on-demand resources based on their unique requirements and business goals.

Private cloud

Private clouds are built, managed, and owned by a single organization and privately hosted in their own data centers, commonly known as “on-premises” or “on-prem.” They provide greater control, security, and management of data while still enabling internal users to benefit from a shared pool of compute, storage, and network resources.

Hybrid cloud

Hybrid clouds combine public and private cloud models, allowing companies to leverage public cloud services and maintain the security and compliance capabilities commonly found in private cloud architectures.

What are the types of cloud computing services?

There are three main types of cloud computing service models that you can select based on the level of control, flexibility, and management your business needs: 

Infrastructure as a service (IaaS)

Infrastructure as a service (IaaS) offers on-demand access to IT infrastructure services, including compute, storage, networking, and virtualization. It provides the highest level of control over your IT resources and most closely resembles traditional on-premises IT resources.

Platform as a service (PaaS)

Platform as a service (PaaS) offers all the hardware and software resources needed for cloud application development. With PaaS, companies can focus fully on application development without the burden of managing and maintaining the underlying infrastructure.

Software as a service (SaaS)

Software as a service (SaaS) delivers a full application stack as a service, from underlying infrastructure to maintenance and updates to the app software itself. A SaaS solution is often an end-user application, where both the service and the infrastructure is managed and maintained by the cloud service provider.

What are the benefits of cloud computing?

It’s flexible.

Due to the architecture of cloud computing, enterprises and their users can access cloud services from anywhere with an internet connection, scaling services up or down as needed.

It’s efficient

Enterprises can develop new applications and rapidly get them into production—without worrying about the underlying infrastructure.

It offers strategic value

Because cloud providers stay on top of the latest innovations and offer them as services to customers, enterprises can get more competitive advantages—and a higher return on investment—than if they’d invested in soon-to-be obsolete technologies.

It’s secure

Enterprises often ask, What are the security risks of cloud computing? They are considered relatively low. Cloud computing security is generally recognized as stronger than that in enterprise data centers, because of the depth and breadth of the security mechanisms cloud providers put into place. Plus, cloud providers’ security teams are known as top experts in the field.

It’s cost-effective

Whatever cloud computing service model is used, enterprises only pay for the computing resources they use. They don’t need to overbuild data center capacity to handle unexpected spikes in demand or business growth, and they can deploy IT staff to work on more strategic initiatives.

Solve your business challenges with Google Cloud

How cloud computing can help your organization.

The pace of innovation—and the need for advanced computing to accelerate this growth—makes cloud computing a viable option to advance research and speed up new product development. Cloud computing can give enterprises access to scalable resources and the latest technologies without needing to worry about capital expenditures or limited fixed infrastructure. What is the future of cloud computing? It’s expected to become the dominant enterprise IT environment.

If your organization experiences any of the following, you’re probably a good candidate for cloud computing:

  • High business growth that outpaces infrastructure capabilities
  • Low utilization of existing infrastructure resources
  • Large volumes of data that are overwhelming your on-premises data storage resources
  • Slow response times with on-premises infrastructure
  • Delayed product development cycles due to infrastructure constraints
  • Cash flow challenges due to high computing infrastructure expenses
  • Highly mobile or distributed user population

These scenarios require more than traditional data centers can provide.

Infrastructure scaling

Many organizations, including those in retail, have wildly varying needs for compute capacity. Cloud computing easily accommodates these fluctuations.  

Disaster recovery

Rather than building more data centers to ensure continuity during disasters, businesses use cloud computing to safely back up their digital assets.

Data storage

Cloud computing helps overloaded data centers by storing large volumes of data, making it more accessible, easing analysis, and making backup easier.

Application development

Cloud computing offers enterprise developers quick access to tools and platforms for building and testing applications, speeding up time to market.

Big data analytics

Cloud computing offers almost unlimited resources to process large volumes of data to speed research and reduce time to insights.

Related products and services

Google Cloud is a suite of cloud computing services that runs on the same infrastructure that Google uses internally for their own consumer products, such as Google Search, Gmail, and YouTube.

The list of available Google Cloud services is long—and it keeps growing. When developing applications or running workloads on Google Cloud, enterprises can mix and match these services into combinations that provide the infrastructure they need.

Take the next step

Start building on Google Cloud with $300 in free credits and 20+ always free products.

Start your next project, explore interactive tutorials, and manage your account.

  • Need help getting started? Contact sales
  • Work with a trusted partner Find a partner
  • Continue browsing See all products
  • Get tips & best practices See tutorials

'ZDNET Recommends': What exactly does it mean?

ZDNET's recommendations are based on many hours of testing, research, and comparison shopping. We gather data from the best available sources, including vendor and retailer listings as well as other relevant and independent reviews sites. And we pore over customer reviews to find out what matters to real people who already own and use the products and services we’re assessing.

When you click through from our site to a retailer and buy a product or service, we may earn affiliate commissions. This helps support our work, but does not affect what we cover or how, and it does not affect the price you pay. Neither ZDNET nor the author are compensated for these independent reviews. Indeed, we follow strict guidelines that ensure our editorial content is never influenced by advertisers.

ZDNET's editorial team writes on behalf of you, our reader. Our goal is to deliver the most accurate information and the most knowledgeable advice possible in order to help you make smarter buying decisions on tech gear and a wide array of products and services. Our editors thoroughly review and fact-check every article to ensure that our content meets the highest standards. If we have made an error or published misleading information, we will correct or clarify the article. If you see inaccuracies in our content, please report the mistake via this form .

What is cloud computing? Everything you need to know about the cloud explained

steve-ranger

What is cloud computing, in simple terms?

Cloud computing is the delivery of computing services—including servers, storage, databases, networking, software, analytics, and intelligence—over the Internet ("the cloud") to offer faster innovation, flexible resources, and economies of scale.

How does cloud computing work?

Rather than owning their own computing infrastructure or data centres, companies can rent access to anything from applications to storage from a cloud service provider.

One benefit of using cloud-computing services is that firms can avoid the upfront cost and complexity of owning and maintaining their own IT infrastructure, and instead simply pay for what they use, when they use it.

In turn, providers of cloud-computing services can benefit from significant economies of scale by delivering the same services to a wide range of customers.

What cloud-computing services are available?

Cloud-computing services cover a vast range of options now, from the basics of storage, networking and processing power, through to natural language processing and artificial intelligence as well as standard office applications. Pretty much any service that doesn't require you to be physically close to the computer hardware that you are using can now be delivered via the cloud –  even quantum computing .

What are examples of cloud computing?

Cloud computing underpins a vast number of services. That includes consumer services like Gmail or the cloud backup of the photos on your smartphone, though to the services that allow large enterprises to host all their data and run all of their applications in the cloud. For example, Netflix  relies on cloud-computing  services to run its its video-streaming service and its other business systems, too.

Cloud computing is becoming the default option for many apps: software vendors are increasingly offering their applications as services over the internet rather than standalone products as they try to switch to a subscription model. However, there are potential downsides to cloud computing, in that it can also introduce new costs and new risks for companies using it.

Why is it called cloud computing?

A fundamental concept behind cloud computing is that the location of the service, and many of the details such as the hardware or operating system on which it is running, are largely irrelevant to the user. It's with this in mind that the metaphor of the cloud was borrowed from old telecoms network schematics, in which the public telephone network (and later the internet) was often represented as a cloud to denote that the location didn't matter – it was just a cloud of stuff. This is an over-simplification of course; for many customers, location of their services and data remains a key issue.

What is the history of cloud computing?

Cloud computing as a term has been around since the early 2000s, but the concept of computing as a service has been around for much, much longer – as far back as the 1960s, when computer bureaus would allow companies to rent time on a mainframe, rather than have to buy one themselves.

These 'time-sharing' services were largely overtaken by the rise of the PC, which made owning a computer much more affordable, and then in turn by the rise of corporate data centres where companies would store vast amounts of data.

But the concept of renting access to computing power has resurfaced again and again – in the application service providers, utility computing, and grid computing of the late 1990s and early 2000s. This was followed by cloud computing, which really took hold with the emergence of software as a service and hyperscale cloud-computing providers such as Amazon Web Services.

How important is the cloud?

Building the infrastructure to support cloud computing now accounts for a significant chunk of all IT spending, while spending on traditional, in-house IT slides as computing workloads continue to move to the cloud, whether that is public cloud services offered by vendors or private clouds built by enterprises themselves.

Indeed, it's increasingly clear that when it comes to enterprise computing platforms,  like it or not, the cloud has won .

Tech analyst Gartner predicts that as much as half of spending across application software, infrastructure software, business process services and system infrastructure markets will have shifted to the cloud by 2025, up from 41% in 2022. It estimates that almost two-thirds of spending on application software will be via cloud computing, up from 57.7% in 2022.

Top Cloud Providers

Top cloud providers: aws, microsoft azure, and google cloud, hybrid, saas players.

Here's a look at how the cloud leaders stack up, the hybrid market, and the SaaS players that run your company as well as their latest strategic moves.

That's a shift that only gained momentum in 2020 and 2021 as businesses accelerated their digital transformation plans during the pandemic. The lockdowns throughout the pandemic showed companies how important it was to be able to access their computing infrastructure, applications and data from wherever their staff were working – and not just from an office.

Gartner said that demand for integration capabilities, agile work processes and composable architecture will drive the continued shift to the cloud.

The scale of cloud spending continues to rise. For the full year 2021, tech analyst IDC expects cloud infrastructure spending to have grown 8.3% compared to 2020 to $71.8 billion, while non-cloud infrastructure is expected to grow just 1.9% to $58.4 billion. Long term, the analyst expects spending on compute and storage cloud infrastructure to see a compound annual growth rate of 12.4% over the 2020-2025 period, reaching $118.8 billion in 2025, and it will account for 67.0% of total compute and storage infrastructure spend. Spending on non-cloud infrastructure will be relatively flat in comparison and reach $58.6 billion in 2025.

All predictions around cloud-computing spending are pointing in the same direction, even if the details are slightly different. The momentum they are describing is the same: tech analyst Canalys reports that worldwide cloud infrastructure services expenditure topped $50 billion in a quarter for the first time in Q4 2021. For the full year, it has cloud infrastructure services spending growing 35% to $191.7 billion

Canalys argues that there is already a new growth opportunity for cloud on the horizon, in the form of augmented and virtual reality and the metaverse. "This will be a significant driver for both cloud services spend and infrastructure deployment over the next decade. In many ways, the metaverse will resemble the internet today, with enhanced capabilities and an amplified compute consumption rate," the analyst said .

What are the core elements of cloud computing?

Cloud computing can be broken down into a number of different constituent elements, focusing on different parts of the technology stack and different use cases. Let's take a look at some of the best known in a bit more detail.

What is Infrastructure as a Service?

Infrastructure as a Service (IaaS) refers to the fundamental building blocks of computing that can be rented: physical or virtual servers, storage and networking. This is attractive to companies that want to build applications from the very ground up and want to control nearly all the elements themselves, but it does require firms to have the technical skills to be able to orchestrate services at that level. 

What is Platform as a Service?

Platform as a Service (PaaS) is the next layer up – as well as the underlying storage, networking, and virtual servers, this layer also includes the tools and software that developers need to build applications on top, which could include middleware, database management, operating systems, and development tools.

What is Software as a Service?

Software as a Service (SaaS) is the delivery of applications as a service, probably the version of cloud computing that most people are used to on a day-to-day basis. The underlying hardware and operating system is irrelevant to the end user, who will access the service via a web browser or app; it is often bought on a per-seat or per-user basis.

SaaS is the largest chunk of cloud spending simply because the variety of applications delivered via SaaS is huge, from CRM such as Salesforce, through to Microsoft's Office 365. And while the whole market is growing at a furious rate, it's the IaaS and PaaS segments that have consistently grown at much faster rates, according to analyst IDC: "This highlights the increasing reliance of enterprises on a cloud foundation built on cloud infrastructure, software-defined data, compute and governance solutions as a Service, and cloud-native platforms for application deployment for enterprise IT internal applications." IDC predicts that IaaS and PaaS will continue growing at a higher rate than the overall cloud market "as resilience, flexibility, and agility guide IT platform decisions".

What is multi-cloud computing?

While the big cloud vendors would be very happy to provide all the computing needs of their enterprise customers, increasingly businesses are looking to spread the load across a number of suppliers.  All of this has lead to the rise of multi-cloud . Part of this approach is to avoid being locked in to just one vendor (which can lead to the sort of high costs and inflexibility that the cloud is often claimed to avoid), and part of it is to find the best mix of technologies across the industry.

That means being able to connect and integrate cloud services from multiple vendors is going to be a new and increasing challenge for business. Problems here include skills shortages (a lack of workers with expertise across multiple clouds) and workflow differences between cloud environments. Customers will also want to manage all their different cloud infrastructure from one place, make it easy to build applications and services and then move them, and ensure that security tools can work across multiple clouds – none of which is especially easy right now.

What are the benefits of cloud computing?

The exact benefits will vary according to the type of cloud service being used but, fundamentally, using cloud services means companies not having to buy or maintain their own computing infrastructure.

No more buying servers, updating applications or operating systems, or decommissioning and disposing of hardware or software when it is out of date, as it is all taken care of by the supplier. For commodity applications, such as email, it can make sense to switch to a cloud provider, rather than rely on in-house skills. A company that specializes in running and securing these services is likely to have better skills and more experienced staff than a small business could afford to hire, so cloud services may be able to deliver a more secure and efficient service to end users.

Using cloud services means companies can move faster on projects and test out concepts without lengthy procurement and big upfront costs, because firms only pay for the resources they consume. This concept of business agility is often mentioned by cloud advocates as a key benefit. The ability to spin up new services without the time and effort associated with traditional IT procurement should mean that it is easier to get going with new applications faster. And if a new application turns out to be wildly popular, the elastic nature of the cloud means it is easier to scale it up fast.

For a company with an application that has big peaks in usage, such as one that is only used at a particular time of the week or year, it might make financial sense to have it hosted in the cloud, rather than have dedicated hardware and software laying idle for much of the time. Moving to a cloud-hosted application for services like email or CRM could remove a burden on internal IT staff, and if such applications don't generate much competitive advantage, there will be little other impact. Moving to a services model also moves spending from capital expenditure (capex) to operational expenditure (opex), which may be useful for some companies.

  • Business continuity is the ultimate killer application for cloud
  • It's official: Supercomputing is now ho-hum (thanks, cloud)

What are the advantages and disadvantages of cloud computing?

Cloud computing is not necessarily cheaper than other forms of computing, just as renting is not always cheaper than buying in the long term. If an application has a regular and predictable requirement for computing services it may be more economical to provide that service in-house.

Some companies may be reluctant to host sensitive data in a service that is also used by rivals. Moving to a SaaS application may also mean you are using the same applications as a rival, which might make it hard to create any competitive advantage if that application is core to your business.

While it may be easy to start using a new cloud application, migrating existing data or apps to the cloud might be much more complicated and expensive. And it seems there is now something of a  shortage in cloud skills , with staff with DevOps and multi-cloud monitoring and management knowledge in particularly short supply.

In one report, a significant proportion of experienced cloud users  said they thought upfront migration costs  ultimately outweigh the long-term savings created by IaaS.

And of course, you can only access your applications if you have an internet connection.

What is cloud-computing adoption doing to IT budgets?

Cloud computing tends to shift spending from capex to opex, as companies buy computing as a service rather than in the form of physical servers. This may allow companies to avoid large increases in IT spending which would traditionally be seen with new projects; using the cloud to make room in the budget might be easier than going to the CFO and looking for more money.

Of course, this doesn't mean that cloud computing is always or necessarily cheaper that keeping applications in-house; for applications with a predictable and stable demand for computing power, it might be cheaper (from a processing power point of view at least) to keep them in-house.

  • Cloud computing spending is growing even faster than expected  

How do you build a business case for cloud computing?

To build a  business case for moving systems to the cloud , you first need to understand what your existing infrastructure actually costs. There's a lot to factor in: obvious things like the cost of running data centres, and extras such as leased lines. The cost of physical hardware – servers and details of specifications like CPUs, cores and RAM, plus the cost of storage. You'll also need to calculate the cost of applications, whether you plan to dump them, re-host them in the cloud unchanged, completely rebuilding them for the cloud, or buy an entirely new SaaS package. Each of these options will have different cost implications. The cloud business case also needs to include people costs (often second only to the infrastructure costs) and more nebulous concepts like the benefit of being able to provide new services faster. Any cloud business case should also factor in the potential downsides, including the risk of being locked into one vendor for your tech infrastructure (see multi-cloud, above).

  • Cloud computing: How to build a business case
  • Cloud computing: What it's like to make the move

Cloud-computing adoption

Analysts argue that as the cloud now underpins most new technological disruptions in everything from mobile banking to healthcare, usage is only going grow. It's hard to see many new technology projects being delivered that don't harness the cloud in some way. Gartner says that more than 85% of organizations will embrace a cloud-first principle by 2025 and will not be able to fully execute on their digital strategies without it. The analyst says new workloads deployed in a cloud-native environment will be pervasive, not just popular, and anything non-cloud will be considered legacy. By 2025, Gartner estimates that over 95% of new digital workloads will be deployed on cloud-native platforms, up from 30% in 2021.

And if that sounds unrealistic, it may be that figures on adoption of cloud depend on who you talk to inside an organisation. Not all cloud spending will be driven centrally by the CIO: cloud services are relatively easy to sign-up for, so business managers can start using them, and pay out of their own budget, without needing to inform the IT department. This can enable businesses to move faster, but also can create security risks if the use of apps is not managed.

Adoption will also vary by application: cloud-based email is much easier to adopt than a new finance system, for example. And for systems such as supply chain management, that are working efficiently as they are, there will be less short-term pressure to do a potentially costly and risky shift to the cloud.

What about cloud-computing security?

Many companies remain concerned about the security of cloud services, although breaches of security are rare. How secure you consider cloud computing to be will largely depend on how secure your existing systems are. In-house systems managed by a team with many other things to worry about are likely to be more leaky than systems monitored by a cloud provider's engineers dedicated to protecting that infrastructure.

However, concerns do remain about security, especially for companies moving their data between many cloud services, which has led to growth in  cloud security tools , which monitor data moving to and from the cloud and between cloud platforms. These tools can identify fraudulent use of data in the cloud, unauthorised downloads, and malware. There is a financial and performance impact, however: these tools can reduce the return on investment of the cloud by 5% to 10%, and impact performance by 5% to 15%. The country of origin of cloud services is also worrying some organisations (see ' Is geography irrelevant when it comes to cloud computing?'  below)

  • Cloud security and IoT are the new peanut butter and jelly
  • Azure confidential computing: Microsoft boosts security for cloud data
  • Three smart cloud services that can help keep your business more secure
  • Cloud computing security: This is where you'll be spending the money
  • Security as a Service? We want it, say IT leaders

What is public cloud?

Public cloud is the classic cloud-computing model, where users can access a large pool of computing power over the internet (whether that is IaaS, PaaS, or SaaS). One of the significant benefits here is the ability to rapidly scale a service. The cloud-computing suppliers have vast amounts of computing power, which they share out between a large number of customers – the 'multi-tenant' architecture. Their huge scale means they have enough spare capacity that they can easily cope if any particular customer needs more resources, which is why it is often used for less-sensitive applications that demand a varying amount of resources.

What is private cloud?

Private cloud allows organizations to benefit from some of the advantages of public cloud – but without the concerns about relinquishing control over data and services, because it is tucked away behind the corporate firewall. Companies can control exactly where their data is being held and can build the infrastructure in a way they want – largely for IaaS or PaaS projects – to give developers access to a pool of computing power that scales on-demand without putting security at risk. However, that additional security comes at a cost, as few companies will have the scale of AWS, Microsoft or Google, which means they will not be able to create the same economies of scale. Still, for companies that require additional security, private cloud might be a useful stepping stone, helping them to understand cloud services or rebuild internal applications for the cloud, before shifting them into the public cloud.

What is hybrid cloud?

Hybrid cloud is perhaps  where everyone is in reality: a bit of this, a bit of that. Some data in the public cloud, some projects in private cloud, multiple vendors and different levels of cloud usage. 

  • Infographic: Companies are turning to hybrid cloud to save money
  • What does 'hybrid cloud' mean? It depends on whom you ask
  • Managing the multi-cloud: It's complicated

What are the cloud-computing migration costs?

For startups that plan to run all their systems in the cloud, getting started is pretty simple. But the majority of companies, it is not so simple: with existing applications and data, they need to work out which systems are best left running as they are, and which to start moving to cloud infrastructure. This is a potentially risky and expensive move, and migrating to the cloud could cost companies more if they underestimate the scale of such projects.

A survey of 500 businesses that  were early cloud adopters  found that the need to rewrite applications to optimise them for the cloud was one of the biggest costs, especially if the apps were complex or customised. A third of those surveyed cited high fees for passing data between systems as a challenge in moving their mission-critical applications. The skills required for migration are both difficult and expensive to find – and even when organisations could find the right people, they risked them being stolen away by cloud-computing vendors with deep pockets. 

Beyond this, the majority also remained worried about the performance of critical apps, and one in three cited this as a reason for not moving some critical applications.

  • Cloud computing migration: More expensive and complicated than you thought
  • Technology migrations are more painful, and cloud isn't making them any easier
  • Where does the NAS fit in an increasingly cloud-centric world?

Is geography irrelevant when it comes to cloud computing?

Actually, it turns out that is where the cloud really does matter. Geopolitics is forcing significant changes on cloud-computing users and vendors. Firstly, there is the issue of latency: if the application is coming from a data centre on the other side of the planet, or on the other side of a congested network, then you might find it sluggish compared to a local connection. That's the latency problem.

Secondly, there is the issue of data sovereignty. Many companies, particularly in Europe, have to worry about where their data is being processed and stored. European companies are worried that, for example, if their customer data is being stored in data centres in the US or (owned by US companies), it could be accessed by US law enforcement. As a result, the big cloud vendors have been building out a regional data centre network so that organizations can keep their data in their own region.

Some have gone further, effectively detatching some of those datacenters from their main business to make it much harder for US authorities – and others – to demand access to the customer data stored there. The customer data in the data centres is under the control of an independent company, which acts as a "data trustee", and US parents cannot access data at the sites without the permission of customers or the data trustee. Expect to see cloud vendors opening more data centres around the world to cater to customers with requirements to keep data in specific locations.

Cloud security is another issue; the UK government's cyber security agency has warned that government agencies  need to consider the country of origin  when it comes to adding cloud services into their supply chains. While it was warning about antivirus software in particular, the issue is the same for other types of services too.

What is a cloud-computing region? And what is a cloud-computing availability zone?

Cloud-computing services are operated from giant datacenters around the world. AWS divides this up by  'regions' and 'availability zones' . Each AWS region is a separate geographic area, like EU (London) or US West (Oregon), which AWS then further subdivides into what it calls availability zones (AZs). An AZ is composed of one or more datacenters that are far enough apart that in theory a single disaster won't take both offline, but close enough together for business continuity applications that require rapid failover. Each AZ has multiple internet connections and power connections to multiple grids: AWS has over 80 AZs.

Google  uses a similar model , dividing its cloud-computing resources into regions that are then subdivided into zones, which include one or more datacenters from which customers can run their services. It currently over eight zones: Google recommends customers deploy applications across multiple zones and regions to help protect against unexpected failures.

Microsoft Azure  divides its resources slightly differently . It offers regions that it describes as is a "set of datacentres deployed within a latency-defined perimeter and connected through a dedicated regional low-latency network". It also offers 'geographies' typically containing two or more regions, that can be used by customers with specific data-residency and compliance needs "to keep their data and apps close". It also offers availability zones made up of one or more data centres equipped with independent power, cooling and networking.

Cloud computing and power usage

Those data centres are also sucking up a huge amount of power: for example, Microsoft struck a deal with GE to buy all of the output from its new 37-megawatt wind farm in Ireland for the next 15 years in order to power its cloud data centres. Ireland said it now expects data centres to account for  15% of total energy demand by 2026 , up from less than 2% back in 2015.

  • Cloud computing: IBM overhauls access rules at Euro data centre
  • AWS just sold some of its cloud computing infrastructure in China

Which are the big cloud-computing companies?

When it comes to IaaS and PaaS, there are really only a few giant cloud providers. Leading the way is Amazon Web Services, and then the following pack of Microsoft's Azure, Google, and IBM. According to data from Synergy Research, Amazon, Microsoft and Google continue to attract well over half of worldwide cloud spending, with Q3 market shares of 33%, 20% and 10% respectively. And with growth rates that are higher than the overall market, their share of worldwide revenues continues to grow. However, that still leaves plenty of revenue for the chasing pack of companies – about $17 billion. "Clearly there are challenges with the big three companies lurking in the background, so the name of the game is not competing with them head on,"  said the analyst .

AWS, Azure and Google Cloud – what's the difference?

The big three cloud companies all have their own strengths. AWS is the most established player and was behind Amazon's ability to support huge seasonal swings in demand from consumers. Being first out to market with cloud services and pushing hard to gain market share has made it the market leader, and it continues to innovate. Microsoft's Azure has become an absolutely core part of Microsoft's strategy, and the company has the enterprise history and products to support businesses as they switch to the cloud. Google Cloud is the smallest of the big three players, but clearly has the might of the advertising-to-Android giant behind it.

Who are the other main cloud-computing players?

Beyond the big three there are others, such as Alibaba Cloud, IBM, Dell and Hewlett Packard Enterprise, that all want to be part of the enterprise cloud project. And of course, from giants like Salesforce down to tiny startups, pretty much every software company is a SaaS company now.  

Can cloud computing go wrong?

There are and will continue to be cloud outages. Those outages might happen at a local level because your internet is disrupted either by physical means (a digger cuts your broadband) or because of cyberattacks. But the big vendors have outages too and because, we are all increasingly reliant on their services, when the cloud stops, work stops. Few companies have backup systems to turn to in this situation. So long as cloud vendors keep outages to a minimum, then users will probably consider that using the cloud is more reliable than home-grown apps. But if outages become widespread, that opinion might change.

What is the future of cloud computing?

Cloud computing is reaching the point where it is likely to account for more of enterprise tech spending than the traditional forms of delivering applications and services in-house that have been around for decades. However, use of the cloud is only likely to climb as organisations get more comfortable with the idea of their data being somewhere other than a server in the basement. And now cloud-computing vendors are increasingly pushing cloud computing as an agent of digital transformation instead of focusing simply on cost. Moving to the cloud can help companies rethink business processes and accelerate business change, goes the argument, by helping to break  down data any organisational silos . Some companies that need to boost momentum around their digital transformation programmes might find this argument appealing; others may find enthusiasm for the cloud waning as the costs of making the switch add up.

  • Why you're still scared of the Cloud (it's not about security or reliability)
  • Cloud computing switch as digital transformation takes priority
  • Moving to the cloud? Some advice to consider

Cloud-computing case studies

There are plenty of examples of organisations deciding to go down the cloud-computing route: here are a few examples of recent announcements.

  • What's the best way to make the most of the cloud?
  • US Air Force plots IT overhaul, aims for cloud
  • DuluxGroup paints a future of procurement in the cloud
  • Marketo to migrate to Google Cloud as part of multi-year deal
  • AWS infrastructure is now behind three main streaming media providers
  • American Airlines to move consumer-facing apps to IBM Cloud, says Cloud Foundry key

Previous coverage

The Art of the Hybrid Cloud

Cloud computing is gobbling up more of the services that power businesses. But, some have privacy, security, and regulatory demands that preclude the public cloud. Here's how to find the right mix.

  • Public cloud, private cloud, or hybrid cloud: What's the difference?

Trying to understand and articulate the differences between public, private, and hybrid cloud? Here's a quick breakdown.

Read more on cloud computing

  • Businesses will spend $128 billion on public cloud this year, says IDC

What is Google One, and is it worth it?

Amazon ceo: generative ai 'may be the largest technology transformation since the cloud', google unveils new gemini-powered security updates to chronicle and workspace.

  • Trending Now
  • Foundational Courses
  • Data Science
  • Practice Problem
  • Machine Learning
  • System Design
  • DevOps Tutorial
  • AWS vs Azure: Which One Should You Choose in 2024
  • What is GCP (Google Cloud Platform) Security?
  • Advantages and Disadvantages of Cloud Security
  • 7 Privacy Challenges in Cloud Computing
  • Top 10 Cloud Services For Database-as-a-Service
  • 5 Programming Languages For Every Cloud Engineer to Learn
  • Economics of Cloud Computing
  • Why Cloud Computing is the Best Choice for Small Businesses?
  • Top 5 Cloud Computing Companies to Work For in 2024
  • 10 Best Cloud Computing Project Ideas
  • Cloud Computing Platforms and Technologies
  • Serverless Computing
  • How Cloud Contact Center Will Transform Your Business?
  • Top 10 Job Opportunities in Cloud Computing
  • How to Make a Career in Cloud Computing?
  • How Cloud Storage Actually Works !!
  • Top 10 Cloud Platform Service Providers in 2024
  • Portable applications in Cloud and their barriers
  • How Emerging Economies and Different Industries are Getting Benefit From Cloud Computing?

Top 10 Cloud Computing Research Topics in 2020

Cloud computing has suddenly seen a spike in employment opportunities around the globe with tech giants like Amazon, Google, and Microsoft hiring people for their cloud infrastructure. Before the onset of cloud computing, companies and businesses had to set up their own data centers, allocate resources and other IT professionals thereby increasing the cost. The rapid development of the cloud has led to more flexibility, cost-cutting, and scalability. 

Top-10-Cloud-Computing-Research-Topics-in-2020

The Cloud Computing market its an all-time high with the current market size at USD 371.4 billion and is expected to grow up to USD 832.1 billion by 2025! It’s quickly evolving and gradually realizing its business value along with attracting more and more researchers, scholars, computer scientists, and practitioners. Cloud computing is not a single topic but a composition of various techniques which together constitute the cloud. Below are 10 the most demanded research topics in the field of cloud computing:

1. Big Data

Big data refers to the large amounts of data produced by various programs in a very short duration of time. It is quite cumbersome to store such huge and voluminous amounts of data in company-run data centers. Also, gaining insights from this data becomes a tedious task and takes a lot of time to run and provide results, therefore cloud is the best option. All the data can be pushed onto the cloud without the need for physical storage devices that are to be managed and secured. Also, some popular public clouds provide comprehensive big data platforms to turn data into actionable insights. 

DevOps is an amalgamation of two terms, Development and Operations. It has led to Continuous Delivery, Integration, and Deployment and therefore reducing boundaries between the development team and the operations team. Heavy applications and software need elaborate and complex tech stacks that demand extensive labor to develop and configure which can easily be eliminated by cloud computing. It offers a wide range of tools and technologies to build, test, and deploy applications with a few minutes and a single click. They can be customized as per the client requirements and can be discarded when not in use hence making the process seamless and cost-efficient for development teams.

3. Cloud Cryptography

Data in the cloud is needed to be protected and secured from foreign attacks and breaches. To accomplish this, cryptography in the cloud is a widely used technique to secure data present in the cloud. It allows users and clients to easily and reliably access the shared cloud services since all the data is secured using either the encryption techniques or by using the concept of the private key. It can make the plain text unreadable and limits the view of the data being transferred. Best cloud cryptographic security techniques are the ones that do not compromise the speed of data transfer and provide security without delaying the exchange of sensitive data. 

4. Cloud Load Balancing

It refers to splitting and distributing the incoming load to the server from various sources. It permits companies and organizations to govern and supervise workload demands or application demands by redistributing, reallocating, and administering resources between different computers, networks, or servers. Cloud load balancing encompasses holding the circulation of traffic and demands that exist over the Internet. This reduces the problem of sudden outages, results in an improvement in overall performance, has rare chances of server crashes, and also provides an advanced level of security. Cloud-based servers farms can accomplish more precise scalability and accessibility using the server load balancing mechanism. Due to this, the workload demands can be easily distributed and controlled.

5. Mobile Cloud Computing

It is a mixture of cloud computing, mobile computing, and wireless network to provide services such as seamless and abundant computational resources to mobile users, network operators, and cloud computing professionals. The handheld device is the console and all the processing and data storage takes place outside the physical mobile device. Some advantages of using mobile cloud computing are that there is no need for costly hardware, battery life is longer, extended data storage capacity and processing power improved synchronization of data and high availability due to “store in one place, accessible from anywhere”. The integration and security aspects are taken care of by the backend that enables support to an abundance of access methods. 

6. Green Cloud Computing

The major challenge in the cloud is the utilization of energy-efficient and hence develop economically friendly cloud computing solutions. Data centers that include servers, cables, air conditioners, networks, etc. in large numbers consume a lot of power and release enormous quantities of Carbon Dioxide in the atmosphere. Green Cloud Computing focuses on making virtual data centers and servers to be more environmentally friendly and energy-efficient. Cloud resources often consume so much power and energy leading to a shortage of energy and affecting the global climate. Green cloud computing provides solutions to make such resources more energy efficient and to reduce operational costs. This pivots on power management, virtualization of servers and data centers, recycling vast e-waste, and environmental sustainability. 

7. Edge Computing

It is the advancement and a much more efficient form of Cloud computing with the idea that the data is processed nearer to the source. Edge Computing states that all of the computation will be carried out at the edge of the network itself rather than on a centrally managed platform or the data warehouses. Edge computing distributes various data processing techniques and mechanisms across different positions. This makes the data deliverable to the nearest node and the processing at the edge. This also increases the security of the data since it is closer to the source and eliminates late response time and latency without affecting productivity.

8. Containerization

Containerization in cloud computing is a procedure to obtain operating system virtualization. The user can work with a program and its dependencies utilizing remote resource procedures. The container in cloud computing is used to construct blocks, which aid in producing operational effectiveness, version control, developer productivity, and environmental stability. The infrastructure is upgraded since it provides additional control over the granular activities over the resources. The usage of containers in online services assists storage with cloud computing data security, elasticity, and availability. Containers provide certain advantages such as a steady runtime environment, the ability to run virtually anywhere, and the low overhead compared to virtual machines. 

9. Cloud Deployment Model

There are four main cloud deployment models namely public cloud, private cloud, hybrid cloud, and community cloud. Each deployment model is defined as per the location of the infrastructure. The public cloud allows systems and services to be easily accessible to the general public. Public cloud could also be less reliable since it is open to everyone e.g. Email. A private cloud allows systems and services to be accessible inside an organization with no access to outsiders. It offers better security due to its access restrictions. Hybrid cloud is a mixture of private and public clouds with the critical activities being performed using private cloud and non-critical activities being performed using the public cloud. Community cloud allows system and services to be accessible by a group of an organization.

10. Cloud Security

Since the number of companies and organizations using cloud computing is increasing at a rapid rate, the security of the cloud is a major concern. Cloud computing security detects and addresses every physical and logical security issue that comes across all the varied service models of code, platform, and infrastructure. It collectively addresses these services, however, these services are delivered in units, that is, the public, private, or hybrid delivery model. Security in the cloud protects the data from any leakage or outflow, theft, calamity, and removal. With the help of tokenization, Virtual Private Networks, and firewalls data can be secured. 

Please Login to comment...

Similar reads.

author

  • Cloud-Computing
  • 10 Ways to Use Microsoft OneNote for Note-Taking
  • 10 Best Yellow.ai Alternatives & Competitors in 2024
  • 10 Best Online Collaboration Tools 2024
  • 10 Best Autodesk Maya Alternatives for 3D Animation and Modeling in 2024
  • 30 OOPs Interview Questions and Answers (2024)

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

Cloud Computing .

What is cloud computing how the cloud works, cloud computing 101.

S imply put, cloud computing is when computing services are stored and accessed over the internet instead of through physical hard drives. Our guide will walk you through what cloud computing it is, how it works and how it’s being used today.

cloud surrounded by gears and digital numbers to represesnt cloud computing

What Is Cloud Computing?

Cloud computing refers to any kind of hosted service delivered over the internet. These services often include servers, databases, software, networks, analytics and other computing functions that can be operated through the cloud.

Files and programs stored in the cloud can be accessed anywhere by users on the service, eliminating the need to always be near physical hardware. In the past, for example, user-created documents and spreadsheets had to be saved to a physical hard drive, USB drive or disk. Without some kind of hardware component, the files were completely inaccessible outside the computer they originated on. Thanks to cloud storage, few people worry anymore about fried hard drives or lost or corrupted USB drives. Cloud computing makes the documents available everywhere because the data actually lives on a network of hosted servers that transmit data over the internet.

Cloud Computing Service Types

Cloud computing services are broken down into three major categories: software-as-a-service (SaaS), platform-as-a-service (PaaS) and infrastructure-as-a-service (IaaS).

Software-as-a-Service

SaaS is the most common cloud service type. Many of us use it on a daily basis. The SaaS model makes software accessible through an app or web browser. Some SaaS programs are free, but many require a monthly or annual subscription to maintain the service. Requiring no hardware installation or management, SaaS solutions are a big hit in the business world. Notable examples include Salesforce, Dropbox and Google Docs.

Platform-as-a-Service

PaaS is a cloud environment supporting web application development and deployment. PaaS supports the full lifecycle of applications, helping users build, test, deploy, manage and update all in one place. The service also includes development tools, middleware and business intelligence solutions. Notable examples include Windows Azure, AWS Elastic Beanstalk and Google App Engine.

Infrastructure-as-a-Service

IaaS provides users with basic computer infrastructure capabilities like data storage, servers and hardware — all in the cloud. IaaS gives businesses access to large platforms and applications without the need for large onsite physical infrastructures. Notable examples of IaaS include DigitalOcean, Amazon EC2 and Google Compute Engine.

Related Reading 25 Top IaaS Providers and Companies to Know

Benefits of Cloud Computing

There’s lots to be gained from cloud computing. In an increasingly remote world, one of the cloud’s big draws is that it allows for business operations to continue regardless of location. A report from Grand View Research highlighted the transition to remote work models as a significant driver of growth in the U.S. cloud computing market, which is projected to see a compound annual growth rate of 15.7 percent from 2022 to 2030.

Cloud storage also comes with a number of other benefits. Business leaders have reported the top benefits of cloud computing to be increased efficiency, faster deployment, collaboration tools, security and remote accessibility, according to Statista .

How Does Cloud Computing Work?

The cloud is basically a decentralized place to share information through satellite networks. Every cloud application has a host, and the hosting company is responsible for maintaining the massive data centers that provide the security, storage capacity and computing power needed to maintain all of the information users send to the cloud.

How Cloud Computing Works

These hosting companies can sell the rights to use their clouds and store data on their networks, while also offering the end user an ecosystem that can communicate between devices and programs (for example, download a song on your laptop and it’s instantly synced to the music app on your iPhone).

More on Cloud Computing 10 Cloud Computing in Healthcare Examples to Know

Types of Cloud Computing Delivery Models

Public cloud.

This is the most common and all of the players in cloud computing (Amazon, Microsoft, Apple and Google) run public clouds accessible anywhere with login credentials and the right web app.

Private Cloud

This model offers the same kind of flexibility as the public cloud, but with the infrastructure needs (hosting, data storage, IT staff, etc.) provided by the companies or users of the service. Additionally, the restricted access and hands-on management of hosting gives the private model an extra layer of security.

Hybrid Cloud

Hybrid cloud computing is a combination of the public and private models. The two cloud types are linked over the internet and can share resources when needed (for example, if the private cloud reaches storage capacity or becomes corrupted, the public cloud can step in and save the day).

Related Reading 15 Cloud Storage Companies Handling Our Data

Companies and individuals use cloud computing in a variety of unique and exciting ways. The Cloud Infrastructure Report 2021 from cloud management company CloudCheckr revealed close to 60 percent of the 304 IT and business stakeholders surveyed said they had more than half their infrastructure already in the cloud.

The most prominent companies hosting the cloud are major players like Amazon (Amazon Web Services), Microsoft (Azure), Apple (iCloud) and Google (Google Drive), but there’s also a bunch of other players, large and small.

What Are Cloud Companies?

Among the most common applications of cloud computing people might encounter both at work and in their everyday lives are cloud-based collaboration tools like Microsoft OneDrive and Dropbox, movie and music streaming and backup storage for iPhones and other mobile devices.

In addition to some of the examples already covered, here is a quick look at some other important application areas .

Cloud Computing Applications

  • Communication and collaboration
  • Media streaming
  • Big data analytics and insights
  • Streamlining business processes
  • Storage backups and data recovery

Communication and Collaboration

The entire Google suite of applications is cloud-based, from calendar to Google Chat. Additionally, so are popular apps like Zoom and WhatsApp, and all empower people to communicate and collaborate on a global scale.

Entertainment

A combination of cloud computing and vastly improved internet speed has given rise to media streaming giants like Netflix and Hulu, which host enormous databases of movies and TV shows available via the cloud. Cloud storage allows these companies and others like Spotify and Tidal, to exist.

Big Data Analytics

Before the cloud, using big data to glean patterns and insights was a cumbersome and expensive process. The cloud has changed all that, eliminating the need for in-house development resources when compiling and analyzing data. Nowadays companies can collect data from a variety of sources, connect them to the cloud and dig for insights in real time.

Business Processes

Without the cloud, innovative tools like Salesforce, Slack and myriad others designed to enhance and streamline the daily operations of companies would not exist.

Storage Backups

Cloud computing is an important answer to the issue of data-loss and recovery on physical hard drives. Most individuals who’ve owned a computer have experienced the stress of losing irreplaceable files. Whether it’s a term paper, family photos or the company payroll, cloud storage offers an easily accessible backup solution to keep data safe.

63 Cloud Companies You Should Know

In the age of Web hyper-connectivity and related security issues, cost concerns and environmental consciousness, cloud computing is huge.

cables plugged into cloud

Here’s How to Take Your FinOps Game to the Next Level

6 reasons to learn julia in 2024‍.

A photo of Julia Roberts.

How Observability Empowers DevOps Teams to Build Products Customers Will Love

blue computers scattered across a pink background

What Is Distributed Computing?

A person using a laptop to write code.

5 Software Companies in Manchester to Know

A finger pointing at a holographic image of a cartoonish cloud with metric holograms surrounding it.

What Is an Effective Savings Rate, and How Do I Track It?

4 data sovereignty challenges for msps and how to solve them.

A holographic scroll with a locked symbol inside of a cartoonish cloud.

5 Ways To Deal with Difficult People at Work

A computer connected to the cloud illustration

3 Tips for Successful Cloud Adoption

A smartphone displays the AWS logo in front of a laptop with text about cloud computing

How to Use AWS for Serverless Computing

People working in a factory.

Efficiency: The Most Important KPI You’re Not Thinking About

3 iot latency issues and how to fix them.

A cityscape with different devices shown connected throughout the city.

What Is Amazon Bedrock?

Great companies need great people. that's where we come in..

For enquiries call:

+1-469-442-0620

banner-in1

  • Cloud Computing

Top 10 Cloud Computing Research Topics of 2024

Home Blog Cloud Computing Top 10 Cloud Computing Research Topics of 2024

Play icon

Cloud computing is a fast-growing area in the technical landscape due to its recent developments. If we look ahead to 2024, there are new research topics in cloud computing that are getting more traction among researchers and practitioners. Cloud computing has ranged from new evolutions on security and privacy with the use of AI & ML usage in the Cloud computing for the new cloud-based applications for specific domains or industries. In this article, we will investigate some of the top cloud computing research topics for 2024 and explore what we get most out of it for researchers or cloud practitioners. To master a cloud computing field, we need to check these Cloud Computing online courses .

Why Cloud Computing is Important for Data-driven Business?

The Cloud computing is crucial for data-driven businesses because it provides scalable and cost-effective ways to store and process huge amounts of data. Cloud-based storage and analytical platform helps business to easily access their data whenever required irrespective of where it is located physically. This helps businesses to take good decisions about their products and marketing plans. 

Cloud computing could help businesses to improve their security in terms of data, Cloud providers offer various features such as data encryption and access control to their customers so that they can protect the data as well as from unauthorized access. 

Few benefits of Cloud computing are listed below: 

  • Scalability: With Cloud computing we get scalable applications which suits for large scale production systems for Businesses which store and process large sets of data.
  • Cost-effectiveness : It is evident that Cloud computing is cost effective solution compared to the traditional on-premises data storage and analytical solutions due to its scaling capacity which leads to saving more IT costs. 
  • Security : Cloud providers offer various security features which includes data encryption and access control, that can help businesses to protect their data from unauthorized access.
  • Reliability : Cloud providers ensure high reliability to their customers based on their SLA which is useful for the data-driven business to operate 24X7. 

Top 10 Cloud Computing Research Topics

1. neural network based multi-objective evolutionary algorithm for dynamic workflow scheduling in cloud computing.

Cloud computing research topics are getting wider traction in the Cloud Computing field. These topics in the paper suggest a multi-objective evolutionary algorithm (NN-MOEA) based on neural networks for dynamic workflow scheduling in cloud computing. Due to the dynamic nature of cloud resources and the numerous competing objectives that need to be optimized, scheduling workflows in cloud computing is difficult. The NN-MOEA algorithm utilizes neural networks to optimize multiple objectives, such as planning, cost, and resource utilization. This research focuses on cloud computing and its potential to enhance the efficiency and effectiveness of businesses' cloud-based workflows.

The algorithm predicts workflow completion time using a feedforward neural network based on input and output data sizes and cloud resources. It generates a balanced schedule by taking into account conflicting objectives and projected execution time. It also includes an evolutionary algorithm for future improvement.

The proposed NN-MOEA algorithm has several benefits, such as the capacity to manage dynamic changes in cloud resources and the capacity to simultaneously optimize multiple objectives. The algorithm is also capable of handling a variety of workflows and is easily expandable to include additional goals. The algorithm's use of neural networks to forecast task execution times is a crucial component because it enables the algorithm to generate better schedules and more accurate predictions.

The paper concludes by presenting a novel multi-objective evolutionary algorithm-based neural network-based approach to dynamic workflow scheduling in cloud computing. In terms of optimizing multiple objectives, such as make span and cost, and achieving a better balance between them, these cloud computing dissertation topics on the proposed NN-MOEA algorithm exhibit encouraging results.

Key insights and Research Ideas:

Investigate the use of different neural network architectures for predicting the future positions of optimal solutions. Explore the use of different multi-objective evolutionary algorithms for solving dynamic workflow scheduling problems. Develop a cloud-based workflow scheduling platform that implements the proposed algorithm and makes it available to researchers and practitioners.

2. A systematic literature review on cloud computing security: threats and mitigation strategies 

This is one of cloud computing security research topics in the cloud computing paradigm. The authors then provide a systematic literature review of studies that address security threats to cloud computing and mitigation techniques and were published between 2010 and 2020. They list and classify the risks and defense mechanisms covered in the literature, as well as the frequency and distribution of these subjects over time.

The paper suggests the data breaches, Insider threats and DDoS attack are most discussed threats to the security of cloud computing. Identity and access management, encryption, and intrusion detection and prevention systems are the mitigation techniques that are most frequently discussed. Authors depict the future trends of machine learning and artificial intelligence might help cloud computing to mitigate its risks. 

The paper offers a thorough overview of security risks and mitigation techniques in cloud computing, and it emphasizes the need for more research and development in this field to address the constantly changing security issues with cloud computing. This research could help businesses to reduce the amount of spam that they receive in their cloud-based email systems.

Explore the use of blockchain technology to improve the security of cloud computing systems. Investigate the use of machine learning and artificial intelligence to detect and prevent cloud computing attacks. Develop new security tools and technologies for cloud computing environments. 

3. Spam Identification in Cloud Computing Based on Text Filtering System

A text filtering system is suggested in the paper "Spam Identification in Cloud Computing Based on Text Filtering System" to help identify spam emails in cloud computing environments. Spam emails are a significant issue in cloud computing because they can use up computing resources and jeopardize the system's security. 

To detect spam emails, the suggested system combines text filtering methods with machine learning algorithms. The email content is first pre-processed by the system, which eliminates stop words and stems the remaining words. The preprocessed text is then subjected to several filters, including a blacklist filter and a Bayesian filter, to identify spam emails.

In order to categorize emails as spam or non-spam based on their content, the system also employs machine learning algorithms like decision trees and random forests. The authors use a dataset of emails gathered from a cloud computing environment to train and test the system. They then assess its performance using metrics like precision, recall, and F1 score.

The findings demonstrate the effectiveness of the proposed system in detecting spam emails, achieving high precision and recall rates. By contrasting their system with other spam identification systems, the authors also show how accurate and effective it is. 

The method presented in the paper for locating spam emails in cloud computing environments has the potential to improve the overall security and performance of cloud computing systems. This is one of the interesting clouds computing current research topics to explore and innovate. This is one of the good Cloud computing research topics to protect the Mail threats. 

Create a stronger spam filtering system that can recognize spam emails even when they are made to avoid detection by more common spam filters. examine the application of artificial intelligence and machine learning to the evaluation of spam filtering system accuracy. Create a more effective spam filtering system that can handle a lot of emails quickly and accurately.

4. Blockchain data-based cloud data integrity protection mechanism 

The "Blockchain data-based cloud data integrity protection mechanism" paper suggests a method for safeguarding the integrity of cloud data and which is one of the Cloud computing research topics. In order to store and process massive amounts of data, cloud computing has grown in popularity, but issues with data security and integrity still exist. For the proposed mechanism to guarantee the availability and integrity of cloud data, data redundancy and blockchain technology are combined.

A data redundancy layer, a blockchain layer, and a verification and recovery layer make up the mechanism. For availability in the event of server failure, the data redundancy layer replicates the cloud data across multiple cloud servers. The blockchain layer stores the metadata (such as access rights) and hash values of the cloud data and access control information

Using a dataset of cloud data, the authors assess the performance of the suggested mechanism and compare it to other cloud data protection mechanisms. The findings demonstrate that the suggested mechanism offers high levels of data availability and integrity and is superior to other mechanisms in terms of processing speed and storage space.

Overall, the paper offers a promising strategy for using blockchain technology to guarantee the availability and integrity of cloud data. The suggested mechanism may assist in addressing cloud computing's security issues and enhancing the dependability of cloud data processing and storage. This research could help businesses to protect the integrity of their cloud-based data from unauthorized access and manipulation.

Create a data integrity protection system based on blockchain that is capable of detecting and preventing data tampering in cloud computing environments. For enhancing the functionality and scalability of blockchain-based data integrity protection mechanisms, look into the use of various blockchain consensus algorithms. Create a data integrity protection system based on blockchain that is compatible with current cloud computing platforms. Create a safe and private data integrity protection system based on blockchain technology.

5. A survey on internet of things and cloud computing for healthcare

This article suggests how recent tech trends like the Internet of Things (IoT) and cloud computing could transform the healthcare industry. It is one of the Cloud computing research topics. These emerging technologies open exciting possibilities by enabling remote patient monitoring, personalized care, and efficient data management. This topic is one of the IoT and cloud computing research papers which aims to share a wider range of information. 

The authors categorize the research into IoT-based systems, cloud-based systems, and integrated systems using both IoT and the cloud. They discussed the pros of real-time data collection, improved care coordination, automated diagnosis and treatment.

However, the authors also acknowledge concerns around data security, privacy, and the need for standardized protocols and platforms. Widespread adoption of these technologies faces challenges in ensuring they are implemented responsibly and ethically. To begin the journey KnowledgeHut’s Cloud Computing online course s are good starter for beginners so that they can cope with Cloud computing with IOT. 

Overall, the paper provides a comprehensive overview of this rapidly developing field, highlighting opportunities to revolutionize how healthcare is delivered. New devices, systems and data analytics powered by IoT, and cloud computing could enable more proactive, preventative and affordable care in the future. But careful planning and governance will be crucial to maximize the value of these technologies while mitigating risks to patient safety, trust and autonomy. This research could help businesses to explore the potential of IoT and cloud computing to improve healthcare delivery.

Examine how IoT and cloud computing are affecting patient outcomes in various healthcare settings, including hospitals, clinics, and home care. Analyze how well various IoT devices and cloud computing platforms perform in-the-moment patient data collection, archival, and analysis. assessing the security and privacy risks connected to IoT devices and cloud computing in the healthcare industry and developing mitigation strategies.

6. Targeted influence maximization based on cloud computing over big data in social networks

Big data in cloud computing research papers are having huge visibility in the industry. The paper "Targeted Influence Maximization based on Cloud Computing over Big Data in Social Networks" proposes a targeted influence maximization algorithm to identify the most influential users in a social network. Influence maximization is the process of identifying a group of users in a social network who can have a significant impact or spread information. 

A targeted influence maximization algorithm is suggested in the paper "Targeted Influence maximization based on Cloud Computing over Big Data in Social Networks" to find the most influential users in a social network. The process of finding a group of users in a social network who can make a significant impact or spread information is known as influence maximization.

Four steps make up the suggested algorithm: feature extraction, classification, influence maximization, and data preprocessing. The authors gather and preprocess social network data, such as user profiles and interaction data, during the data preprocessing stage. Using machine learning methods like text mining and sentiment analysis, they extract features from the data during the feature extraction stage. Overall, the paper offers a promising strategy for maximizing targeted influence using big data and Cloud computing research topics to look into. The suggested algorithm could assist companies and organizations in pinpointing their marketing or communication strategies to reach the most influential members of a social network.

Key insights and Research Ideas: 

Develop a cloud-based targeted influence maximization algorithm that can effectively identify and influence a small number of users in a social network to achieve a desired outcome. Investigate the use of different cloud computing platforms to improve the performance and scalability of cloud-based targeted influence maximization algorithms. Develop a cloud-based targeted influence maximization algorithm that is compatible with existing social network platforms. Design a cloud-based targeted influence maximization algorithm that is secure and privacy-preserving.

7. Security and privacy protection in cloud computing: Discussions and challenges

Cloud computing current research topics are getting traction, this is of such topic which provides an overview of the challenges and discussions surrounding security and privacy protection in cloud computing. The authors highlight the importance of protecting sensitive data in the cloud, with the potential risks and threats to data privacy and security. The article explores various security and privacy issues that arise in cloud computing, including data breaches, insider threats, and regulatory compliance.

The article explores challenges associated with implementing these security measures and highlights the need for effective risk management strategies. Azure Solution Architect Certification course is suitable for a person who needs to work on Azure cloud as an architect who will do system design with keep security in mind. 

Final take away of cloud computing thesis paper by an author points out by discussing some of the emerging trends in cloud security and privacy, including the use of artificial intelligence and machine learning to enhance security, and the emergence of new regulatory frameworks designed to protect data in the cloud and is one of the Cloud computing research topics to keep an eye in the security domain. 

Develop a more comprehensive security and privacy framework for cloud computing. Explore the options with machine learning and artificial intelligence to enhance the security and privacy of cloud computing. Develop more robust security and privacy mechanisms for cloud computing. Design security and privacy policies for cloud computing that are fair and transparent. Educate cloud users about security and privacy risks and best practices.

8. Intelligent task prediction and computation offloading based on mobile-edge cloud computing

This Cloud Computing thesis paper "Intelligent Task Prediction and Computation Offloading Based on Mobile-Edge Cloud Computing" proposes a task prediction and computation offloading mechanism to improve the performance of mobile applications under the umbrella of cloud computing research ideas.

An algorithm for offloading computations and a task prediction model makes up the two main parts of the suggested mechanism. Based on the mobile application's usage patterns, the task prediction model employs machine learning techniques to forecast its upcoming tasks. This prediction is to decide whether to execute a specific task locally on the mobile device or offload the computation of it to the cloud.

Using a dataset of mobile application usage patterns, the authors assess the performance of the suggested mechanism and compare it to other computation offloading mechanisms. The findings demonstrate that the suggested mechanism performs better in terms of energy usage, response time, and network usage.

The authors also go over the difficulties in putting the suggested mechanism into practice, including the need for real-time task prediction and the trade-off between offloading computation and network usage. Additionally, they outline future research directions for mobile-edge cloud computing applications, including the use of edge caching and the integration of blockchain technology for security and privacy. 

Overall, the paper offers a promising strategy for enhancing mobile application performance through mobile-edge cloud computing. The suggested mechanism might improve the user experience for mobile users while lowering the energy consumption and response time of mobile applications. These Cloud computing dissertation topic leads to many innovation ideas. 

Develop an accurate task prediction model considering mobile device and cloud dynamics. Explore machine learning and AI for efficient computation offloading. Create a robust framework for diverse tasks and scenarios. Design a secure, privacy-preserving computation offloading mechanism. Assess computation offloading effectiveness in real-world mobile apps.

9. Cloud Computing and Security: The Security Mechanism and Pillars of ERPs on Cloud Technology

Enterprise resource planning (ERP) systems are one of the Cloud computing research topics in particular face security challenges with cloud computing, and the paper "Cloud Computing and Security: The Security Mechanism and Pillars of ERPs on Cloud Technology" discusses these challenges and suggests a security mechanism and pillars for protecting ERP systems on cloud technology.

The authors begin by going over the benefits of ERP systems and cloud computing as well as the security issues with cloud computing, like data breaches and insider threats. They then go on to present a security framework for cloud-based ERP systems that is built around four pillars: access control, data encryption, data backup and recovery, and security monitoring. The access control pillar restricts user access, while the data encryption pillar secures sensitive data. Data backup and recovery involve backing up lost or failed data. Security monitoring continuously monitors the ERP system for threats. The authors also discuss interoperability challenges and the need for standardization in securing ERP systems on the cloud. They propose future research directions, such as applying machine learning and artificial intelligence to security analytics.

Overall, the paper outlines a thorough strategy for safeguarding ERP systems using cloud computing and emphasizes the significance of addressing security issues related to this technology. Organizations can protect their ERP systems and make sure the Security as well as privacy of their data by implementing these security pillars and mechanisms. 

Investigate the application of blockchain technology to enhance the security of cloud-based ERP systems. Look into the use of machine learning and artificial intelligence to identify and stop security threats in cloud-based ERP systems. Create fresh security measures that are intended only for cloud-based ERP systems. By more effectively managing access control and data encryption, cloud-based ERP systems can be made more secure. Inform ERP users about the security dangers that come with cloud-based ERP systems and how to avoid them.

10. Optimized data storage algorithm of IoT based on cloud computing in distributed system

The article proposes an optimized data storage algorithm for Internet of Things (IoT) devices which runs on cloud computing in a distributed system. In IoT apps, which normally generate huge amounts of data by various devices, the algorithm tries to increase the data storage and faster retrials of the same. 

The algorithm proposed includes three main components: Data Processing, Data Storage, and Data Retrieval. The Data Processing module preprocesses IoT device data by filtering or compressing it. The Data Storage module distributes the preprocessed data across cloud servers using partitioning and stores it in a distributed database. The Data Retrieval module efficiently retrieves stored data in response to user queries, minimizing data transmission and enhancing query efficiency. The authors evaluated the algorithm's performance using an IoT dataset and compared it to other storage and retrieval algorithms. Results show that the proposed algorithm surpasses others in terms of storage effectiveness, query response time, and network usage. 

They suggest future directions such as leveraging edge computing and blockchain technology for optimizing data storage and retrieval in IoT applications. In conclusion, the paper introduces a promising method to improve data archival and retrieval in distributed cloud based IoT applications, enhancing the effectiveness and scalability of IoT applications.

Create a data storage algorithm capable of storing and managing large amounts of IoT data efficiently. Examine the use of cloud computing to improve the performance and scalability of data storage algorithms for IoT. Create a secure and privacy-preserving data storage algorithm. Assess the performance and effectiveness of data storage algorithms for IoT in real-world applications.

How to Write a Perfect Research Paper?

  • Choose a topic: Select the topic which is interesting to you so that you can share things with the viewer seamlessly with good content. 
  • Do your research: Read books, articles, and websites on your topic. Take notes and gather evidence to support your arguments.
  • Write an outline: This will help you organize your thoughts and make sure your paper flows smoothly.
  • Start your paper: Start with an introduction that grabs the reader's attention. Then, state your thesis statement and support it with evidence from your research. Finally, write a conclusion that summarizes your main points.
  • Edit and proofread your paper. Make sure you check the grammatical errors and spelling mistakes. 

Cloud computing is a rapidly evolving area with more interesting research topics being getting traction by researchers and practitioners. Cloud providers have their research to make sure their customer data is secured and take care of their security which includes encryption algorithms, improved access control and mitigating DDoS – Deniel of Service attack etc., 

With the improvements in AI & ML, a few features developed to improve the performance, efficiency, and security of cloud computing systems. Some of the research topics in this area include developing new algorithms for resource allocation, optimizing cloud workflows, and detecting and mitigating cyberattacks.

Cloud computing is being used in industries such as healthcare, finance, and manufacturing. Some of the research topics in this area include developing new cloud-based medical imaging applications, building cloud-based financial trading platforms, and designing cloud-based manufacturing systems.

Frequently Asked Questions (FAQs)

Data security and privacy problems, vendor lock-in, complex cloud management, a lack of standardization, and the risk of service provider disruptions are all current issues in cloud computing. Because data is housed on third-party servers, data security and privacy are key considerations. Vendor lock-in makes transferring providers harder and increases reliance on a single one. Managing many cloud services complicates things. Lack of standardization causes interoperability problems and restricts workload mobility between providers. 

Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS) are the cloud computing scenarios where industries focusing right now. 

The six major components of cloud infrastructure are compute, storage, networking, security, management and monitoring, and database. These components enable cloud-based processing and execution, data storage and retrieval, communication between components, security measures, management and monitoring of the infrastructure, and database services.  

Profile

Vinoth Kumar P

Vinoth Kumar P is a Cloud DevOps Engineer at Amadeus Labs. He has over 7 years of experience in the IT industry, and is specialized in DevOps, GitOps, DevSecOps, MLOps, Chaos Engineering, Cloud and Cloud Native landscapes. He has published articles and blogs on recent tech trends and best practices on GitHub, Medium, and LinkedIn, and has delivered a DevSecOps 101 talk to Developers community , GitOps with Argo CD Webinar for DevOps Community. He has helped multiple enterprises with their cloud migration, cloud native design, CICD pipeline setup, and containerization journey.

Avail your free 1:1 mentorship session.

Something went wrong

Upcoming Cloud Computing Batches & Dates

Course advisor icon

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Elsevier - PMC COVID-19 Collection

Logo of pheelsevier

The role of cloud computing technology: A savior to fight the lockdown in COVID 19 crisis, the benefits, characteristics and applications

Sharaf alhomdy.

a Professor. Emirates International University-Yemen

Fursan Thabit

b Ph.D Researcher. School of Computational Sciences, S.R.T.M. University, Nanded, India

Fua'ad Hasan Abdulrazzak

c Associated Professor. Department of Communication Technology and Networks, Thamar University, Yemen

Anandakumar Haldorai

d Associated Professor. Department of Computer Science and Engineering, Sri Eshwar College of Engineering, Coimbatore, Tamil Nadu, India

Sudhir Jagtap

e Professor and Principle of Swami Vivekanand Campus, India

The contagion of the Coronavirus (COVID-19) led to a global lockdown that put governments in emergency mode. With the total number of positive cases worldwide exceeding the 97.46 million mark, social distancing appears to be the only effective strategy to contain the virus at the moment. As a result, companies face obstacles and find it difficult to respond to this current challenge of remote working. The impact of the novel COVID-19 has created many new challenges, and many of us have had to adopt new ways of working. With the need for accessing to critical applications and the scalability of the infrastructure, cloud computing is emerging as an underlying technology. The cloud technology had a major role in fighting the epidemic; it becomes a salvation for governments and organizations in numerous fields of life, education, health, industry, communication, remote surveillance, and more information. Therefore, this study presents the benefits, characteristics and applications of cloud computing and explains how the cloud contributes to improving life in all regions of the world during COVID-19. It shows that the cloud computing helps countries in combating COVID 19, in education and health sectors, also in the economic and commercial aspects. It investigates the current state by distributing an online questionnaire to various people of academic and non-academic backgrounds in different places over the world in the ICT and education sectors. The results showed that there is an effective role for cloud computing during the epidemic.

1. Introduction

The Cloud Computing has been increasingly growing in recent years. This growth has been achieved on the back of both innovation and expansion in the area of cloud powered solutions to the information technology industry offering a modern way to handle different information systems. There are several features of enterprises that are heading into cloud-based data storage solutions. They include a simplified IT technology and administration, efficient remote connectivity from everywhere in the world with a secure internet connection and cost-effective cloud storage as shifted into the idea of on-demand delivery, pooling of resources and integrating everything in the delivery environment [ 1 ].

Since the (COVID-19) has really shattered investor confidence and turned our lives upside down, the technologies have become our saving grace. One of the solutions is that the workers should work remotely. However, companies face obstacles and find it difficult to respond to this current challenge of remote working. And other companies find it difficult to cope with the new phenomenon. For example, enterprises that invested in a cloud computing infrastructure exceeded the current pandemic are functioning well. Therefore, cloud computing helps employees and co-workers collaborate and communicate safely with each other in a remote environment. While other organizations lacking a cloud computing infrastructure find it difficult to run smoothly. The cloud computing reduces the challenges of operating remotely such as chatting, interacting and tracking all the work from home safely [ 2 ].

In addition, during the (COVID-19) period, cloud technology played a major role in fighting the epidemic, i.e. it becomes a salvation for governments and organizations in numerous fields of life, education, health, industry, communication, remote surveillance, and more information.

Tech companies become, as a straw under the sun shines, more reliant on delivering their works from home and stay productive. For instance, in a blog post CEO of Google Cloud, spoke about the increased demand for these services “Over the past few weeks, Meet's daily growth rate has exceeded 60%, and as a result, its daily use is 25 times more than it was in January”. In a positive development, Google revealed in the same article that advanced features in Google Meet are now free for both G Suite and G Suite for Education customers around the world. During the pandemic, he also announced new locations for Google Meet hardware in “South Korea, Hong Kong, Taiwan, Indonesia, and South Africa,” to ensure consumers have the correct hardware to complement the Meet solution. Remote staff as a service was introduced in collaboration with AWS, Microsoft Azure, Google Cloud Platform, Alibaba Cloud, and 3i Infotech. By hosting any local applications or servers, a virtual private network (VPN) as a service makes it simple to operate remotely. It also enables organizations to transfer to the cloud easily [ 3 ].

Cloud computing represents a fundamental pillar for helping the countries for fighting the COVID 19 in the education, telehealth, economy, and business.

1.1. Problem of the study

The impact of the novel coronavirus (COVID-19) outbreak has led to a global lockdown, putting countries on high alert. With more than 97.46 million positive cases globally, social separation appears to be the only viable approach to enclosing the virus at present. As a result, organizations face challenges and find it difficult to adapt to this new challenge of working remotely. The (COVID-19) has created many new challenges which made us to embrace new ways of working. With the need to access critical applications and infrastructure scalability, cloud computing is emerging as a core technology. Cloud computing has become the need of the hour. The cloud has been a big part of our war against the epidemic in various areas of life.

As presented previously, the study problem can be formulated in the following main question:

What is the role cloud computing application in assisting the community to mitigating the effects of the epidemic?

How successful was the experience of using cloud computing as a savior to fight the COVID 19 lockdown in the help of the countries in the education sector?

The following sub-questions are emerged from the main question:

  • • What is the percentage of users' satisfaction with the cloud services during the COVID 19 lockdown period?
  • • What is the availability of the most common cloud applications used during the COVID 19 lockdown period?
  • • What is the nature of cloud computing techniques used as a savior during the COVID 19 lockdown period?
  • • What are the benefits of applying cloud during the COVID 19 lockdown period?
  • • Are the government sectors and the private organization sectors planning to transfer to the cloud computing?

1.2. Objectives of this study

The main objective of this study is to assess the experience of how cloud computing has been a lifesaver to fight the lockdown during the COVID 19 crisis.

The following sub-goals emanate from the main objective:

  • - Knowing the basic structure of cloud computing.
  • - Contributing cloud computing to improve life in all over the world during COVID 19 lockdown period.
  • - Determining the nature of available cloud computing technique to support countries in fighting the COVID pandemic, especially in the educational sector.
  • - Learning about aspects of the benefits of applying cloud computing during the COVID 19 lockdown period.
  • - Measuring the level of planning of the government sectors and the private organization sectors to move to the cloud computing.

1.3. The importance of study

The importance of this research stems from the importance of cloud computing as a new technology and an effective method of conducting business. This evaluation study comes in a parallel line with the global challenge. The importance is also exploring and evaluating the experience that comes in exceptional circumstances and faces a serious challenge in the world.

Accordingly, the aspects of benefiting from this study can be described in the following points:

  • 1. Describing the advantages of cloud computing, its characteristics, and its applications in different areas of life during the COVID 19 crisis.
  • 2. Assisting officials and decision-makers to take the appropriate decisions for employing educational technology and forming it to benefit from it in the future.
  • 3. The study is important to researchers and those who are interested in evaluating experiences in the field of cloud computing.

1.4. The scope of the study

The online questionnaires are distributed via Google Docs to a government university, educational institutions, colleges of applied sciences, private universities∖colleges of technologies and employees who work in ICT in different countries.

The rest of the paper is organized as follows. Section (2) presents some remarks on the COVID-19 crisis. A brief overview of cloud computing is provided in Section (3). Section (4) presents some remarks on related work. The main part of the paper is section (5), which is devoted to several issues related to the scientific framework of the study of using cloud computing service during the COVID 19 lockdown crisis. Section (6) shows the results and discussion. The conclusion and further recommendations will be highlighted in section (7).

2. Overview of COVID 19 crisis

This section describes the brief information of COVID 19 crisis and also its impact on education sector.

2.1. Interdiction of COVID 19 crisis

Coronavirus is a single-stranded RNA virus with a coat and is about 0.1 ​μm in diameter. The virus is spread through droplets such as coughing, direct contact with an infected person, or hands touching contaminated environmental surfaces. The COVID-19 pandemic is the defining public health crisis of our time and the largest threat that faced the humanity since the Second World War. First it appeared in Asia at the end of last year and then spread in all the world. This pandemic is even more than a health crisis stressing on many countries; it has the ability to cause devastating societal, economic and political consequences that can leave deep and long-standing scars. Every day, people lose work and wealth, with no way to tell when normality is going to return. The International labor organization reports that 195 million jobs could be lost [ 4 ].

According to Geneva [ 5 ], January 24th./The TASS/. The World Health Organization (WHO) announced in its daily Sunday Newsletter that more than 577,000 confirmed cases of coronavirus were registered worldwide last day, with a cumulative number of these cases reaching 97.46 million. Up to 97,464,094 cases of coronavirus infection were recorded on January 24, 2021. According to the WHO, the north and South America account for more than 56% of the daily COVID-19 toll (323,950 cases). It is followed by Europe (166,047 cases) and Southeast Asia (29,198 cases). The largest numbers of coronavirus cases were reported from the United States (24,604,325), India (10,654,533), Brazil (8,753,920), Russia (3719,400), the United Kingdom (3617,463), France (2,985,259), Spain (245,6675) and Italy (2,455,185), Germany (2,134,936), Colombia (1,987,418), Argentina (1,853,830) and Mexico (1,732,290) As shown in Fig. 1 .

Fig. 1

WHO (2021).

2.2. Impact of covid-19 on education sector

The COVID-19 pandemic is first and foremost a health crisis. Many countries have decided to close schools, colleges and universities [ [6] , [7] , [8] , [9] ]. The epidemic encapsulates the challenge that politicians face in deciding whether to close schools (to reduce interaction and save lives) or to leave them open (allowing workers to work and preserving the economy). In the short term, many households around the world are experiencing this major disruption: home schooling is a massive blow not just to parental productivity, but also to children's social lives and learning. In an unprecedented and untested rate, education is going online. Many of the tests have been cancelled outright. Importantly, these disruptions may not only be a short-term problem, but they may also have long-term consequences for the affected groups, increasing inequalities [ 10 ].

2.2.1. Impacts on education: schools

Going to school is the most effective public policy instrument for improving skills. Although education can be enjoyable and can help children develop social skills and knowledge, the most important economic benefit of being in school is that it improves a child's potential. And if it's only for a brief amount of time in school, even a short period of absence from school would have implications for ability growth. Should we, therefore, quantify the learning effect of boycotting COVID-19? We can't be precise since we're in a new country, but we can get an order of magnitude from other research.

2.2.2. Impacts on education: families (side)

Perhaps to some people's dismay, children are usually not sent school to learn. The suggestion is that they finish their schooling at home, hoping that they don't miss much. As defined by Author [ 11 ], families are an integral component of education and are generally accepted to provide key inputs to a child's learning. The new global growth in homeschooling can be regarded favorably at first glance, as it is likely to be effective. However, this function is typically seen as a supplement to the school's input. Parents can supplement a child's arithmetic learning by counting for them or highlighting basic math problems in daily situations; they can also enrich history lessons by taking them to important monuments or museums. It's a different issue to be the primary force of literacy, even with online materials; while many parents around the world actively teach their children at home, this is impossible to apply to the entire population. So, while global homeschooling will undoubtedly yield some exciting, furious, enjoyable, and stressful moments, it is unlikely that it will replace learning lost from school on a consistent basis. But the bigger point is that there are likely to be significant differences between households in terms of their ability to assist their children in learning. The amount of time required to devote to instruction, non-cognitive abilities for parents, tools (for example, not all would have a toolkit to find the right materials online), and the amount of expertise accessible are the key disparities. -It's difficult to assist your child in this area. Learn something that you may not be able to figure out on your own. As a result, the affected cohorts will see increased inequalities in human capital development as a result of this cycle.

  • • Assessments

The closure of schools, colleges, and universities not only results in a worldwide boycott of teaching; it also coincides with a significant assessment cycle, which has resulted in several exams being delayed or cancelled. Internal reports are probably regarded as less significant, and many have been cancelled as a result.

  • • Graduates;

The COVID-19 pandemic will have a significant impact on this year's college degree jobs. They have had significant disruptions in teaching in the latter stages of their studies, are seeing major disruptions in their assessment, and are about to graduate at the start of a major global recession.

3. Overview of cloud computing

3.1. cloud computing concepts.

The NIST Definition for Cloud Computing private deployment is a paradigm for providing ubiquitous, ergonomic, and on-demand network access to a shared collection of configurable computing resources (such as software, utilities, storage, networks, and servers) that can be rapidly provisioned and released with minimal effort. There is less contact with the support staff or service providers. Scalability, manageability, and flexibility are all advantages of cloud computing. Cloud infrastructure also has the advantages of on-demand operation, economy, universality, comfort, leasing pluralism, reliability, and versatility. A cloud customer can use these resources on demand to flexibly develop, operate and host applications and services anytime, anywhere and on any device. According to the NIST definition, it highlights the three service models - Platform as a Service (PaaS), Program as a Service (SaaS), Infrastructure as a Service (IaaS), and cloud service delivery through a Cloud Service Provider (CSP) to its customers as explain in Fig. 2 . It also summarizes the four deployment models - public cloud, hybrid cloud, private cloud, and community cloud, while highlighting the computing infrastructure sharing model for delivering cloud services. Moreover, it provides an integrated view of five essential and unique features of each cloud service - resource pooling, self-service on demand, rapid flexibility, scaled services, and wide network access [ [11] , [12] , [13] ].

Fig. 2

Reference architecture of Cloud computing with different deployment models [ 19 ] [55].

3.2. Cloud service delivery models

With the development of internet technology and cloud computing for big data, they elevated a new concept of services. These new services can link the growing number of online activities. After many types of research in cloud computing that basically provide three service delivery models and four development modes which are IaaS, PaaS and SaaS, there are currently many service models available according to their service that provide the functions and capabilities that lead to the creation of anything like - anything as a service (AaaS) delivery models”. The following describes service delivery models (SaaS, PaaS, and IaaS) for delivering cloud services by a cloud provider to its customers [ 14 ].

  • • Infrastructure as a Service (IaaS) belongs to the bottom of the paradigm. IaaS deals with computer hardware (processor, memory, network storage, server/virtual machine, and data center) as a service.
  • • Platform as a service PaaS is present in the middleware of the service model and provides services in the form of development, tools, framework, architecture, software, and IDEs.
  • • Software as a Service (SaaS) is a combination of remote computing services. SaaS is in the top of the line among delivery models. Remote deployment of applications by a third party is allowed.

3.3. The core component of cloud computing

This section, discusses the basic components of cloud computing. These components consist of a wide range of services that can be used all over the internet.

  • - Virtualization: It plays a crucial function in cloud deployment. It is a critical component of the cloud that enables many users to share physical resources. It provides a simulated instance of a resource or unit, such as an operating system, servers, network infrastructure, or storage devices that the application can use in several execution environments [ 9 ].
  • - Multi-tenancy: Multiple clients or consumers in a multi-tenant system do not view or share each other's data but can share resources or applications in an implementation environment, even though they do not belong to the same entity. Multi-tenancy helps in the most efficient use of hardware and data storage mechanisms [ 9 ].
  • - Cloud storage: It is a component, which maintained, managed, and backed up remotely and it made available over the network where the users can access data.
  • - The hypervisor: A core component of virtualization is the virtual machine monitor or boss. It enables the execution of several Virtual Machines (VMs) on a single hardware host. It controls and tracks the various operating systems that share a physical system [ 15 ].
  • - Cloud Network: It can operate more than one conventional data centers; a typical data center contains hundreds or thousands of servers [ 15 ]. To efficiently build and manage the storages the cloud requires a secure network infrastructure called cloud networking. It requires an internet connection and similar with a virtual private network which enables the user to securely access printers, applications, files etc.

3.4. Cloud computing components vitality during COVID-19

In the current COVID-19 pandemic condition, cloud computing services may be readily deployed and exploited, allowing facilities such as data management, recording information, and so on to become more productive [ 16 ].

To fulfil the cloud computing approach's intended purpose, it requires the significant elements/components to be organized together. The main components were summarized and explained in Fig. 3 . Cloud clients are software/computer setups that are designed primarily to use cloud services, whereas cloud services are solutions, products, and services that are used for real-time delivery. Cloud apps are used in cloud software architecture so that medical staff may access shared data; nevertheless, a cloud platform is a type of service that includes software infrastructure and services [ 16 , 17 ]. All records, patient shareable reports, information, and other data are stored in the cloud, and cloud infrastructure is the provision of computing infrastructure as a facility/service in the domain of interest.

Fig. 3

Main components of cloud computing.

3.5. Application of cloud computing

Cloud computing has swept the digital globe since its inception. Cloud computing is the practice of managing, storing, or processing data on a network of remote servers hosted on the Internet. Previously, this data had to be stored on local servers or personal computers, which restricted storage space. However, with cloud computing, it is possible to access almost infinite space while still improving remote processing. Cloud computing's potential has expanded considerably as a result of optimized resource use, flexibility, cost reduction, and flexibility. Machine learning in the cloud has a number of advantages. Furthermore, cloud computing applications have grown to including mobile phones.

4. Related work

In addition to a number of researches examining the effective role of technology in trying to reduce the impact of the pandemic on society as a whole, many studies have been conducted on how COVID-19 effects in all aspects of life. The next subsections describe briefly the related works on how COVID 19 influences our lives in the society and show the roles of the technology in reducing these impacts of the pandemic on society.

4.1. COVID 19 and its impacts on the society

According to Ref. [ 18 ] the emerging coronavirus, known as Covid-19, was discovered in the last month of 2019 year, at a seafood market in Wuhan. Shown the results of the clinical analysis of the virus person-to-person transmission.

The World Health Organization (WHO) in march 2020 (WHO, 2020) declared that COVID-19 has become a pandemic after an evaluation the speed of the spread and severity of the deadly virus worldwide with additional announcements about social distancing as a way to limit the spread of the epidemic. Social distancing is a conscious increase in the physical gap between people in order to limit the spread of disease (Red Cross, 2020). This pandemic has forced the global physical shutdown of business, sporting activities and schools by forcing all institutions to migrate to online platforms [ 5 ].

The paper [ 19 ] aims to study the repercussions of the Coronavirus on various sectors of the economy, highlight the reasons for India's benefit in the post-pandemic period, and identify mainly business survival strategies that are key to overcoming this difficult situation. Whereas, paper [ [20] , [21] , [22] ] highlights some of the country's seamless education. The positive and negative impacts of COVID have been discussed, and some fruitful suggestions for implementing educational activities during COVID 19 have also been noted.

The authors [ 23 ] have studied the impact of COVID-19 on health sector prevention and treatment services for non-communicable diseases (NCDs) have been severely disrupted since the COVID-19 pandemic began, according to a WHO survey released during the pandemic. Whereas, the authors [ 7 ] focused on the impact of COVID-19 on economic sector, the outbreak of pandemic COVID-19 all over the world has disturbed the political, social, economic, religious and financial structures of the whole world.

Authors [ 8 ] have study the impact of COVID-19 on education sector. The COVID-19 pandemic is first and foremost a health crisis. Many countries have (rightly) decided to close schools, colleges and universities.

4.2. Roles of the technology in reducing the impact of the pandemic on society

The technology has been reducing the impact of the pandemic on society. The paper [ 24 ] discusses immigration of higher educational institutions, students and faculty as response to the pandemic from normal learning to online learning. The opportunities and challenges related to COVID-19 have added values to the current body of literature by online learning and providing comprehensive awareness of immigration methods to avail an appropriate mode of communication between the institutions and students. The authors in [ 25 ] add some light on the growth of E-Tech Start-up during a time of pandemic and natural disasters and include suggestions for academic institutions on how to deal with the challenges associated with online learning. Whereas paper [ 3 ] explains the role of IT Governance during the COVID-2019 Pandemic. The authors study the cloud's significance in enabling remote data access and storage which have become even more demonstrable in light of the COVID-19 situation, enabling business continuity and avoiding risk [ 26 ].

5. The scientific outline of the study

The survey was conducted in four parts, and it was distributed to participants from different nationalities. The first part is general information about the participants. The second part evaluates participants' knowledge of cloud computing. The third part is used to evaluate the adoption of cloud computing in participants' organization through the COVID 19 lock-down. It evaluates the main concerns in adopting cloud computing in their institution. The sufficient knowledge to use the cloud computing is approved; this made the ability to use the cloud within pandemic is permitted and displays the users' satisfaction with cloud service to get benefits of cloud applications in this period. The last part of this survey measures the need for cloud computing in the future. The following subsections show the methodology and sample design.

5.1. Research methodology

Exploratory, descriptive, empirical, and statistical analysis approaches are the four categories of research methods. Descriptive analysis is useful for identifying and categorizing elements or characteristics of a subject [Neville, 2007]. According to Creswell [1994], a descriptive system of analysis is used to gather knowledge about the current situation.

In this paper, the researchers used the descriptive exploratory approach appropriate for the objectives of the study, and this method helps the researchers to form a more accurate perception, and enables them to form a general framework that helps in conducting a deeper study later. In addition to what is mentioned above, it ensures whether cloud computing service was a savior to fight the lockdown in COVID-19 crisis, benefits, characteristics, and applications.

5.2. Design sample

The sample design was created in electronic form with Google docs and selected randomly from different educational institutions and ICT. The sample was 101 participants from 16 different countries from October 2nd to November 2nd, 2020.

5.3. The study tool

The researchers designed forms to collect data from the study sample, and the questionnaire included a set of questions according to the objectives of the study. It was taken out after verification of its validity and reliability. The online questionnaire contains (37) questions divided into four parts:

  • a. Personal information (nationality, age),
  • b. Socioeconomic data (designation of respondent, level of education, place of work, years of work experience).
  • B.1 Designation of the respondent classified into three classes: Academic (faculty member), non-academic, other.
  • B.2 Level of education considered into five subclasses: Prof./PhD., Master, Bachelor, Diploma, Other.
  • B.3 Place of work is considered into five subclasses: education sector, health sector, management and administration, business, Other.
  • b.4 Years of work experience classified into four subclasses: 1–5 year, 6–15 year, 15–30 year, Over 30 years.
  • ✓ Part two: specifies the participant's knowledge of cloud computing.
  • ✓ Part three: checks cloud computing service at the institutions and organizations government and non-government in COVID 19 lockdown period.
  • ✓ Part four: displays the users' satisfaction with cloud service and the benefits of cloud applications in this period.

5.4. Data analysis tool

The Statistical Package for Social Sciences (SPSS) version 25 [ 26 ] was used to tabulate metadata including basic characteristics and the field of the questionnaires. The computational method adopted to determine the general direction of the participants regarding the questionnaire.

6. Results and discussion

The results will be presented and discussed according to the study questions using ratios, frequencies, arithmetic mean and relative weights. The reliability analysis (stability, internal consistency) measured by Cronbach's alpha, the questionnaires contains 0.951 reflecting the excellent stability at the same time Cronbach alpha of the elements in four parts less than 0.951. So that the internal consistency is satisfied.

The result showed that the sample are distributed such that the participants from Yemen 61.4%, Saudi Arabia and India at 7.9% for each and 22.8% for other countries as seen in Fig. 4 .

Fig. 4

Distribution frequency upon the questionnaires nationality.

The sample results show that the half of the participants between 30 and 39 years old, 24% for whom between 20 and 29 years old, the 40–50 years old are 19.8% and 5% for 50 years old above.

Most common in respondent assignment is academic participant (faculty member) 67.3%, 22% non-academic and 11% in other assignments. The education level of most of the participants was 41.6% professor/doctorate, while 36.6% were masters, bachelors 14.9%, diplomas were 1% and others 5.9%. as seen in Fig. 5 .

Fig. 5

Distribution frequency Designation of Respondent.

As the lowest prevalence of the workplace, the participants in which are almost working in the education sector, 16.8% in management and administration, and others 15.8%, in the health sector, 5.9%, and less in business, at 5%. Whether the most common work experience years were 57.4% such as 1–5 years, 6–15 years were 29.7%; 6–15 years were the least 12.9%. as seen in Fig. 6 , Fig. 7

Fig. 6

Distribution frequency education level and prevalence of the workplace.

Fig. 7

Distribution frequency of type(s) of cloud services.

The result of percentage of overall participant's knowledge sufficiently about the cloud computing, score was 52.5% (n ​= ​53), and they agreed that public information about cloud computing is known. The average score for questions concerning participants' knowledge about cloud computing from 3.72 ​± ​1.031 to 3.82 ​± ​1.014 (as seen in Table 1 ). As seen in Table 1 , most of the respondents have good knowledge about Cloud Computing and cloud computing application score was 19.6% (n ​= ​19) strongly agree and score of Agree was 43.3%, (n ​= ​43). And also, the result show percentage of appropriate of used any cloud applications score was 98%. The research tries to clear the participant's reply about the most field used the cloud applications by participant's as showed in Table 1 .

Table 1

Frequency and mean ​± ​standard deviation (SD) of participants' questionnaire answers about knowledge about cloud computing.

The result of percentage appropriate public respondents who used and accredited cloud computing in their organization on the COVID 19 lock-down, score was 72.3% (n ​= ​73), and they agreed that their organization adopted cloud computing in COVID 19 lock-down (see Fig. 8 ).

Fig. 8

Distribution frequency of in any field you used the cloud applications.

Also percentage appropriate public respondents to the main question (Do you think implementing Cloud Computing in your organization or institution will improve the quality of delivering services?) score was 32% Strongly agree, 45.8% Agree, 11.9% Neutral, and 10.2 Strongly disagree. The result of percentage appropriate of specify the type(s) of services that you are using in your institution as showed in Fig. 9 .

Fig. 9

Also The mean score for questions related to concerning participants’ about used and adopting the cloud computing in their organization in the COVID 19 lock-down from 3.29 ​± ​0.852 to 3.93 ​± ​0.886 (as seen in Table 2 ).

Table 2

Frequency and mean ​± ​standard deviation (SD) of participants' questionnaire answers about Used and adopting the cloud computing in their organization in the Covid 19 lock-down.

The result in this section explain the main concerns in adopting cloud computing which are concerned with privacy and confidentiality of corporate data, security availability of services or data, the need for a stable and fast Internet connection, and Loss of control of services and/or data. All considered the main concerned to provide the good cloud computing service.

The percentage of the adequate overall participant for main concerns in adopting cloud computing in their institution score was 79.2% (n ​= ​80), they agreed that their institution adopted the main concerns in cloud computing in the COVID 19 lock-down.

The mean score for questions related to concerning participants about the main concerns in adopting cloud computing in their institution from 3.24 ​± ​0.789 to 4 ​± ​0.787(as seen in Table 3 ).

Table 3

Frequency and mean ​± ​standard deviation (SD) of participants' questionnaire answers The main concerns in adopting cloud computing in their institution.

7. Conclusion

The epidemic diseases cause radical social changes during human history since long periods. Covid-19 has also dramatically changed the consumption habits of people around the world and continues to change. The current situation indicates that one of the most significant visual impacts of COVID-19 is imposed on a widespread digital social and business life. It had numerous impacts on the environment and nations. The technology has played an important role in mitigating the effects of Coronavirus pandemic. However, the most common shift between these factors is the pattern of technology and, in particular, cloud computing technology.

People avoided contacting each other as much as possible and tried to fulfil almost all of their online needs in using cloud computing technology during the period when the workplace was closed. This state of affairs led to significant improvements for companies, universities and schools all over the world. In the current situation, cloud computing has become necessary for educational institutions, companies and health facilities.

This paper reviewed a brief overview of the COVID-19 epidemic and how it has affected the world as a whole. It showed that the awareness of this topic practically justifies better knowledge of people and deliberations about how the Coronavirus affects education, trade, industry, and economies in countries. The paper presented the benefits, characteristics, and applications of cloud computing. It explains how the cloud has contributed to improving life in all regions of the world during COVID-19. It shows that the cloud helps countries fight COVID-19 in education, business, and more. By distributing an online questionnaire to various academic and non-academic persons in different places around the world, The result in the first section showed percentage of overall participant's knowledge sufficiently about the cloud computing, score was 52.5% (n ​= ​53), and they agreed that public information about cloud computing is known. Most of the respondents have good knowledge about Cloud Computing and cloud computing application score was 19.6% (n ​= ​19) strongly agree and score of Agree was 43.3%, (n ​= ​43). And also, the result show appropriate percentage of appropriate of useing any cloud applications score was 98%. As seen in our questionnaire data analysis.

However the result showed that cloud computing played an important role in improving public life, especially in the field of education. Universities and schools have helped in the continuation of the learning process by using many cloud computing applications. As seen the participants' questionnaire answers about Used and adopting the cloud computing in their organization during the Covid 19 lock-down mostly give the good reply for the view of cloud computing service and the score was 72.3% (n ​= ​73), and they agreed that their organization adopted cloud computing. as well as respondents' scores to the main question (Do you think implementing Cloud Computing in your organization or institution will improve the quality of delivering services?) score was 32% Strongly agree, 45.8% Agree, 11.9% Neutral, and 10.2 Strongly dis agree at the It also explained people's increased confidence in cloud computing technology.

The future direction study the risk effect to adopting the cloud computing in all organization also study the future of cloud computing in a post- COVID 19 pandemic all over the world from the users' point of view.

Declaration of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Password based authentication for web based graphics computing services retrieval in cloud

  • Published: 19 April 2024

Cite this article

  • Eman S. Alkhalifah 1  

In the global environment privacy and security issues were critical to handle the huge number of participants. Many security-based research works have been undertaken in the cloud, including multi-factor authentication using Elliptic Curve Computation Diffie-Hellman Problem, Multi-model Biometric, Signatures, Graphical One-time Passwords and more. In this paper, we propose a multi-level multi-factor authentication procedure that is comprised of three significant entities: Users, Trusted Third Parties and Cloud. This authentication procedure is segregated into three phases: In the first phase, the HMAC-SHA 256 algorithm, watermarking algorithm and logical OR operation are applied which includes user ID, password, and fingerprint. Then in the second phase, three-level authentications are involved to permit users to view, upload and download files from the cloud. On validating each constraint, the user is permitted to participate in the cloud within the limit. Finally, the third phase is for user convenience to modify the prior password with a new one for security purposes. Overall, our main goal is to validate the user’s legitimacy for accessing the cloud through multi-factors: ID, password, fingerprint and graphical one-time password. Here, our work is implemented in a Java environment; our technique improves performance indicators such as successful login rates (94.4%), mean login time (10 ms), authentication efficiency, and overall user experience. In conclusion, our suggested MFA-MLS system provides a strong answer to the difficulty of safe authentication in cloud environments, prioritizing user ease and experience while also improving security measures.

Avoid common mistakes on your manuscript.

1 Introduction

The introduction should briefly place the study in a broad context and highlight why it is important. It should define the purpose of the work and its significance. The current state of the research field should be carefully reviewed, and key publications cited. Please highlight controversial and diverging hypotheses when necessary. Finally, briefly mention the main aim of the work and highlight the principal conclusions. As far as possible, please keep the introduction comprehensible to scientists outside your field of research. References should be numbered in order of appearance and indicated by a numeral or numerals in square brackets—e.g., [ 1 , 2 , 3 ], or [ 3 , 4 ]. See the end of the document for further details on references.

The ubiquitous characteristic of clouds attracts the attention of several people. Due to the involvement of certain threats and security issues, cloud computing is critical to manage. For security purposes, the cloud environment utilized different cryptography techniques. The cipher text-policy Attribute-Based Encryption (CP-ABE) scheme supports secure data sharing [ 1 , 2 ]. Weighted attributes (CP-WABE) are expressed based on the access policies of the data owner. Factor-based authentication is also widely used in the cloud for ensuring high security. According to the number of factors two, three and multi, the level of security is increased [ 3 , 5 ]. Two-factor authentication involves a user password and smart card. A random number is generated using a password and identity. A smart card is required during both the registration and login phases. Similarly, to thwart harmful attacks is solved with two factors based on Elliptic Curve Cryptography. Concerning user ID and password, a session key is generated and if it is valid the shared else is discarded. A security key and security device are the two factors used that are verified by proof of knowledge protocol. Verification of certificates is based on BBS + signature that deals with the satisfaction of monotone Boolean function based on a set of attributes. Three Three-factor authentication (3FA) in the cloud is adopted for solving various security threats [ 6 ]. A smart mobile user is supported to log in with a credential ID, password and fingerprint from anywhere biometric and smart card-based authentication is focused. A fuzzy verifier is used for password verification and a fuzzy extractor for biometric verification.

Authentication with a single factor and two factors only provides partial and incomplete security, so combining these factors altogether is chosen as the best way to attain a higher level of security with Multi-Factor Authentication (MFA). The challenges in security and privacy are completely solved using MFA with handy smart devices. A handwritten signature is one of the factors that matches global and local features. MFA address the issues of efficiency, robustness, privacy, and usability. A generic design for multi-factor authentication works with a smart card, password and biometric [ 6 ]. A Message Authentication Code (MAC) is generated and exchanged between entities for verification. MAC key is obtained from the user's biometric which is registered prior. MFA is being adopted in public displays by addressing the GTmoPass multimodal scheme that integrates gaze and touch input [ 7 ]. Gaze corresponds to the movement of eyes either towards right or left. Watermarking is also applied in MFA using Discrete Wavelet Packet Transform (DWPT) and Quantization Index Modulation (QIM) to perform robust speech watermarking [ 8 ]. Speech signals from the user are divided into frames and then processed with DWPT and the output retrieved is a watermark. A gesture is a free-form pattern that is also considered a factor, which is capable of providing better security for users with smartphones [ 9 ]. Gestures are handwritten (i.e.) swiped on displays with twists and turns of one’s wish. Presently, several Android users are supported with gestures for security, these gestures are analyzed by pre-processing, feature extraction and feature selection.

For security in the cloud, many researchers have introduced novel techniques and schemes. Since user authentication has become a significant requirement due to vulnerable threats and security issues. Usually, MFA mechanisms are designed according to the knowledge, possession and inherence of each user; hence many research works in MFA design are broadly classified as follows:

Hardware-based Authentication, this comes under the category of possession (i.e.) ‘Something you have’, users have security devices to support storage capacity which does not require remembering any key terms. But this is not affordable when the device is either lost or stolen since it is carried handily.

Bio-metric-based Authentication is the user’s inherence (i.e.) ‘Something you are, which is unique for each one and cannot be changed life-long. Bio-metric instances are Fingerprints, iris, retina, etc.

Security Question-based Authentication is one of the simpler authentication procedures that utilize the user’s registered information and randomly questions each user. However, the answers can be guessed by malicious users for gathering data.

Knowledge-based Authentication (i.e.) ‘Something you know’ which is popularly used in many social networks and other applications, they are password, Personal Identification Number (PIN), session password, etc.

In this paper, we focus on bio-metric-based authentication and knowledge-based authentication in the design of a novel MFA-MLS technique. The MFA-MLS technique involves a registration phase with an ID, password, and biometric. These three significant factors support the MFA-MLS technique to withstand a secure and privacy-preserved cloud environment. Password is required to be secure in the cloud, so a nonce value is utilized for encrypting the password. Then login phase is executed in three-level authentications by considering multiple factors. Finally, in the third phase of the password amending phase, user privacy and convenience are concentrated. Our main goal is to achieve a higher level of security.

1.1 Contributions of this paper

The major contributions of this work are summarized as follows.

Multi-factor Authentication with Multi-Level Security (MFA-MLS) technique in a cloud environment comprised of ‘ \(n\) ’ number of users, Trusted Third Party (TTP) and Cloud servers.

Incorporating the multi-factors as User ID, Password and bio-metric help to identify unique users and also with the temporary password generation for a short period of additional security.

HMAC-SHA 256 creates trustworthy user participation for User registration, biometric Fingerprint, watermarking with Digital Wavelet Transform-Discrete Cosine Transform (DWT-DCT), Graphical Time Password (GOTP) based on novel Grid-pair shuffling of grid images for each user, for support the higher level of security.

The Password Amending phase is for user convenience to alter or change the password using obtaining an octet key from TTP (i.e.) generated according to logical OR operation which is valid for a period ensuring the ease of use and security enhancements.

1.2 Organization of the paper

The subsequent sections of this paper are structured as follows: Section II provides a comprehensive review of previous research on the topic of multifactor authentication. Section III identifies the issues and limitations of previous multi-factor authentication solutions. Section IV introduces the suggested MFA-MLS approach and explains how it works. Section V covers the experimental evaluation, which includes a full study of the suggested technique and comparison results. Finally, Section VI summarizes this study on multi-factor authentication in cloud contexts, identifying interesting areas for future research and enhancements.

2 Literature review

Cloud Access Management (CAM) with authentication uses multi-factors to perform mutual authentication between the server and cloud service provider [ 10 ]. All the registered users are validated in the authentication phase which includes low-level authentication, medium-level authentication and high-level authentication. Each level evaluates three factors captcha, One Time Password and IMEI number. Captcha is very simple and a person with the knowledge of language and numerics will pass this level. Then IMEI in the third level is unique but when the mobile is stolen or lost, the secrecy is broken. Data confidentiality maintenance was provided by Cyclic Shit Transposition Algorithm (CSTA) and authentication protocol with QR code [ 11 , 12 ]. Partitioning and shifting are the two operations in encryption and decryption to save data from hackers. QR code is much better than a barcode, but scanning of QR codes is available in advanced versions of smartphones hence these schemes are feasible only for Android users.

Mansour et al. [ 13 ] have designed an MFA based on Multi Biometrics for real applications to support remote users to access from anywhere via the internet. A factor (i.e.) biometric (fingerprint or iris or other) data are enabled to be projected as keys using the Keyed Random Projections and Arithmetic Hashing (KRP-AH) scheme. In this scheme multiple biometrics are integrated using the FaMSL-MBS scheme that extracts features for matching, tokens and passwords stored in the database are also matched. Storage remains a challenge due to the use of multiple biometrics for each user.

Iris is a biometric chosen for authentication by authors Nedjah et al. [ 14 ]. Iris matching is held by performing segmentation, normalization, and binary code formation. Hough transform was responsible for predicting geometric objects, but the iris was a complicated bio-metric since it matches unauthorized iris in some cases that lack accuracy due to the default matching method. Biometrics in MFA requires accurate matching algorithms that are involved in image processing. Biometric-based Multi-Factor Authenticated Key Exchange (MFAKE) protocol was proposed [ 15 ] with robustness. A parallel mode process is followed in the MFAKE protocol for simultaneous authentication which matches the templates using fuzzy extractor. An authentication challenge is created between the server and the client for authentication verification. On exchanging the Message Authentication Code (MAC) the challenge is completed if a valid key is exchanged. MFAKE protocol deals with lengthier computations and becomes more complex at peak hours. For maintenance of robustness, a 3FA scheme was proposed in the multi-server environment [ 16 ]. Password, smart card and biometrics are the three factors that also include a unique ID. Concerning the threshold time the registration center validates the user, this procedure is comprised of several computations that leads to higher time consumption when the number of users increases.

Debiao He et al. [ 17 ] designed a three-factor authentication to secure against Denial-of-service attacks, replay attacks and password-guessing attacks. Burrows-Abadi-Needham (BAN) logic is followed for authentication, USB Mass Storage Device (MSD) supports storing the computations during authentication and validation. However, when USB-MSD is used, its storage capacity is limited to some limit and so either old computations need to be erased or the storage device should be changed when USB-MSD becomes full. Factor-like mouse (device) behavior is considered in which the movement of this device is measured to determine distance and pattern for mapping user input concerning registered signature [ 18 ]. Then a token is generated by an open-source application with QR code registration. Further, an OTP is generated by performing QR code scanning. MFA is designed to support several applications for remote users; an application of online banking was discussed with MFA [ 19 ]. An exclusive OR operation is performed to integrate the user’s fingerprint with a generated secret key, and then the User's identity is also encrypted and stored. During authentication, the fingerprint is hashed with a random number and the received OTP is matched to proceed with online transactions. In case the user fails to recall the random number, then it becomes difficult to access the cloud and the usage hash function supports data integrity but not authentication; hence this procedure leads to higher risk when used for online banking applications. MFA have significantly included a smart device, OTP and other factors for guaranteeing higher security.

The identity-based Encryption (IBE) mechanism was used for the provision of enhanced security protection [ 20 ]. This method includes a unique identifier and security device, using these two factors’ data was stored in an encrypted format. However, the maintenance of security was good, it was not able to retrieve stored data if the security device was either lost (broken) or stolen. MFA in the cloud deals with several research works by considering different significant factors. However, previous works provided security and privacy using single-factor, two-factor, and three-factor from which multi-factor was an effective model to assure a higher level of security. The limitations and problems of previous algorithms and techniques are stated in the section.

3 Problem statement

Cloud environment requires security maintenance for user data and inhibits the involvement of unauthorized users. The authors have designed an authentication between the client and server installation of the pass generator application [ 21 ]. The server verifies the ID and password and then requests for OTP to pass the generator, similarly, the client also requests for its OTP to pass the generator. Both OTPs are matched, and authentication is approved. The use of pass generator application is required to be installed which requires smart devices; however, if users handle smart devices the apps are not authorized to be used. Then user password is directly stored in the cloud without any encryption technique hence it can be utilized by hackers. MFA-MB was proposed in an environment of cloud and trusted third party with Class-Authentication Rules (CAR) [ 22 ]. This MFA-MB was not able to handle the different credential factors from users since it significantly focused on a matching process that failed to provide a higher security level.

AKE protocols were involved in providing authentication according to the factors. In [ 23 ] Two Factor AKE (TF-AKE), the user ID is encrypted, and a security device (smart card) is used for storing the encrypted data of the user. User-assigned password is appended with a nonce value to avert information leakage. To uphold a higher level of security, two factors are not sufficient and the use of a security device should be carried out everywhere the user moves. Then Elliptic Curve Cryptography was utilized in 3FA including passwords, smart cards and biometrics [ 24 ]. ECC leads to space complexity since the size of encrypted data tends to be larger. Authentication was also provided with a combination of images (i.e.) Graphical One Time Password (GOTPass) authentication system balance both usability and security [ 25 ]. Images are organized in a 4 × 4 panel that generates an OTP concerning the images selected. GOTPass assigns the value of row and column corresponding to the images; hence it is vulnerable to shoulder surfing and other attacks. Cloud authentication is significantly focused on many research works which include factors incorporated in mechanisms and techniques. To solve the problems and limitations that existed in the previous cloud authentication environment, this paper has focused on MFA with novel procedures and sustaining highly secure data. Therefore, this MFA-MLS technique can be applied for applications (banking, health care, administration of any organization, etc.) where sensitive data are stored [ 26 ].

4 Proposed system model

The notations used throughout the paper are listed as follows,

4.1 Overview

Authentication has become mandatory in cloud computing environments, especially in sensitive data storage applications. A naive Multi-Factor Authentication with Multi-Level Security (MFA-MLS) technique in a cloud environment. The MFA-MLS technique is executed in novel architecture that encompasses registered users, Trusted Third Parties (TTP) and the cloud. The goal of the MFA-MLS technique is to achieve the robustness that exists in previous works showing the worst performance and weak authentication. This is a significant factor that should be taken into account for MFA technique design. Usability refers to the active participation of people by using the MFA-MLS technique (i.e.) user-friendly technique. Constraints like people not wishing to carry any additional devices for security, and longer passwords that are difficult to remember. Privacy in recent studies has proven biometrics is acknowledged to inhibit information leakage. Therefore, authentication cannot be broken and can be maintained stronger. Efficiency is improvised by eradicating complex authentication protocol designs with mathematical computations and computation costs. The MFA-MLS technique involves minimized computations and is vulnerable to attacks. Authentication is mandatory since adversary users work to access secure data by going ahead forward from its limits [ 27 ].

Figure  1 illustrates the idea of our proposed system model using the MFA-MLS technique in a cloud environment with ‘ \(N\) ’ number of users. MFA-MLS technique is divided into the following phases (i) Registration phase, (ii) Login phase and (iii) Password amending phase. Initially each user registers to cloud using authentication factors as \({U}_{ID}\) , \({U}_{PW}\) and \({U}_{FP}\) . Fingerprint is the biometric taken in account from MFA-MLS technique. \({U}_{PW}\) authentication is supported by HMAC-SHA256 algorithm and \({U}_{FP}\) is watermarked for authentication using DWT-DCT. TTP generates \({Oc}_{K}\) from \({U}_{PW}\) and \({U}_{FP}\) which is stored for future use in the password amending phase. The Login phase deals with three levels of authentication to allow the user to their access limit in the cloud. The First level verifies the user password, the second level verifies \({G}_{OTP}\) and finally the third level verifies watermarked \({U}_{FP}\) . For user convenience, the password amending phase is involved in which the registered user password can be changed. Here the generated \({Oc}_{K}\) , after verification of key the \({PU}_{PW}\) is changed into \({NU}_{PW}\) . Our MFA-MLS technique consists of the following entities:

Trusted Third Party (TTP): It is responsible for generating octet keys and verifying multi-factors while the \(U\) performs the login phase. It also acts as a trusted intermediator between the user and the Cloud Service Provider (CSP). In the MFA-MLS technique, TTP plays an important role in sustaining authentication.

User: The significant entity in this paper is the user that performs authentication with CSP via TTP. Each user is issued with a nonce value and octet key after registration to maintain trustee for data in CSP.

Cloud Service Provider: CSP is enabled to provide services for only anonymous authorized users. CSP works together with TTP during the authentication process.

figure 1

System model

4.2 MFA-MLS technique

To solve the flaws and limitations illustrated in section III, we proposed novel MFA-MLS technique in cloud environment. This technique is performed in three sequential phases as mentioned in previous section. Users utilize cloud environment only after creating an account by registration. User’s registered details are stored in TTP and CSP.

4.2.1 Registration phase

MFA-MLS technique is initiated with this phase for all the user’s who are likely to be an \({Ac}_{H}\) for secure service access from cloud. \(U\) registers with \({U}_{ID}\) , \({U}_{PW}\) and \({U}_{FP}\) (i.e.) ‘something users know and something users are:

Initialization: User requests TTP for \({Nc}_{v}\) by providing it’s unique \({U}_{ID}\) . TTP forwards \({U}_{ID}\) to CSP for the generation of \({Nc}_{v}\) for the purpose of securing the password. The generated \({Nc}_{v}\) need to be stored for future use since it is required while making changes in \({U}_{PW}\) .

Key generation: \({Nc}_{v}\) is a pseudo-random number that is issued by CSP via TTP for authentication. Upon receiving \({Nc}_{v}\) , TTP checks whether \({U}_{ID}\) has been requested for registration, if yes then TTP proceeds with key generation for \(U\) . Assume two distinct prime numbers in random that is defined as \({P}_{k}\) and \({S}_{k}\) . \({Nc}_{v}\) and \({P}_{k}\) is sent to \(U\) via TTP.

Encryption: \(U\) encrypts the \({Nc}_{v}({U}_{PW})\) using HMAC-SHA256 (Hash-based Message Authentication Code using the SHA-256) algorithm. It is essential to ensure the security of user credentials during the registration step, combining the security capabilities of HMAC and SHA-256. This approach uses the SHA-256 hash function to construct a unique hash value, or message digest, based on the user's ID, password, and fingerprint data. This hash value is a secure representation of the user's credentials, ensuring confidentiality, integrity, and validity while preventing unauthorized access and modification [ 28 ]. HMAC-SHA256 algorithm’s specifications are illustrated in Table  1 .

figure a

Completion of secure \({U}_{PW}\) storage extends User registration with biometric, due to its ability to differentiate a particular person from others. To inhibit adversaries, \({U}_{FP}\) are stored secure by watermarking the biometric. \({U}_{FP}\) is embedded into an image to form a watermarked image. Discrete Wavelet Transform (DWT) and Discrete Cosine Transform (DCT) are integrated to support higher security of watermarked fingerprint images.

Watermarking Fingerprint: \({U}_{FP}\) is embedded into an image ( \(Im\) ) with minimized size, due to selection of minutiae point from \({U}_{FP}\) and then apply DWT-DCT shown in Fig.  2 . Before watermarking \({U}_{FP}\) , binarization, thinning and minutiae points are extracted. In binarization \({U}_{FP}\) is converted into binary image ( \({B}_{I}\) ) and a copy is stored this image is stored for future process ( \({Oc}_{K}\) generation). Ridges extraction plays a significant role in collecting exact information, the obtained image represents pixel value ‘ \(0\) ’ for background and ‘ \(1\) ’ for informative areas. To eliminate redundant ridge pixels image thinning is involved, which minimizes the width of ridges and constructs a skeleton image. Minutiae points are extracted with the determination of central pixel value by three other neighboring pixels. Non-continuous ridges are defined as minutiae points. When the central pixel is 1 and all neighboring pixels are 1, it is ended ridge and hence stored for the next process [ 29 ].

figure 2

Watermarking fingerprint

Minutiae points extracted from \({U}_{FP}\) are applied into DWT-DCT algorithm. \({U}_{FP}\) is divided into bands as (i) low frequency band (LL) and (ii) high frequency bands (HL, LH, HH). \(Im\) is chosen to be watermarked into \({U}_{FP}\) over high frequency sub-bands. DCT is integrated to perform compression, quantization, and binary conversion. A 2-Dimensional \({U}_{FP}\) applies DCT (i.e.) expressed as:

From (1) pixel intensities of \({U}_{FP}\) is determined that has \(N\times N\) dimensions of \(i\) rows and \(j\) columns. \(D(i,j)\) denotes DCT coefficients present in \(i\) rows and \(j\) columns, of DCT matrix. The terms \({{U}_{FP}}_{i}\) and \({{U}_{FP}}_{j}\) are given as,

\({U}_{FP}\) is divided into 8 × 8 pixel blocks in which each block is comprised with 64 values, from this a block is selected from left to right and top to bottom. The obtained 8 × 8 DCT coefficients are compressed through quantization. According to the level of compression, the quantization matrix is selected with 32 coefficients in zigzag fashion. The values in quantized matrix consist of both positive and negative, rounding operation is applied to obtain the final compressed binary array of \(Im\) . \({U}_{FP}\) is embedded into binary array \(Im\) with respect to the sub-bands. The method undergoes testing at regular intervals, with updates and improvements made as needed. It conducts daily automated tests to ensure the integrity and security of our system.

Let P be the probability of detecting a security vulnerability in a single test. The probability of not detecting a vulnerability in a single test is 1-P. With n tests the probability of not detecting a vulnerability in any of the tests is \({1-P}^{n}\) . Therefore, the probability of detecting at least one vulnerability in n tests is \(1-({1-P)}^{n}\) . By increasing the number of tests n, we can approach a probability of detecting vulnerabilities close to 1, ensuring rigorous testing.

DCT of LH1 is retrieved from the division of high-frequency sub-bands. Inverse DCT and DWT are determined to watermark \({U}_{FP}\) into the image (i.e.) expressed as,

Equation ( 3 ) obtains a watermarked fingerprint that is sent to CSP for secure storage. Quality of watermarked \({U}_{FP}\) is sustained and to minimize space complexity minutiae points extracted from thinned fingerprint image and then applied over DWT-DCT algorithm for hiding in the watermark.

Octet Key generation: Combination of \({U}_{PW}\) and \({U}_{FP}\) in logical OR operation produces \({Oc}_{K}\) . \({B}_{I}\) and \({U}_{PW}\) is converted into 8-bit binary values. From the generated value, 8-bit \({Oc}_{K}\) is selected either Most Significant Bit (MSB) or Least Significant Bit (LSB).

figure b

\({Oc}_{K}\) is generated in TTP and sent to \(U\) , which is required for the use of making changes in password.

4.2.2 Login phase

The login phase is comprised of three levels of authentication to permit each \(U\) into cthe loud to access files by viewing or uploading or downloading. Registered users are only permitted to participate in cloud. When a User \(U\) wishes to login to CSP, the \(U\) has to enroll the details as identity, password and fingerprint.

First level authentication (FLA)

\(U\) Submits his/her own identity and password to TTP. Previously in registration phase TTP has stored each \(U\) ’s \({Nc}_{v}\) according to \({U}_{ID}\) and \({U}_{PW}\) .

figure c

Second level authentication (SLA)

Upon completing FLA, cloud provides an image to proceed \(U\) towards the next level of SLA. A unique image is sent to \(U\) via e-mail(i.e.) provided during registration.

A 5 × 5 graphical grid is constructed with 25 flower images that include image provided to \(U\) . Figure  3 shows a grid in which (C1, C2, C3, C4, C5) columns and (R1, R2, R3, R4, R5) rows explaining two instances of GOTP using Grid-Pair based authentication. The circled flower image (C3R3) is obtained as OTP to \(U\) from CSP. \(U\) receives a grid of images and \(U\) need to select two different images present in the grid. The logic is that, the selected image’s intersection should be linked to the image received from CSP. Anyone image in C3 and anyone image in R3 is selected to complete SLA. \(U\) completing this level is permitted to view and download files from CSP.

figure 3

Graphical OTP

Third level authentication (TLA)

In TLA, the \(U\) submits a fingerprint for final level verification. UThe user fingerprint is validated from the watermarked fingerprint that was stored in CSP. Inverse process of DWT-DCT algorithm is applied to obtain \({U}_{FP}\) from watermarked image. Minutiae points of \({U}_{FP}\) are matched and valid users are permitted with complete access into cloud (i.e.) viewing files, uploading files and downloading files.

4.2.3 Password amending phase

For user convenience the once registered password is changed into new password. \(U\) requests for password change to cloud via TTP. To validate authentication of \(U\) , TTP requests back the \(U\) for generated \({Oc}_{K}\) . TTP on receiving \({Oc}_{K}\) matches from the \({Oc}_{K}\) (i.e.) previously stored. After matching \({Oc}_{K}\) , the \(U\) is determined to be authorized and then TTP intimates cloud for \({Nc}_{v}\) generation. Since the \(U\) ’s are enabled to change one’s password only with the \({Nc}_{v}\) value. Then as usual the \(U\) receives \({Nc}_{v}\) value for changing password and further it is encrypted and stored in cloud.

figure d

Generation of \({Oc}_{K}\) is depicted in Fig.  4 by applying logical OR operation. Password Amending Phase is not always performed by all the users. If the \(U\) ’s password has been leaked by his/herself then the \({PU}_{PW}\) can be changed into \({NU}_{PW}\) . MFA has been actively involved in cloud computing for providing higher level of security. In this paper MFA-MLS technique is designed with multiple factors and multiple security levels to avoid the adversary’s involvement in the cloud. Cloud is enabled with usage of several applications that store sensitive and confidential data, hence securities using single and double factors are not sufficient. This was solved by MFA which tremendously enhances and improves the level of security.

figure 4

Octet key generation

5 Performance evaluation

In this section, a set of experiments was designed to evaluate the effectiveness and performance of the MFA-MLS technique. Java is preferred for implementing this technique that supports secure cloud construction to compare our study with previous research works. To evaluate this work, different attacks, as well as comparative results are discussed in the rest of this section to validate the proposed MFA-MLA technique [ 30 ]. Table 2 illustrates problems/limitations that existed in the previous algorithm, due to which their performance is low, and secrecy of level is minimized in the cloud. These limitations are solved in this paperwork by the novel design of the MFA-MLS technique in a cloud environment.

Table 3 depicts the security providence in different research works. Most of the MFA is supported with cloud and key management algorithms to resist several data-gathering attacks. All types are attacks should be inhibited by predicting their behavior and authenticating each participating user to determine their originality [ 31 , 32 ].

5.1 Implemented environment

This work is implemented using the Java tool which supports server configurations. As per this work, our design involves three significant entities: users, TTP and cloud service provider. TTP acts as an intermediator between the cloud and the user for secure storage of data; this provides data integrity and confidentiality. Table 4 lists out the specifications that are taken into account for our implementation of the cloud environment. Significant specifications are listed that are not limited to these entities, the numbers of users are increased, and their performance is validated by the MFA-MLS technique.

5.2 Security analysis

Credential change: In the MFA-MLS technique, the users are facilitated to change their password at any time as detailed in Sect. 4.2.3. During password change, TTP verifies the generated octet key to ensure that the requester is the original cloud user or not. Man-In-Middle attack (MIMA): These types of attacks cannot participate when MFA-MLS technique is performed, since even the \(U\) ’s ID and password is knowing the MIMA is unable to login cloud. In MFA-MLS password is encrypted and stored using nonce value, so to pass FLA nonce value is mandatory in MFA-MLS technique.

Unauthorized Access attack: Unauthorized user’s is very difficult, anyhow a factor gather by attacker is not sufficient to login the account with limitless cloud accessibility. In MFA-MLS password is encrypted and stored, similarly fingerprint is also watermarked and stored, hence gathering registered factor of a \(U\) from cloud is complex. Password guessing attack: In MFA-MLS technique, even if attacker either guesses the password or predicts the password he/she cannot access login without knowing nonce value. Impersonation attack: An impersonator involves in CSP to cheat \(U\) , however this attacker tries to obtain \(U\) factors it cannot be decrypted, since the \({S}_{k}\) is present only at TTP. Therefore MFA-MLS technique solves this attack.

Forward secrecy: The level of secrecy needs to be sustained while forwarding sensitive data from one end-user towards storage environment. Forward secrecy is retrieval of long term secret keys of \(U\) ’s, to predict previously used session keys. On obtaining these keys the working procedure of session key generation can be determined. Hereby in MFA-MLS technique session keys are not generated and the involvement of TTP supports with highly secure transmission of keys that are generated for each factor stored in cloud. MFA-MLS technique supports security, and it performs well against several attackers who try to gather sensitive data by either retrieving the secret keys or by retrieving data during transmission. However, cloud is aware about attackers, it is mandatory to provide security by validating each user before proceeding for data storage.

Table 5 illustrates the factors utilized by several authors in their research work. The level of secrecy increases when considering more than one factor for authentication verification. MFA is required in case of storage of more sensitive data.

MFA-MLS technique used factors as ID, password, fingerprint and OTP. Before storing sensitive data, each user is required to perform registration process for analyzing legitimate users during login. Factors obtained from users are securely stored in terms of encrypted format and watermarked format. This creates a remarkable security of storing factors secure which is beyond the knowledge of cloud. Intermediate party is trusted, and it is responsible for sustaining the secrecy level between the user and the cloud. Hereby, obtained results of MFA-MLS technique are shown in Table  6 .

5.3 Comparative results

In this section, a detailed discussion of the MFA-MLS technique concerning previous authentication works is compared. The results are compared to analyze the betterment achieved in the MFA-MLS technique, which supports to utilisation of real world cloud assisted applications.

5.3.1 User experience

User experience is stated as “a person’s perceptions and responses (i.e.) resulted using participated user from product or system or service. Such things include user’s opinions, sensations, preferences, activities, and spiritual responses either during usage or after usage [ 22 ]. User experience does not remain static, it dynamically varies over time, since the usage circumstances change according to individuals’ thoughts. User experience is validated with the following expression,

Figure  5 shows the obtained \({U}_{ex}\) that is computed from the above expression, \({Nb}_{Auth}\) denotes the number of authentications that have been performed by the user and \({Nb}_{di}\) denotes the number of discards caused a number of times during user login process. \({U}_{ex}\) value ranges between \(0-1\) , as per increase in discards, the \({U}_{ex}\) gradually drops from \(1\) . Therefore, number of discards needs to be minimized to improve \({U}_{ex}\) , when compared with previous work, MFA-MLS technique show improvements in \({U}_{ex}\) .

figure 5

User experience

5.3.2 Effectiveness

Effectiveness is evaluated by the following constraints as success rate, failure rate and mean login time, the proposed system compared with the Three factor authentication protocol based on elliptic curve cryptosystem (TFP-ECC), Multifactor Authentication with Multimodal Biometrics (MFA-MB), two-factor data security protection (T-FDSP).

In the comparison examination, the suggested method surpasses existing strategies in terms of MLT) with a significantly lower value of 10 ms than TFP-ECC, MFA-MB, and T-FDSP. Furthermore, the proposed technique has a better Average Success Rate (SR) at 94.4%, outperforming TFP-ECC and T-FDSP. Furthermore, it has the lowest Average Failure Rate (FR) of 6%, suggesting excellent authentication performance above all other techniques.

Figures  6 and 7 represent the Success Rate (SR) and Failure Rate (FR) obtained during authentication over three sequential trials. The results of SR show approximately 95% of successful attempts in Fig.  6 . The three trails obtained nearly similar changes and have achieved more than 90% of SR, hence it supports the user’s authorized login. According to the number of user logins in each trial, the technique’s SR and FR changes. Similarly, FR decreases in the MFA-MLS technique when compared with the existing method which proves the MFA-MLS technique is much better in providing authentication to all participating users. Average SR and FR for three trials are evaluated and listed out in Table  7 in which both SR and FR show 1% increase when comparing the MFA-MLS technique concerning previous research work.

figure 6

Success rate

figure 7

5.3.3 Time analysis

This comparative result depicts time variations in three different constraints such as login time, technique’s simplicity and authentication convenience. Time variations are highlighted in Fig.  8 with respect to the available evaluated data of MFA-MLSF technique and previous authentication technique.

figure 8

Time analysis

From this result demonstration MFA-MLS technique supports large number of users and it tends to increase accuracy in authentication and enhances reliability. Reduction in time consumption attracts many cloud users to utilize MFA-MLS technique that provides secure storage environment at short time.

6 Conclusions

In this paper, we propose a novel technique Multi-Factor Authentication Multi-Level Security (MFA-MLS). We develop an enhanced MFA technique that allows improving user experience by combining authentication process and secure storage. The proposed multi factors exists previously, the secure storage of factors is required due to the involvement of several types of attackers. MFA-MLS technique’s effectiveness is shown by constructing and secure environment of users, TTP and cloud. TTP is more trusted than cloud, so TTP plays a major role in transferring factors from users in highly secure manner. User password is encrypted along with the generated nonce value using HMAC-SHA 256 and sent to cloud via TTP. Then second factor is a biometric (i.e.) user fingerprint which is watermarked and stored. Image processing techniques are used for watermarking and hereby to minimize the size of fingerprints we have predicted the minutiae points for watermarking. Combination of DWT-DCT is made use to watermark the selected minutiae points from the given user fingerprint. Third fact is computation of GOTP, a pair of image is selected from the grid with respect to their intersection. Our MFA-MLS technique also supports password changing process that enhances user convenience to modify password using generated octet key. Overall MFA-MLS technique in cloud has step forward to performance secure storage of factors that has been discussed in some research works. In future we have planned to inbuilt TTP with neural network for still increasing their processing speed and Quality of Service.

Data availability

Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.

References  

Babu JA, Neha HP, Babu KS, Pinto RN (2022) Secure data retrieval system using biometric identification. In: 2022 IEEE International conference on data science and information system (ICDSIS) pp 1–4

Zhou Z, Yang C, Yang Y, Sun X (2019) Polynomial-based google map graphical password system against shoulder-surfing attacks in cloud environment. CompLex 1–2875676:8

Google Scholar  

Kameswara Rao M, Usha Switha T, Naveen S, (2016) A novel graphical password authentication mechanism for cloud services. In: Information Systems Design and Intelligent Applications: Proceedings of third international conference INDIA 2016. Vol 1. Springer, India, pp 447–453

Adamu HI, Mohammed AD, Adepoju SA, Aderiike AO (2022) A three-step one-time password, textual and recall-based graphical password for an online authentication. 2022 IEEE Nigeria 4th International Conference on Disruptive Technologies for Sustainable Development (NIGERCON) 1–5

Chimakurthi BN, Prakash KB (2022) Image and video-based graphical password authentication. 2022 First International Conference on Electrical, Electronics, Information and Communication Technologies (ICEEICT) 01–08

Kenneth MO, Olujuwon SM (2021) Web application authentication using visual cryptography and cued clicked point recall-based graphical password. Journal of Computer Science Research 3(3):29–41

Takada T, Yoshida M (2021) Pict-place authentication: Recognition-based Graphical Password using Image Layout for Better Balance of Security and Operation Time. CHItaly 2021: 14th Biannual Conference of the Italian SIGCHI Chapter

Chu X, Sun H, Chen Z (2020) PassPage: Graphical password authentication scheme based on web browsing records. Financial Cryptography Workshops

Nandi P, Savant DP (2022) Graphical password authentication system. Int J Res Appl Sci Eng Technol 10(4):1759–1765

Rohitash Kumar Banyal, Pragya Jain, Vijendra Kumar Jain (2013) “Multi-factor authentication framework for cloud computing”, Fifth International Conference on Computational Intelligence, Modelling and Simulation, IEEE 105–110

KL Neela, V Kavitha (2017) “Enhancement of data confidentiality and secure data transaction in cloud storage environment”. Cluster Computing, Springer 1–10

Divya R, Muthukumarasamy S (2015) An impervious QR-based visual authentication protocols to prevent black-bag cryptanalysis. In: 2015 IEEE 9th International conference on intelligent systems and control (ISCO) pp 1–6

Abdeljebar Mansour, Mohamed Sadik, Essaid Sabir (2015) “Multi-factor authentication based on Multimodal Biometrics (MFA-MB) for Cloud Computing”, IEEE/ACS 12th International Conference of Computer Systems and Applications 1–4

Nedjah N, Wyant RS, Mourelle LM, Gupta BB (2017) Efficient Yet Robust Biometric Iris Matching on Smart Cards for Data High Security and Privacy. Elsevier, Future Generation Computer systems

Book   Google Scholar  

Zhang R, Xiao Y, Sun S, Ma H (2017) Efficient multi-factor authenticated key exchange scheme for mobile communications. IEEE Transactions on dependable and secure computing, 16(4):625–634

Chandrakar P, Om H (2017) A secure and robust anonymous three-factor remote user authentication scheme for multi-server environment using ECC. Elsevier, Computer Communications

He D, Kumar N, Jong-Hyouk Lee R, Sherratt S (2014) Enhanced three-factor security protocol for consumer USB mass storage devices. IEEE Trans Consumer Electron 60(1):30–37

Article   Google Scholar  

Hem D, Bhanumathi S (2016) Mouse behaviour based multi-factor authentication using neural networks. In: 2016 International Conference on Circuit, Power and Computing Technologies (ICCPCT) IEEE pp 1–8

Nagaraju S, Parthiban L (2015) Trusted framework for online banking in public cloud using multi-factor authentication and privacy protection gateway. Journal of Cloud Computing, 4:1–23

Liu JK, Liang K, Susilo W, Liu J, Xiang Y (2016) Two-factor data security protection mechanism for cloud storage system. IEEE Trans Comput 65(6):1992–2004

Article   MathSciNet   Google Scholar  

Abderrahim Abdellaouia, Younes Idrissi Khamlichib, Habiba Chaoui (2016) “A novel strong password generator for improving cloud authentication”. Int Conf Comput Model Sec Elsevier 293–300

Abdeljebar Mansour, Mohamed Sadik, Essaïd Sabir, Mohamed Azmi (2016) “A context-aware multimodal biometric authentication for cloud-empowered systems”. Int Conf Wirel Networks Mobile Commun IEEE 278–285

Xie Qi, Wong DS, Wang G, Tan X, Chen K, Fang L (2017) Provably secure dynamic ID-based anonymous two-factor authenticated key exchange protocol with extended security model. IEEE Trans Inf Forensics Secur 12(6):1382–1392

Jiang Q, Khurram Khan M, Lu X, Ma J, He D (2016) A privacy preserving three-factor authentication protocol for e-health clouds. J Supercomput Springer 72(10):3826–3849

Alsaiari H, Papadaki M, Dowland P, Furnell S (2016) Graphical one-time password (GOTPass): a usability evaluation. Information security journal: a global perspective, 25(1-3):94–108

Alkhalifah ES, Almalki FA (2023) Developing an intelligent cellular structure design for a UAV wireless communication topology. Axioms 12(2):129

Ilakkiya N, Rajaram A (2023) Blockchain-assisted secure routing protocol for cluster-based mobile-ad hoc networks. Int J Comput Commun Control 18(2)

Kaur M, Alzubi AA, Walia TS, Yadav V, Kumar N, Singh D, Lee HN (2023) EGCrypto: A low-complexity elliptic Galois cryptography model for secure data transmission in IoT. IEEE Access.

Kaur M, AlZubi AA, Singh D, Kumar V, Lee HN (2023) Lightweight biomedical image encryption approach. IEEE Access.

Shafiq M, Tian Z, Bashir AK, Jolfaei A, Yu X (2020) Data mining and machine learning methods for sustainable smart cities traffic classification: A survey. Sustain Cities Soc 1(60):102177

Shafiq M, Tian Z, Bashir AK, Du X, Guizani M (2020) CorrAUC: A malicious bot-IoT traffic detection method in IoT network using machine-learning techniques. IEEE Int Things J 8(5):3242–3254

Shafiq M, Tian Z, Bashir AK, Du X, Guizani M (2020) IoT malicious traffic identification using wrapper-based feature selection mechanisms. Comput Secur 1(94):101863

Download references

No funding is involved in this work.

Author information

Authors and affiliations.

Graphic Design and Digital Media Department, College of Arts and Design, Princess Nourah Bint Abdulrahman University, P.O. Box 84428, 11671, Riyadh, Saudi Arabia

Eman S. Alkhalifah

You can also search for this author in PubMed   Google Scholar

Contributions

All authors are contributed equally to this work.

Corresponding author

Correspondence to Eman S. Alkhalifah .

Ethics declarations

Ethics approval and consent to participate.

No participation of humans takes place in this implementation process.

Human and animal rights

No violation of Human and Animal Rights is involved.

Conflict of interest

Conflict of Interest is not applicable in this work.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Alkhalifah, E.S. Password based authentication for web based graphics computing services retrieval in cloud. Multimed Tools Appl (2024). https://doi.org/10.1007/s11042-024-19044-8

Download citation

Received : 21 August 2023

Revised : 22 March 2024

Accepted : 24 March 2024

Published : 19 April 2024

DOI : https://doi.org/10.1007/s11042-024-19044-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Computer Graphics
  • Visualization
  • Multi-factor Authentication
  • Fingerprint
  • Cryptography Algorithm
  • Watermarking
  • Find a journal
  • Publish with us
  • Track your research
  • MyU : For Students, Faculty, and Staff

Fall 2024 CSCI Special Topics Courses

Cloud computing.

Meeting Time: 09:45 AM‑11:00 AM TTh  Instructor: Ali Anwar Course Description: Cloud computing serves many large-scale applications ranging from search engines like Google to social networking websites like Facebook to online stores like Amazon. More recently, cloud computing has emerged as an essential technology to enable emerging fields such as Artificial Intelligence (AI), the Internet of Things (IoT), and Machine Learning. The exponential growth of data availability and demands for security and speed has made the cloud computing paradigm necessary for reliable, financially economical, and scalable computation. The dynamicity and flexibility of Cloud computing have opened up many new forms of deploying applications on infrastructure that cloud service providers offer, such as renting of computation resources and serverless computing.    This course will cover the fundamentals of cloud services management and cloud software development, including but not limited to design patterns, application programming interfaces, and underlying middleware technologies. More specifically, we will cover the topics of cloud computing service models, data centers resource management, task scheduling, resource virtualization, SLAs, cloud security, software defined networks and storage, cloud storage, and programming models. We will also discuss data center design and management strategies, which enable the economic and technological benefits of cloud computing. Lastly, we will study cloud storage concepts like data distribution, durability, consistency, and redundancy. Registration Prerequisites: CS upper div, CompE upper div., EE upper div., EE grad, ITI upper div., Univ. honors student, or dept. permission; no cr for grads in CSci. Complete the following Google form to request a permission number from the instructor ( https://forms.gle/6BvbUwEkBK41tPJ17 ).

CSCI 5980/8980 

Machine learning for healthcare: concepts and applications.

Meeting Time: 11:15 AM‑12:30 PM TTh  Instructor: Yogatheesan Varatharajah Course Description: Machine Learning is transforming healthcare. This course will introduce students to a range of healthcare problems that can be tackled using machine learning, different health data modalities, relevant machine learning paradigms, and the unique challenges presented by healthcare applications. Applications we will cover include risk stratification, disease progression modeling, precision medicine, diagnosis, prognosis, subtype discovery, and improving clinical workflows. We will also cover research topics such as explainability, causality, trust, robustness, and fairness.

Registration Prerequisites: CSCI 5521 or equivalent. Complete the following Google form to request a permission number from the instructor ( https://forms.gle/z8X9pVZfCWMpQQ6o6  ).

Visualization with AI

Meeting Time: 04:00 PM‑05:15 PM TTh  Instructor: Qianwen Wang Course Description: This course aims to investigate how visualization techniques and AI technologies work together to enhance understanding, insights, or outcomes.

This is a seminar style course consisting of lectures, paper presentation, and interactive discussion of the selected papers. Students will also work on a group project where they propose a research idea, survey related studies, and present initial results.

This course will cover the application of visualization to better understand AI models and data, and the use of AI to improve visualization processes. Readings for the course cover papers from the top venues of AI, Visualization, and HCI, topics including AI explainability, reliability, and Human-AI collaboration.    This course is designed for PhD students, Masters students, and advanced undergraduates who want to dig into research.

Registration Prerequisites: Complete the following Google form to request a permission number from the instructor ( https://forms.gle/YTF5EZFUbQRJhHBYA  ). Although the class is primarily intended for PhD students, motivated juniors/seniors and MS students who are interested in this topic are welcome to apply, ensuring they detail their qualifications for the course.

Visualizations for Intelligent AR Systems

Meeting Time: 04:00 PM‑05:15 PM MW  Instructor: Zhu-Tian Chen Course Description: This course aims to explore the role of Data Visualization as a pivotal interface for enhancing human-data and human-AI interactions within Augmented Reality (AR) systems, thereby transforming a broad spectrum of activities in both professional and daily contexts. Structured as a seminar, the course consists of two main components: the theoretical and conceptual foundations delivered through lectures, paper readings, and discussions; and the hands-on experience gained through small assignments and group projects. This class is designed to be highly interactive, and AR devices will be provided to facilitate hands-on learning.    Participants will have the opportunity to experience AR systems, develop cutting-edge AR interfaces, explore AI integration, and apply human-centric design principles. The course is designed to advance students' technical skills in AR and AI, as well as their understanding of how these technologies can be leveraged to enrich human experiences across various domains. Students will be encouraged to create innovative projects with the potential for submission to research conferences.

Registration Prerequisites: Complete the following Google form to request a permission number from the instructor ( https://forms.gle/Y81FGaJivoqMQYtq5 ). Students are expected to have a solid foundation in either data visualization, computer graphics, computer vision, or HCI. Having expertise in all would be perfect! However, a robust interest and eagerness to delve into these subjects can be equally valuable, even though it means you need to learn some basic concepts independently.

Sustainable Computing: A Systems View

Meeting Time: 09:45 AM‑11:00 AM  Instructor: Abhishek Chandra Course Description: In recent years, there has been a dramatic increase in the pervasiveness, scale, and distribution of computing infrastructure: ranging from cloud, HPC systems, and data centers to edge computing and pervasive computing in the form of micro-data centers, mobile phones, sensors, and IoT devices embedded in the environment around us. The growing amount of computing, storage, and networking demand leads to increased energy usage, carbon emissions, and natural resource consumption. To reduce their environmental impact, there is a growing need to make computing systems sustainable. In this course, we will examine sustainable computing from a systems perspective. We will examine a number of questions:   • How can we design and build sustainable computing systems?   • How can we manage resources efficiently?   • What system software and algorithms can reduce computational needs?    Topics of interest would include:   • Sustainable system design and architectures   • Sustainability-aware systems software and management   • Sustainability in large-scale distributed computing (clouds, data centers, HPC)   • Sustainability in dispersed computing (edge, mobile computing, sensors/IoT)

Registration Prerequisites: This course is targeted towards students with a strong interest in computer systems (Operating Systems, Distributed Systems, Networking, Databases, etc.). Background in Operating Systems (Equivalent of CSCI 5103) and basic understanding of Computer Networking (Equivalent of CSCI 4211) is required.

  • Future undergraduate students
  • Future transfer students
  • Future graduate students
  • Future international students
  • Diversity and Inclusion Opportunities
  • Learn abroad
  • Living Learning Communities
  • Mentor programs
  • Programs for women
  • Student groups
  • Visit, Apply & Next Steps
  • Information for current students
  • Departments and majors overview
  • Departments
  • Undergraduate majors
  • Graduate programs
  • Integrated Degree Programs
  • Additional degree-granting programs
  • Online learning
  • Academic Advising overview
  • Academic Advising FAQ
  • Academic Advising Blog
  • Appointments and drop-ins
  • Academic support
  • Commencement
  • Four-year plans
  • Honors advising
  • Policies, procedures, and forms
  • Career Services overview
  • Resumes and cover letters
  • Jobs and internships
  • Interviews and job offers
  • CSE Career Fair
  • Major and career exploration
  • Graduate school
  • Collegiate Life overview
  • Scholarships
  • Diversity & Inclusivity Alliance
  • Anderson Student Innovation Labs
  • Information for alumni
  • Get engaged with CSE
  • Upcoming events
  • CSE Alumni Society Board
  • Alumni volunteer interest form
  • Golden Medallion Society Reunion
  • 50-Year Reunion
  • Alumni honors and awards
  • Outstanding Achievement
  • Alumni Service
  • Distinguished Leadership
  • Honorary Doctorate Degrees
  • Nobel Laureates
  • Alumni resources
  • Alumni career resources
  • Alumni news outlets
  • CSE branded clothing
  • International alumni resources
  • Inventing Tomorrow magazine
  • Update your info
  • CSE giving overview
  • Why give to CSE?
  • College priorities
  • Give online now
  • External relations
  • Giving priorities
  • Donor stories
  • Impact of giving
  • Ways to give to CSE
  • Matching gifts
  • CSE directories
  • Invest in your company and the future
  • Recruit our students
  • Connect with researchers
  • K-12 initiatives
  • Diversity initiatives
  • Research news
  • Give to CSE
  • CSE priorities
  • Corporate relations
  • Information for faculty and staff
  • Administrative offices overview
  • Office of the Dean
  • Academic affairs
  • Finance and Operations
  • Communications
  • Human resources
  • Undergraduate programs and student services
  • CSE Committees
  • CSE policies overview
  • Academic policies
  • Faculty hiring and tenure policies
  • Finance policies and information
  • Graduate education policies
  • Human resources policies
  • Research policies
  • Research overview
  • Research centers and facilities
  • Research proposal submission process
  • Research safety
  • Award-winning CSE faculty
  • National academies
  • University awards
  • Honorary professorships
  • Collegiate awards
  • Other CSE honors and awards
  • Staff awards
  • Performance Management Process
  • Work. With Flexibility in CSE
  • K-12 outreach overview
  • Summer camps
  • Outreach events
  • Enrichment programs
  • Field trips and tours
  • CSE K-12 Virtual Classroom Resources
  • Educator development
  • Sponsor an event

Modernize Your Cloud Governance To Match Today’s Cloud Strategy

Tracy Woo , Principal Analyst

Your cloud usage continues to grow. The types of workloads you’re migrating are trending increasingly mission-critical. Your cloud governance program must match this new reality. For this reason, along with new and developing industry regulations, growing sovereignty requirements, and a plethora of breaches/vulnerabilities, companies are revisiting or standing up cloud governance programs that have not existed in long-standing cloud programs.

The motivation for cloud governance is obvious. Implementation is much more difficult. Part of the issue: There are many paths to cloud governance. Some are just cost, basic access security, and DevOps. Other paths tie in broader operations, data management, change management, and collaboration. Even the cloud providers themselves vastly differ in scope when it comes to governance framework recommendations. Like with any enterprise process, it’s important to start with the definition. Since starting this coverage, I’ve reviewed over 100 governance strategies from enterprises across the globe. Across these companies, definitions vary widely. Since it is one of the top topics of 2024, I’ve spent the beginning of this year revamping our own cloud governance coverage — starting with the definition.

Forrester defines cloud governance as:

A set of rules, policies, and processes (implementation, enablement, and maintenance) that guides an organization’s cloud operations without breaching the parameters of risk tolerance or compliance obligations.

We developed research that manifested into three reports: Build Your Cloud Governance Framework , Assess Your Cloud Governance Maturity , and one written with my colleague Andras Cser , The Forrester Guide To Cloud Governance . In this work, the scope of cloud governance is:

  • Security: a security baseline, security toolchain options, classification of data schema, risk assessment and planning, and security policies and triggers
  • Cost: maximizing the value of cloud investments, forecasting cloud spend, leveraging automation for billing, reporting on cost and cost reduction, and enforcing cost policies
  • Identity baseline: identity authentication protocol, user/role-based permissions, designation of access groups, collaboration restrictions, identity program audits, and log activity audits
  • Resource configuration: syncing with corporate CMDB, reusable templates and blueprints, and creation and maintenance of landing zones
  • Automated DevOps governance: automated workflows (deployment and updates to infra, configs, libraries, secrets, keys, and certificates), CI/CD pipelines, and enforcing governance for build, test, release, and deployment

No matter your approach, a few truths remain:

  • Cost and security exist for almost every definition.
  • Guardrails are the goal and must walk the delicate balancing act between minimally inhibiting productivity and standardizing governance principles across functions — leaders in the DevOps world call this wide boulevards and high curbs .
  • The tired adage of alignment and exec support is still true and absolutely crucial.

If you have questions or want direction on how to set up or upgrade your cloud governance program, set up an inquiry or guidance session with me.

  • CIO insights
  • CISO Trends
  • Cloud Computing Trends
  • Cloud Security
  • Data Governance
  • development & operations (DevOps)
  • digital business
  • Digital Transformation
  • GRC - Governance, Risk, And Compliance
  • network security
  • private cloud
  • public cloud
  • security architecture
  • security risk management

cloud computing research work

Thanks for signing up.

Stay tuned for updates from the Forrester blogs.

Stop Runaway Tech Debt With Our Essential Toolkit

Unmanaged tech debt piles up fast. tackle it head on with our tech debt toolkit, featuring seven management strategies, a red-flag checklist, and a framework for success., five things you should know about burnout in cybersecurity but probably don’t, wiz acquires cloud detection and response specialist gem security to round out cloud security portfolio, get the insights at work newsletter, help us improve.

  • Search for: Toggle Search

Wide Open: NVIDIA Accelerates Inference on Meta Llama 3   

NVIDIA today announced optimizations across all its platforms to accelerate Meta Llama 3 , the latest generation of the large language model ( LLM ).

The open model combined with NVIDIA accelerated computing equips developers, researchers and businesses to innovate responsibly across a wide variety of applications.

Trained on NVIDIA AI

Meta engineers trained Llama 3 on a computer cluster packing 24,576 NVIDIA H100 Tensor Core GPUs , linked with an NVIDIA Quantum-2 InfiniBand network. With support from NVIDIA, Meta tuned its network, software and model architectures for its flagship LLM.

To further advance the state of the art in generative AI , Meta recently described plans to scale its infrastructure to 350,000 H100 GPUs.

Putting Llama 3 to Work

Versions of Llama 3, accelerated on NVIDIA GPUs, are available today for use in the cloud, data center, edge and PC.

From a browser, developers can try Llama 3 at ai.nvidia.com . It’s packaged as an NVIDIA NIM microservice with a standard application programming interface that can be deployed anywhere.

Businesses can fine-tune Llama 3 with their data using NVIDIA NeMo , an open-source framework for LLMs that’s part of the secure, supported NVIDIA AI Enterprise platform. Custom models can be optimized for inference with NVIDIA TensorRT-LLM and deployed with NVIDIA Triton Inference Server .

Taking Llama 3 to Devices and PCs

Llama 3 also runs on NVIDIA Jetson Orin for robotics and edge computing devices, creating interactive agents like those in the Jetson AI Lab .

What’s more, NVIDIA RTX and GeForce RTX GPUs for workstations and PCs speed inference on Llama 3. These systems give developers a target of more than 100 million NVIDIA-accelerated systems worldwide.

Get Optimal Performance with Llama 3

Best practices in deploying an LLM for a chatbot involves a balance of low latency, good reading speed and optimal GPU use to reduce costs.

Such a service needs to deliver tokens — the rough equivalent of words to an LLM — at about twice a user’s reading speed which is about 10 tokens/second.

Applying these metrics, a single NVIDIA H200 Tensor Core GPU generated about 3,000 tokens/second — enough to serve about 300 simultaneous users — in an initial test using the version of Llama 3 with 70 billion parameters.

That means a single NVIDIA HGX server with eight H200 GPUs could deliver 24,000 tokens/second, further optimizing costs by supporting more than 2,400 users at the same time.

For edge devices, the version of Llama 3 with eight billion parameters generated up to 40 tokens/second on Jetson AGX Orin and 15 tokens/second on Jetson Orin Nano.

Advancing Community Models

An active open-source contributor, NVIDIA is committed to optimizing community software that helps users address their toughest challenges. Open-source models also promote AI transparency and let users broadly share work on AI safety and resilience.

Learn more about how NVIDIA’s AI inference platform, including how NIM, TensorRT-LLM and Triton use state-of-the-art techniques such as low-rank adaptation to accelerate the latest LLMs.

NVIDIA websites use cookies to deliver and improve the website experience. See our cookie policy for further details on how we use cookies and how to change your cookie settings.

Share on Mastodon

We've detected unusual activity from your computer network

To continue, please click the box below to let us know you're not a robot.

Why did this happen?

Please make sure your browser supports JavaScript and cookies and that you are not blocking them from loading. For more information you can review our Terms of Service and Cookie Policy .

For inquiries related to this message please contact our support team and provide the reference ID below.

We use cookies to understand how you use our site and to improve your experience. This includes personalizing content and advertising. To learn more, click here . By continuing to use our site, you accept our use of cookies, revised Privacy Policy and Terms of Service .

Zacks Investment Research Home

New to Zacks? Get started here.

Member Sign In

Don't Know Your Password?

Zacks

  • Zacks #1 Rank
  • Zacks Industry Rank
  • Zacks Sector Rank
  • Equity Research
  • Mutual Funds
  • Mutual Fund Screener
  • ETF Screener
  • Earnings Calendar
  • Earnings Releases
  • Earnings ESP
  • Earnings ESP Filter
  • Stock Screener
  • Premium Screens
  • Basic Screens
  • Research Wizard
  • Personal Finance
  • Money Managing
  • Real Estate
  • Retirement Planning
  • Tax Information
  • My Portfolio
  • Create Portfolio
  • Style Scores
  • Testimonials
  • Zacks.com Tutorial

Services Overview

  • Zacks Ultimate
  • Zacks Investor Collection
  • Zacks Premium

Investor Services

  • ETF Investor
  • Home Run Investor
  • Income Investor
  • Stocks Under $10
  • Value Investor
  • Top 10 Stocks

Other Services

  • Method for Trading
  • Zacks Confidential

Trading Services

  • Black Box Trader
  • Counterstrike
  • Headline Trader
  • Insider Trader
  • Large-Cap Trader
  • Options Trader
  • Short Sell List
  • Surprise Trader
  • Alternative Energy

Zacks Investment Research Home

You are being directed to ZacksTrade, a division of LBMZ Securities and licensed broker-dealer. ZacksTrade and Zacks.com are separate companies. The web link between the two companies is not a solicitation or offer to invest in a particular security or type of security. ZacksTrade does not endorse or adopt any particular investment strategy, any analyst opinion/rating/report or any approach to evaluating individual securities.

If you wish to go to ZacksTrade, click OK . If you do not, click Cancel.

cloud computing research work

Image: Bigstock

Oracle (ORCL) to Invest More Than $8B in Cloud Computing & AI

Oracle ( ORCL Quick Quote ORCL - Free Report ) has announced its intention to invest more than $8 billion over the next decade to meet the growing demand for cloud computing and artificial intelligence (AI) infrastructure in Japan. This investment aims to expand the presence of Oracle Cloud Infrastructure (“OCI”) throughout Japan. Additionally, the company plans to grow its operations and support engineering teams in Japan to assist customers and partners with digital sovereignty requirements. To address the needs of customers and partners in Japan, ORCL will enhance local customer support for its public cloud regions in Tokyo and Osaka. It will also expand its local operations teams for services like Oracle Alloy and OCI Dedicated Region. These efforts aim to facilitate the transition of critical workloads to OCI, allowing the Japanese government and businesses to adopt sovereign AI solutions securely. The company's sovereign cloud and AI services will be available within Japan's borders or on-premises for organizations, offering various operational controls for security. Oracle distinguishes itself as the sole hyperscaler capable of delivering AI and a comprehensive suite of more than 100 cloud services locally and anywhere. Shares of this Zacks Rank #3 (Hold) company have gained 12.6% year to date compared with the Zacks Computer and Technology sector’s growth of 6.1%. This outperformance was due to ORCL’s alignment in understanding customer needs. You can see the complete list of today’s Zacks #1 Rank (Strong Buy) stocks here .

Oracle Corporation Price and Consensus

Oracle Corporation Price and Consensus

Oracle Corporation price-consensus-chart | Oracle Corporation Quote

ORCL Faces Competition in Japan’s Cloud Computing Market

According to a report by Techanivo, the cloud computing market in Japan is projected to reach $30.24 billion in 2027, witnessing a CAGR of 12.54% between 2022 and 2027. The Japanese cloud computing market is expanding because small and medium-sized businesses are increasingly using cloud services. Cloud computing is a popular trend now, and both small and big companies want to cut down on their spending by using new technologies. This allows businesses to adjust their computing resources to meet the changing needs without having to invest in additional hardware or software. Oracle faces tough competition in Japan’s cloud computing market from players like Microsoft ( MSFT Quick Quote MSFT - Free Report ) , Amazon ( AMZN Quick Quote AMZN - Free Report ) and Alphabet ( GOOGL Quick Quote GOOGL - Free Report ) . Microsoft Cloud offers various technology tools and solutions to help businesses adapt to the evolving landscape. It recently declared its intention to invest $2.9 billion in the next couple of years to enhance its large-scale cloud computing and AI infrastructure in Japan. Additionally, the company plans to broaden its digital training initiatives, aiming to provide AI skills training to more than three million individuals in the next three years. MSFT will establish its first research lab in Japan under Microsoft Research Asia and strengthen its collaboration with the Japanese government on cybersecurity. AMZN’s cloud division, Amazon Web Services (“AWS”), provides cloud services to government entities, with more than 7,500 agencies using its platform. In January 2024, AWS announced plans to invest 2.26 trillion yen ($15.24 billion) in Japan by 2027 to expand its cloud computing infrastructure, which supports AI services. AWS is increasing its facilities in Tokyo and Osaka to meet the rising demand from customers. Alphabet-owned Google Cloud provides a collection of flexible cloud services like computing, storing and analyzing data, as well as using machine learning. In 2023, GOOGL set up its first data center in Japan, close to Tokyo, to support its services better. The company also initiated a project to lay a cable under the sea connecting Canada and Japan. Moreover, it operates cloud regions in Tokyo and Osaka, offering services for storing data and managing AI infrastructure. ORCL recently partnered with Fujitsu Limited to provide cloud and AI services that meet the digital sovereignty needs of the Japanese government and businesses. Through Oracle Alloy, Fujitsu will enhance its Hybrid IT services for Fujitsu Uvance, assisting customers in business growth and addressing societal challenges. This is expected to aid the company’s Asia-Pacific revenues in the upcoming quarters. The Zacks Consensus Estimate for ORCL’s fiscal 2024 Asia-Pacific revenues is pegged at $6.87 billion, indicating year-over-year growth of 3.8%. The Zacks Consensus Estimate for earnings is pegged at $5.58 per share, indicating year-over-year growth of 8.98%.

See More Zacks Research for These Tickers

Normally $25 each - click below to receive one report free:.

Amazon.com, Inc. (AMZN) - free report >>

Microsoft Corporation (MSFT) - free report >>

Oracle Corporation (ORCL) - free report >>

Alphabet Inc. (GOOGL) - free report >>

Published in

This file is used for Yahoo remarketing pixel add

cloud computing research work

Due to inactivity, you will be signed out in approximately:

COMMENTS

  1. Research Note Cloud computing research: A review of research themes, frameworks, methods and future research directions

    Cloud computing research started to gain recognition around 2009 and has seen considerable rise over the years. From 6 journal articles in year 2009, cloud computing research continues to rise yearly as there are over 200 journal articles currently. We predict that more studies will be conducted on cloud computing in the coming years.

  2. Articles

    The smart collection and sharing of data is an important part of cloud-based systems, since huge amounts of data are being created all the time. This feature allows users to distribute data to particular recip... S. Velmurugan, M. Prakash, S. Neelakandan and Arun Radhakrishnan. Journal of Cloud Computing 2024 13 :86.

  3. Adoption of cloud computing as innovation in the organization

    Barriers to Cloud Computing deployment can be observed in the work of Jangjou M et al., 2022 where there is a strong focus on the Cybersecurity risks when adopting Cloud Computing technology in both client and server-side layers of Cloud architecture. 26 These risks include Providing vulnerable APIs to Cloud users, lack of awareness of the ...

  4. Cloud services selection: A systematic review and future research

    Cloud computing is a paradigm that provides on-demand computing resources such as storage, network, servers, ... A SRS is a type of SLR in which current research work is gathered and classified in order to provide a detailed outline of a specific field using a simple and effective search strategy. The goal of this SRS is to collect and analyze ...

  5. cloud computing Latest Research Papers

    The paper further compares and reviews different layout model for the discovery of services, selection of services and composition of services in Cloud computing. Recent research trends in service composition are identified and then research about microservices are evaluated and shown in the form of table and graphs. Download Full-text.

  6. (PDF) Cloud Computing: Current Research & Summary

    This research paper presents what cloud computing is, the various cloud models and the overview of the cloud computing architecture. ... ma nage VM instances, and work with both cloud-based and ...

  7. What Is Cloud Computing?

    Cloud computing is the on-demand access of computing resources—physical servers or virtual servers, data storage, networking capabilities, application development tools, software, AI-powered analytic tools and more—over the internet with pay-per-use pricing. The cloud computing model offers customers greater flexibility and scalability ...

  8. Cloud computing as a platform for genomic data analysis and

    The cloud is also the substrate for the NIH Data Commons Pilot, an effort to increase availability and utility of data and software from NIH-funded efforts 16, 17. The cloud has disadvantages as well. Depending on the user's financial incentives, the cloud might be more expensive than a local cluster 18.

  9. Research Advances in Cloud Computing

    About this book. This book addresses the emerging area of cloud computing, providing a comprehensive overview of the research areas, recent work and open research problems. The move to cloud computing is no longer merely a topic of discussion; it has become a core competency that every modern business needs to embrace and excel at.

  10. Cloud computing applications for biomedical science: A perspective

    Biomedical research has become a digital data-intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches.

  11. (PDF) A COMPREHENSIVE STUDY ON CLOUD COMPUTING

    A COMPREHENSIVE STUDY ON CLOUD. COMPUTING PARADIGM. Ab Rashid Dar 1, Dr. D. Ravindran 2. 1,2 Department of Computer Science, St. Joseph's College. (Autonomous), Tiruchirappalli Tamil Nadu, (Indi ...

  12. What is Cloud Computing?

    Cloud computing defined. Cloud computing is the on-demand availability of computing resources (such as storage and infrastructure), as services over the internet. It eliminates the need for individuals and businesses to self-manage physical resources themselves, and only pay for what they use. The main cloud computing service models include ...

  13. What is cloud computing? Everything you need to know about the cloud

    What is cloud computing, in simple terms? Cloud computing is the delivery of computing services—including servers, storage, databases, networking, software, analytics, and intelligence—over ...

  14. Top 10 Cloud Computing Research Topics in 2020

    Below are 10 the most demanded research topics in the field of cloud computing: 1. Big Data. Big data refers to the large amounts of data produced by various programs in a very short duration of time. It is quite cumbersome to store such huge and voluminous amounts of data in company-run data centers. Also, gaining insights from this data ...

  15. What Is Cloud Computing? How the Cloud Works

    Cloud Computing Service Types. Cloud computing services are broken down into three major categories: software-as-a-service (SaaS), platform-as-a-service (PaaS) and infrastructure-as-a-service (IaaS). Software-as-a-Service. SaaS is the most common cloud service type. Many of us use it on a daily basis.

  16. 12 Latest Cloud Computing Research Topics

    Cloud Computing is gaining so much popularity an demand in the market. It is getting implemented in many organizations very fast. One of the major barriers for the cloud is real and perceived lack of security. There are many Cloud Computing Research Topics, which can be further taken to get the fruitful output.. In this tutorial, we are going to discuss 12 latest Cloud Computing Research Topics.

  17. Top 10 Cloud Computing Research Topics of 2024

    4. Blockchain data-based cloud data integrity protection mechanism. The "Blockchain data-based cloud data integrity protection mechanism" paper suggests a method for safeguarding the integrity of cloud data and which is one of the Cloud computing research topics. In order to store and process massive amounts of data, cloud computing has grown ...

  18. Cloud computing

    Cloud computing [1] is the on-demand availability of computer system resources, especially data storage ( cloud storage) and computing power, without direct active management by the user. [2] Large clouds often have functions distributed over multiple locations, each of which is a data center.

  19. The role of cloud computing technology: A savior to fight the lockdown

    Every day, people lose work and wealth, with no way to tell when normality is going to return. The International labor organization reports that 195 million jobs could be lost . ... After many types of research in cloud computing that basically provide three service delivery models and four development modes which are IaaS, PaaS and SaaS, there ...

  20. (PDF) Research Paper on Cloud Computing

    Student. , M.Sc. I.T., I.C.S. College, Khed, Ratnagri. Abstract: Cloud Computing has come of age later Amazons introduce the first of its kind of cloud services in2006. It is. particularly ...

  21. Top 10 cloud computing careers of 2024 and how to get started

    3. Cloud security analyst. Cloud security analysts have the responsibility of ensuring the integrity and security of a company's cloud presence. They do this by assessing threats and shoring up defenses against them, preventing data breaches, securing data and eliminating security gaps if a breach occurs.

  22. Leverage Cloud Computing in Research Management

    7. Cloud computing has revolutionized the way research is managed. As a research manager, embracing this technology can significantly enhance your team's productivity and collaboration. Cloud ...

  23. Password based authentication for web based graphics computing services

    In the global environment privacy and security issues were critical to handle the huge number of participants. Many security-based research works have been undertaken in the cloud, including multi-factor authentication using Elliptic Curve Computation Diffie-Hellman Problem, Multi-model Biometric, Signatures, Graphical One-time Passwords and more. In this paper, we propose a multi-level multi ...

  24. Fall 2024 CSCI Special Topics Courses

    The dynamicity and flexibility of Cloud computing have opened up many new forms of deploying applications on infrastructure that cloud service providers offer, such as renting of computation resources and serverless computing. ... Students will also work on a group project where they propose a research idea, survey related studies, and present ...

  25. 4 Cloud Computing Career Paths to Know in 2024

    Cloud computing is a fast-growing field. According to research published on Statista, the worldwide market size for cloud computing applications is projected to reach $168.6 billion by 2025 [ 1 ]. That's many times higher than the 2013 global market size for cloud computing applications, which sat at just 30.4 billion that year.

  26. Top 6 Entry-Level Cloud Computing Jobs

    According to Built In, the average yearly salary of a cloud support engineer in the US is $137,911, with a low of $95,000 and a high of $190,000. According to ZipRecruiter, the average salary for this position in the US is $130,802, with a low of $49,000 and a high of $181,500. Cloud Support Engineer Requirements.

  27. Modernize Your Cloud Governance To Match Cloud Strategy

    Modernize Your Cloud Governance To Match Today's Cloud Strategy. Tracy Woo, Principal Analyst. Apr 17 2024. Your cloud usage continues to grow. The types of workloads you're migrating are trending increasingly mission-critical. Your cloud governance program must match this new reality. For this reason, along with new and developing industry ...

  28. Wide Open: NVIDIA Accelerates Inference on Meta Llama 3

    Putting Llama 3 to Work. Versions of Llama 3, accelerated on NVIDIA GPUs, are available today for use in the cloud, data center, edge and PC. From a browser, developers can try Llama 3 at ai.nvidia.com. It's packaged as an NVIDIA NIM microservice with a standard application programming interface that can be deployed anywhere.

  29. Africa's Biggest Mobile Carrier Boosts Huawei Ties With Tech Lab

    2:05. Africa's biggest wireless carrier, MTN Group Ltd., has opened a research lab with Huawei Technologies Co. in Johannesburg, deepening ties with the Chinese company and potentially ...

  30. Oracle (ORCL) to Invest More Than $8B in Cloud Computing & AI

    Oracle (. ORCL Quick Quote. ORCL - Free Report) has announced its intention to invest more than $8 billion over the next decade to meet the growing demand for cloud computing and artificial ...