Search
Filters
Close

Remote Data Integrity For Cloud Based Storage System

Cloud computing provides a reliable and robust infrastructure for users to remotely store and access huge amount of data. However, data integrity in the cloud is a major security concern for data owners no longer physically possess their sensitive data. To mitigate this challenge, remote data integrity has been proposed as a mechanism to enable data owners to verify the correctness of their outsourced data. The remote verification process needs to be done with reduced communication, computation, and storage overhead. 

Product Number: 51216-010-SG
Author: Sravanthi Sukireddy, Ayad Barsoum
Publication Date: 2016
Industry: Coatings
$0.00
$20.00
$20.00

Cloud computing provides a reliable and robust infrastructure for users to remotely store and access huge amount of data. However, data integrity in the cloud is a major security concern for data owners no longer physically possess their sensitive data. To mitigate this challenge, remote data integrity has been proposed as a mechanism to enable data owners to verify the correctness of their outsourced data. The remote verification process needs to be done with reduced communication, computation and storage overhead. That is why traditional cryptographic primitives for data integrity based on hashing and signature schemes are not applicable; for it is impractical to download all stored data to validate their integrity (expensive I/O operations and immense communication overheads).Therefore, provable data possession (PDP) has been the main focus for many research studies to efficiently, periodically, and securely validate that a remote server –which supposedly stores the owner's potentially very large amount of data –is actually storing the data intact. There are many different variations of PDP schemes under different cryptographic assumptions. In this study, we provide a comparative analysis of various PDP schemes. We investigate not only PDP schemes for static data, but also protocols that handle the dynamic behavior of outsourced data. We implement a prototype that allows the data owner to outsource their data, and dynamically update the data by inserting, deleting, or modifying some data blocks. The prototype also evaluates the performance of different PDP schemes from different perspectives such as pre-computation times, computation times, verification times and storage overhead.

Cloud computing provides a reliable and robust infrastructure for users to remotely store and access huge amount of data. However, data integrity in the cloud is a major security concern for data owners no longer physically possess their sensitive data. To mitigate this challenge, remote data integrity has been proposed as a mechanism to enable data owners to verify the correctness of their outsourced data. The remote verification process needs to be done with reduced communication, computation and storage overhead. That is why traditional cryptographic primitives for data integrity based on hashing and signature schemes are not applicable; for it is impractical to download all stored data to validate their integrity (expensive I/O operations and immense communication overheads).Therefore, provable data possession (PDP) has been the main focus for many research studies to efficiently, periodically, and securely validate that a remote server –which supposedly stores the owner's potentially very large amount of data –is actually storing the data intact. There are many different variations of PDP schemes under different cryptographic assumptions. In this study, we provide a comparative analysis of various PDP schemes. We investigate not only PDP schemes for static data, but also protocols that handle the dynamic behavior of outsourced data. We implement a prototype that allows the data owner to outsource their data, and dynamically update the data by inserting, deleting, or modifying some data blocks. The prototype also evaluates the performance of different PDP schemes from different perspectives such as pre-computation times, computation times, verification times and storage overhead.

Also Purchased
Picture for 02084 Pipeline Mapping: Using INS/GPS Data...
Available for download

02084 Pipeline Mapping: Using INS/GPS Data as the Fundamental Tool for Integrating Historical and Current Pipeline Data

Product Number: 51300-02084-SG
ISBN: 02084 2002 CP
Author: Mark J. Slaughter and Todd Porter
$20.00
Picture for Pipeline Predictive Analytics Through On-Line Remote Corrosion Monitoring
Available for download

Pipeline Predictive Analytics Through On-Line Remote Corrosion Monitoring

Product Number: 51319-12899-SG
Author: Ivan Stubelj
Publication Date: 2019
$20.00

Pipelines are vast and complex networks delivering fossil fuel from remote locations to gas processing facilities refineries petrochemical manufacturers and refined products all the way to end users. Pipeline operators rely on Pipeline Integrity Management (PIM) systems to conduct safe and reliable hydrocarbon transportation operations cope with local regulations maximize transportation capacity and identify integrity threats.Internal and external corrosion are leading causes of incidents in pipelines that can lead to spills explosions and increased downtime. ASME describe the threats above as time-dependent; however they are commonly assessed with methods such as in-line inspection direct assessment and hydrostatic pressure tests whose measurement interval can range from months to years providing isolated snapshots throughout the pipeline lifetime. Moreover executing these techniques requires extensive planning and execution pipelines ready to accommodate in-line inspection tools and in some instances stop hydrocarbon transportation activities.Coping with increased demand pushes operators to boost their pipeline’s utilization rate to serve their customers and communities safely and reliably. In consequence PIM systems will require more data to constantly monitor dynamic changes along the infrastructure (either high consequence areas or not) and leverage predictive analytics. Increasing remote corrosion monitoring locations along several pipeline segments provide continuous input to feed PIM systems with on-line data that is seamlessly integrated into the operator’s control systems and data historian minimizing human intervention.This paper will explore remote corrosion monitoring technologies and how increasing real-time insights to risk maintenance and performance can increase reliability and decrease downtime through predictive analytics.