Server maintenance is scheduled for Saturday, December 21st between 6am-10am CST.
During that time, parts of our website will be affected until maintenance is completed. Thank you for your patience.
Use GIVING24 at checkout to save 20% on eCourses and books (some exclusions apply)!
Cloud computing provides a reliable and robust infrastructure for users to remotely store and access huge amount of data. However, data integrity in the cloud is a major security concern for data owners no longer physically possess their sensitive data. To mitigate this challenge, remote data integrity has been proposed as a mechanism to enable data owners to verify the correctness of their outsourced data. The remote verification process needs to be done with reduced communication, computation, and storage overhead.
Cloud computing provides a reliable and robust infrastructure for users to remotely store and access huge amount of data. However, data integrity in the cloud is a major security concern for data owners no longer physically possess their sensitive data. To mitigate this challenge, remote data integrity has been proposed as a mechanism to enable data owners to verify the correctness of their outsourced data. The remote verification process needs to be done with reduced communication, computation and storage overhead. That is why traditional cryptographic primitives for data integrity based on hashing and signature schemes are not applicable; for it is impractical to download all stored data to validate their integrity (expensive I/O operations and immense communication overheads).Therefore, provable data possession (PDP) has been the main focus for many research studies to efficiently, periodically, and securely validate that a remote server –which supposedly stores the owner's potentially very large amount of data –is actually storing the data intact. There are many different variations of PDP schemes under different cryptographic assumptions. In this study, we provide a comparative analysis of various PDP schemes. We investigate not only PDP schemes for static data, but also protocols that handle the dynamic behavior of outsourced data. We implement a prototype that allows the data owner to outsource their data, and dynamically update the data by inserting, deleting, or modifying some data blocks. The prototype also evaluates the performance of different PDP schemes from different perspectives such as pre-computation times, computation times, verification times and storage overhead.
We are unable to complete this action. Please try again at a later time.
If this error continues to occur, please contact AMPP Customer Support for assistance.
Error Message:
Please login to use Standards Credits*
* AMPP Members receive Standards Credits in order to redeem eligible Standards and Reports in the Store
You are not a Member.
AMPP Members enjoy many benefits, including Standards Credits which can be used to redeem eligible Standards and Reports in the Store.
You can visit the Membership Page to learn about the benefits of membership.
You have previously purchased this item.
Go to Downloadable Products in your AMPP Store profile to find this item.
You do not have sufficient Standards Credits to claim this item.
Click on 'ADD TO CART' to purchase this item.
Your Standards Credit(s)
1
Remaining Credits
0
Please review your transaction.
Click on 'REDEEM' to use your Standards Credits to claim this item.
You have successfully redeemed:
Go to Downloadable Products in your AMPP Store Profile to find and download this item.
Pipelines are vast and complex networks delivering fossil fuel from remote locations to gas processing facilities refineries petrochemical manufacturers and refined products all the way to end users. Pipeline operators rely on Pipeline Integrity Management (PIM) systems to conduct safe and reliable hydrocarbon transportation operations cope with local regulations maximize transportation capacity and identify integrity threats.Internal and external corrosion are leading causes of incidents in pipelines that can lead to spills explosions and increased downtime. ASME describe the threats above as time-dependent; however they are commonly assessed with methods such as in-line inspection direct assessment and hydrostatic pressure tests whose measurement interval can range from months to years providing isolated snapshots throughout the pipeline lifetime. Moreover executing these techniques requires extensive planning and execution pipelines ready to accommodate in-line inspection tools and in some instances stop hydrocarbon transportation activities.Coping with increased demand pushes operators to boost their pipeline’s utilization rate to serve their customers and communities safely and reliably. In consequence PIM systems will require more data to constantly monitor dynamic changes along the infrastructure (either high consequence areas or not) and leverage predictive analytics. Increasing remote corrosion monitoring locations along several pipeline segments provide continuous input to feed PIM systems with on-line data that is seamlessly integrated into the operator’s control systems and data historian minimizing human intervention.This paper will explore remote corrosion monitoring technologies and how increasing real-time insights to risk maintenance and performance can increase reliability and decrease downtime through predictive analytics.
Tensile pull-off adhesion testing is becoming a more frequent specification requirement for in situ quality assurance testing to confirm proper surface preparation and adhesion of high performance protective linings applied to concrete. It is also becoming a widely used test for forensic analysis of protective linings in existing installations. There have been numerous investigations leading to the development of different devices and test pull-off adhesion methods used to assess bond strengths of mortars and overlay materials