This paper discusses measurement resolution for in-line inspection (ILI) of pipelines by intelligent pigs. In the early days of ILI, defect sensing was limited to a modest number of sensor channels, and inspection reports typically placed corrosion defects into only three categories such as light, moder- ate, and severe. Today's high-resolution ILI tools have shrunk the coverage of individual sensor channels to, in some cases, a few millimeters. This, among other improvements, has made it possible for inspection vendors to promise defect depth measurement accuracies of 5 to 10 percent of wall thickness and defect length measurement accuracies of several millimeters. Once the defect configuration has been estimated from ILI data, assessment procedures are invoked to forecast the remaining strength of the pipe. In this paper, we consider the effect of measurement resolution on the accuracy of assessment calculations and quantify the error introduced into failure-pressure calculations made from low-resolution data compared to high-resolution data.