My view has always been that the U.S. News methodology is basically to present a veneer of science, which has been doctored to substantially resemble the results of the peer assessment survey that is at its core. That is, the peer assessment survey provides a reasonable measure of what informed people think about rankings, and U.S. News wants to come close to that, so as to have as much credibility as possible, but not replicate it precisely, so that they can claim that their screwy methodology has some value added. This is why library size, which used to have a respectable weight in the survey, got reduced in importance: U.S. News discovered at some point in the 1990s that it didn't correlate very well with reputation, so they downsized its importance.
Anyway, I have reconstructed the rankings for the schools that got at least a 3.0 score on the peer assessment scale, and have attached list showing what the U.S. News list would look like if it used only the peer assessment score. It is in rank order, with the peer assessment score shown in parentheses right after the rank. As you can see, it doesn't change the top ranks very much. Chicago, Michigan, and Texas do a little better than they do under the U.S. News overall ranks, and Northwestern and Vanderbilt a little less well. But there is a big difference among some traditionally strong schools that don't do well under the U.S. News methodology, for one reason or another. Wisconsin (U.S. News rank of 36) and Florida (U.S. News rank of 46) come in at a more reasonable 24 and 35, respectively. Alabama, which beats the two just mentioned in the U.S. News ranks, doesn't show up on my list, because their peer assessment score was only 2.9. That seems to me to better reflect reality, but opinions may vary.