It used to be obvious in Google Maps where the boundaries of different satellite images were. Each image had different brightness, contrast, colour, etc which gave away the stitching.
I always wondered whether there were techniques to normalize that.
I guess there are: Today I noticed the satellite images are stitched together seamlessly.
I also noticed some level-of-detail differences between land and ocean and that is also done pretty seamlessly.
It actually makes navigating around the satellite view a little eerie.
Anyone know when the change was made?
UPDATE: Actually it depends on the zoom level. Compare this to this. And notice the image credits are different. Interestingly, my home town of Perth looks fully normalized at all scales, even though the image sources are still TerraMetrics for the large scale and DigitalGlobe/GeoEye for the small scale.