Surface of the ocean in Google Earth

Why doesn’t Google Earth display more of the ocean’s top surface (in addition to blurring a ring around the land masses and giving us an ocean floor view)?

Is there a technical reason?

The ring around the continents used to be larger years ago. I have always loved looking for whales, sharks, and ships.

Expense I guess. Flying a plane over the ocean just to take a picture of the same blue is quite costly with no real benefit.

Maybe they preferred to direct resource to land mapping.

Aren’t the satellites constantly roving everywhere though?

Much of the imagery on Google Maps is based on photos from airplanes.

I don’t think Google owns any satellites, they’ll buy imagery from those that do. Again, cost. I can’t think of the benefit of mapping a faceless ocean which would offset it.

The images you see on google maps, for land imagery at least is a mix images taken from satellite, plane and car cameras. Depending on your 'zoom,
Most recently they’ve used ROVs to take pictures of reefs, that’s quite cool

I much prefer looking at the sea floor morphology. Geology, plate boundaries, the Marianas Trench! It’s all there and it’s all good. I get a more intuitive grasp of plate tectonics looking at Google Earth sea floor imaging than from anything else.

Right. Typical resolution significantly exceeds what’s possible from a satellite (except possibly one operated by the DoD).

Maybe the cost of the satelite imagery is a portion of it, but I also think it was a design decision.

Years ago they used to show the ocean surface, it just wasn’t great resolution. Then they changed it to show the bottom. I think they thought that was better than an fairly blurry ocean surface. It was cool for a little while, but I also wish you could switch it back to the surface.

Just wait.

It’s called SkyNet for a reason.

It depends from country to country, but a lot of the imagery is collected by governments and is freely available. But, yeah, there’s all sorts of applications for having comprehensive aerial photos of land areas, but not so much with the surface of the ocean so governments don’t bother with those.

EarthViewer 3D was created by Keyhole, Inc, a CIA funded company acquired by Google in 2004 (look into In-Q-Tel). It maps the Earth by the superimposition of images obtained from satellite imagery, aerial photography and geographic information system (GIS). Maps showing a visual representation of Google Earth coverage Melbourne, Australia; Las Vegas, Nevada; and Cambridge, Cambridgeshire include examples of the highest resolution, at 15 cm (6 inches). For large parts of the surface of the Earth only 2D images are available. For other parts of the surface of the Earth, 3D images of terrain and buildings are available. Google Earth uses digital elevation model (DEM) data collected by NASA’s Shuttle Radar Topography Mission (SRTM).[12] This means one can view almost the entire earth in three dimensions. Since November 2006, the 3D views of many mountains, including Mount Everest, have been improved by the use of supplementary DEM data to fill the gaps in SRTM coverage.
Introduced in version 5.0 (February 2009), the Google Ocean feature allows users to zoom below the surface of the ocean and view the 3D bathymetry beneath the waves. Supporting over 20 content layers, it contains information from leading scientists and oceanographers. On April 14, 2009, Google added underwater terrain data for the Great Lakes. In 2010, Google added underwater terrain data for Lake Baikal.

In June 2011, higher resolution of some deep ocean floor areas increased in focus from 1-kilometer grids to 100 meters thanks to a new synthesis of seafloor topography released through Google Earth. The high-resolution features were developed by oceanographers at Columbia University’s Lamont-Doherty Earth Observatory from scientific data collected on research cruises. The sharper focus is available for about 5 percent of the oceans (an area larger than North America). Underwater scenery can be seen of the Hudson Canyon off New York City, the Wini Seamount near Hawaii, and the sharp-edged 10,000-foot-high Mendocino Ridge off the U.S Pacific Coast. There is a Google 2011 Seafloor Tour for those interested in viewing ocean deep terrain.

(straight from Wikipedia).

There are also certain parts of Google Earth which are deliberately misleading. For example - the Arctic Ocean which most people associate with ice bergs and snow - which is now all gone due to global warming. If you attempt to view the Arctic Ocean in Google Earth is will not show you the original images from the 1980’s when it was still covered in ice bergs - but if you simply google images of the Arctic Ocean you’ll see that they were there. It’s as if Google Earth doesn’t want people asking “what happened to the Arctic Ocean” or something. If the time slider was accurate for that region you’d see it covered in ice at one point in time and then year by year you’d watch it all evaporate (this is what the actual historical imagery shows).

They probably also don’t want people locating the “Great Pacific Garbage Patch” which is larger than the state of Texas. If you think that’s disgusting, consider that 70% of the trash that gets dumped into the ocean sinks to the bottom and the Pacific Garbage Patch only accounts for 30% of the total amount of garbage in our oceans.

They’ll buy one from Major League Baseball.

The garbage patch is a serious issue and a symptom of an even larger issue, but it’s not visible on satellite images:

This is a silly theory. It’s not like people get their information about the Earth from Google Earth alone and will actually be misled by the lack of Arctic sea ice. And it’s not “all gone”, please read the forum tag-line.

And as long as we’re making corrections, the Garbage Patch does not contain 30% of all the garbage in our oceans.

***All ***of the landmass imagery IMO

Satellites orbit above clouds, AFAIK, therefore useless for taking meaningful pics of landmass.

Great for taking pics of cloud formations, however.

Google definitely uses satellite imagery for the zoomed-out views. You can’t argue this is a shot from an airplane. Pretty much everything with a scale of around 1 inch to a mile is from an airplane, then it switches to satellite in the multiple miles per inch scales.

Satellites orbit often enough that you can find plenty of images that aren’t obscured by clouds. The images might not be meaningful for finding your house, but they are certainly meaningful for some applications.

Even though satellites can see the ocean, it takes bandwidth to transmit those images back to earth. And bandwidth is an extremely precious resource on a satellite. I don’t see why they’d waste it by taking visible-light images of the ocean surface.