DigitalGlobe New Satellite WorldView I is Black and White?

Lots of folks, including myself, may have been confused by the news about the new DigitalGlobe satellite which just launched today. I’m glad the launch was successful and will improve DigitalGlobe’s imagery data gathering capabilities. But, I’m not sure this satellite will be destined to help improve Google Earth imagery. Closer inspection of the specifications of the imaging system of the WorldView I reveals that it is “panchromatic” – which, while sensitive to many wavelengths, is essentially a black and white imaging sensor.
I’m going to do some more checking on this. It seems odd that they would spend all this money and not be able to produce color imagery. I know some imaging systems use color filters and blend processes to produce colors. Any remote sensing experts out there care to comment?
[UPDATE 1835: Quickbird – is also panchromatic, but it produces color using multispectral sensors. But, those types of sensors aren’t listed in the specs for WorldView I.]
[UPDATE 1920: Ok, so according to analysis in the comments below WorldView I isn’t color, but DG wil be able to fuse the imagery data with other color sources to get imagery that is mostly colored correctly. The limitation was possibly due to government contracts on the satellite which apparently won’t be the case with the WorldView II satellite.].

About Frank Taylor

Frank Taylor started the Google Earth Blog in July, 2005 shortly after Google Earth was first released. He has worked with 3D computer graphics and VR for many years and was very impressed with this exciting product. Frank completed a 5.5 year circumnavigation of the earth by sailboat in June 2015 which you can read about at Tahina Expedition, and is a licensed pilot, backpacker, diver, and photographer.


  1. Craig Hughes says:

    Uh. Panchromatic does not mean what you think it means. The first place you might want to start your checking is with the definition of “panchromatic”…

  2. Panchromatic allows for the best resolution, but it would be nice if it was multi-band data. Maybe Google will use it to pan-sharpen existing color or false-color imagery.

  3. from wikipedia on panchromatic:
    Digital panchromatic imagery of the Earth’s surface is also produced by some modern satellites, such as QuickBird and IKONOS. This imagery is extremely useful, as it generally is of a much higher resolution than the multispectral imagery from the same satellite. For example, the QuickBird satellite produces panchromatic imagery having a pixel equivalent to an area 0.6m x 0.6m, while the multispectral pixels represent an area of 2.4m x 2.4m.

  4. No, you were right – panchromatic is black and white. Color imagery would be acquired with multi-spectral sensors, typically with a lower resolution than the panchromatic one. Check out the sensor specs for Quickbird:
    And you’ll see that there are two classes of sensors listed:
    * 60-cm (2-ft) panchromatic at nadir
    * 2.4-m (8-ft) multispectral at nadir
    I can’t believe they would have launched a satellite without color capabilities. Maybe they just left the color sensor specs off by mistake.

  5. Craig: Panchromatic means that it’s sensitive to all wavelengths of visible light. This results in a broad-spectrum greyscale image. Contrast this with orthochromatic, which is only sensitive to greens and blues.
    In theory they could probably do multi-colour imagery by using transmission filters, but I don’t see any mention of filters on the spec page.

  6. There’s an interesting and rather dramatic history in regards to the WorldView sensor program. But without getting into that, I’ll try to point-out a couple things.
    – WorldView 1 is a panchromatic full-spectrum sensor, as Craig describes.
    – WorldView 1 will be primarily tasked in relationship with the government — and that data will be available to the commercial sector.
    – The panchromatic imagery, because of its wavelength capacity, will be able to be fused with existing and newly collected multi-spectral sources, such as Quickbird, and other sensors (IKONOS, etc). Or, the multi-spectral can be pan-sharpened by the new panchromatic source.
    – The panchromatic system also enhances collect-time dramatically, allowing it to capture imagery at a far more rapid rate than any other commercial sensor.
    – Due to such a rapid collect-time and rapid re-visit rate — and the nature of the wide spectral range — more accurate DEMs can be generated, and at high resolutions.
    Put 2 and 2 together, and what you get is a sensor that will fill in all the gaps, along with providing some of the most useful data available to the commercial and government sectors.
    To my understanding, WorldView 2 will be fully owned and operated by Digital Globe and with no other interests — making it the first fully commercial sensor. That sensor is supposed to be a multi-spectral system, but superior to Quickbird. (Thus you can imagine the additional possibilities in fused/sharpened sources between the sensors — along with the advantages of rapid DEM collection from the panchromatic WorldView 1 system, mapping the entire earth at 700,000km per day.)
    Also note, that because of the spectral variety in WorldView 1’s panchromatic source — that means we’ll see some interesting processing methodologies emerging using that data as well. Classification will probably be more easily acheived due to the spectral variety present.

  7. In theory, since google has multi-color imagery already, it could be joined with panchromatic data coming from the new bird to create sharper looking color imagery. TV broadcasts do the same thing (lookup 4:2:2 on wikipedia), and it all works because our eyes are much more sensitive to luma (brightness) than chroma (color).
    This could really be a great boon for those areas of the country not already covered by aerial photos (like the mountains I like to camp in).

  8. We can restore colors from multi sensors data.
    Hi res panchrome + Low res color = Hi-res color
    This need more of time and best software.
    For Google it’s good solution.

  9. PS: We’ll probably see two variants of methodologies occuring with the DEMs generated from this high-resolution source.
    1) Automated 3D mapping and modelling.
    2) The most complete high-resolution global elevation dataset.
    Am I convinced it’ll enhance their relationship with Google Earth? Yes, I most certainly am.
    BUT — that is to say that if the NGA is going to play ‘nice’ as well with the tax-payers.
    I think the emphasis is on the DEM data in this case, along with some relative variants of classification-based methods. GE could probably benefit from some of the more robust vis applications that use classification methods to generate a more realistic environment based on the classes that each material represents.

  10. It’s common in all of these satellites that the panchromatic is highly superior and the multispectral is of much lower resolution. Whenever they cite a satellite as say of 0.5m resolution, they usually refer to its best sensor, which is also usually the panchrom.
    and like everybody said, when processing remote sensing imageries, all you need to do is combined the panchromatic layer with color to get high rez color. This process is called pan sharpening

  11. would they really want color sensors in the visible wavelengths? not only would it be more expensive to have 3 focal planes or filters for each of the three wavelengths R, G, B, but then you’d have to throw away all your imagery from cloudy days. for high resolutions, it may be best to have sensors with passbands that see directly to the ground.

  12. full wiki on Panchromatic:
    Panchromatic film is a type of black-and-white photographic film that is sensitive to all wavelengths of visible light. A panchromatic film therefore produces a realistic image of a scene. Almost all modern photographic film is panchromatic, but some types are orthochromatic and are not sensitive to certain wavelengths of light. As naturally prepared, silver halide emulsions are very much more sensitive to blue and UV light than to green and red wavelengths. The German chemist Hermann W. Vogel found out how to extend the sensitivity into the green, and later the orange, by adding sensitising dyes to the emulsion. However, his technique was not extended to achieve a fully panchromatic film until the early 1900s, shortly after his death.
    Digital panchromatic imagery of the Earth’s surface is also produced by some modern satellites, such as QuickBird and IKONOS. This imagery is extremely useful, as it generally is of a much higher resolution than the multispectral imagery from the same satellite. For example, the QuickBird satellite produces panchromatic imagery having a pixel equivalent to an area 0.6m x 0.6m, while the multispectral pixels represent an area of 2.4m x 2.4m.
    full wiki on Orthochromatic:
    Orthochromatic photography refers to an emulsion that is sensitive to only blue and green light, and thus can be processed with a red safelight. Using it, blue objects appear lighter and red ones darker because of increased blue sensitivity. A standard black and white film can be used with a Cyan-lens-filter (devoided of red light) to produce similar effect.
    Orthochromatic films were first produced by Hermann Carl Vogel in 1873 by adding small amounts of certain aniline based dyes to ordinary photo emulsions, work that was extended by others including J M Eder, who introduced the use of the red dye erythrosine in 1884.
    most important note on Panchromatic: “… that is sensitive to all wavelengths of visible light.” the question is if ALL data is recorded and sent back to earth…
    but just this sentence tells us that it’s possible to edit the footage so it appears “normal” to us…

  13. Smokeonit,
    Those are generic definitions of film-based photography and shouldn’t be confused entirely with the spectral wavelengths of the panchromatic sensor in this case.
    The actual spectral range of World View 1 will be 0.45 – 0.90 µm, with a center wavelength of 650 nm (Min 400 nm, Max 900 nm).
    You can think of each band of a multi-spectral sensor, for instance, as being ‘each band panchromatic’. Each band is taken in a spectral range due to filtering to collect within those ranges. Merged, fused, or pan-sharpened, you can create various combinations of imagery data/products.
    The bands generally known for their ‘color characteristics’ are simply smaller panchromatic wavelengths, which are typically taken at lower-resolution. This is because if they were try to collect all spectrums at once, it wouldn’t be feasible to collect the imagery with accuracy and downlink speed in mind. (The more data you try to pipe down through the data-link, the more time it takes.)
    In the case of World View 1 — NGA opted for the panchromatic sensor, because it was an agreed upon trade-off to launch the sensor and get it operational.
    Hopefully that helps a little.

  14. as i posted before, the question is how much data will be sent down to earth from the satellite…
    that will become known when digitalglobe posts the first pictures in high rez and the specs for it…

  15. For a history of why WV01 is pan and what WV02 is expected to deliver see the following article from Oct 05.

  16. What do DigitalGlobe and President Bush have in common? They both have a black and white world view. Sorry, couldn’t resist. 😉

Leave a Reply