Imagine NASA had just landed a robot spacecraft in Richmond Park looking for Life, as it has just done with Perseverance on Mars.. These might have been the first pictures their lander sent back.
Imagine NASA had just landed a robot spacecraft in Richmond Park, West London looking for Life. These might have been the first pictures their lander sent back for the scientists to pore over and use to plan theories and explorations. Subsequent photos might have been in weird colour, maybe using a telephoto lens not a panoramic, and at a higher resolution but still from the high vantage point of the lander. If they had been looking for Life, what would they have seen and what would they plan to do next?
Apart from using a high point of view to mimic the top of a space lander, I have rearranged the colorimetry on the premise that to see as humans see, you have to be looking at the same frequencies and in the same proportions as a human eye. So a life form on another planet, using different biochemistry, wouldn’t necessarily see as we do on Earth. So it’s unlikely a lander would see in the same way as the local life forms. Birds, for instance, don’t see colours as we do and bees see far more ultra-violet than do humans; many flowers have stripes visible to bees but which humans cannot see.
NASA is very particular to minimise lens distortion so as to be able to judge the scale of objects, so lens aberrations are not part of my simulation.
A variation on the art school exercise of how many photos can you take from just the one location. Plus a bit of fun climbing a tree to get the height for the viewpoint.
Project inspired by the recent successful landing of NASA’s Perseverance rover on Mars in the quest for life elsewhere in the Universe - or are they really looking for minerals for extraction?