The USA offers more than just city hubs and sightseeing landmarks. US deserts are home to some of the most astonishing landscapes, wildlife and experiences that are waiting to be explored. Here's a short guide to some of the best deserts in America.