When you visit western US today, you might see cattle everywhere. But did you know that cattle was not endemic to these parts? Cattle was introduced to the US by the Spanish conquistadors and cowboys drove them to the West. Follow the history of early cattle ranching and how it impacted life on the plains. Start reading today.