Why did Britain Colonise America?
Why did Britain Colonise America? While the French moved into the north and the Spanish settled in the south and west, the British founded colonies on the east coast. The British settlers came to these new lands for many reasons. Some wanted to make money or set up trade with their home country while others wanted religious freedom.
No records found.