A policy or practice by which a country increases its power and dominion over other nations or territories, often through military force, colonization, or economic and political influence.
The age of imperialism saw European powers expand their territories across Africa and Asia.