Radicalism
Jump to navigation
Jump to search
Radicalism is the belief that society needs to be changed, and that these changes are only possible through revolutionary means. Most people think of left-wing politics when they use the noun radicalism, although people on both ends of the spectrum can be described as radical. The word radicalism comes from the Latin radicalis, "of or having roots," which in turn arose from radix, or "root." Both radical and radicalism came out of the idea that political change must "come from the root," or the very basic source of society.