Not to deny our past but I am sick of hearing Westerners complaining about how imperialistic and evil we were during World War 2. Didn't the British colonize India and Malaysia? Didn't the Americans colonize the Philippines and make them permanently dependent on the US?