The start of the 1990s saw the fall of the Berlin Wall and the reunification of Germany into one new nation that would be a formidable economic force around the world. But to many Americans educated by the news and entertainment media, the image of Germany remained a holdover from World War II and the Holocaust. When the American media were not presenting an outdated, jackbooted view of Germany, they were portraying it as a country epitomizing...