Britannica Quiz Hollywood Films in the 1930s Quiz Hollywood had become the center of the American film industry by 1915 as more independent filmmakers relocated there from the East Coast. For more than three decades, from early silent films through the advent of “talkies,” figures such asD.W...