When did the NFL start to become a national brand?

When the NFL started to become an international brand in the 1970s, there was a perception that it was an American sport