The movie industry hit some major milestones over time, but when it comes to the 1990s, it’s safe to say that it was the decade that changed everything about what was expected from Hollywood movies.