In today's evolving society, there is no denying that pop culture, media and entertainment has a significant influence on us. It is everywhere we go. When we turn on the television, some company is marketing a new brand of clothing that is "the next best thing." In magazines geared towards teen readers, nearly every other page is filled with a Photoshopped image of a model advertising a line of cosmetics that is sure to make us feel "prettier" and "happier" because "we're worth it." In my opinion, I do not think that the correct notion is to believe that this can or needs to be "abolished," because in reality, businesses will still try to market their brands through conspicuous, often explicit, means.