Most Americans agree a college degree is important in helping a young person succeed in the world. According to the Pew Research Center, most college graduates themselves say their degree helped them grow and develop the skills they needed for the workplace. There are clear and growing economic advantages for college graduates versus those without a degree. Even so, there is an undercurrent of dissatisfaction among the public about the role colleges play in society, whether they cite the cost of tuition or the perception that faculty bring political views to the classroom. Kim Parker from Pew Research Centers shares some interesting polling of American’s differing views regarding Higher Education. See the full story here.