Yes, but it's what you do with it that counts. I obtained an undergraduate in journalism with an emphasis in news/editorial writing. If I'd stayed in that career field, my answer would have probably been "no"... as I could have probably worked my way up through the ranks eventually, though I was an editor just a few short years after graduating at a small paper. Still, the cost of the degree compared to my ROI wouldn't have been worth it, sad to say. Realizing that, I went back and obtained a master of arts in English, with an emphasis in teaching composition and rhetoric. I taught English 101 for several years at our local D-1 university... the pay was still relatively abysmal. If I wanted to "make it" finiancially, it was going to require a PhD. Universities and small colleges are notorious for using ABD (all but dissertation) faculty as "slave labor" to comprise the bulk of their faculty. I left to pursue a career in sales, which led to several hard years financially. We lived on my wife's income primarily for a couple of those. At that point in my life, you could STILL say the college degree wasn't necessarily worth it. Today though, I work for the largest medical company in the world, and they won't even give your resume a second look if it doesn't list a college diploma. My master's helps separate me from the field a bit, too. You can imagine the income, benefits and perks are pretty nice... and though it took a LOT of work to get there, I could NOT have worked myself into this position without the degree.