You may be resting in your dorm room, listening to your Beatles and smoking your "bong" pipe and composing your protest march folk song without a care in the world, secure in the knowledge that college degrees are kind of worth the money, more or less. But are they? Well, maybe not for you, hippie.

A new study out of Georgetown University says that yes, generally speaking, getting a college degree means you'll earn significantly more money over your lifetime. But! Let's get this out of the way, from the NYT's Economix blog:

The bachelor's degree holders most likely to be at a disadvantage, he said, are those with liberal arts degrees. "The truth is that if you get a liberal arts degree, there are a smaller and smaller set of occupations that you can go into," he said.

If you want to make the most money over a lifetime, your best bet, as it turns out, is to go to medical school.

It also really helps to be a white male! Can't emphasize that enough. If you want to earn money, please be a white male, if at all possible. Then, a doctor.

Also, the chart above shows you just how many people with a lower level of education than you will end up making more money than you. Not a ton, but just enough to make you question whether that art history degree was really worth it, or whether you would have been better off using that money to pay for plastic surgery to turn yourself into a white male. (Yes.) Haha, but seriously, no, overall, chances are you're better off going to school. As long as it's not law school.

You can read the full report here.