I have noticed recently that a few "evil" corporations have started to put forth an eco-friendly message. When empired, er... organizations, like Fox (with their new "
Green it, Mean it" add campaign) and Walmart (with their recent focus on organic produce and doing well in
taste tests against Whole Paycheck, er... Foods) start being "green" I have two simultaneous reactions. The first is to call bullshit. My uber-liberal educational background makes it hard for me not to assume it is just for show. My second reaction, however, is that even if it is just posturing due to mounting public concern about the environment it is a good thing. It means that all the bad press about being eco-terrible has worn off, or that they realize being green is something people actually care about. While I wish we would all just start cherishing nature and being better stewards of the planet because it is the right thing to do, I know that is not going to happen. It is going to take large corporations with big capital to actually make meaningful change, but they certainly wont do it out of the goodness of their "hearts" we need to keep their feet to the fire and make sure they are not just posturing. So here is a tentative cheers to Fox and Walmart, even though it makes me choke a little to say it.
Also, for some awesome science-art-earth inspiration check out what is happening at one amazingly awesome small uber-liberal arts college:
Feet to the Fire