A recent study from New Zealand shows that musicians' brains respond differently to music than those of non-musicians do. In short, when asked to do difficult language-based tasks with music playing, musicians did worse than in silence (indicating that their brains process music much like language), while non-musicians did equally well whether music was playing or not (meaning that they don't treat it linguistically).
That makes sense, of course. I recall when I was a kid that I could just let music wash over me without dissecting how it was made and how it is structured the way I do now, after more than 16 years making money playing it, even if not always often or well.
But it's not an entirely binary thing either, as I discovered yesterday. The two main interface designers at work were away, but we needed to make some application mockups for a presentation taking place today, so my boss called on my not-so-mad Photoshop skillz to get it done.
Normally at work I'm editing words—letters, web content, proposals, emails, and so on. I've found that while I can listen to music while doing that, I can't listen to talk podcasts or other discussion-heavy audio, just as I can't listen to them while browsing text-heavy websites or writing blog posts like this one. I find myself focusing on the work onscreen, and missing out on the talk in my ears.
However, when working on the mockups, which are at least as intellectually difficult but are more visual than linguistic, I found I could happily listen to talk podcasts at the same time, or to music instead, and both get the work done efficiently and enjoy the shows or music. So my brain obviously treats written and spoken language differently than it does music, and differently than visual design as well. I notice that while working and listening to music, I actually do revert a bit to my childhood listening pattern, where I don't analyze the music as much, just hear it—presumably because my left brain is occupied with the tasks at hand.