(04-20-2010, 09:38 PM)Tayrtahn Wrote: It is definitely possible to implement this type of audio system in a game. You would have two "listener" objects in the world to represent the player's ears. When a sound is made in the game world, both listeners record their distances from the source of the sound - this information is used to slightly offset the playback of the sound in either the left or the right channel. Putting that together wouldn't be unreasonable, though it would definitely require more memory and CPU usage. Most developers opt for graphical improvements instead.
About the only problem I see there is that from my teensy experience with sound programming (OpenAL, just recently) it might not always be possible to have two listener objects within the same context. Or at least, in my OpenAL use so far I haven't seen anything regarding the creation of a second listener. That's not to say it's not possible, but it might not be a typically supported feature since there usually probably isn't much need for a second listener, not to mention of course the CPU problems you outlined.
Otherwise though, I agree with your post- it's more or less what I was thinking when I first read this thread.