If an average could be computed (and it cannot be), it would theoretically be 0.999...5, but you can't compute the exact average because you can never get to the point of tacking on the final '5'.
But of course, even this opens up a can of worms.. 0.999...5 is intuitively less than 0.999... and the average should be between 0.999... and 1 (and thus greater than 0.999...). The thing of it is, in order to take an average, you have to have a finite number, so the 0.999... has to be terminated, and any terminated decimal number can have any number of zeros tacked on the end. So now it should be clear why 0.999...5 is theoretically the average, since it's half way between 0.999...0 and 1.0. And it is only the theoretical average if you allow mathematical operations on the theoretical limit (immediately beyond the leading edge) of infinity without computing the infinite space itself..
I'm not a math major, fwiw.. This debate really is summed up as simply as 'one... not one..', but anyhow.. you didn't accept that. So.. still with me?