I don't quite understand the question. Is the question, who invented the currently used charsets or who invented the concept of mapping a number to a symbol?
In case of the concept, it already exists in alphabet as well. There exists a rather intuitive mapping of 1,2,3 into A,B,C and so on. Even before the latin alphabet, there were phoenicians and so on. Thus, the concept of a "character map" goes way back, their use in computers is just a natural adaption of this.
Regarding the currently used charsets, you could check ASCII from wikipedia for a good starting place:
http://en.wikipedia.org/wiki/AsciiTypically, charsets refer to the mapping of 8 bits into a character, and that's where the mess comes. ASCII only defines characters for 7 bits, so half of the space is free for others to use. As the result, zillions of different mappings exist for each nationality to have their own special characters. It's quite a mess.
Thankfully, there exists an ASCII compatible way to extend 8 bit tokens to represent larger than 8 bit symbol space, and it's called UTF-8. Multiple bytes are used to encode characters above the ascii range, and in theory any other characters sets would no longer be needed at all.
So, screw the whole charset crap and move to UTF-8. It's the wave of the future, unicode will take over the world!