I have a question about wire gauge ratings. Why do wire gauge charts
show how many amps that the wire is able to carry, and not the wattage
that the wire can dissipate safely? Resistors are rated in watts, not
amps.

OK, lets say that you've got a 100 foot extension cord and it's
rated to safely dissipate 10 watts. What are you going to do with
that information? More importantly, how do you know what you can
plug into it safely?
On the other hand, if it's rated to safely carry 10 amps all you
have to do is look at the nameplate of what you're going to plug
into it to know how much current the extension cord will have to
carry.

I would think that if I had a length of wire that was just barely
capable of conducting a charge at the rate of 10 amps at 12 volts, that if
I were to hook up 120 volts to the same piece of wire and attempted to run
THAT charge at the rate of 10 amps, that, since P=I*E, the wire would be
dissipating heat much more rapidly and could overheat.

You forget that the wire isn't supposed to be a significant portion
of the load. Using our 100 foot extension cord as an example, if it
can safely dissipate 10 watts with 10 amps going through it, then
the voltage dropped across it will be:
P 10W
E =  =  = 1V,
I 10A
with the balance being dropped across the load, and its
(the extension cord's) resistance will be:
E 1V
R =  =  = 1 ohm,
I 1A
and the circuit will look like this:
E1

[R1]

+E2

[R2]

GND
Where E1 is the supply voltage, E2 is the voltage appearing across
the load, R1 is the resistance of the extension cord, and R2 is the
resistance of the load.
So, let's take a look at what actually happens with a load hooked up
to our extension cord.
First, if we have a 120V supply feeding a load which draws 10 amps,
the load will look like:
E 120V
R  =  = 12 ohms
I 10A
and our circuit will look like:
120V

[R1] 1R

+E2

[R2] 12R

GND
Now, since the extension cord and the load are in series, that's a
total of 13 ohms, so the current the extension cord will have to
carry will be:
E 120V
I =  =  ~ 9.2 amperes
R 13R
and instead of 120V appearing across the load, there'll be about 119
there because of the ~ 1V drop in the cord.
On the other hand, if you wanted to treat the extension cord like a
resistor, all you'd have to do would be to short out the socket end
of it and plug it in.
Since the cord looks about like an ohm, when you first plugged it in
you'd get:
E 120V
I =  =  = 120 amperes
R 1R
through it, and the power it would be dissipating would be:
P = IE = 120A * 120V = 14,400 watts!
Why isn't voltage taken into account in determining the right diameter
for a conductor in a circuit?

Because it's largely irrelevant since the bulk of the voltage is
supposed to appear across the load. The diameter is important
because that, in conjuction with the length of the wire will
determine its resistance and that, in turn, will determine how hot
the wire gets with a specified current running through it.