It's a bit of a Schrodinger's cat type of question.
You can't measure one without affecting the other.
For example, you might short out the output and measure 100mA. And then you might measure the voltage and see 2.5V RMS. From that you might (incorrectly) assume you can draw 100mA at 2.5V. But the voltage was measured at zero current, and the current at zero voltage.
To really answer the question you need to define a load (say a 33 ohm resistor) and measure the voltage across it and the current through it.
On another level, you also need to consider the source of the signal and the volume level. In this case it will have a substantial effect.
If you know the rating of the output in watts (or in this case probably milliwatts) you can calculate the relationship between voltage and current. What the amplifier can do at its best will be a point on that curve.
For your amplifier, lets say it's rated at 150mW, that means it may be capable of delivering 75mA at 2 volts. But note that this is one point on the curve, and not necessarily the one the amplifier can reach (for example, you wouldn't expect 1mA at 150V or 15A at 10mV)
Now, remember I said you have to read the voltage across the load and the current through it? Actually that's a lie. From the voltage across it we can determine the current through it (for a simple resistor). But is either case (measure or calculate) you will come up with a voltage and a current.