A possible starting point would be:

1. Use peak values. Set your minimum (~1.42xRMS) in software, perhaps as a constant.

2. Use a voltage divider to give you a sample output, calculated to give a maximum which does not exceed 5V. Total series resistance of the divider will depend on the output voltage of your bridge and the current through the divider, but it doesn't need to draw even as much as a milliamp--possibly even 100 microamps would do.

3. With an ADC port, repeatedly sample the voltage from the divider for half the period of the bridge input. (How many times depends on the precision you require.) For instance, if the input is 60 Hz, the period is 16.7 ms, so an 8.4 ms sample would do it. Or, if your processor isn't doing much of anything else, just sample continuously.

4. Continue to sample as needed, and compare the maximum value from any given sample to the minimum you have established as a limit. When it drops below the minimum you've set, record for how many consecutive sampling cycles the value stays below.

One note: If the output from the bridge is filtered, you'll need to put a diode between the output of the bridge and any filtering (e.g., a capacitor) and place the voltage divider ahead of that diode.

This is off the top of my head (or "shooting from the hip").

I'm sure there will be other suggestions, better, simpler, or both.