AC is good for inductive motor loads and transferring power over distances under 300km. Everything else really should (and in most cases can) use DC. Motors rely on the switching to turn and the frequency sets their speed. Household air-conditioners or pool pumps are good examples. In power transmission, AC is more effecient at transferring power because the voltage and current pass through 0, however the skin effect starts to come into play around 300km in high voltage transfer lines. I found a study by Siemens that found up to 9% of the energy passed through a 1200km AC line was lost, thus HVDC lines are commonly used for high power long distance transfers. 9% may not sound like much, but when you're transferring power on the gigawatt scale, you're losing a whole neighborhood worth of power just in the wire.
I work at a lab at my university and I'm currently helping test power supplies and devices running on DC voltage as we are trying to make a DC microgrid with DC actually at the outlet, but point of use modified sine inverters for where AC is absolutely required. It makes more sense to run on DC because you lose power in rectification, and in inverting the power from generation. My coworker actually put on a cool demo where he created 100v DC with some bench power supplies and plugged it into a Dell monitor. If you think something needs AC to run, think about how it works. If it's electronics, you're not sending AC through the circuitry and microcontrollers. No way. If the device is electromechanical, it may need AC, but chances are it doesn't.