That's because Celsius uses 0 as the freezing point of water and 100 as the boiling point of water. That makes it easier to use and understand. Same with the rest of the metric system.
Celsius is NEAR 0 for freezing water (even in ideal lab conditions, it isn't exactly 0°C), and is rarely at 100 for boiling water (it is often several degress off due to air pressure variations). And the temperature for both depends a lot on the number of nucleation points available (water will stay liquid far below 0°C if there are no nucleation points). Plus using water for your calibration is arbitrary anyways. Why not ethanol or any other common chemical?
Fahrenheit at least has the advantages of:
1: being weather related (most of the world is between 0°F and 100°F and extremes on either side of that range are a health hazzard).
2: being nicely tuned to heath. A temperature above 100°F is a fever. Why would you want to memorize 38°C when 100°F is easier? Note: using human health is just as arbitrary as using water.
3: matching the precision of human feeling. We can tell the difference between temperatures that are roughly 1°F apart. A 1°F change in the outside temperature or the thermostat is just enough to notice. Like the OP said, 1°C is just too coarse of a temperature range and you are forced to go with decimals.
But, in the end both suck. Neither have an absolute zero at 0. Rankine is the best of all worlds. A true zero and temperature intervals that match human touch.