The calendar best associated with Rome was also notorious, in its original usage, for requiring an obscene amount of maintenance and upkeep: because the year was the wrong length, an extra month had to be inserted from time to time to keep the solstices reasonably aligned with the dates, a feat that the present Gregorian calendar reduces to a mere day every (slightly less than) four years, a comparatively modest complication that upsets few people on a regular basis. This is hardly its most egregious dysfunction, however: days of the month were not numbered counting upward from one, but downward, indicating the number of days remaining until the next significant event—Ides, Kalends, or Nones, the exact dates of which varied for months of different lengths. The Kalends was the first day of the month, so the last week or so of month m would generally be known as 'Day x before the Kalends of <month m + 1>'—except the day of the Kalends was x = 1, so really it was more like 'Day x - 1 before the Kalends of <month m + 1>'.
There were still leap days, and this is where it gets really egregious: the extra day was inserted generally in February, and was created by doubling the sixth (fifth!) day before the Kalends—resulting in two days with the same designation, a.d. VI Kal. Feb., quite a bit different from the modern practice of simply adding an extra day on the end of the month.
Exhaustive summaries of these rules can be found here and here. This system existed for centuries upon centuries, in various states of disrepair, before the modern, simple, indexing format of dates arose.
I admire this system for its idiosyncrasies, much as I admire old computers for their distinctive quirks, not merely the technological achievements they crystallize—in fact, I had a Roman date stamp at the top of my desktop for some years—but it is unquestionably impractical. Anyone who has wrestled with the simple question of determining how many days it's been since some event will immediately accept that even now, our calendar system is far from perfect, but it is greatly simplified by at least being merely a calculation involving inconsistent radices (i.e., month/year lengths). Like the deprecation of Roman numerals, the Roman calendar is a clear example of how layers of history can create an unwieldy and overly complex system.
(And it goes without saying this is a lesson any software engineer should already know.)
People can often develop unhealthy attachments to obsolete norms—this is a phenomenon seen again and again in retrocomputing, where people have been known to irrationally claim the superiority of their favourite platform without any evidence beyond nostalgia factors. Contrariwise, some standards are simply too hard to replace, like the persistence of the customary units system in the United States. Neither of these are reasonable justifications for resisting change, though, when change is possible. In the case of Pluto, we know it's merely one of almost countless possible objects of its size, and refusing to distinguish these objects from major planets would be objectively less useful to explorers in the future, given the critical criterion of orbit-clearing behaviour that would render settlement relatively hazardous. Perhaps the choice of the word 'dwarf' is a little antiquated (the more general category of minor planet or Asimovian coinage mesoplanet might be friendlier), but in the remarkable absence of any other reason to avoid reclassification, it seems to me that this is the sort of thing that is simply inevitable, and that Plutonian exceptionalism will go down in history as misguided and utterly pointless, or perhaps an anthropomorphic meme of pity.
Then again, it was once believed that the Americans would inevitably adopt the metric system.