The very earliest days of computing "were heady, indeed," as the whole world realized that "punched-card tabulators governed by wired plugboards" could be replaced by ... "software!" 😲
Very quickly, many different computer manufacturers jumped into the ring – most of them long forgotten. Honeywell, Sperry Rand... (And one other, IBM, whose continuity is only maintained by its brand-name, not by its hardware of the time.)
And from this came yet another realization: "that there needed to be 'a basement line.'" Above the line would be "computer programming as an abstract endeavor." Below the line would be various constantly-evolving hardware implementations. And, sitting precisely on that line, there would be a brand-new thing: "programming languages."
Henceforth, computer programmers would not write their software in "architecture-specific" ways. Instead, they would write for standardized "architecture-abstract" models, trusting a "compiler" would exist which could translate their ideas into architecture-specific executable code on whatever platforms there might be.
As expected, there were many initial candidates – APL, FORTH, FORTRAN, ALGOL – but one that was very decidedly different: COBOL.
"So, What's Special About COBOL?"
COBOL (an obligatory acronym for "COmmon Business-Oriented Language"), as was the custom of the time," was the first organized effort to create "a standard programming language for the United States Government." Of course it was an ambitious effort, and it created (what is now maybe considered to be ...) a remarkably-wordy language. ("ADD X TO Y GIVING Z ON SIZE ERROR PERFORM ERROR-PROC." You be the judge.)
Nonetheless, there was a "lost wisdom(!)" in their approach: they split the source-code into: "IDENTIFICATION DIVISION. ENVIRONMENT DIVISION. DATA DIVISION. PROCEDURE DIVISION." Emphasis: "DIVISION."
Superficially, the "wisdom" might appear to be that "they specified the correct environment in which this piece of software was intended to be run." The language separately defines four separate DIVISIONs. But, there is a much deeper wisdom in the DATA DIVISION.
The Wisdom Of PICture:
Buried within the DATA DIVISION, we find endless repetitions of the PICture clause, which are very easily overlooked. But here we find a very specific representation of how the value of each variable is to be handled: whether the value is to be "float binary," or "binary-coded decimal (BCD)." And, if decimal, precisely how the digits are to be handled. The PICture clause also specifies how a value is to be printed.
"Very simply stated, no other programming language ever did this." Most of them never acknowledged the existence of BCD arithmetic. They therefore never considered the impact upon: Dollars and Cents!
"Dollars And Cents" ...
When accountants want to "verify" a computer's output, they invariably turn to desktop calculators and their paper tapes. They add-up their columns of numbers ... digitally. And they perform their multiplications and their divisions, likewise, digitally. Whereas most computer algorithms instead use "scientific calculators." The end-result being that, after you total-up a column of figures, you probably find yourself "off by ±1 cent." Which happens to be a monumentally-big deal to any accountant.
The COBOL language – uniquely – provides explicit control of numeric precision, "rounding" handling, and other rudimentary concerns throughout the entire program, because it specifies all such things in an entirely-separate DIVISION.
So far as I am aware, no other programming language ever had the fundamental concept of "DIVISIONs."