As the author states, the traditional data processing organization was centralized, in which all processing power in the organization is located on one large computer, or on a cluster of large computers, all within a central data processing facility. The term "computing center" comes from this.
The term centralized applies not only to the computers, but also (naturally) to the processing, the data, the control, and the support:
Two of the primary reasons for this are cost and architecture. Specifically, in the past, computing power was expensive. As computer scientists are fond of pointing out, the average $500.00 desktop computer has more power than a $1,000,000 computer did N years ago (you pick N; it varies, but as of 2005, I'd say N is roughly between 20 and 30 years).
The architectural reason was that these computers were physically very large, consumed enormous amounts of power, and generated enormous amounts of heat. I've heard tales of an IBM 360 (a large mainframe) being hooked up to provide the winter heat for a specific government building. I doubt that the desktop PC that could replace it would adequately heat the same building, though. Regardless, their size and power consumption, etc., necessitated the building of large rooms that were air-conditioned, had false floors (so that cabling could be run underneath), and often had special power conditioners and uniterruptible power supplies. All of this cost money, so it was natual to concentrate the computing power in one place so that only one such room needed to be built.
Economies of scale were present because one computer or several computers could service the needs of hundreds of users. Furthermore, a small group of good programmers could service the needs of those same users. Since the method of interfacing with the computer was through some kind of terminal connected in a star fashion (see prior chapter example), all interfaces were controlled centrally. There was no customization of a desktop. Rather, the programming staff created the interface, which was identical for every user. Although this represents duplication of user interface, it also represents a non-duplication of programming effort to maintain consistency. Likewise, this made it easy for the programming and support staff to enforce standards and security.
One last note: I may have implied earlier here that the centralized form of processing is obsolete and no longer used. This isn't necessarily true, and I don't mean to imply that. Large-scale mainframe computers still have their place in certain environments, and it is certainly possible that they could "make a comeback" at some point in the future if some advantage they have over distributed data processing facilities becomes economically important.