History mainframe software




















Christopher Tozzi March 5, Share on: LinkedIn Twitter Facebook. Major developments in mainframe history include: First mainframe — By most measures, the first mainframe computer was the Harvard Mark I. Developed starting in the s, the machine was not ready for use until Keep that in mind the next time you complain about the cost of your iPhone.

Although the ENIAC was not actually completed until a year after the war ended, it signaled the start of heavy government investment in mainframe development. Magnetic storage — While early mainframes were based on vacuum tubes for storing data, a major innovation came to the mainframe world with the development of what was called core memory.

In place of vacuum tubes, core memory stores information magnetically. Your magnetic RPM hard disk it was not, but core memory provided a faster and more reliable way to store and retrieve data than vacuum tubes. Core memory was first used in and soon replaced vacuum tubes entirely.

It beats even C, which originated in the early s. IBM is the name most closely associated with mainframes but, historically, the mainframe commercial ecosystem was more diverse. More than half-dozen companies — including Univac, General Electric, and RCA — also sold mainframes during the first few decades of mainframe computing.

Linux for mainframes — Also worth noting is that, while mainframes for the first decades of their history ran on special mainframe operating systems, by the late s this changed. See, for example, Tandem's NonStop platform. Java very popular for businesses has been ported to a number of mainframe platforms. Even so, older applications are not typically rewritten since 1 they work fine as-is and 2 there is a risk of changing behavior new bugs, etc. One, when mainframes were replaced, their applications were replaced with new applications written in modern languages for the new platforms.

Many a IBM mainframe have been replaced by modern Unix machines with completely new software applications. Two, for those that didn't want rewrite their applications, but change platforms, they relied on virtual environments to run the software. So, in this case, the sytem did not need to be rewritten. Likely this was just a matter of modern systems, modern environments, which had more value to the developers than COBOL could offer them at the time. I'd venture to say that it was neither of the things you mention; I think it was what IBM termed "access methods" -- data structures and low level system libraries allowing programmatic access to files, first sequential on tapes , later random on DASDs.

A large portion of COBOL applications have been what we call today ETL jobs: reading files, parsing input records, manipulating them, the writing new files with a different record structure. Minicomputer operating systems offered a different, higher-level abstraction for file access, which required new programming paradigms, languages and tools.

By the time various COBOL implementations were ported to mini- and microcomputer platforms, the "native" alternatives had already had established themselves, bringing about a new generation of programmers and related technologies. Cobol is a compiled language. So you need a compiler for it. The compiler needs an operating system. The operating system needs a mainframe computer. The mainframe computer needs FEs to maintain it and electricity to run it and it needs to be cooled in a special room with a raised floor to permit cabling underneath.

There were many mainframes that ran cobol. There were several operating systems that ran cobol although only two main classes of those: Big Oz and Dos. There were several compilers with various completeness of features. I'll have to disagree on that one. It's almost unused today, everyone having gone over to relational databases SQL style.

Even the cheapy SATA hard drives of today have significant cache and transfer rates that would leave the old mainframe systems standing. Sign up to join this community. The best answers are voted up and rise to the top.

Stack Overflow for Teams — Collaborate and share knowledge with a private group. Create a free Team What is Teams? Learn more. Ask Question. Asked 1 year, 1 month ago. Active 2 months ago. Viewed 4k times. Roughly equivalent to bash? Probably substantial chunks of a typical business application could be written in this?

This actually seems less likely to be a problem, partly because it was first released only in , and partly because there are other relational databases, and the effort of porting between them, while nontrivial, would be less than if you had to port to a different kind of database.

IMS, a pre-relational database. This seems much more likely to be a problem, partly because it goes back to and partly because the effort of porting from it to a relational database would be correspondingly greater. Other major software components that I'm not aware of? Improve this question. Could it be the time it would take to double check all corner cases that all output data really becomes what you expect that get them to refrain to leave a perfectly working system?

UncleBod Sure, for some customers, 'safest approach of all, stick with the mainframe' was and is compelling. But then again, there have been many customers for whom 'spend a zillion dollars rewriting it all in Java, with uncertain prospects of success' looked like the least bad option.

Surely for at least some of them, recompiling the COBOL on a different platform would have been more appealing had it been an available option. Mainframes are very very good at what they do. COBOL is very efficient for moving data around and simple calculation which is what many business applications need to do.

The only reason that it is even considered moving away from mainframes is because they are expensive. But, so are supertankers. Add CICS to your list. This was a problem created entirely by industry marketing strategies and personal career goals. Show 12 more comments. Active Oldest Votes. No, they are not and they were not.

For completeness: JCL, the job control language. Well, yes as it's about controlling jobs, but no so much as programming. Think of it as a framework for data shoveling. This seems much more likely to be a problem, IMS is more of an runtime environment and transaction system than a data base. Improve this answer. Raffzahn Raffzahn k 18 18 gold badges silver badges bronze badges. Comments are not for extended discussion; this conversation has been moved to chat.

Build it on the PC, test locally, then ship it to the mainframe and compile and do final testing there. Raffzahn: any idea how much this was used? After all, coding is an interactive process, and a supportive local environment improves processes a lot.

I was a die hard terminal user all my life - or better als long as I was using our own IDE, which was a tight fit to our coding and testing process.

But I did as well see and enjoy the advantage a local PC environment has over standard development processes on mainframes. So for other projects I did as well switch to PC use. Especially as supportive PC based IDEs became more capable during the s, while mainframe based tools did not evolve — Raffzahn. Ahhh, It sounds as if the practice to run JavaScript at both frontend and backend has an ancient root.

Schezuk JCL does not handle data well, usually not but defines the environment for programs to run. Like assigning input and output files, requesting volumes, managing import or export to robot archives and so on. From today's POV they are more like config and parameter fiels. Add a comment. Alex Hajnal Alex Hajnal 8, 3 3 gold badges 33 33 silver badges 53 53 bronze badges.

The workhorse mainframe computers that met these demands in turn reshaped how businesses operate, increasing centralization and nourishing new demand. In addition to industries like banking, insurance, and healthcare, the mainframe found another enthusiastic customer in the s: NASA.

Mainframes played an important role in space travel, helping NASA solve complex computational problems for space flight, as documented in the recent Hidden Figures movie.

In the early years, mainframes had no interactive interface, but only accepted data and programs on punched cards, paper tape, or magnetic tape. By the early s, mainframes acquired interactive computer terminals such as the IBM and IBM and supported multiple concurrent on-line users along with batch processing.

Up until the s, mainframes were without question the best option for massive data processing at big organizations. But as other computer technologies advanced and personal computers became more pervasive and popular, analysts began — prematurely and inaccurately — to predict the end of the mainframe. The mainframe has continued to thrive in the modern era, evolving to meet changing business needs and concerns, consistently revered for its stability and reliability.



0コメント

  • 1000 / 1000