Hello all,
Today with multi-core CPUs running at gigahertz speeds attached to gigabytes of RAM and terabytes of disc space, we're well used to running programs that require what would be enormous resources back when mainframes ruled the data centre.
My question is "how much did one second of computer time cost in 1975?" This assumes the program in question is being run on a mainframe from IBM or any of the other manufacturers of the day. Of course, there are a lot of factors to consider: leasing costs, staffing, power and cooling, if the system could run more than one program simultaneously, and how computer time was charged back to the users.
I'm interested to know if anyone on this list has had experience with this sort of system accounting, and if they can recall some numbers.
Regards, Brian
There are many people here who used the UofM mainframe back when it was updated on a chargeback system (including me, sadly, although my Dept paid the bill, not me). Of course I didn't keep a copy of the rate sheet, so I don't remember what 1sec cost. It wouldn't quite be the fully-loaded cost you describe, but it would be close - I don't think Computer Services was expected to turn a profit back in 1991. -Adam
On October 25, 2021 8:56:07 p.m. CDT, Brian Lowe brian2@groupbcl.ca wrote:
Hello all,
Today with multi-core CPUs running at gigahertz speeds attached to gigabytes of RAM and terabytes of disc space, we're well used to running programs that require what would be enormous resources back when mainframes ruled the data centre.
My question is "how much did one second of computer time cost in 1975?" This assumes the program in question is being run on a mainframe from IBM or any of the other manufacturers of the day. Of course, there are a lot of factors to consider: leasing costs, staffing, power and cooling, if the system could run more than one program simultaneously, and how computer time was charged back to the users.
I'm interested to know if anyone on this list has had experience with this sort of system accounting, and if they can recall some numbers.
Regards, Brian
Roundtable mailing list Roundtable@muug.ca https://muug.ca/mailman/listinfo/roundtable
Whoops. Definitely not 1975. I know most systems *were* operated on a chargeback basis then, but I didn't even know computers existed in '75. Maybe email the docents at the Computer History Museum? -Adam
On October 26, 2021 12:17:13 a.m. CDT, Adam Thompson athompso@athompso.net wrote:
There are many people here who used the UofM mainframe back when it was updated on a chargeback system (including me, sadly, although my Dept paid the bill, not me). Of course I didn't keep a copy of the rate sheet, so I don't remember what 1sec cost. It wouldn't quite be the fully-loaded cost you describe, but it would be close - I don't think Computer Services was expected to turn a profit back in 1991. -Adam
On October 25, 2021 8:56:07 p.m. CDT, Brian Lowe brian2@groupbcl.ca wrote:
Hello all,
Today with multi-core CPUs running at gigahertz speeds attached to gigabytes of RAM and terabytes of disc space, we're well used to running programs that require what would be enormous resources back when mainframes ruled the data centre.
My question is "how much did one second of computer time cost in 1975?" This assumes the program in question is being run on a mainframe from IBM or any of the other manufacturers of the day. Of course, there are a lot of factors to consider: leasing costs, staffing, power and cooling, if the system could run more than one program simultaneously, and how computer time was charged back to the users.
I'm interested to know if anyone on this list has had experience with this sort of system accounting, and if they can recall some numbers.
Regards, Brian
Roundtable mailing list Roundtable@muug.ca https://muug.ca/mailman/listinfo/roundtable
I remember in 1987 being told it was nanosecond billing, not per second for TSO services with IBM here. But I don't know the rate.
On 2021-10-25 20:56, Brian Lowe wrote:
Hello all,
Today with multi-core CPUs running at gigahertz speeds attached to gigabytes of RAM and terabytes of disc space, we're well used to running programs that require what would be enormous resources back when mainframes ruled the data centre.
My question is "how much did one second of computer time cost in 1975?" This assumes the program in question is being run on a mainframe from IBM or any of the other manufacturers of the day. Of course, there are a lot of factors to consider: leasing costs, staffing, power and cooling, if the system could run more than one program simultaneously, and how computer time was charged back to the users.
I'm interested to know if anyone on this list has had experience with this sort of system accounting, and if they can recall some numbers.
Regards, Brian
Roundtable mailing list Roundtable@muug.ca https://muug.ca/mailman/listinfo/roundtable