R – How to find the running time given algorithm speed and computer speed


I'm currently working through an assignment that deals with Big-O and running times. I have this one question presented to me that seems to be very easy, but I'm not sure if I'm doing it correctly. The rest of the problems have been quite difficult, and I feel like I'm overlooking something here.

First, you have these things:
Algorithm A, which has a running time of 50n^3.
Computer A, which has a speed of 1 millisecond per operation.
Computer B, which has a speed of 2 milliseconds per operation.
An instance of size 300.

I want to find how long it takes algorithm A to solve this instance on computer A, and how long it takes it on computer B.

What I want to do is sub 300 in for n, so you have 50*(300^2) = 4500000.

Then, multiply that by 1 for the first computer, and by 2 for the second computer.

This feels odd to me, though, because it says the "running time" is 50n^3, not, "the number of operations is 50n^3", so I get the feeling that I'm multiplying time by time, and would end up with units of milliseconds squared, which doesn't seem right at all.

I would like to know if I'm right, and if not, what the question actually means.

Best Solution

It wouldn't make sense if you had O(n^3) but you are not using big O notation in your example. I.e. if you had O(n^3) you would know the complexity of your algorithm but you would not know the exact number of operations as you said.

Instead it looks as though you are given the exact number of operations that are taken. (Even know it is not explicitly stated). So substituting for n would make sense.

Big O notation describes how the size of the input would effect your running time or memory usage. But with Big O you could not deduce an exact running time even given the speed of each operation.

Putting an explanation of why your answer looks so simple (as I described above) would also be a safe way. But I'm sure even without it you'll get the marks for the question.

Related Question