I thought of comparing
n3 and
n4 running times to feel what it means to have a slow algorithm.
Let us suppose we have an algorithm with
cubic runtime complexity:
int n = 2500;
int[] arr = new int[n];
System.out.println("Start");
long st = System.currentTimeMillis();
for (int i = 0; < n; i++) {
for (int j = 0; j < n; j++) {
for (int k = 0; k < n; k++) {
arr[k]++;
}
}
}
System.out.printf("%dms\nDone!", System.currentTimeMillis() - st);
For n = 2500, it completed in 1.827 seconds on a modern i5 CPU :
Start
1827ms
Done!
Now, an algorithm of
power 4 time complexity,
int n = 2500;
int[] arr = new int[n];
System.out.println("Start");
long st = System.currentTimeMillis();
for (int i = 0; i < n; i++) {
for (int j = 0; j < n; j++) {
for (int k = 0; k < n; k++) {
for (int l = 0; l < n; l++) {
arr[l]++;
}
}
}
}
System.out.printf("%dms\nDone!", System.currentTimeMillis() - st);
With just one additional loop, for the same n=2500, it took:
Well, it didn't complete... I waited 15 minutes, yet no answer!!
EDIT: It completed after 47 minutes!
Start
2863936ms
Done!
Why is this comparison important to me?
This comparison became interesting to me because I have implemented a Tree Edit Distance algorithm which has a fourth-power time complexity of the input size (number of nodes in the trees). I used this algorithm on HTML DOM trees which have around 1000 to 2500 HTML elements.
Summary
Thus, an algorithm with power 4 is not an acceptable solution for tree edit distance for web pages with more than 1000 HTML elements.
I am seeking a better algorithm.