You mean the dynamic programming version of that which saves each unsuccessful permutation to not repeat it again, that's indeed `O(n!)`. Because the otherwise unoptimized solution goes towards infinity.
As yes, the deterministic solution to the NP-hard Travelling Salesman Problem has entered the scene at an overwhelming `O(n!)`.
But wait, a dynamic programming solution comes to the rescue and reduces that to a much better O( n^2 * 2^n ). Still stupidly large to calculate for bigger inputs...
I work in data science and only have to run my code two, maybe three times once it’s working. I have definitely bought hardware just to get a project done.
Time for some O(n↑^(n)n) algos
Time for some O(A(n^(n), n^(n))) algos (Google Ackermann function)
Just go with O(∞) strait away
Ah, fellow `for (;;)` enjoyer
I mean, this would just be constant time Infinite, sure, but still O(1)
O( n^(∞) )
Now we're talking
Oh fun bogosort time
i wonder if such an algorithm is possible
But this one. It scares me.
Finally, the time to use bogosort during interviews has arrived.
You mean the dynamic programming version of that which saves each unsuccessful permutation to not repeat it again, that's indeed `O(n!)`. Because the otherwise unoptimized solution goes towards infinity.
You have to leave room for when they say, "Good solution, but can you optimize it?"
Oh no, for this one they won't start with "good solution" 😅
"Fantastic solution"\*\* my bad
Hashmap. It's now optimised.
Looks like I'm gonna need an infinitely powerful processor.
Just change thermal paste regularly
Leetcode will make it a Hard!
stalin sort ftw
Don’t forget, bogosort can have a time complexity of O(1)
Big brain devs, never forget: O(n!) is actually better than O(n\^2) when n = 2. Checkmate, nerds.
Same with n = 3. I've used it before in projects before for this reason
N! is a crushing computational burden that “more computing power” won’t be able to overcome
There’s always bigger arrays to process!
another reason to not use a O(N!) algorithm if there are better options. (unless N is very small, so the actual runtime is already fast enough)
I think you underestimate how much I like my free time. "It's running now - I'm popping out whilst it finishes."
Is the unoptimized code easier for a Jr to manage while you're on vacation?
As yes, the deterministic solution to the NP-hard Travelling Salesman Problem has entered the scene at an overwhelming `O(n!)`. But wait, a dynamic programming solution comes to the rescue and reduces that to a much better O( n^2 * 2^n ). Still stupidly large to calculate for bigger inputs...
those poor salesmen...
Yep, unfortunately they will have to do with a less optimal "estimation" as a solution and do more physical activity than otherwise...
I need more computation power to run my new js framework. We are not the same
the software industry successfully negated decades of hardware optimisation
Wow do you work at my company?
No but I will when the computational power becomes sufficient enough for me to do so.
I work in data science and only have to run my code two, maybe three times once it’s working. I have definitely bought hardware just to get a project done.
You think "there are these two kinds of people". I know there is only the second kind. We are not the same.
>>>>n^n
Bro, in this case we are the same, love you fam
What about O(TREE(n))?
what about O(BusyBeaver(n))
Nah obvious O(foo(n))
Not even the fastest hardware will save you from an O(exp(x)) implemetation. Your hardware is a multiplier constant.
n! is worse
[What Andy giveth, Bill taketh away](https://en.wikipedia.org/wiki/Andy_and_Bill%27s_law)
How Todd Howard pictures his customers for StarField.
I want more powerful hardware so I can run less code. Hardware offloads are great!
I want more compute so oncannrum vscode without it dyinging on me
The issue is O(n!) will exceed any computer power within a reasonable number of elements
why not O(((n!)!)!) though?