Ask HN: Codex is too slow. Is there any solution?
The codex backend is of good quality, the frontend is average, but most importantly it is too slow. I wonder if OpenAI will improve it.
The codex backend is of good quality, the frontend is average, but most importantly it is too slow. I wonder if OpenAI will improve it.
It seems to work with less issues than CC opus.
I don’t mind if it takes longer as long as the answer is correct more often.
You can always be doing more work while one chat is working..
the new 0.47 has a better performance now imho
Sonnet and Gemini are good and fast. Can't speak for Grok.
Grok Turbo is fast.