Ummmm…No, there will be cases where LLM will fucku...
# general-english
s
Ummmm…No, there will be cases where LLM will fuckup cause they aren’t made for originally to perform a certain task step by step. You can sure achieve a decent accuracy but saying 100% to follow this step will never happen LLM are made for generating the content not for following steps line by line. I might sounds like biased but this is my opinion.