Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ironically, I was just using ChatGPT to answer another seemingly simple question:

What is an example of an item that would show up in EBIT but not operating income?

> In standard financial reporting and accounting terminology, "Operating Income" and "EBIT" (Earnings Before Interest and Taxes) are typically considered to be the same thing. They both represent a measure of a company's profitability from its core operating activities, and they exclude non-operating income and expenses...

It's an answer, stated as fact, that is also totally wrong. When I went to Google, I found the correct answer:

"The key difference between EBIT and operating income is that operating income does not include non-operating income, non-operating expenses, or other income."

After some pressing of ChatGPT, I ultimately got it to spit out the correct answer:

"I apologize for any confusion in my previous responses, and you are correct. In standard financial reporting, EBIT does include non-operating income, whereas operating income does not."

ChatGPT is great in some cases, and in other cases sends you down random tangents, references things that either don't exist or are misleading, or just flat out lies. That is a REAL time waster, because you might end up proceeding based on the incorrect explanation provided by ChatGPT, only to find out that you have to re-learn a concept correctly later on.



Are you using GPT-4 or ChatGPT? I only trust GPT-4, and I have a fairly good idea of its 'limits' so to speak (don't ask it to walk step by step through a decision tree question for instance, but if we generally want to know about xgboost or lgbm it's good). I agree that if you don't know what is trustable by ChatGPT and what is not, then it frustrates the whole process again




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: