Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Years ago I transitioned from a developer role to a manager role and suddenly I had to do a lot more talking. Not all the talking needs to be a deeply involved exchange of complex ideas, a lot of it serves a different purpose. Sometimes it can be a simple as filling up the time in a pleasant way with a group of people that may or may not know each other that well.

After getting some experience with this I noticed that I had developed a talking on/off button in my head. I could just simply turn it on and start talking. I could generate words that sounded good together and fit the purpose of the moment. But They just seemed to come from a different place in my brain than my conscious mind. Because that was not involved in this process at all. The only job my mind had was to turn the button off again at the right moment, for the rest it was free to think whatever it wanted.

(I transferred back to development a couple of years later.)



The very fact that we can be conscious about our ability to generate language and work it separately from the rest of our consciousness tells me that chatbots might emulate our language generation ability. We shouldn't invert it to conclude that means they can reason about it.


> We shouldn't invert it to conclude that means they can reason about it.

Agreed. However, I think it's somewhat accepted view that the bulk of work constituting reasoning happens subconsciously, with the conscious mind playing the role of a censor/gatekeeper, and occasionally handholding during when reasoning through tougher problems.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: