Dr. Adam Hart
2 min readNov 8, 2024

--

Thank you, that would be wonderful. If it is indeed possible to create a mimicry of a "conscience" for code that would be something special. I worry that we conflate our interpretation of code symbolic outputs as the code is intelligent as Langrabe and Smith point out. I think we have yet to move beyond Asimov tbh. The Loebner prize was abandoned when ChatGPT came about, that was a shame, I felt Steve Warwick's chatbot was more human somehow! But that doesn;t mean any of it is intelligent. And, does it have to be? If code works safely as it says on the tin and does what is needed in a predictable manner then that's job done. If you watch the Writing Doom video above they refer indirectly to the Paper Clip problem of a code base that has access to all resources to optimise production of paperclips, starving the world of resources. https://cepr.org/voxeu/columns/ai-and-paperclip-problem That is not only unconscionable but stupid. How do we get code to learn common sense is intriguing. I think the bad reaction to Senate Bill 1047 shows me it is more to do with what I am writing about in this essay (companies trying to justify their astronomical share valuation by the next hyped innovation) rather than what is useful (like Eradicating Polio - good on Bill Gates). Much of the useful work is using pattern detection to further science (https://www.nobelprize.org/prizes/chemistry/2024/press-release/) - but the cost to get this far has been extraordinary - like the 1969 moonshot. How did landing on the moon help the bulk of humanity? Thank you for your comment. Kind Regards. Adam

--

--

Dr. Adam Hart
Dr. Adam Hart

Written by Dr. Adam Hart

phd epistemology | 30 years corptech

Responses (1)