For more information on the robots.txt file used for controlling bot access to web sites, see https://www.theverge.com/24067997/robots-txt-ai-text-file-web-crawlers-spiders . This does not address a new standard for controlling AI access to APIs as I discussed in "What is the Meaning of This?" but it is good background
For more information on the robots.txt file used for controlling bot access to web sites, see https://www.theverge.com/24067997/robots-txt-ai-text-file-web-crawlers-spiders . This does not address a new standard for controlling AI access to APIs as I discussed in "What is the Meaning of This?" but it is good background
See also https://mamund.substack.com/p/what-is-alps by Make Amundsen; ALPS (Application-Level Profile Semantics) . ALPS is (in the words of an AI :-) :
> a format created by Mike Amundsen for describing the semantics, or meaning, of an application's data and actions in a simple and reusable way.
Here's a comment from [ TallVenti @TallVenti@mastodon.world ] which complements my thoughts on using AI to write API client code: https://fosstodon.org/@TallVenti@mastodon.world/111778694924021266
"So… using #AI to write me some integration code based on a well documented #API.
It just made it up. The code was reasonably complete. Just wrong.
I knew it was wrong. I had checked the documentation previously.
And I have clients telling me how to do stuff based on their questionable #AI research.
It’s bonkers.
Can you #AI fanboys come back to me when it’s finished."
(this was in response to https://apidesignmatters.substack.com/p/what-is-the-meaning-of-this )
Check out Keith Casey's take on this topic, also part of #APIFutures: https://caseysoftware.com/blog/how-chatgpt-will-solve-all-api-problems-except-yours . Notably (Keith put this more more elegantly than I did):
| In short – just like humans – an AI can’t build with what it can’t understand.