Lightning Labs reveals Bitcoin AI tools.

Add this to the list of tasks that artificial intelligence (AI) is now capable of: Sending bitcoin.

AI applications like OpenAI’s GPT series can now hold, send and receive bitcoin (BTC) using a suite of new tools that Lightning infrastructure firm Lightning Labs unveiled on Thursday.

The AI industry exploded in popularity after the viral debut of OpenAI’s ChatGPT last November. Open AI is an artificial intelligence research company and ChatGPT is one of its advanced chatbots that garnered more than 100 million users a mere two months after launching. ChatGPT is a large language model (LLM) – a piece of software that gets trained on large data sets and can then generate human-like text in response to user prompts.

Read more: MicroStrategy’s Saylor Integrates Bitcoin Lightning Address Into Corporate Email

Lightning Labs says one glaring problem with current LLMs is the lack of a native Internet-based payment mechanism for them to use. This forces AI platforms to rely on outdated payment methods like credit cards and pass on the costs of using such methods to end users, limiting use cases and reducing general access to AI software.

Enter Lightning – a second-layer payment network for cheaper and faster bitcoin transactions.

Lightning Labs says bitcoin is the Internet’s native currency, and the company has built tools that integrate high-volume bitcoin micropayments via Lightning with popular AI software libraries like LangChain. Incorporating Lightning into popular LLMs will not only make software deployment cheaper, it will also increase the depth of possible use cases for AI, according to the company.

“We are in the realm of enabling use cases that weren’t previously possible,” Lightning Labs CEO Elizabeth Stark told blockchain.

One interesting use case floated by the firm is the ability to create software that can charge for application programming interface (API) access. APIs allow different pieces of software to communicate.

In a post by Lightning Labs, the authors give an example of a piece of AI software or agent that queries another agent on a paid basis. The querying agent would be designed to pay for API access to the agent being queried, but additional payments would only go through after a satisfactory response was provided.

“A user can sell a prompt by gating access to an API capable of responding to queries,” the post states. “Potential buyers can then ask their own local agent to evaluate the response given a set of criteria. If the agent approves of the response, then further responses can be purchased.”

Despite ChatGPT’s phenomenal success, Michael Levin, who leads product growth at Lightning Labs, predicted in a tweet that chatbots and their user interfaces (UIs) are only the beginning. He says the lion’s share of use cases will be in yet-to-be-discovered enterprise and software-as-a-service (SaaS) applications.

“Chat UIs are just the tip of the iceberg for LLM usage,” Levin tweeted. “Ninety percent of use cases lie beyond this initial foray. The most useful products won’t be chat UIs, but SaaS/Enterprise/API products built on LLMs to uniquely solve user problems.”