Etherscan releases AI Code Reader.
On June 19, Etherscan, an Ethereum block explorer and analytics platform, released a new tool called “Code Reader” that uses artificial intelligence to retrieve and interpret the source code of a specific contract address. After the user inputs a prompt, Code Reader generates a response using OpenAI’s large language model (LLM), providing insight into the contract’s source code files. The Etherscan developers wrote:
“To use the tool, you need a valid OpenAI API Key and sufficient OpenAI usage limits. This tool does not store your API keys.”
Code Reader can be used to gain a deeper understanding of contracts’ code through AI-generated explanations, obtain comprehensive lists of smart contract functions associated with Ethereum data, and understand how the underlying contract interacts with decentralized applications (dApps). The developers added that “once the contract files are retrieved, you can choose a specific source code file to read through. Additionally, you may modify the source code directly inside the UI before sharing it with the AI.”
- Tel Aviv Stock Exchange allows nonbank customers to trade crypto, 2 hackers arrested in France, $70M in BTC and ETH donated to Ukraine.
- Proposal to increase Ethereum validator limit from 32 Ether to 2,048 Ether.
- Cyberport in Hong Kong attracted 150 Web3 firms in a year.
Despite the increasing demand for training large AI models in decentralized distributed computing power networks, researchers say that current prototypes face significant constraints such as complex data synchronization, network optimization, data privacy, and security concerns. In one example, Foresight researchers noted that the training of a large model with 175 billion parameters with single-precision floating-point representation would require around 700 gigabytes. However, distributed training requires these parameters to be frequently transmitted and updated between computing nodes. In the case of 100 computing nodes and each node needing to update all parameters at each unit step, the model would require transmitting of 70 terabytes of data per second, far exceeding the capacity of most networks. Researchers summarized:
“In most scenarios, small AI models are still a more feasible choice, and should not be overlooked too early in the tide of FOMO on large models.”