Here is a more detailed and structured version of the documentation:
Introduction
The annotate command-line tool is used to annotate papers with concepts. It supports various LLMs, including those from Hugging Face Hub.
Usage
To use the tool, run the following command:
annotate <paper_url> [options]
Replace <paper_url> with the URL of the paper you want to annotate.
Options
--keywords : Specify a list of keywords to search for in the paper.--progress : Disable the progress bar. (default: True)--LLM : Set the LLM to use. Can be set via ENV parameters or the Hugging Face Hub.Supported LLMs
The following LLMs are currently supported:
gpt-4o
claude-3-5-sonnet-20240620
ollama/llama3.1
bartowski/Llama-3.1_OpenScholar-8B-GGUF/Llama-3.1_OpenScholar-8B-Q4_K_M.gguf
Docker Web Application
A web application version of the tool is available on Docker Hub.
To run the web application, use the following command:
docker run -d --gpus=all -it -p 8501:8501 neuml/annotateai
The LLM can also be set via ENV parameters. For example:
docker run -d --gpus=all -it -p 8501:8501 -e LLM=bartowski/Llama-3.2-1B-Instruct-GGUF/Llama-3.2-1B-Instruct-Q4_K_M.gguf neuml/annotateai
The code for the web application can be found in the app folder.
API Documentation
The tool provides the following API endpoints:
POST /annotate: Annotate a paper with concepts.
Example request:
curl -X POST \
http://localhost:8501/annotate \
-H 'Content-Type: application/json' \
-d '{"paper_url": "https://arxiv.org/pdf/2005.11401", "keywords": ["hallucinations", "llm"], "progress": false}'
LLM APIs
The tool also provides the following LLM API endpoints:
GET /llms: Get a list of supported LLMs.
Example request:
curl -X GET \
http://localhost:8501/llms
POST /llms: Set the LLM to use.
LLM=bartowski/Llama-3.2-1B-Instruct-GGUF/Llama-3.2-1B-Instruct-Q4_K_M.gguf).Example request:
curl -X POST \
http://localhost:8501/llms \
-H 'Content-Type: application/json' \
-d '{"llm_name": "bartowski/Llama-3.2-1B-Instruct-GGUF/Llama-3.2-1B-Instruct-Q4_K_M.gguf"}'
Note that this is a more detailed and structured version of the documentation, including API endpoints and LLM APIs.
Multiple AIs for extensive searches. Ithy uses distributed AI to analyze online sources, synthesizing independent insights for definitive answers not...
Powered by Anthropic's MCP protocol, ChatMCP is a web agent that understands what you're browsing and helps you complete complex tasks quickly and acc...
Discover the latest and most popular open LLM models powering the next generation of AI applications. Stay updated with trending models shaping the fu...