约 6,350,000 个结果
在新选项卡中打开链接
  1. Llama.cpp integration - Docs by LangChain

    Integrate with the Llama.cpp chat model using LangChain Python.

  2. Download Ollama on macOS

    Download Ollama for macOS curl -fsSL https://ollama.com/install.sh | sh paste this in terminal or Download for macOS

  3. Industry Leading, Open-Source AI | Llama

    Discover Llama 4's class-leading AI models, Scout and Maverick. Experience top performance, multimodality, low costs, and unparalleled efficiency.

  4. GitHub - ollama/ollama: Get up and running with Kimi-K2.5, GLM-5 ...

    Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models. - ollama/ollama

  5. What is Ollama - GeeksforGeeks

    2026年1月27日 · Output: phi3 response Pre-Trained Model Support in Ollama Ollama enables developers to run pre-trained, open-weight language and multimodal models locally through a unified …

  6. GitHub - run-llama/llama_index: LlamaIndex is the leading document ...

    A note on Verification of Build Assets By default, llama-index-core includes a _static folder that contains the nltk and tiktoken cache that is included with the package installation. This ensures that you can …

  7. codellama:python

    2023年7月18日 · Code Llama is a model for generating and discussing code, built on top of Llama 2. It’s designed to make workflows faster and efficient for developers and make it easier for people to learn …

  8. Meta Llama 3: The most capable openly available LLM to date

    2024年4月18日 · Readme Llama 3 The most capable openly available LLM to date. Meta Llama 3, a family of models developed by Meta Inc. are new state-of-the-art , available in both 8B and 70B …

  9. GitHub - OllamaRelease/Ollama: Download and running with Llama 3.3 ...

    2025年2月26日 · Download and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 2, and other large language models. - OllamaRelease/Ollama

  10. meta-llama (Meta Llama) - Hugging Face

    Org profile for Meta Llama on Hugging Face, the AI community building the future.