Grafilab
  • Getting Started
  • Whitepaper
    • Introduction
    • Market Overview
    • Challenges
    • Goal
    • What Are We Building?
    • Grafi Cloud
      • Core Features
      • How It Works
      • Payment and Incentives
      • Advanced Features
    • Grafi Co-Builder
      • Main Services of Co-Builder
      • How It Works
      • Supporting Large-Language-Model
      • Supporting Framework
    • Grafi AI App Store
      • Key Features of the Grafi AI App Store
    • Grafi AI Data Layer
      • Key Components of the Data Layer
      • How the Data Layer Works
    • Ecosystem (Flywheel)
      • Revenue Streams of Grafilab
      • Grafi's Universal Payment and Burning Mechanism
    • Token Info
      • Token Distribution and Allocations
      • Token Utility
      • Token Emission
  • Roadmap
Powered by GitBook
On this page
  1. Whitepaper
  2. Grafi Co-Builder

How It Works

Inference API

  1. Connect Your Application: Use the Co-Builder interface to link your application or agent with the Inference API.

  2. LLM Selection: Choose the preferred LLM that suits your apps/agents.

  3. Real-Time Processing: The API processes data and delivers results with optimized speed and accuracy.

  4. Monitor Usage: Track performance metrics and adjust settings to optimize for cost and efficiency.

Deploy Your Own Apps/Agents

  1. Upload your pre-trained dataset: Developers upload pre-trained dataset or configure new ones within the Co-Builder platform.

  2. Select Resources: Choose from centralized or decentralized GPUs for deployment, ensuring scalability and cost efficiency.

  3. Train and Scale: Co-Builder handles training and scaling based on performance requirements.

  4. Monitor Performance: Use real-time dashboards to oversee training speed, utilization, and outcomes.

  5. Monetize: Get listed on the AI App Store for subscription earning rewards for usage.

PreviousMain Services of Co-BuilderNextSupporting Large-Language-Model

Last updated 3 months ago