ChatGLM.cpp is a C++ implementation of the ChatGLM-6B model, enabling efficient local inference without requiring a Python environment. It is optimized for running on consumer hardware.

Features

  • Provides a C++ implementation of ChatGLM-6B
  • Supports running models on CPU and GPU
  • Optimized for low-memory hardware and edge devices
  • Allows quantization for reduced resource consumption
  • Works as a lightweight alternative to Python-based inference
  • Offers real-time chatbot capabilities

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow ChatGLM.cpp

ChatGLM.cpp Web Site

Other Useful Business Software
Award-Winning Medical Office Software Designed for Your Specialty Icon
Award-Winning Medical Office Software Designed for Your Specialty

Succeed and scale your practice with cloud-based, data-backed, AI-powered healthcare software.

RXNT is an ambulatory healthcare technology pioneer that empowers medical practices and healthcare organizations to succeed and scale through innovative, data-backed, AI-powered software.
Learn More
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of ChatGLM.cpp!

Additional Project Details

Operating Systems

Linux, Mac, Windows

Programming Language

C++

Related Categories

C++ Large Language Models (LLM), C++ Natural Language Processing (NLP) Tool, C++ AI Models, C++ LLM Inference Tool

Registered

2025-01-21