For AI teams and Computer Vision team in organizations of all size
AI-Assisted features of the Ango Hub will automate your AI data workflows to improve data labeling efficiency and model RLHF, all while allowing domain experts to focus on providing high-quality data.
Learn More
Securely stream and govern industrial data to power intelligent operations with agentic insights.
For IoT Developers, Solution Architects, Technical Architects, CTOs, OT/IT Engineers
Trusted MQTT Platform — Fully-managed and cloud-native MQTT platform for bi-directional IoT data movement.
The Monkey-Spider is a crawler based low-interaction Honeyclient Project. It is not only restricted to this use but it is developed as such. The Monkey-Spider crawles Web sites to expose their threats to Web clients.
Universal information crawler is a fast precise and reliable Internet crawler. Uicrawler is a program/automated script which browses the World Wide Web in a methodical, automated manner and creates the index of documents that it accesses.
Ruya is a Python-based breadth-first, level-, delayed, event-based-crawler for crawling English, Japanese websites. It is targeted solely towards developers who want crawling functionality in their projects using API, and crawl control.
The all-in-one solution built to help you stay organised and get more bookings with thousands of connections to online travel agencies (OTAs), resellers and suppliers.
zSearch is a simple python based crawler and search engine. Raw HTML are stored in bzip2 archives, the index is created using pylucene, and twsited is used to provide internal http server. Results are sent back as XML over HTTP.
A configurable knowledge management framework. It works out of the box, but it's meant mainly as a framework to build complex information retrieval and analysis systems. The 3 major components: Crawler, Analyzer and Indexer can also be used separately.
Nomad is tiny but efficient search engine and web crawler. This works very good for searching with in the set of corporate websites on internet and/or intranet's HTML documents or knowledge repositories.
PySMBSearch is a crawler and search engine for SMB shares. It consists of a crawler script, which creates an index and stores it in an SQL database, and a CGI script that can be used to extract queries from the database.
Gokstad will be a basic crawler and text analysis engine. Its current scope is to download news webpages and do simple text analysis on top of it. The name "Gokstad" comes from a sea worthy, clinker-built ship, constructed largely of oak by the vikings
Enterprise-grade platform designed to connect strategy, planning, and execution across digital product development and software delivery
Planview links your technology vision directly to teams' daily work, providing complete visibility and control over your digital product delivery ecosystem.
Qvark is many things. It is a multi-user dungeon crawler (like countless text-based MUDs out there) with ASCII graphics (nethack, Angband, etc) and the persistent world and storyline of a modern MMORPG. It is terminal-based and written in C++ and Python.