MLOS is a project to enable autotuning for systems.
-
Updated
Apr 22, 2026 - Python
MLOS is a project to enable autotuning for systems.
π¦ Learn Locust from scratch π
β‘ JEval helps you to evaluate your JMeter test plan and provides recommendation before you start your performance testing. All contributions welcome π.
π¦ Black Friday Performance Testing Experiment π
π This utility converts your LoadRunner Rules to JMeter Correlation Recorder Rules. π All Contributions Welcome π
A production-grade telemetry-aware suite for benchmarking LLM inference performance on NVIDIA RTX 3080.
JEval helps you to evaluate your JMeter test plan and provides recommendation before you start your performance testing.
perf experiments in gil free python v3.13
Collection of examples and links that uses different profiling tools to show memory usage and timings.
Load testing suite built with Locust for web applications
Standalone LLM inference benchmarking pipelines on AMD GPUs using ROCm, vLLM, MAD, and data visualization scripts.
Field-theoretic dual track python subset. Curvature, phase, and domain-driven optimization. Deterministic and interpretable.
Open ML systems platform for training, profiling, evaluating, and serving AI models.
Anthropic Performance Take-Home: 1,339 cycles (110.3x speedup, 9/9 tests) β Claude Opus 4.6 solution
Library to compute auto-tuning and performance metrics.
Distributed training profiler for analyzing compute, communication, memory, and scaling bottlenecks in ML training systems.
GitHub App for detecting benchmark regressions in pull requests.
π DeltaPerf AI π Analyze. Detect. Summarize
Profile-first ML systems project optimizing a multi-camera end-to-end driving model for hardware efficiency using PyTorch, CUDA streams, NVTX instrumentation, and Nsight Systems.
NFR-driven performance test generation for mission-critical systems. Part of the PRESTO framework for performance resilience engineering.
Add a description, image, and links to the performance-engineering topic page so that developers can more easily learn about it.
To associate your repository with the performance-engineering topic, visit your repo's landing page and select "manage topics."