From 67981a200e8fcd7e1d9c3f4234fb2a9ab68311d5 Mon Sep 17 00:00:00 2001 From: Sam Mirazi Date: Tue, 3 Jun 2025 12:22:43 -0700 Subject: [PATCH] 24 --- .gitignore | 67 +++++++++++++++---------------- LICENSE | 21 ++++++++++ README.md | 114 +++++++++++++++++++++++++++++++++++++++++++++++++++++ 3 files changed, 169 insertions(+), 33 deletions(-) create mode 100644 LICENSE create mode 100644 README.md diff --git a/.gitignore b/.gitignore index 5ef6a52..acc4d80 100644 --- a/.gitignore +++ b/.gitignore @@ -1,41 +1,42 @@ # See https://help.github.com/articles/ignoring-files/ for more about ignoring files. -# dependencies -/node_modules -/.pnp -.pnp.* -.yarn/* -!.yarn/patches -!.yarn/plugins -!.yarn/releases -!.yarn/versions - -# testing -/coverage - -# next.js -/.next/ -/out/ - -# production -/build - -# misc +# Python +__pycache__/ +*.py[cod] +*$py.class + +# Virtual environment +.venv/ +env/ +venv/ +ENV/ + +# IDEs and editors +.idea/ +.vscode/ +*.swp + +# OS-specific .DS_Store -*.pem +Thumbs.db + +# Distribution / packaging +.Python +build/ +dist/ +*.egg-info/ +*.egg -# debug -npm-debug.log* -yarn-debug.log* -yarn-error.log* -.pnpm-debug.log* +# Logs and databases +*.log +*.sqlite3 +*.db -# env files (can opt-in for committing if needed) +# Environment variables .env* -# vercel -.vercel +# pytest cache +.pytest_cache/ -# typescript -*.tsbuildinfo -next-env.d.ts +# Other misc +*.pem diff --git a/LICENSE b/LICENSE new file mode 100644 index 0000000..c34edf8 --- /dev/null +++ b/LICENSE @@ -0,0 +1,21 @@ +MIT License + +Copyright (c) [year] [fullname] + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. \ No newline at end of file diff --git a/README.md b/README.md new file mode 100644 index 0000000..4ba7a13 --- /dev/null +++ b/README.md @@ -0,0 +1,114 @@ +# FastAPI vs Flask Performance Benchmark + +This project provides a benchmarking suite to compare the performance of FastAPI and Flask web frameworks under various conditions, including scenarios with and without artificial delays. + +## Project Overview + +The primary goal is to offer a clear, reproducible way to measure and contrast the request handling capabilities of FastAPI (using Uvicorn) and Flask (using Werkzeug's development server). It includes: + +- Simple web applications for both FastAPI and Flask. +- Benchmark scripts to send a configurable number of concurrent requests. +- Scripts to orchestrate the server startup, benchmark execution, and result tabulation for different scenarios. + +## Directory Structure + +``` +Hacker Dojo/ +├── .venv/ # Virtual environment +├── app_fastapi/ # FastAPI application files +│ ├── FastAPI_no_delay.py +│ └── FastAPI_with_delay.py +├── app_flask/ # Flask application files +│ ├── Flask_no_delay.py +│ └── Flask_with_delay.py +├── benchmark/ # Core benchmark script +│ └── run_benchmark.py +├── Design Docs/ # (Placeholder for design documents) +├── project/ # (Purpose to be clarified) +├── SOP/ # Standard Operating Procedures & Handoff notes +├── static/ # (Placeholder for static assets) +├── tests/ # (Placeholder for automated tests) +├── .gitignore # Files and directories ignored by Git +├── custom.css # (Purpose to be clarified) +├── README.md # This file +├── requirements.txt # Python dependencies +├── run_benchmark_NO_RESTRICTIONS.py # Runs benchmarks for no-delay apps +└── run_benchmark_table.py # Runs benchmarks for apps with delays and generates a table +``` + +## Prerequisites + +- Python 3.8+ +- Git + +## Setup and Installation + +1. **Clone the repository:** + + ```bash + git clone + cd Hacker-Dojo + ``` + +2. **Create and activate a virtual environment:** + + - On Windows: + ```bash + python -m venv .venv + .venv\Scripts\activate + ``` + - On macOS/Linux: + ```bash + python3 -m venv .venv + source .venv/bin/activate + ``` + +3. **Install dependencies:** + ```bash + pip install -r requirements.txt + ``` + +## Running the Benchmarks + +This project provides two main scripts to run benchmarks: + +1. **`run_benchmark_table.py`:** + This script benchmarks applications that include an artificial `time.sleep(0.3)` delay in their request handlers. It typically tests fewer requests and is good for observing behavior with I/O-bound-like operations. + + ```bash + python run_benchmark_table.py + ``` + + The results will be displayed in a table in your console. + +2. **`run_benchmark_NO_RESTRICTIONS.py`:** + This script benchmarks applications that do _not_ have any artificial delays. It's designed to test CPU-bound performance and raw request throughput with a higher number of requests (typically 1000). + ```bash + python run_benchmark_NO_RESTRICTIONS.py + ``` + Results will also be displayed in a table in the console. + +**Important Notes:** + +- Ensure no other applications are using ports 3000 (for Flask) or 8000 (for FastAPI) when running the benchmarks. The scripts attempt to manage server processes, but conflicts can still occur. +- Benchmark results can be influenced by your system's hardware, OS, and current load. For the most accurate comparisons, run benchmarks in a consistent environment. + +## Interpreting Results + +The benchmark scripts will output tables summarizing: + +- **Framework:** The web framework and configuration tested. +- **#Reqs:** The total number of requests sent. +- **Success:** The number of successful requests out of the total. +- **Total s:** The total time taken in seconds to complete all requests. +- **Avg s/req:** The average time per request in seconds. + +Lower `Total s` and `Avg s/req` generally indicate better performance. + +## Contributing + +Contributions are welcome! Please feel free to submit pull requests or open issues for bugs, feature requests, or improvements. + +## License + +This project is licensed under the MIT License - see the `LICENSE` file for details. -- 2.25.1