flask与fastapi性能测试
背景
sy项目通过MQ接受业务系统的业务数据,通过运行开发者开发的python脚本执行业务系统与财务系统数据的一致性校验。
sy系统需要每天运行大量的python脚本。目前使用falsk日运行6W+次python脚本,由于性能存在瓶颈,需要引入
新的fastapi框架,来解决cpu、内存性能压榨不够及目前的性能瓶颈。本文目标给出两者的性能测试报告。
给出选择哪个框架的性能数据支撑。
apache ab介绍
apache ab性能测试
安装
yum -y install httpd-tools
部分参数说明
-n 执行的请求总数
-c 并发数, 同时执行的数量, c不能大于n
-p post请求指定的文件
-T header Content-type值,默认为 'text/plain'
测试get请求
ab -c 10 http://127.0.0.1:8081/cppla
测试post请求
ab -n 100-c 10-T 'application/json' -p httpjson.txt http://127.0.0.1:8081/cppla1
// httpjson.txt的内容{"recordId":123}
测试计划
模拟真实每次请求调用脚本,分别对每一个数量级的请求量进行测试。
请求总数每次并发数每次并发数每次并发数100101001000100010100100010000101001000200001010010003000010100100040000101001000500001010010006000010100100080000101001000
测试代码
处理post请求,延时3s返回结果。flask启动20个进程。fastapi启动一个进程。
## flask 代码# coding: utf-8from gevent import monkey
from gevent.pywsgi import WSGIServer
import requests
import datetime
import os
from multiprocessing import cpu_count, Process
from flask import Flask, jsonify,request
import json
import traceback
import importlib
from loguru import logger
import time
app = Flask(__name__)# 执行run方法@app.route("/cppla1", methods=['POST','GET'])defcppla1():
data = request.json
time.sleep(3)return data
# 启动监听ip、端口defrun(MULTI_PROCESS):if MULTI_PROCESS ==False:
WSGIServer(('0.0.0.0',8081), app).serve_forever()else:
mulserver = WSGIServer(('0.0.0.0',8081), app)
mulserver.start()defserver_forever():
mulserver.start_accepting()
mulserver._stop_event.wait()# for i in range(cpu_count()):for i inrange(20):
logger.info('启动进程第几个:{}', i)
p = Process(target=server_forever)
p.start()if __name__ =="__main__":# 单进程 + 协程# run(False)# 多进程 + 协程
log_init()
run(True)
## fastapi# coding: utf-8# https://www.javazhiyin.com/80750.html# import web frameworkfrom fastapi import FastAPI
from fastapi.encoders import jsonable_encoder
from fastapi.responses import JSONResponse
# import base libimport datetime
import os
import requests
import json
import traceback
import importlib
from loguru import logger
import time
app = FastAPI()@app.post("/cppla1")deffunction_benchmark(data:dict):
time.sleep(3)return{"item": data}# 启动监听ip、端口if __name__ =="__main__":import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8081)
测试结果
框架类型请求总数每次并发数耗时(s)每次并发数耗时(s)每次并发数耗时(s)fastapi1001033.11910012.1481000ab命令不支持flask1001045.08810081.1061000ab命令不支持fastapi100010304.05710078.283100078.631flask100010327.472100198.2731000303.442fastapi1000010x100754.2961000757.719flask1000010x1001550.11910001970.427fastapi2000010x100x1000xflask2000010x100x1000xfastapi3000010x100x1000xflask3000010x100x1000xfastapi4000010x100x1000xflask4000010x100x1000xfastapi5000010x100x1000xflask5000010x100x1000xfastapi6000010x100x1000xflask6000010x100x1000xfastapi8000010x100x1000xflask8000010x100x1000x
结论
fastapi是flask性能的3倍,推荐使用fastap。
版权归原作者 我爱看明朝 所有, 如有侵权,请联系我们删除。